Robots Learning Without Us? New Study Cuts Humans From Early Testing
Humans no longer have exclusive control over training social robots to interact effectively, thanks to a new study from the University of Surrey and the University of Hamburg.
The study, which will be presented at this year's IEEE International Conference on Robotics and Automation (ICRA), introduces a new simulation method that lets researchers test their social robots without needing human participants, making research faster and scalable.
Using a humanoid robot, the research team developed a dynamic scanpath prediction model to help the robot predict where a person would look in a social setting. The model was tested using two publicly available datasets, and the researchers demonstrated that humanoid robots were capable of mimicking human-like eye movements.
Dr Di Fu, co-lead of the study and lecturer in Cognitive Neuroscience at the University of Surrey:
"Our method allows us to test whether a robot is paying attention to the right things -- just as a human would -- without needing real-time human supervision. What's exciting is that the model remains accurate even in noisy, unpredictable environments, making it a promising tool for real-world applications like education, healthcare, and customer service."
Social robots are designed to interact with people using speech, gestures, and expressions, making them useful in education, healthcare, and customer service. Examples of social robots also include Pepper, a retail assistant, and Paro, a therapeutic robot for dementia patients.
Comments
Post a Comment