Bengaluru: Researchers at the Indian Institute of Information Technology Bengaluru (IIIT-B) have developed a robotic model that they say understands humans and interacts with them based on emotions.
Models that exist in the market predict engaging behaviour directly from multi-modal features and do not incorporate personality inferences or any theories of interpersonal behaviour in human-human interactions, according to the researchers.
A research paper by Soham Joshi, Arpitha Malavalli and Shrisha Rao was published recently in PLOS, an open-access publisher.
“We want the robot to show cognitive improvement in a sensible way, not in a completely fixed manner. If you have a query as a person and attitude, the response from the human is different. What we want to see is an automated system which more accurately models the engagement and behaviour of human interaction with them and adjusts their responses accordingly,” Rao, the faculty supervisor for the project, said.
Rao further explained that this is a pipeline way in which long-existing psychological theories have been brought into the field of Artificial Intelligence (AI).
“We’ve used classical psychological theories about human personality engagement and so on and help the robot or system to build a better model and how to predict human engagement and adjust its behaviour,” he added.
Joshi said that state-of-the-art things are just using multi-modal neural network to predict behaviour. “But what about emotion,” he asks.
According to Rao, this is an attempt to bring psychology into the game.
“The robots/models must have some understanding of psychology and consider the interest and age of the person they interact with, instead of ignoring the human element,” he said.
It is an important breakthrough in bringing psychology into the mix, the researchers say, adding that further development is possible in due course.
Technology can be used to create better personal assistive devices. The current assistive devices do not have any awareness of the mood of the human and give static responses. The new technology helps to build devices which understand emotions and mood, the researchers say.
It can also be helpful in industries where it can understand the mental state of a person. Also, the technology can issue an alert if the person behind the wheel is found sleepy or sick based on behaviour to prevent accidents due to driver fatigue, they say.