New approach brings robot pets closer to reality
By ANISaturday, September 18, 2010
WASHINGTON - The day is not far when you would be able to keep a robot pet that can mimic the behaviour of real ones, for researchers in Taiwan have come closer to developing a robot vision module that might one-day recognize human facial expressions and respond appropriately.
Users expect their robot pets to be almost as good as the “robots” they see in 3D movies and games.
The researchers, Wei-Po Lee, Tsung-Hsien Yang and Bingchiang Jeng of National Sun Yat-sen University, have now turned to neural networks to help them break the cycle of repetitive behaviour in robot toys and to endow them with almost emotional responses to interactions.
“We have developed a user-centric interactive framework that employs a neural network-based approach to construct behaviour primitives and behaviour arbitrators for robots,” explained the team.
Their evaluation of the approach should allow them to construct an emotion-based pet robot much more quickly than current design and manufacturing prototyping allows.
Building fully autonomous artificial creatures with intelligence akin to humans is a very long-term goal of robot design and computer science.
On the way to such machines, home entertainment and utility devices such as “Tamagotchi” digital pets and domestic toy robots such as Aibo, the robotic dog and even the Roomba robotic vacuum cleaner, have been developed.
At the same time, popular science fiction culture has raised consumer expectations.
“With current technologies in computing and electronics and knowledge in ethology, neuroscience and cognition, it is now possible to create embodied prototypes of artificial living toys acting in the physical world,” explained Wei-Po Lee and colleagues at the National Sun Yat-sen University, Kaohsiung.
The team explained there are three major issues to considered in robot design.
The first is to construct an appropriate control architecture by which the robot can behave coherently. The second is to develop natural ways for the robot to interact with a person. The third is to embed emotional responses and behavior into the robot’s computer.
The researchers hope to address all three issues by adopting an approach to behaviour-based architecture - using a neural network that could allow the owner of a robot pet to reconfigure the device to “learn”, or evolve new behaviour and at the same time ensure that the robot pet functions properly in real time.
The team has evaluated their framework by building robot controllers to achieve various tasks successfully.
Now, they, and countless other research teams across the globe, are working on vision modules for robots.
The technique is not yet fully mature, but ultimately they hope to be able to build a robot pet that could recognize its owner’s facial expressions and perhaps respond accordingly.
Such a development has major implications for interactive devices, computers and functional robots of the future.
The study has been published in the current issue of the International Journal of Modelling, Identification and Control. (ANI)