Deceptive robots to revolutionise military, search and rescue operations

By ANI
Friday, September 10, 2010

WASHINGTON - In what is believed to be the first detailed examination of robot deception, researchers at the Georgia Institute of Technology made a robot dodge an enemy soldier by creating a false trail and hiding so that it will not be caught.

In the future, robots capable of deception may be valuable for several different areas, including military and search and rescue operations.

A search and rescue robot may need to deceive in order to calm or receive cooperation from a panicking victim.

Robots on the battlefield with the power of deception will be able to successfully hide and mislead the enemy to keep themselves and valuable information safe.

“We have developed algorithms that allow a robot to determine whether it should deceive a human or other intelligent machine and we have designed techniques that help the robot select the best deceptive strategy to reduce its chance of being discovered,” said Ronald Arkin, a Regents professor in the Georgia Tech School of Interactive Computing.

Because the researchers explored the phenomena of robot deception from a general perspective, the study’s results apply to robot-robot and human-robot interactions.

“Most social robots will probably rarely use deception, but it’s still an important tool in the robot’s interactive arsenal because robots that recognize the need for deception have advantages in terms of outcome compared to robots that do not recognize the need for deception,” said the study’s co-author, Alan Wagner.

For this study, the researchers focused on the actions, beliefs and communications of a robot attempting to hide from another robot to develop programs that successfully produced deceptive behaviour.

Their first step was to teach the deceiving robot how to recognize a situation that warranted the use of deception.

They used interdependence theory and game theory to develop algorithms that tested the value of deception in a specific situation.

A situation had to satisfy two key conditions to warrant deception-there must be conflict between the deceiving robot and the seeker, and the deceiver must benefit from the deception.

Once a situation was deemed to warrant deception, the robot carried out a deceptive act by providing a false communication to benefit itself.

The technique developed by the Georgia Tech researchers based a robot’s deceptive action selection on its understanding of the individual robot it was attempting to deceive.

To test their algorithms, the researchers ran 20 hide-and-seek experiments with two autonomous robots.

“The experimental results weren’t perfect, but they demonstrated the learning and use of deception signals by real robots in a noisy environment,” said Wagner.

“The results were also a preliminary indication that the techniques and algorithms described in the paper could be used to successfully produce deceptive behavior in a robot,” he added.

The results of the study are published online in the International Journal of Social Robotics. (ANI)

Filed under: Science and Technology

Tags:
YOUR VIEW POINT
NAME : (REQUIRED)
MAIL : (REQUIRED)
will not be displayed
WEBSITE : (OPTIONAL)
YOUR
COMMENT :