"PLEASE put down your weapon. You have 20 seconds to comply." So said the armed robot in Paul Verhoeven's 1987 movie RoboCop.
The suspect drops his weapon but a fault in the robot's software means it opens fire anyway. Nearly two decades later, such fictional weapon-toting robots are looking startlingly close to reality – and New Scientist has discovered that some may eventually help to decide who is friend and who is foe.
Sometime in the coming months, chances are that we'll be seeing TV reports that an armed remote-controlled robot has been used in anger for the first time. "They will appear when they appear. I can't talk about when that may be," says Bob Quinn, genera manager at Foster-Miller of Waltham, Massachusetts, whose machine-gun-equipped robot, called Sword, was certified safe for use by the US forces in June.
Robots have already shown their mettle in defensive roles, detonating improvised bombs in the UK, Israel, Iraq and Afghanistan. Foster-Miller's Talon robot and its rival PackBot, from the Massachusetts-based company iRobot, are the lightweight robots now used for these tasks. These tracked machines, controlled by an operator sitting in an armoured vehicle, are capable of being driven at high speed and use manipulator arms and grippers to place a small explosive charge to disable a suspected bomb.
Now versions of these robots are being developed that will allow troops to manoeuvre and fire a variety of weapons. iRobot has built a prototype equipped with a 20- round shotgun. "It will be able to fire over four dozen different kinds of shotgun ammunition, everything from large slugs that would kill an elephant, to buckshot that would cover a wide area," says Joe Dyer, head of iRobot's military division. Foster-Miller's Sword is a variant of Talon in which the manipulator arm has been replaced by a rotating machinegun carrier. "It's for urban combat and perimeter security and it's fully controlled by the soldier," Quinn says. Touted uses include checking out a potential ambush.
Both companies stress that there is always a human in control of the robots. Apart from a planned autonomous "return home" function, neither Sword nor the iRobot prototype operates autonomously.
Nevertheless, more complex machines may soon be on the drawing board. A research request issued in August by the Pentagon's Office of Naval Research (ONR) shows that military robots are one day going to be asked to make some important decisions on their own. The ONR wants to engineer mobile robots to "understand cooperative and uncooperative" people, and inform their operator if they seem a threat. It hopes to do this using artificial intelligence software fed with data from a "remote physiological stress monitoring" system, and by using speech, face and gesture recognition. From this it would draw inferences about the threat that person poses.
It's a prospect that is causing some concern. "It is ethically problematic to use software that may work in lab conditions but not under a whole range of extreme conditions, such as when you suspect someone might be a suicide bomber," says Kirsten Dautenhahn, an AI expert at the University of Hertfordshire in the UK.
Lucy Suchman, an expert in human-computer interaction at Lancaster University, UK, is even more critical: "This plan is just ridiculous. It involves the worst kind of simplistic profiling. It's a fantasy on the part of technology enthusiasts within the Pentagon." Quinn, however, disagrees. The ONR is not known for wasting research dollars, he says, and what it funds usually happens – even if it is 10 years away. "Recognition technology is progressing fast. I think it will separate the wheat from the chaff," he predicts.
This article is posted on this site to give advance access to other authorised media who may wish to quote extracts as part of fair dealing with this copyrighted material. Full attribution is required, and if reporting online a link to www.newscientist.com is also required. This story posted here is the EXACT text used in New Scientist magazine, therefore advance permission is required before any and every reproduction of each article in full. Please contact email@example.com. Please note that all material is copyright of Reed Business Information Limited and we reserve the right to take such action as we consider appropriate to protect such copyright.
THIS ARTICLE APPEARS IN NEW SCIENTIST MAGAZINE ISSUE: 23 SEPTEMBER 2006
Author: Paul Marks, New Scientist Technology Correspondent
IF REPORTING ON THIS STORY, PLEASE MENTION NEW SCIENTIST AS THE SOURCE AND, IF REPORTING ONLINE, PLEASE CARRY A HYPERLINK TO: http://www.newscientist.com
UK CONTACT - Claire Bowles, New Scientist Press Office, London:
Tel: +44(0)20 7611 1210 or email firstname.lastname@example.org
US CONTACT – New Scientist Boston office:
Tel: +1 617 386 2190 or email email@example.com
Last reviewed: By John M. Grohol, Psy.D. on 21 Feb 2009
Published on PsychCentral.com. All rights reserved.