Team Infidel
Forum Spin Doctor
Washington Post
May 6, 2007
Pg. D1
In the Field of Battle (Or Even Above It), Robots Are a Soldier's Best Friend
By Joel Garreau, Washington Post Staff Writer
The most effective way to find and destroy a land mine is to step on it.
This has bad results, of course, if you're a human. But not so much if you're a robot and have as many legs as a centipede sticking out from your body. That's why Mark Tilden, a robotics physicist at the Los Alamos National Laboratory, built something like that. At the Yuma Test Grounds in Arizona, the autonomous robot, 5 feet long and modeled on a stick-insect, strutted out for a live-fire test and worked beautifully, he says. Every time it found a mine, blew it up and lost a limb, it picked itself up and readjusted to move forward on its remaining legs, continuing to clear a path through the minefield.
Finally it was down to one leg. Still, it pulled itself forward. Tilden was ecstatic. The machine was working splendidly.
The human in command of the exercise, however -- an Army colonel -- blew a fuse.
The colonel ordered the test stopped.
Why? asked Tilden. What's wrong?
The colonel just could not stand the pathos of watching the burned, scarred and crippled machine drag itself forward on its last leg.
This test, he charged, was inhumane.
* * *
The wars in Afghanistan and Iraq have become an unprecedented field study in human relationships with intelligent machines. These conflicts are the first in history to see widespread deployment of thousands of battle bots. Flying bots range in size from Learjets to eagles. Some ground bots are like small tanks. Others are the size of two-pound dumbbells, designed to be thrown through a window to scope out the inside of a room. Bots search caves for bad guys, clear roads of improvised explosive devices, scoot under cars to look for bombs, spy on the enemy and, sometimes, kill humans.
Even more startling than these machines' capabilities, however, are the effects they have on their friendly keepers who, for example, award their bots "battlefield promotions" and "purple hearts." "Ours was called Sgt. Talon," says Sgt. Michael Maxson of the 737th Ordnance Company (EOD). "We always wanted him as our main robot. Every time he was working, nothing bad ever happened. He always got the job done. He took a couple of detonations in front of his face and didn't stop working. One time, he actually did break down in a mission, and we sent another robot in and it got blown to pieces. It's like he shut down because he knew something bad would happen." The troops promoted the robot to staff sergeant -- a high honor, since that usually means a squad leader. They also awarded it three "purple hearts."
Humans have long displayed an uncanny ability to make emotional connections with their manufactured helpmates. Car owners for generations have named their vehicles. In "Cast Away," Tom Hanks risks his life to save a volleyball named Wilson, who has become his best friend and confidant. Now that our creations display elements of intelligence, however, the bonds humans forge with their machines are even more impressive. Especially when humans credit their bots with saving their lives.
Ted Bogosh recalls one day in Camp Victory, near Baghdad, when he was a Marine master sergeant running the robot repair shop.
That day, an explosive ordnance disposal technician walked through his door. The EODs, as they are known, are the people who -- with their robots -- are charged with disabling Iraq's most virulent scourge, the roadside improvised explosive device. In this fellow's hands was a small box. It contained the remains of his robot. He had named it Scooby-Doo.
"There wasn't a whole lot left of Scooby," Bogosh says. The biggest piece was its 3-by-3-by-4-inch head, containing its video camera. On the side had been painted "its battle list, its track record. This had been a really great robot."
The veteran explosives technician looming over Bogosh was visibly upset. He insisted he did not want a new robot. He wanted Scooby-Doo back.
"Sometimes they get a little emotional over it," Bogosh says. "Like having a pet dog. It attacks the IEDs, comes back, and attacks again. It becomes part of the team, gets a name. They get upset when anything happens to one of the team. They identify with the little robot quickly. They count on it a lot in a mission."
The bots even show elements of "personality," Bogosh says. "Every robot has its own little quirks. You sort of get used to them. Sometimes you get a robot that comes in and it does a little dance, or a karate chop, instead of doing what it's supposed to do." The operators "talk about them a lot, about the robot doing its mission and getting everything accomplished." He remembers the time "one of the robots happened to get its tracks destroyed while doing a mission." The operators "duct-taped them back on, finished the mission and then brought the robot back" to a hero's welcome.
Near the Tigris River, operators even have been known to take their bot fishing. They put a fishing rod in its claw and retire back to the shade, leaving the robot in the sun.
Of the fish, Bogosh says, "Not sure if we ever caught one or not."
'Sort of Alive'
What do you mean "robot"?
Does a machine have to declare independence from its humans to qualify? In 2005, four bots competing in the Defense Advanced Research Projects Agency (DARPA) Grand Challenge successfully traversed 132 miles of the Mojave Desert all by themselves.
Most, however, are more tightly connected to their humans.
The American military and paramilitary intelligence forces are legendarily skittish about fielding an intelligent weapon that's entirely autonomous. They like having a human in the decision-making loop. If anything goes wrong -- if a Boy Scout troop were to be mistaken for an al-Qaeda cell -- they want to have a person to blame. They hate explaining that the robot had a glitch in its algorithm.
So where does the air vehicle called the Predator fit? It is unmanned, and impressive. In 2002, in Yemen, one run by the CIA came up behind an SUV full of al-Qaeda leaders and successfully fired a Hellfire missile, leaving a large smoking crater where the vehicle used to be. Was this the first bot to incinerate Homo sapiens? It is an artificially intelligent machine. But a remote human told it to fire the missile. So can it be said that we now actually have murderous robots? Reasonable people differ. The fellows in the SUV, of course, might find these distinctions overly fine.
More significant than autonomy, thinks Rodney Brooks, may be the way humans have evolved to recognize instantly when an entity behaves like it's alive -- "animate" is the word he uses. Brooks is director of the MIT Computer Science and Artificial Intelligence Laboratory, co-founder and chief technology officer of the pioneering firm iRobot and author of "Flesh and Machines: How Robots Will Change Us."
What the battle bots are teaching us is how easily we identify our own creations as animate.
Digital pets like the Tamagotchi or the Furby, designed to be cute, have long caused children to make spooky levels of connection. Sherry Turkle, founder of the MIT Initiative on Technology and Self, quotes kids describing intelligent machines as "sort of alive."
Robots at MIT with fanciful names like Cog and Kismet are intentionally built to display what look like emotions. Kismet can listen, and speak with expression. Its cartoonish eyes, ears, eyebrows, eyelids and lips move to create facial expressions that make it appear to be happy, sad, disgusted, calm, interested, angry or surprised.
Humans respond so readily to Kismet, created by Cynthia Breazeal, that graduate students working in the lab at night have been known to put up a curtain between themselves and the bot, Brooks reports. They couldn't stand the way it seemed to gaze around and stare at them. It broke their concentration. These humans are as sophisticated about robots as anyone on Earth. Yet even they are freaked by Kismet's lifelike behavior. "We're programmed biologically to respond to certain sorts of things," Brooks explains.
May 6, 2007
Pg. D1
In the Field of Battle (Or Even Above It), Robots Are a Soldier's Best Friend
By Joel Garreau, Washington Post Staff Writer
The most effective way to find and destroy a land mine is to step on it.
This has bad results, of course, if you're a human. But not so much if you're a robot and have as many legs as a centipede sticking out from your body. That's why Mark Tilden, a robotics physicist at the Los Alamos National Laboratory, built something like that. At the Yuma Test Grounds in Arizona, the autonomous robot, 5 feet long and modeled on a stick-insect, strutted out for a live-fire test and worked beautifully, he says. Every time it found a mine, blew it up and lost a limb, it picked itself up and readjusted to move forward on its remaining legs, continuing to clear a path through the minefield.
Finally it was down to one leg. Still, it pulled itself forward. Tilden was ecstatic. The machine was working splendidly.
The human in command of the exercise, however -- an Army colonel -- blew a fuse.
The colonel ordered the test stopped.
Why? asked Tilden. What's wrong?
The colonel just could not stand the pathos of watching the burned, scarred and crippled machine drag itself forward on its last leg.
This test, he charged, was inhumane.
* * *
The wars in Afghanistan and Iraq have become an unprecedented field study in human relationships with intelligent machines. These conflicts are the first in history to see widespread deployment of thousands of battle bots. Flying bots range in size from Learjets to eagles. Some ground bots are like small tanks. Others are the size of two-pound dumbbells, designed to be thrown through a window to scope out the inside of a room. Bots search caves for bad guys, clear roads of improvised explosive devices, scoot under cars to look for bombs, spy on the enemy and, sometimes, kill humans.
Even more startling than these machines' capabilities, however, are the effects they have on their friendly keepers who, for example, award their bots "battlefield promotions" and "purple hearts." "Ours was called Sgt. Talon," says Sgt. Michael Maxson of the 737th Ordnance Company (EOD). "We always wanted him as our main robot. Every time he was working, nothing bad ever happened. He always got the job done. He took a couple of detonations in front of his face and didn't stop working. One time, he actually did break down in a mission, and we sent another robot in and it got blown to pieces. It's like he shut down because he knew something bad would happen." The troops promoted the robot to staff sergeant -- a high honor, since that usually means a squad leader. They also awarded it three "purple hearts."
Humans have long displayed an uncanny ability to make emotional connections with their manufactured helpmates. Car owners for generations have named their vehicles. In "Cast Away," Tom Hanks risks his life to save a volleyball named Wilson, who has become his best friend and confidant. Now that our creations display elements of intelligence, however, the bonds humans forge with their machines are even more impressive. Especially when humans credit their bots with saving their lives.
Ted Bogosh recalls one day in Camp Victory, near Baghdad, when he was a Marine master sergeant running the robot repair shop.
That day, an explosive ordnance disposal technician walked through his door. The EODs, as they are known, are the people who -- with their robots -- are charged with disabling Iraq's most virulent scourge, the roadside improvised explosive device. In this fellow's hands was a small box. It contained the remains of his robot. He had named it Scooby-Doo.
"There wasn't a whole lot left of Scooby," Bogosh says. The biggest piece was its 3-by-3-by-4-inch head, containing its video camera. On the side had been painted "its battle list, its track record. This had been a really great robot."
The veteran explosives technician looming over Bogosh was visibly upset. He insisted he did not want a new robot. He wanted Scooby-Doo back.
"Sometimes they get a little emotional over it," Bogosh says. "Like having a pet dog. It attacks the IEDs, comes back, and attacks again. It becomes part of the team, gets a name. They get upset when anything happens to one of the team. They identify with the little robot quickly. They count on it a lot in a mission."
The bots even show elements of "personality," Bogosh says. "Every robot has its own little quirks. You sort of get used to them. Sometimes you get a robot that comes in and it does a little dance, or a karate chop, instead of doing what it's supposed to do." The operators "talk about them a lot, about the robot doing its mission and getting everything accomplished." He remembers the time "one of the robots happened to get its tracks destroyed while doing a mission." The operators "duct-taped them back on, finished the mission and then brought the robot back" to a hero's welcome.
Near the Tigris River, operators even have been known to take their bot fishing. They put a fishing rod in its claw and retire back to the shade, leaving the robot in the sun.
Of the fish, Bogosh says, "Not sure if we ever caught one or not."
'Sort of Alive'
What do you mean "robot"?
Does a machine have to declare independence from its humans to qualify? In 2005, four bots competing in the Defense Advanced Research Projects Agency (DARPA) Grand Challenge successfully traversed 132 miles of the Mojave Desert all by themselves.
Most, however, are more tightly connected to their humans.
The American military and paramilitary intelligence forces are legendarily skittish about fielding an intelligent weapon that's entirely autonomous. They like having a human in the decision-making loop. If anything goes wrong -- if a Boy Scout troop were to be mistaken for an al-Qaeda cell -- they want to have a person to blame. They hate explaining that the robot had a glitch in its algorithm.
So where does the air vehicle called the Predator fit? It is unmanned, and impressive. In 2002, in Yemen, one run by the CIA came up behind an SUV full of al-Qaeda leaders and successfully fired a Hellfire missile, leaving a large smoking crater where the vehicle used to be. Was this the first bot to incinerate Homo sapiens? It is an artificially intelligent machine. But a remote human told it to fire the missile. So can it be said that we now actually have murderous robots? Reasonable people differ. The fellows in the SUV, of course, might find these distinctions overly fine.
More significant than autonomy, thinks Rodney Brooks, may be the way humans have evolved to recognize instantly when an entity behaves like it's alive -- "animate" is the word he uses. Brooks is director of the MIT Computer Science and Artificial Intelligence Laboratory, co-founder and chief technology officer of the pioneering firm iRobot and author of "Flesh and Machines: How Robots Will Change Us."
What the battle bots are teaching us is how easily we identify our own creations as animate.
Digital pets like the Tamagotchi or the Furby, designed to be cute, have long caused children to make spooky levels of connection. Sherry Turkle, founder of the MIT Initiative on Technology and Self, quotes kids describing intelligent machines as "sort of alive."
Robots at MIT with fanciful names like Cog and Kismet are intentionally built to display what look like emotions. Kismet can listen, and speak with expression. Its cartoonish eyes, ears, eyebrows, eyelids and lips move to create facial expressions that make it appear to be happy, sad, disgusted, calm, interested, angry or surprised.
Humans respond so readily to Kismet, created by Cynthia Breazeal, that graduate students working in the lab at night have been known to put up a curtain between themselves and the bot, Brooks reports. They couldn't stand the way it seemed to gaze around and stare at them. It broke their concentration. These humans are as sophisticated about robots as anyone on Earth. Yet even they are freaked by Kismet's lifelike behavior. "We're programmed biologically to respond to certain sorts of things," Brooks explains.