The beastly Terminator T-600 model is an eight-foot-tall brute, armed to the teeth and wrapped in rubber skin. Easy to spot at close range, the T-600s use their somewhat human-like appearance to get high-caliber weapons into striking range.
Walking around with damaged rubber skin, the T-600s look like extras from a George Romero zombie movie.
You’re probably more familiar with the T-800 models – machines encased in living tissue indistinguishable from human beings – famously played by Arnold Schwarzenegger in leather jacket and shades in the 1984 classic, The Terminator.
Well… he’s back… the Governator’s face digitally added to the latest installment in the franchise, Terminator Salvation, to once again portray the first series of T-800s through the magic of CGI.
The twisty plot lines of the first three Terminator movies involve both time travel and timeline alteration. The terminators –- machines directed by the self-aware AI (artificial intelligence) computer network Skynet –- have the sole mission to completely annihilate humanity. A man named John Connor starts the Tech-Com resistance to defeat them and free humanity.
Of course the machines are evil. And of course we fear for John Connor’s life as he tries to save us and our progeny from a robotic war of annihilation. Such is the logic of Hollywood.
Or… do we need to rethink this? The trailers for Terminator Salvation allude to a new character, Marcus Wright. He’s a stranger whose last memory is of being human on death row. He starts to raise questions about the possibility of being “human” while encased in robotic terminator armor.
In the new movie, this terminator-like bot with human memories may hold the key to the salvation of humankind. This puts a new spin on the popular notion of evil robots at war.
Are our fears of evil robot uprisings with zombie-like T-600s justified? What are the real-world moral implications of using bots to fight a “just war” –- for example, if terminators had been around to help defeat Adolf Hitler during World War II?
Is “terminator salvation” simply an ironic contradiction in terms, an oxymoron? Or can bots be programmed to make morally responsible decisions in war?
Robots in War: Today’s Reality
Amy Goodman reported that three days after President Obama took office, an unmanned U.S. Predator drone fired missiles at houses in Pakistan’s Administered Tribal Areas. Twenty-two people were reported killed, including three children.
By the time the robot has emptied its magazine, nine soldiers lay dead. A government report later blamed the bloodbath on a “software glitch.”
According to a Reuters poll, the U.S. has carried out thirty such drone attacks on alleged al-Qaeda targets inside Pakistan since last summer, killing some 250 people.
There has also been a dramatic increase in the use of ground robotics. When U.S. forces went into Iraq in 2003, they had zero robotic units on the ground. Now there are as many as 12,000.
Some of the robots are used to dismantle landmines and roadside bombs, but a new generation of bots are designed to be fighting machines. One bot, known as SWORDS, can operate an M-16 rifle and a rocket launcher.
In the new Terminator movie, the fictional Skynet computer network directs a variety of hunter killers robots: aerial and land-based-drones, as well as motorcycle-like Mototerminators, serpent-shaped Hydrobots, and the terrifying and gigantic Harvesters.
Alarmingly, many of these bots exist in some form today — drones like Predator and Reaper, the ground-based TALON, and iRobot’s PacBots and BigDogs.
P.W. Singer, author of Wired For War, who advised President Obama on science during the 2008 campaign, believes that we are witnessing the dawn of the robot warrior age. (See R.U. Sirius’ upcoming interview with Peter Singer, later this week.)
“Just look at the numbers,” says Singer. “We went into Iraq in 2003 with zero robots. Now we have 12,000 on the ground. They come in all shapes and sizes, from tiny machines to robots bigger than an 18-wheeler truck.”
“There are ones that fit on my little finger and ones with the wingspan of a football field.”
You can find many of them on YouTube. Parental guidance advised:
BigDog – With a built-in computer that controls locomotion, BigDog is equipped with sensors that aid it in adapting to varying conditions. The sensors provide stereo vision, joint force, joint position and ground contact that aids in continuous movement. Most importantly, this bot is equipped with a laser gyroscope that aids in balance under extreme conditions. BigDog, still in the prototype phase, is capable of maintaining its balance while packing a payload of up to 340-pounds over inhospitable terrain.
PacBot – About the size of a lawn mower, the PackBot mounts cameras and sensors, as well as a nimble arm with four joints. It moves using four “flippers.” These are tiny treads that can also rotate on an axis, allowing the small bot not only to roll forward and backward using the treads as a tank would, but also to flip its tracks up and down (it’s sort of like a seal in motion) to climb stairs, rumble over rocks, squeeze down twisting tunnels, and even swim underwater.
TALON – Made by Foster-Miller Inc., whose offices are a few miles from the better known robotics company iRobot’s, the TALON has been remodeled into a “killer app,” the Special Weapons Observation Reconnaissance Detection System, or SWORDS. The new design allows users to mount different weapons on the bot, including an M-16 rifle, a machine gun, and a grenade or rocket launcher and easily swap them out.
MARCbot (Multi-Function Agile Remote-Controlled Robot) – One of the smallest but most commonly used robots in Iraq, the MARCbot looks like a toy truck with a video camera mounted on a tiny antenna-like mast. Costing only $5,000, this miniscule bot is used to scout for enemies and to search under cars for hidden explosives.
Predator – At 27 feet in length, this propeller-powered drone is just a bit smaller than a Cessna plane. Perhaps its most useful feature is that it can spend up to 24 hours in the air, at heights up to 26,000 feet. When the drone flies out of bases in the war zone, the human pilot and sensor operator are 7,500 miles away, flying the planes via satellite from a set of converted-single-wide trailers located mostly at Nellis and Creech Air Force bases in Nevada.
Raven – Just over three feet long (there is an even smaller version called Wasp that carries a camera the size of a peanut), these little bots are tossed into the air by individual soldiers and fly just above the rooftops, transmitting video images of what’s down the street or on the other side of the hill. Medium-sized drones such as the Shadow circle over entire neighborhoods, at heights above 1,500 feet, to monitor for anything suspicious.
The U.S. military is the biggest investor in robot soldiers. The Army’s Future Combat Systems was budgeted to spend $240 billion over the next 20 years, but Secretary Robert Gates recent decision to whack $160 billion out of the program. Ever resourceful Army planners and defense contractors are looking for ways to cannibalize parts of the program to keep them going on a smaller budget.
Singer is worried that in the rush to bring out ever more advanced systems, many lethal robots will be rolled out before they are ready.
It’s a chilling prospect. “Imagine a laptop armed with an M16 machine-gun,” says Noel Sharkey, a professor of robotics and artificial intelligence at Sheffield University. One of the biggest concerns is that this growing army of robots could stray out of communication range.
“Just imagine a rogue robot roaming off the battlefield and into a nearby village,” he says. “Without experts to shut it down, the results could be catastrophic.”
Robots in War: When Robots Decide for Themselves
What happens when robots decide what to do on their own? One nightmare real-life incident was recently reported in the Daily Mail.
“There was nowhere to hide,” one witness stated. “The rogue gun began firing wildly, spraying high explosive shells at a rate of 550 a minute, swinging around through 360 degrees like a high-pressure hose.”
A young female officer rushed forward to try to shut the robotic gun down – but it was too late. “She couldn’t, because the computer gremlin had taken over,” a witness later said.
The rounds from the automated gun ripped into her and she collapses to the ground. By the time the robot has emptied its magazine, nine soldiers lay dead (including the woman officer). Another 14 were seriously injured. A government report later blamed the bloodbath on a “software glitch.”
The robotic weapon was a computer-controlled MK5 anti-aircraft system, with two huge 35mm cannons. The South African troops never knew what hit them.
Ultimately the complexity of coordinating an attack using advanced autonomous robotics technology like the MK5 will require a sophisticated computer network. The Terminator films depict the fictional Cyberdyne Corporation in Sunnyvale, California, that develops the Skynet network of AI supercomputers.
Skynet initially replaces human beings as commercial and military aircraft pilots, but ultimately takes control of all other military weapons systems, including nuclear missiles and terminators. This leads to nuclear “Judgment Day” when a self-aware Skynet decides that humans are in the way.
Here’s another frightening real-world prospect: the U.S. military is currently in the process of developing a network of supercomputers as part of the Army’s Future Combat Systems (FCS) program. As the lead systems integrator for the FCS, The Boeing Company has a larger role than most prime contractors have had on previous defense projects. While the Army selected General Dynamics and BAE Systems to make robotic ground vehicles, Boeing received a contract award for the program’s computer network.
The ground vehicles include an array of infantry carriers, reconnaissance, medical command and combat vehicles. The Army is evaluating the computer network, as part of a revised scaled-back plan due in September.
The Department of Defense (DoD) is also financing studies of autonomous, or self-governing, armed robots that could find and destroy targets on their own. On-board computer programs, not flesh-and-blood people, would decide whether to fire their weapons. “The trend is clear – warfare will continue and autonomous robots will ultimately be deployed in its conduct,” says Ron Arkin, a robotics expert at the Georgia Institute of Technology in Atlanta. Arkin advocates the development of an ethical guidance system or “ethical governor” akin to the governors used to control steam engines.
Autonomous armed robotic systems probably will be operating by 2020, according to John Pike, an expert on defense and intelligence matters and the director of the security Web site GlobalSecurity.org in Washington.
Sheffield University’s Noel Sharkey says, “We are sleepwalking into a brave new world where robots decide who, where and when to kill”.
Sharkey, a critic of autonomous armed robots, says that Arkin’s ethical governor is “a good idea in principle… Unfortunately, it’s doomed to failure at present because no robots or AI systems could discriminate between a combatant and an innocent, that sensing ability just does not exist.”
Selmer Bringsjord, an artificial intelligence expert at Rensselaer Polytechnic Institute in Troy, N.Y., is worried as well.
“I’m concerned. The stakes are very high,” says Bringsjord. “If we give robots the power to do nasty things, we have to use logic to teach them not to do unethical things. If we can’t figure this out, we shouldn’t build any of these robots.”
Philosopher Peter Asaro at Rutgers University also worries about the ethics of robot warfare. In a conversation with R.U. Sirius, Peter Asaro talks about the implications of Ron Arkin’s robotics work at Georgia Tech. “He thinks we can actually make robots super-moral, and thereby reduce civilian casualties and war crimes,” says Asaro.
Robotics expert Arkin envisions an architecture for lethal robots that allows them to question their orders.
Asaro is a bit more hard-headed about this. He rightly points out that it would be “a hard sell” to convince the military to build robots that might disobey orders. “But they actually do tell soldiers to disobey illegal orders.”
“I don’t think we are likely to see this capability in robots any time soon,” he concludes.
Peter Asaro doesn’t see bots replacing humans –- he sees them more as a way to reduce the number of soldiers needed to fight a war. “I don’t see them improving the capabilities of the military, but rather just automating them.”
In spite of recent budget cuts, it seems clear that the Pentagon is moving ahead with robotic warfare including a Skynet-like network of supercomputers to direct robotic operations.
Will we see something akin to the zombie-like T-600s or Schwarzenegger-in-leather T-800s in the next 10 years? Probably not. However, with the rapidly increasing sophistication of technology like PackBot and BigDog, it’s not unlikely that we’ll see bipedal humanoid-like bots on the battlefields of the near future.
Perhaps Ron Arkin is right when he asserts that we can make robots super-moral, and thereby reduce civilian casualties and war crimes. Maybe –- just maybe –- the fictional robotic Marcus Wright in the new Terminator movie, with human memories and compassion, suggests a way forward.
We’re clearly not there yet. With deals like the recent U.S. Army 16.8 million dollar contract with iRobot for the PackBot 510 series, we will likely see “software glitches” like the one reported in South Africa before we achieve a super-moral bot, if such a thing is possible. Meanwhile, let’s hope that the chances for a Judgment-Day-like glitch in the Army’s upcoming FCS program are nil or at least very slight.
And we can also hope –- or maybe insist –- that robotic ethical guidance systems are fully funded and field tested before bringing more sophisticated AI systems online to direct battlefield operations.
A future human solder’s salvation may one day depend on a terminator. As actress Moon Bloodgood says to Christian Bale in Terminator Salvation, “He saved my life. I saw a man, not a machine. ”