On Reddit on Wednesday, several U.S. veterans mourned their dead. One remembered “Boomer”: “Those goddamn Mahdi Army scum took him from this world far too early,” the poster wrote. Another responded, “I am sorry for your loss,” and then described his own fallen comrade’s “full burial detail with 21 gun salute.” The exchange would be somber, grim, and sadly unremarkable, except for one key detail: The two “lost” soldiers weren’t soldiers at all, but robots.
As the Iraq and Afghanistan conflicts have unfolded, the military has been expanding its use of robots on the battlefield. Often, these mechanical helpmates are deployed to carry out high-risk tasks related to the inspection, detection, and defusing of explosives. Their benefits are obvious: They save human lives, cannot be harmed by biological or chemical weapons, and don’t get tired or emotional. But are soldiers becoming too invested in their AI buddies? And could such sentimental attachment cloud their decision-making?
Julie Carpenter, a recent Ph.D. in education from the University of Washington, will explore this question in an upcoming book about human/robot interrelations. She interviewed 23 explosive ordnance personnel—22 men and one women—who worked with robot sidekicks, looking at how they imagined the bots in relation to themselves. She found that some troops anthropomorphized their machines, assigning them names (at times painting them on), genders, and even personalities. And while the soldiers denied that affection for the bots colored their combat strategy, they reported feeling sad and angry when the equipment was destroyed.
“They [soldiers] were very clear it was a tool, but at the same time, patterns in their responses indicated they sometimes interacted with the robots in ways similar to a human or pet,” Carpenter told UW Today. What’s more, “some operators reported they saw their robots as an extension of themselves.”
The implications of this otherwise-sweet trend are worrying. As the military develops robots that increasingly resemble people or animals—seeking physiologies that can “climb stairs and maneuver in narrow spaces and on challenging natural terrain”—emotional transference seems ever more likely. And if troops care too much about the bots to put them in danger, that hesitance could compromise outcomes in the field.
Rare is the case that empathy, especially in the armed forces, becomes a risk rather than a blessing. But even those who are understandably saddened by Boomer’s loss should remember that “he” “died” so that real men and women didn’t have to.