A new study published in the Journal of Experimental Psychology: Applied has found that giving casino slot machines humanlike properties can increase one’s willingness to interact with them, thereby promoting riskier gambling behavior. Apparently something with the appearance of a mind is more welcoming than a lifeless machine.
This finding might be good news not just for the gambling industry, but also for scientists interested in designing more engaging machine systems—like assistive robots for the elderly. However, it also highlights a cognitive vulnerability that could easily be exploited by those who are intent on manipulating humans for financial gain.
This vulnerability stems from the fact that all people share the tendency to ascribe human characteristics to nonhuman agents and objects. This behavior is known as anthropomorphism, and we anthropomorphize things by treating them as though they have features like intentions, mental states, emotions, and consciousness. We do this so effortlessly that some believe anthropomorphism is a trait that was selected for by evolution due to its usefulness as a predictive strategy. For example, treating approaching animals and poisonous insects as if they possess intent to harm was a necessary tactic to stay alive. One can easily imagine how anthropomorphizing even nonliving things, like falling rocks from an avalanche or flying branches in a storm, would have also been beneficial.
Apple and Google, among other companies, have capitalized on our tendency to anthropomorphize. Lifelike automated systems like Siri, with its friendly but edgy personality, or the GPS service Google Maps, with its pleasant female voice, make us feel more at ease and engaged. Such examples suggest that humans are inclined to trust things with minds more than spiritless software.
As harmless as anthropomorphizing might seem, the authors of the study on gambling—who are researchers at the University of Milano-Bicocca—hypothesized that, under certain conditions like when gambling, such mental action could have some pretty undesirable psychological and behavioral consequences. Their theory was based on previous research showing that frequent gamblers tended to talk to slot machines with expressions that insulted or attributed emotions to the devices. The investigators wanted to determine whether a link exists between slot machine anthropomorphism and gambling, and more specifically, if humanizing a slot machine makes people more willing to gamble and for longer.
Four studies were carried out in which individuals were exposed to either an anthropomorphic or a regular description of a real online slot machine before interacting with it. They believed that they were supposed to be evaluating the graphics of the program for a marketing study, and were told they could quit whenever they felt they had a good impression of it. The anthropomorphic instructions explained how the slot machine worked by referring to it as she, while the nonanthropomorphic instructions called the machine it. Importantly, in both conditions participants were told that they didn’t need to use any type of strategy, as the outcomes would always be beyond their control. At the end, subjects were unexpectedly asked how much they agreed with 15 statements that were used to assess their tendency to anthropomorphize the machine. The statements involved qualities like consciousness, intentions, and emotions. Typical examples were “The slot machine has free will” and “The slot machine can be mean to me.”
The results showed a number of interesting things. Not only did anthropomorphizing slot machines increase gambling regardless of whether money was involved—it also increased emotional arousal, which led to gambling for longer and greater losses overall.
The authors interpreted this as evidence that people are more inclined to gamble when they think they are dealing with a mind and not a machine operating mathematically. It’s as though the brain has learned to automatically base its gambling behavior on one simple principle relating to the odds of achieving success: Minds are fallible where machines are not.
While these are great results for those who want to design robotic systems to be more engaging so that they can better aid people, they also have some scary implications. It strongly suggests that any lifeless object can take on the form of an intentional, moral agent in one’s perception if it is imbued with the right features. This can yield more enticing gambling machines that hurt people and their families not only economically but psychologically as well.
Additionally, someone with less-than-noble intentions could apply the same anthropomorphizing strategies in unexpected ways. For example, humanlike machines or programs could be more effective at convincing people to give out personal information. Weaponized robots that appear to have minds and morals might be able to better engage their targets and even gain their trust.
So when creating technology that interacts with people, scientists and engineers must carefully consider how human to make it.