Translated by Malcolm DeBevoise.
KASPAR (Kinesics and Synchronization in Personal Assistant Robotics) is a robot originally conceived as part of a research project begun in the late 1990s by artificial intelligence researcher Kerstin Dautenhahn and her collaborators at the University of Reading in England. Initially, the objective was to develop “robotic therapy games” to facilitate communication with autistic children and to help them interact with others. In 2005, now at the University of Hertfordshire, the KASPAR Project was formally launched with the aim of developing a “social” robot having two missions: first, and mainly, to be a “social mediator” responsible for facilitating communication between autistic children and the people with whom they are in daily contact—other children (autistic or not), therapists, teachers, and parents—and also to serve as a therapeutic and learning tool designed to stimulate social development in these children. The objective was to teach young people with autism a variety of skills that most of us master, more or less fully, without any need of special education: understanding others’ emotions and reacting appropriately, expressing our own feelings, playing in a group while letting everyone take turns, and imitating and cooperating with others. The idea of using playmate robots for therapeutic purposes came from a well-attested observation in the literature on children with autism: Early intervention can help them acquire cognitive and social skills they would otherwise be incapable of developing.
This therapeutic and educational project requires a robotic partner whose social presence is at once obvious and reassuring, because its behavior is easily anticipated and understood. KASPAR is a humanoid robot the size of a small child about 3 years old. Its physical appearance, in keeping with the usual interpretation of the “uncanny valley” effect, is not overly realistic. Dautenhahn and her team achieved a reduction in the complexity of social communication, while also managing to avoid the complications created by excessive likeness, through an extreme simplification of facial features. The robot’s face is a skin-colored silicon mask devoid of the details that normally make it possible to determine age, gender, emotional intensity, and so forth. On the one hand, this deliberate lack of definition gives free rein to the child’s imagination, allowing him to think of KASPAR as a playmate, or at least as someone he feels comfortable being with. On the other hand, it gives KASPAR’s designers considerable leeway in engineering (and later programming) customized versions to suit a variety of needs.
Children with autism interact quite readily with KASPAR from the first meeting. In its present version, the robot is dressed as a little boy. It is capable of moving its torso, arms, and head. It can also open and close its mouth and eyes. This restricted range of movements gives KASPAR “minimal” emotional expressiveness, uncomplicated and easy to interpret. Taken together, the movement of eyes and arms, posture, and voice permit it to express several basic emotions: joy, sadness, surprise.
But KASPAR is not an autonomous robot. An operator controls its movements and speech using what is known as the “Wizard of Oz” technique—it is piloted by a person monitoring the interaction. Its usefulness as a social mediator and as an instrument of therapy therefore depends on human intervention. KASPAR succeeds in getting children with autism to take part in a wide range of interactive games that are not usually accessible to them because they involve activities such as imitation, turn taking, and shared gaze. In this way, KASPAR’s principal role is as a social mediator in the relationship between an autistic child and a therapist, a teacher, or other children. Studies suggest that KASPAR’s expressive minimalism, implying an extreme simplicity of interpretation, furnishes these children with a sufficiently predictable and reassuring social context with which they are able to play with others and try new things.
One such application, made possible by the robot’s touch-responsive epidermal covering, RoboSkin, involves a game that teaches the child to exert an appropriate degree of force when interaction brings her into physical contact with others. Some children with autism who exhibit tactile hypersensitivity or hyposensitivity find it difficult to properly modulate the strength they bring to bear while playing. Here again, interaction with KASPAR furnishes such children with a protected environment that is easily understood and reassuring. When a child fails to correctly judge the amount of force that should be used, the interaction nonetheless continues without interruption and the child is not made to feel rejected. Instead, KASPAR sends a clear message—“Ouch! That hurts!”—without getting angry and ejecting the child from the game, as other children often do.
KASPAR is often used with “high-functioning” autistic children as well. To be able to join in playing a video game, for example, the child with autism must imitate the movements that the robot executes. In this way, thanks to KASPAR’s mediation, he learns to cooperate and to take turns playing the game with a neurotypical child. The robot can also be used to help children with autism discover their body image, following its example. By touching and naming this or that part of its humanoid body—the nose, the ear, the arm—the robot teaches the child to do the same thing.
That children feel comfortable and reassured during this kind of robotic interaction is also the basis for a number of recent developments, among them the suggestion that KASPAR might be used by police for questioning children who are victims of abuse or surviving witnesses of accidents or criminal acts. These children often find it difficult, or are afraid, to speak candidly to an adult. According to the Metropolitan Police in London, they frequently do not furnish useful leads, and they sometimes give false or misleading information. The guiding assumption is that when interrogators, even if they are trained social workers, hear certain accounts of abuse, they find it very hard not to transmit nonverbal signals that the child finds disconcerting, causing him to hesitate and either stop talking or change his story. A robot, it is thought, may be perceived by the child as a more neutral and less threatening conversational partner, making it possible for him to treat a delicate situation as a sort of game.
This proposal has raised ethical questions around the potential uses of robots, though, one that is often mischaracterized. During a police interrogation using KASPAR technology, a child would not be aware that she is actually speaking to a human being controlling the robot, and the interaction is for the express purpose of inducing her to give information that she might not give otherwise. With this use of KASPAR, the operator would be more directly fooling the child, making her believe that she is speaking to a robot when in fact a human being is listening in and doing the talking. But this deception nonetheless assumes a form opposite to the one for which robots are usually reproached. Robots are typically accused of having false emotions, pretending to have emotions like those of human beings without the corresponding internal state. Here, to the contrary, the idea behind this proposed use of KASPAR is to reassure the child by making him believe that he is dealing simply with a robot. If the child is fooled, then, it is not by a robot, not by a machine, but by adults, by other human beings.
The moral issues raised by the uses of a robot like KASPAR, then, should be wholly separate from the perennial question of whether robots’ emotions are true or authentic. Interactions with emotive or empathic robots will amount to sharing with them an experience that is more or less similar to the ones we have with pets or that a child has with a stuffed toy animal. These relationships are not artificial. Nor are they false, though they may very well be unbalanced, confused, or perverse. Whether these relationships are altogether healthy, it is never a question of being fooled by our dog or our teddy bear. Childless people who leave all their money to their cats are not really victims of feline cunning.
The KASPAR model’s undoubted successes exposes the shortcomings of most current ethical thinking about the use of robots in educational institutions, hospitals, and specialized centers for children and adults with particular disabilities. Social robots are often thought of as a way of delegating the obligations associated with such care to machines, rather than taking responsibility for it ourselves. Robotic interaction partners such as KASPAR do not replace human beings. They only support them in providing aid and treatment.
The effectiveness of all the learning experiences KASPAR provides to autistic children proceeds from the fact that the robot provides the child with a relaxed atmosphere in which, unlike the social situations she is used to, she is protected from the awkward and often distressing consequences of her many errors of interpretation. The robot never reacts in a reproving or dismissive way when the child behaves inappropriately. Instead, it gently corrects her while at the same time providing reassurance, by means of firm and unsurprising responses that are unlikely to be misunderstood and that encourage the child to persevere in the difficult work of learning social skills.
The ability to give both comfort and cheer—feelings we experience in the company of pets as well—is common to many robots used in a therapeutic setting, not only ones such as Paro, a robotic stuffed animal that serves as a “therapeutic companion” to those suffering from dementia, but also robots designed to help people who have suffered a stroke or other cerebrovascular accident, or any injury that suddenly compromises one’s ability to perform elementary daily tasks. In fact, in circumstances where motor and cognitive rehabilitation is inseparable from social rehabilitation, several studies show that many patients prefer to perform therapeutic exercises under the supervision of a robot rather than a human nurse, whose presence is apt to disturb or embarrass them in trying moments, when they prefer not to have to compare themselves to other people.
These robots are at their most effective as substitutes for human partners—intrinsically temporary substitutes, designed to make it easier to establish social ties with those who, for one reason or another, have a hard time fitting into their social environment. And the empathetic responses that arise from these experiences, what the filmmaker Phie Ambo has called “mechanical love,” cannot simply be seen as false emotions, or emotions that have a positive effect only by manipulating and deceiving robots’ human partners. The real emotions that emerge and develop in the course of these interactions produce real social responses, making it easier for those who have trouble interacting with others to enter into a social ecology—a human world that otherwise would, for the most part, be closed to them. As imperfect as they still may be, these robots give us at least a glimpse of what an ideal artificial social agent might be like: machines that are capable of assisting in the continuous coordination of human social relations, that will integrate themselves into a complex ecology of mind, each in its own distinctive way.
Extract edited from Living with Robots by Paul Dumouchel and Luisa Damiano, published by Harvard University Press. Copyright © 2017 by the President and Fellows of Harvard College. Used by permission. All rights reserved.
One more thing
If you think Slate’s work matters, become a Slate Plus member. You’ll get exclusive members-only content and a suite of great benefits—and you’ll help secure Slate’s future.Join Slate Plus