I was fifteen years old when the film The Terminator was released, and it offered just about everything a fifteen year old boy in the 1980s could ever wish for in a film: a mind-bending, time-warping science fiction plot; a massively-powerful, ruthless, motorcycle-riding, robotic bad guy; a clever, heroic, and thoroughly-human good guy; and of course a hot, but resourceful, female love interest. Like most adolescent boys, I was unsure which male character I would rather be; you are supposed to identify with the hero, but there was something kind of alluring about being a robot who could not be hurt, either physically or emotionally.
But the terminator in the first film was pretty frightening. He was so…well…inhuman. Like a runaway computer program, it pursued its goal without concern and without emotion. Although it had adopted a humanoid form, one could not relate to it like a human. It had absolutely no empathy. The terminator was strictly an “it.”
When the second film came out in 1991, it was hailed for its cutting-edge computerized graphic effects, but the graphics were not the only thing that had significantly changed (spoiler alert!). In a clever twist, a terminator with the same humanoid form shows up, but this time it was sent to protect the young future leader from a new, more-advanced model. The young future leader quickly learns to trust the good terminator, and begins to relate to it as if it were a human. In classic Star Trek fashion, the terminator begins to express more human traits, including a form of empathy and self-sacrifice. The film concludes with a tearful farewell scene, where the young future leader cries over the destruction of the good terminator in the same way he would do so over a fallen human friend. The machine had become a kind-of person, a sort-of “thou.”
This shift in human-machine relations is emblematic of what Sherry Turkle discusses in the first section of her new book, entitled Alone Together: Why We Expect More from Technology and Less from Each Other. Turkle is a psychologist by training who has spent the last thirty years or so investigating the way humans relate to technology, especially computerized communication media. In this recent book, however, she spends the first section discussing various kinds of artificial intelligence and robotic toys: Tamagotchis, Furbies, AIBOs, and My Real Babies. She notes that children relate to these sociable toys in different ways than kids of my generation related to our Merlin, Simon, and Speak & Spell devices.
When children of my generation encountered these early “computational objects,” we were challenged to decide what exactly these new things were. Was the Speak & Spell just a noisy new kind of toy, or was it somehow intelligent or even “alive?” After all, it could ask me questions, respond to my answers, and beat me at spelling games, just like my mother could. But it was also a bit like the original terminator; its voice and mannerisms were highly mechanical, and it had a very limited repertoire of interaction. It didn’t take me long to feel no remorse when I turned it off and tossed it aside.
In the 1990s, children began to encounter a decidedly different sort of toy: one that not only seemed to think, but also to move and relate to them like a fellow creature. Despite the fact that Tamagotchis had only a digital manifestation, children felt real remorse when their Tamagotchis died, and would often “burry” them and buy new ones rather than simply reset their current one. When the Furby came out, the creature was given not only a physical manifestation, but also a voice, one that initially spoke “Furbish.” Children were encouraged to teach their Furby English, and amazingly the Furby seemed to respond to the teaching; in actuality, the Furby was pre-programmed to gradually shift to English no matter what happened, but the illusion helped the child bond with the toy in a way that went beyond the typical child-doll relationship. Children who “raised” these new social toys considered them to be “kind-of alive,” something more than a toy, perhaps closer to a pet.
Turkle observed this new classification first-hand through an experiment designed by Freedom Baird, a graduate of the MIT Media Lab. Baird developed a sort of Turing Test for the heart (see previous post), which was designed to determine “under what conditions a creature is deemed alive enough for people to experience an ethical dilemma if it is distressed” (loc 1062). Baird had her participants hold a Barbie doll, a Furby, and a biological gerbil upside-down for as long as their emotions would allow them to do so. None had troubles dangling Barbie upside down, and nearly all released the gerbil as soon as it showed signs of distress. When the participants flipped-over the Furby, it began to whine and say that it was scared, causing most to feel guilty and turn it back upright within thirty seconds. The participants, many of whom were adults who fully understood that the Furby was just a robot, found it difficult to torment the toy because its cries make them think of it as a fellow creature.
But is there anything wrong with children, or even adults, relating to their technological devices like fellow creatures? Turkle thinks there is. In the opening of the book, she states her concerns clearly:
These days, insecure in our relationships and anxious about intimacy, we look to technology for ways to be in relationships and protect ourselves from them at the same time…. We bend to the inanimate with new solicitude. We fear the risks and disappointments of relationships with our fellow humans. We expect more from technology and less from each other (loc 153).
Turkle worries that these social robotic toys are only the beginning. What will happen when clever engineers develop “My Real Girlfriend?” Will socially-awkward men prefer the company of a robotic girlfriend that is programmed to assert its will only just-enough to keep up the illusion?
Even if robotic girlfriends never come to pass, Turkle’s quote hints at another technologically-mediated form of relationship with which most of us are already quite familiar: social network media. That is the focus of the second part of her book, and will likewise be the focus of my next post.
For now, here is the ending scene from Terminator 2: