Monthly Archives: October 2011

On Monster Stories

In a few days it will be Halloween, a time when we dress up as, or tell stories about, monsters and other things that are meant to scare us. But have you ever wondered why we tell monster stories? They are entertaining for sure, but is there a deeper reason why we like to flirt with these frightening tales of inhuman creatures?

There are of course, many different kinds of monster stories; different categories of spookiness if you will. There are the “beast within” stories that describe the monster that lurks inside all of us (werewolves, Dr Jekyll and Mr Hyde, the Hulk). There are the stories about vampires, who both repulse and attract us, tempting us to join them and succumb to our latent, unbridled sensuality (e.g., Dracula, the Twilight series). There are the stories about ghosts, spirits, and other supernatural monsters, against which we feel completely helpless despite our advanced scientific knowledge and powerful technologies. Then there are the zombies, those inarticulate and slightly uncoordinated undead, who remind us of the fate that eventually awaits us all (my favorite being Shaun of the Dead). And then there are the unholy products of human creation: Golems, Frankenstein’s monster, the Terminator, the Cylons, and all manner of robots, cyborgs, and androids.

It is this last category of monsters that I want to focus on today: the unholy products of our secondary creation. We might call these “Promethean monsters” in reference to the Greek Titan Prometheus who not only gave mortal humans the first and most elemental of our technologies, domesticated fire, but also in some accounts played a role in the creation of humans by forming them out of clay.

These stories, I think, are interesting to examine for two reasons. First, they give us a window into understanding the source culture’s anxieties about scientific and technological change.  The way these monsters are created, animated, and ultimately killed (or not killed) speak volumes about the ways in which a culture is trying to grapple with the uncontrollable forces they feel they have unleashed upon themselves.

For example, one of the inspirations for Marry Shelley’s Frankenstein story was a discussion she and her friends had about the new science of “galvanism,” which had discovered that electricity applied to the muscles of dead animals seemed to make them move as if they were alive again. Many at the time were postulating that a proper amount of electrical current applied to the human body might also bring it back to life, a prospect that no doubt sparked the imagination of young Shelley.

Promethean monsters from science fiction also provide this same kind of mirror to our own culture’s techno-scientific anxieties. The Terminator series reflected our worries over nuclear holocaust, computerized automation, networked computers, and the possibility of artificial intelligence. More recent film such as Splice express deep anxieties over the potential and ethics of biological engineering.

The second reason why I think it is important to pay attention to Promethean monster stories is because they also tend to reveal what the source culture thinks it means to be human, a person, or a child of God. Because these monsters are created by humans, and because they are often very human-like in appearance and behavior, they beg the question as to what makes them different from ourselves. In these stories, the monsters act like a foil to humanity, a creature that is similar to its creator, but remains distinct in some specific way that highlights that essential quality that we think makes us human.

For example, consider the classic 1960s dystopian novel Do Androids Dream of Electric Sheep; some of you may know the story from the movie Blade Runner, which was loosely based on the novel. Both the novel and the movie imagine a futuristic world in which a powerful corporation has developed a series of organically-grown androids that are virtually identical to humans. In the movie, they are known as “replicants,” which is such a good name that I’m going to use it even though it was never in the novel (I’m sure Dick would have used it had he thought of it).

The brains of these replicants are designed by the brilliant and enigmatic head of the corporation, and in the latest models, the corporation has implanted false memories of parents and a childhood. These false memories cause the replicants themselves to assume that they are human, and since they are virtually indistinguishable, both physically and behaviorally, from their human creators, they are almost undetectable…except for one very interesting and provocative trait: the replicants are incapable of experiencing empathy.

In the novel, this lack of empathy acts like a foil to the religion (known as Mercerism) practiced by the humans still left on Earth. Practicing the religion consists of using an “empathy box” to join the religion’s hero (Wilbur Mercer) on his repetitive climb up a steep cliff. The empathy box allows the worshipers not only to join the plight of Mercer, but also to be connected in a kind of group consciousness with the remaining inhabitants of a post-nuclear holocaust Earth. By feeling the presence of other worshipers, they reach out to one another and are reassured that they are not alone.

In both the novel and the film, the protagonist, police detective Rick Deckard, becomes so calloused by his work and living situation that he begins to wonder whether he too might be a replicant. In order to outwit the more physically powerful replicants, he increasingly has to think like they do, which starts to wear away his ability to care about the feelings and needs of other humans. This culminates in him having an elicit affair with a replicant, choosing it over his human wife.

Dick, like Shelley, is playing with what it means to be alive, what it means to be a person, and just who in the relationship is the real “monster.” Empathy may be a uniquely human trait in Dick’s dystopian world, but it is a trait that must be practiced. When humans make choices that deny their inherent empathetic capabilities, they quickly become just like the monsters they oppose. In other words, being fully human is not something we just inherently enjoy; it is something that needs to be constantly lived into.

To end, here’s an incredibly provocative clip from the series Capirca, which is a prequel to the recent reboot of Battle Star Galactica. It raises all kinds of questions about our relationship with robotic secondary creations, whether the “differently sentient” would be due the same rights as a person, and most interestingly, whether they too could have a relationship with our creator God.

Technological Domestication

When the iPad was first introduced, I read every review of it I could find, but one of them has stuck with me more than the others. The reviewer likened the iPad to a new puppy, something that filled your life with love and joy, but also annoyed you as it chewed up your favorite slippers, shredded your pillow, and peed all over your new carpet. The reviewer was anxious for the iPad to transition into that good old dog who sat by your side, provided unwavering companionship, and behaved the way you wanted it to.

What I loved most about that review was how it perfectly captured one of my favorite concepts from media and technology studies: domestication. Metaphorically speaking, new technologies are similar to untrained puppies; they create chaos and upheaval in their owners’ lives when first introduced, but their owners typically respond by domesticating them: reshaping their behaviors, and sometimes even their physical attributes (e.g. neutering), so that they better fit the existing social order. A house with a dog is never the same as a house without one, but a well-domesticated dog bends as much to its owners as its owners bend to it.

Domestication theory, like it sounds, posits that technological adoption is an active process where designers, producers, marketers, and consumers struggle to work out what a new device or system actually is, and what it is good for. As opposed to the more traditional view where technologies enter the consumer space and are assumed to have one-way “impacts” on culture, domestication researchers stress the ways in which people wrestle with and often reshape technologies as they fit them into their everyday lives.

For example, consider the introduction of a television into a household. I’m just old enough that I remember the first time my parents brought home a large (maybe 15″) color television. Before that, we had a very small black-and-white television that we sometimes watched, but this new color set was the first real TV we ever had. Although the artifact itself carried with it some suggestions for how it should be used, it did not completely determine how we fit it into our lives. It had the look of a piece of furniture, so it could have fit well into our main living area, but my parents were the sort that wanted to relegate the TV to a separate, designated room. This placement sent the message to us boys that watching TV was something out of the ordinary, something to be done occasionally and purposefully.

My parents also carefully regulated what we watched on that television, and when we watched it. My brother and I desperately loved The Six Million Dollar Man, but we also quickly learned that we had to remain on our best behavior to watch it, as it aired just after our normal bed time. Sadly, we missed many of the episodes due to our inability to resist fighting with one another, so I never did find out what happend when Steve Austin met the Sasquatch. Watching TV on a sunny day was also verboten; my mother was particular in her desire that we go outside and play whenever we had the chance to do so. Perhaps she just wanted to watch her own shows in peace….

Like all good parents, mine were also concerned about regulating the way in which we watched television: sitting too close to the set would reap condemnations and warnings that we’d soon go blind, which I’m guessing was a popular urban myth at the time. Sitting upside down on the couch, which seemed perfectly fun to us, was also never tolerated. If we were going to watch TV, we need to watch it, not play around. All of this communicated that watching TV was serious business, and not something you did aimlessly while you played with other things.

My point is that while the physical artifact and the programming streamed through it suggested or even encouraged particular patterns of use, they did not entirely determine how that device was incorporated into my family’s home. My parents domesticated that television: our house was never the same after it was introduced, but the physical placement of the device, and the way in which our use of it was regulated, reshaped our understanding of what it was, and what it was good for.

So where was the TV in your childhood house, and what rules did your parents establish (or not establish) regarding its use? How are you actively domesticating new technologies that are entering your life today? Are your domestication efforts proving successful, or are your new devices metaphorically chewing your coffee table legs to bits?

The Artistry and Engineering of Steve Jobs

Steve JobsIn response to the death of Steve Jobs earlier this week, there has been a virtual flood of great writing reflecting on the man himself, his accomplishments, or his influence on authors’ personal lives. I’ve enjoyed reading all this, but the one source that has caught my attention the most is an oral history interview that Steve Jobs did with the Smithsonian back in 1995.

Although the interview was conducted while he was at NeXT (after he had been forced out of Apple and before he returned), Jobs was asked to reflect a little on his time at Apple. He started by describing what it was like to work there in the early years:

Apple was this incredible journey. I mean we did some amazing things there. The thing that bound us together at Apple was the ability to make things that were going to change the world. That was very important. We were all pretty young. The average age in the company was mid-to-late twenties. Hardly anybody had families at the beginning and we all worked like maniacs and the greatest joy was that we felt we were fashioning collective works of art much like twentieth century physics. Something important that would last, that people contributed to and then could give to more people; the amplification factor was very large.

Notice how he described the way they thought about what they were doing: “…we felt like we were fashioning collective works of art much like twentieth century physics.” For Jobs, there was little distinction between building computers, practicing science, and creating art. The interviewer picked up on this, and asked him to explain why he used the word ‘art’ instead of ‘engineering’. Jobs replied:

I actually think there’s actually very little distinction between an artist and a scientist or engineer of the highest calibre. I’ve never had a distinction in my mind between those two types of people. They’ve just been to me people who pursue different paths but basically kind of headed to the same goal which is to express something of what they perceive to be the truth around them so that others can benefit by it.

The interviewer then tried to clarify this by asking if “the artistry is in the elegance of the solution, like chess playing or mathematics?” Jobs disagreed, saying it was more profound than that:

No. I think the artistry is in having an insight into what one sees around them. Generally putting things together in a way no one else has before and finding a way to express that to other people who don’t have that insight so they can get some of the advantage of that insight that makes them feel a certain way or allows them to do a certain thing. I think that a lot of the folks on the Macintosh team were capable of doing that and did exactly that. If you study these people a little bit more what you’ll find is that in this particular time, in the 70’s and the 80’s the best people in computers would have normally been poets and writers and musicians. Almost all of them were musicians. A lot of them were poets on the side. They went into computers because it was so compelling. It was fresh and new. It was a new medium of expression for their creative talents. The feelings and the passion that people put into it were completely indistinguishable from a poet or a painter. Many of the people were introspective, inward people who expressed how they felt about other people or the rest of humanity in general into their work, work that other people would use. People put a lot of love into these products, and a lot of expression of their appreciation came to these things. It’s hard to explain.

It may be hard to explain, but anyone who has worked in the computer industry knows exactly what Jobs is talking about. When I started writing software for a living in 1991, I too was struck by how many of my coworkers were musicians, or artists in some other field. We had all gotten into computers not because we had always been nerdy, engineering types, but because we saw the inherent creativity involved in designing and building software, and the amazing flexibility of the computer as an creative platform.

What Jobs is getting at here is the deep link between art and craft, a link that is embedded in the very word we use to describe all that cool stuff that Apple made: ‘technology’. As I described in an earlier post, the greek root of the word is typically translated as art or craft, so the literal meaning of technology is just “the study of art or craft.” In English we use the term ‘artist’ to describe someone who makes decorative things and ‘artisan’ to describe someone who makes practical things, but people like Jobs and his employees at Apple demonstrated just how blurry and permeable that distinction really is.

In fact, artists and artisans are really doing the same thing, just in different ways: they develop “an insight into what [they see] around them” and then put “things together in a way no one else has before…finding a way to express that to other people who don’t have that insight so they can get some of the advantage of that insight….”

My first computer was an Apple IIe, and I write this now on an iMac. In between I’ve used many different kinds of computers and operating systems, all of which were the products of talented artist-engineers. But Steve Jobs and the “collective works of art” he inspired and directed have probably had the most profound impact on my life. That first Apple IIe sparked my imagination and drew me into a new creative world that changed the course of my life.

Thanks Steve. Rest in Peace.

The Mixed Blessing of Shuffle

I have an embarrassing confession to make: my closet is full of shirts that are all some shade of solid blue or grey. I am a bit neurotic in this way; I have a hard time feeling comfortable wearing a patterned shirt, and I can’t seem to bring myself to buy reds, greens, or anything terribly far away from blue. This becomes most apparent on laundry day, when my stack of t-shirts oscillates within a very narrow spectrum of color, and my wife looks at me with that look of “how many blue shirts do you need? Are you allergic to other colors?”

This is, of course, slightly hypocritical coming from my wife, who owns something like fifty different versions of a black skirt. Whenever we go shopping, she pulls yet another black skirt off the rack, holds it up to her and says “what do you think?” My usual smart-aleck response goes something like “oh look—a black skirt; just like all those other black skirts you already have!” She then responds with “No, this is completely different…see?”

The truth is, we are both stuck in a rut when it comes to clothing. I keep buying solid blue or grey shirts (preferably a nice shade of blue-grey), and she keeps buying black skirts. We go to the store with all the best intentions of branching out into other colors, patterns, and styles, but we invariably keep buying the same outfit, over and over again.

Most of us tend to get stuck in ruts like these, buying the same outfits, cooking the same meals, walking or driving the exact same routes everyday (I have a very particular path I take to the market each day, even though I could mix it up and go different ways, encountering different sights and people). It also seems to get worse with age; the older we get, the harder it seems to break out of our established patterns and try something different.

Interestingly, my wife and I have also been noticing lately that organizations tend to get stuck in ruts when it comes to hiring new people, and academic institutions seem to suffer from this quite acutely. They say they want to hire more interdisciplinary scholars, or more women, or more ethnic minorities, but when it comes down to it, they just buy the same outfit over and over again. They opt for the familiar, the person that looks and sounds most like what they are used to. Over time, the rut gets deeper and deeper, and the organization gets more and more entrenched. Eventually it becomes insular, inflexible, and irrelevant.

iPod ShuffleI’ve been thinking about all of this because I have recently started using the shuffle feature on my iPod. I have to admit that I resisted the whole digital music phenomenon for quite a while. My wife and I have a large stack of compact discs, and the thought of spending hours ripping them into iTunes seemed like far too much work. We were also suspect of the playback quality, and wondered how iTunes would handle classical and live recordings, where the atomic unit was an entire symphony or album, not a single track.

But over the last few months we have slowly ripped most of our collection, and I bought a cable so that we could play our iPod through our amplified stereo. We have even put together specialized playlists, such as “mellow” music for weeknight dinners when we need to slough off the day’s stress. And then we started using the shuffle feature.

Ordinarily, I tend to avoid shuffle-type features because they are built upon the assumption that the song (or single track) is an isolated and independent entity. This is probably true for most pop and rock music, but really isn’t true for other genres like classical, live concerts, or concept albums. The movements of a symphony are often split into separate tracks on a recording, but they are meant to be played together and in order. They make little sense when separated from their sibling movements or played in a random order. Similarly, live concert recordings have a kind of emotional flow that is carefully planned by the artist, so shuffling the tracks around usually leads to jarring transitions.

But as I started experimenting with the shuffle feature on my iPod, I noticed something rather surprising: I started listening to music that I haven’t listened to in years, and I was really enjoying it. I have several recordings that I bought, listened to once or twice, and then never listened to again. It wasn’t that I disliked the music; it was simply that the music was a little too different from the rut that I had gotten myself into. When I reached for a CD to play, I tended to select one from the same subset of CDs that I always selected from. I kept buying the same outfit. I was stuck in a rut. And the shuffle feature was starting to get me out of it.

My conclusion? Shuffle is a mixed blessing. Sometimes it’s not at all appropriate, but other times, it can act as a helpful mechanism that forces you out of your ruts. Like all technologies, it engenders several kinds of social changes all at once. Some of those changes will be intentional and obvious, while others will be more hidden and unexpected. And all of these changes can be seen as either negative or positive depending on the context.

Now my question to you is, do organizations, and especially academic institutions, need a kind of “shuffle feature” to force them out of their ruts? I’m not suggesting that they select candidates in a random fashion, nor that they purposely hire for difference over competency. What I am suggesting, however, is that these institutions need to think about how they may be seeing very qualified candidates as not adequate simply because those candidates don’t look or sound like the kind of person they are used to hiring. Organizations tend to create an archetype of their ideal candidate, and not surprisingly, the archetype looks a lot like those who are already working there. In other words, they keep buying the same outfit, over and over again, and don’t realize how homogenous and insular they are becoming.