In 1991, I was a fresh-faced, fairly naive information systems major who was about to graduate from college. A few months before the end of school, an alumnus who worked for Microsoft came to our seminar and showed us a video of a speech Bill Gates had made the year before at Comdex. The speech was entitled “Information at Your Fingertips” and it was Bill’s first attempt at articulating a vision for the future of PC industry, a future where everyone would have instant and easy access to whatever information they could ever need or want (he gave another more-well-known version of the speech in 1995). Watching it today, one can’t help but smile at Bill’s enormous glasses, bad haircut, and cheesy delivery, but at the time, his vision looked incredibly cool to me. I knew then that I desperately wanted to be a part of making it happen.
I jumped into the software industry shortly after graduation, and spent nearly a decade designing, building, and managing software that could deliver information to people’s fingertips. Although I had studied information systems, I did so at a small, integrative liberal arts college, so most of what I learned about the practice of software engineering was actually acquired on the job. I learned C, then C++, and a smattering of other higher-level languages. I became adept at relational databases and SQL. I read books on algorithms, object-oriented theory, design patterns, human-computer interaction, and obscure programming tricks. I learned to evaluate the efficiency of everything I did, to seek the optimal solution. I read Dilbert religiously. I watched a lot of sci-fi. I became an engineer.
As I acquired the technical skills of software programming, I also took on some of the more annoying behaviors that are often characteristic of engineers. I became quite arrogant, assuming that my computer skills were evidence of a broader intellect that enabled me to have the correct opinion on just about anything. I became easily frustrated when people chose what I deemed to be a suboptimal course of action. I figured that I was capable of solving just about any problem given the right set of tools and techniques. And by “any problem,” I meant, any problem: automating sales reports was really just a special case of solving world hunger, homelessness, and the troubled middle east. All that was needed, I naively assumed, was a bit of rational decision making, supported by better computer systems that could catalog and deliver the right information at the right time.
After a few years, however, I started to notice that with every set of problems we solved, a whole new set of problems seemed to emerge. We would start every project with the greatest ambitions and expectations, but by the end we were already starting to see its shortcomings and thinking “oh well, we’ll fix that in the next version” (and we always assumed there would be a “next version,” even though our customers would have probably preferred us to just fix the problems in the existing one). Throughout the 1990s, we did automate scores of routine tasks, and developed tools that could catalog and retrieve information in ways similar to Bill’s vision, but our greatest social problems still seemed as intractable as ever. In some ways, we may have actually made them worse.
By the late 1990s, I was starting to get pretty cynical about the software industry in particular, and technology in general, so one of my friends suggested that I read Neil Postman’s book Technopoly. It was just what I needed. I can still remember how the following passage completely stopped me in my tracks:
You need only ask yourself, What is the problem in the Middle East, or South Africa, or Northern Ireland? Is it lack of information that keeps these conflicts at fever pitch? Is it lack of information about how to grow food that keeps millions at starvation levels? Is it lack of information that brings soaring crime rates and physical decay to our cities? Is it lack of information that leads to high divorce rates and keeps the beds of mental institutions filled to overflowing? (60)
I stayed in the software industry for a few more years, but reading Technopoly eroded my faith in modern technology’s ability to solve our larger social problems. I channeled my inner grumpy old man, and started to wonder if modern technology was actually more the cause than a solution to our social ills. I read Thoreau and pined for the simpler life. We got rid of our TV and spent more time reading. We bought a dining table made from reclaimed factory floor boards. We replaced the overhead electric light with a candelabra that we diligently lit each night. I exchanged my power tools for manual ones. I replaced my GoreTex with wool. I bought a push mower. I became a Romantic.
Well, sort of. I’m a city-boy at heart, and I never really learned how to appreciate poetry, so I was never quite the card-carrying Romantic. Still, I became much more of a techno-pessimist and eagerly read all the prominent Christian critics of modern technology. I also began to wonder whether one could really be both a engineer and a sincere Christian. If, as Ellul and Borgman claimed, industrialists and engineers were primarily responsible for the modern mindset, including all the social ills that it led to, how could a sincere Christian continue to do that kind of work?
Shortly thereafter, I left software to go back to graduate school, hoping to deepen my understanding of the ways in which modern technology had influenced our culture, and determine if my Christian and my engineering selves could really co-exist. I had never been much of a historian (business and computer science are perhaps some of the most a-historical fields there are), but the critics I most admired seemed to be well-versed in the history of technology, so I thought I should pursue that as well. It turned out to be a good decision, but not for the reasons I originally thought.
As I began to study the history and sociology of technology, I discovered that most critics of technology, especially the ones who write for a popular audience, rely on a theory that is no longer supported by most historians. That theory, commonly known as “technological determinism,” posits that technologies have a kind of one-way, deterministic “impact” on any society that adopts them. The stronger forms of this theory also hold that technological innovations advance according to an internal logic that makes technological progression inevitable and unstoppable.
Although technological determinism was the dominant historical theory for the first half of the 20th century, most current historians consider it to be only half right. Technologies most certainly change the societies that adopt them, but those changes are rarely, if ever, deterministic. Instead, detailed historical cases show that consumers play very active roles in shaping our understanding of what a new device is and is good for. In some cases, they also instigate a physical or functional reshaping of the new device as they seek to make it fit better into their lives (for example, the Kosher mobile phone).
This discovery opened up the possibility that I, as a Christian who was also passionate about technology, could actively engage in the reshaping and redeeming of these new devices. When we think as a technological determinist, we are left with a fairly bleak choice: adopt the new device and suffer the inevitable consequences; or completely reject it and hope you can convince others to do so as well. As Sherry Turkle has reminded us, this is the language of addiction—it’s similar to the way an addict thinks about his or her drugs. But when we realize that both engineers and consumers play active roles in the shaping of new technologies, a new possibility arises: the opportunity for a participatory redemption.
This realization also helped me see how I might reintegrate my Christian and engineering selves. If technologies did not have deterministic impacts and did not advance entirely according to their own logic, then it was dreadfully important for more Christians to be actively involved in not only the engineering of new devices and systems, but also their early adoption. If Christians aren’t there to inject their own values into the design, production, marketing, and adoption of new technologies, we really have no excuse if we don’t like how things turn out. Blaming deterministic outcomes just obscures what is really a lack of engagement.
I also began to realize that my Romantic reaction was just as short-sighted as the techno-optimism of my youth. It was certainly good to question the purported benefits of modern technology, and perhaps reject a few things that were really more of a distraction than a help, but to deny the flourishing I felt when designing and building software was to deny an important part of who I was made to be. Not all of us are made to be farmers or poets. Some of us are made to be engineers and artisans.
Are you a Christian involved in some kind of engineering practice? If so, how do you integrate your faith and your work? What makes a Christian engineer different from a secular one?