March 28, 2010
Class with Oliver O’Donovan last week was unusually enlightening even by the high standards of that class. Deep insights and great quotes poured from the sage professor in a sparkling effusion, and I couldn’t take notes fast enough (especially as I had to use pen and paper). Some of the most interesting thoughts came on the question of how we should view technology from an ethical standpoint--is technology neutral? (The texts for the class were two fantastic essays on technology by the Canadian philosopher George Grant).
It is a standard platitude among--well, among most anybody in mainstream Western thought, but particularly among conservatives, including, more often than not, Christian conservatives--that “technology is a neutral tool; it isn’t good or evil in itself, it depends on how you use it.” The problem with that seeming truism is that the two halves of the statement are not saying the same thing.It is a truism that any technology is not good or evil in itself, for indeed, only actions, not objects, are good or evil. But it does not follow from this that the object or technology is a neutral tool--that is to say, that we can use it however we would like, for good or evil, that it does not incline, by its nature, toward certain good or evil uses.
Take, for instance, a large nuclear warhead. In itself, it is neither good nor evil. I could build one simply to put in my backyard as a piece of decoration. It might be imprudent and wasteful, but not necessarily evil...certainly, the weapon in itself wouldn’t be evil. But this would be basically because I had decided not to use the weapon, since sticking it in my backyard would hardly constitute a use. It would be impossible to use this piece of technology, at all sensibly, in a way that was not evil. So, a large nuclear weapon is not simply a neutral tool like a hammer--it is a tool that works for certain purposes, and that cannot be morally used for those purposes.
This is an extreme example, but it proves the point--all technologies that we develop are inclined to work in certain ways, and to force us to use them in certain ways rather than others, if they are to be at all useful. If they are significant enough, they will begin to mold our lives in certain new and fundamental ways, and these will not be “neutral”--they will have, perhaps, certain features that we could call “good,” and others that will certainly be “evil” and lamentable.
The computer, of course, is a preeminent example (and is the one that Grant himself chooses). The computer has of course had a profound impact on how we live our lives--we are as much subject to it as it is subject to us. It is certainly not a mere tool in our hands, for us to do with as we will, like the hammer (if even the hammer is that). And while this impact has been good in certain ways, in other ways it has been terrible--according to O’Donovan, especially on the university. The computer’s predisposition to record what is measurable and quantifiable has meant that its adoption by the educational system has driven the educational system to focus on precisely what are the least significant features of education--those that are measurable and quantifiable. The result, according to O’Donovan, is that the university, historically understood, is now in its death throes. This is not a “neutral” development. For some, it might be a good development. For those with more sense, it is almost certainly an evil development. The email, too, as O’Donovan discussed, is a tremendously convenient innovation, yet one with an almost limitless capacity for harm, posing entirely novel temptations and risks for our society. Technology, then, is not neutral; it is quite perilous.
A blithe confidence in technology, then, a careless indifference to the way it shapes our lives in directions that we may not wish to go, is not a responsible ethical posture for Christians. Does this mean that we must be Amish and flee technology? Does this mean that if we foresee the risks of the computer, of email, of mechanized agriculture, etc., that we must simply slam on the brakes and retreat from these non-neutral developments?
No, because, though they may not be neutral, that does not mean they cannot be neutralized, in O’Donovan’s terminology. If we do not naively imagine that the computer is a neutral tool, and we embrace it as a society with open eyes, well aware of the changes we are facing and the risks we are incurring, then we will be in a position to make careful moral judgments and establish social disciplines and limitations to neutralize the harm that might be caused by a careless use of the new technology. For instance, we might seek to consciously establish social conventions regulating the use of email, or make decisions within communities and institutions what sort of dialogues and functions could be legitimately carried out by email. We might seek to develop computer technology in directions that would counteract its tendency to promote institutional homogeneity (as has in fact now begun to happen with the tools the internet has provided).
This posture, however, would require three things that we are not necessarily willing to do as a society: 1) establish limitations on the ways that new technologies can be used--e.g., the kind of internet censorship that everyone acts like is a short step from Stalin; 2) be willing to say an absolute “no” to certain kinds of technology--just because we can do it doesn’t mean we should do it (e.g., a lot of the issues surrounding genetic engineering); 3) be willing to slow down the pace of technological growth. As it stands now, we are simply innovating far faster than we as a society can digest what new technologies mean, craft appropriate responses, and cultivate the moral discipline to use them properly. All sides of the political spectrum seem to presuppose that unhampered innovation is the way to go, and that the solution to all our problems is more and faster innovation. But, if we are not going to be destroyed by our own creations, we must be willing to focus less of our energies on developing new technologies, and more on learning how to wisely use the ones we have.
This posture, recommended by O’Donovan, seems to me to be the clearest and most intelligent an approach to the ethics of technology that I have yet heard. Commonsensical, really, and avoiding the unnecessary radicalism of an Amish approach, yet, when you get down to it, still very radical in its own way; perhaps too radical for our inebriated technological society to ever hear.