I have been reading Gleik’s new book “The Information: A History, A Theory, A Flood” and have been enjoying it very much. This particular infovore especially enjoys feeding on information about information – technology, language, computers, and so on. Consequently, Gleik’s book hits all the sweet spots for me.
I am about halfway through the book and Gleik is discussing the contributions of Claude Shannon to information theory and computer science. He describes how Shannon was thinking about self-replicating machines, particularly computers, in the 1950’s (though this Wickipedia article claims that John Bernal’s work on the topic predated Shannon’s by about twenty years). This type of forward-thinking wasn’t always well-received by audiences who thought that perhaps Shannon was meddling with things that should not be meddled with. This is a common reaction to creativity in the sciences perhaps stemming from belief systems in the West.The influence of Judeo-Christian beliefs about creativity can be seen when creative products cause people to be frightened about the consequences of those creative products and stem from the belief that only God can create “ex-nihilo” and that divine instruction is necessary for human creativity. This belief manifests itself in the Second Commandment “You shall not make for yourself a graven image, or any likeness of anything that is in heaven above, or that is in the earth beneath, or that is in the water under the earth…” . Additionally, it was believed that human curiosity was something that should be constrained and that the primary role of humans was to “serve and obey.”
This belief is also reflected in the Ancient Greek myths of what happened to Prometheus when he gave fire to man; Daedalus creating the wings that Icarus used to fly too close to the sun, and Pandora opening the box that released all the evils upon the world. We can tell when these beliefs are affecting our reactions to creative products when those creative products are met with cries of “It’s unnatural!”, “Some things man was not meant to know!”, or by invoking the story of Frankenstein’s monster.
As Gleik relates in his book, Shannon’s proposal of a machine that could reproduce itself and the methods by which it could do so, was not always well received by the public, causing them to fret about the possible consequences of these innovations. What struck me about this discussion in the book was how prescient some of the public’s worries were and how something that was seen as a negative consequence of the innovations is actually believed to be a positive outcome by our current standards (which is probably why Gleik included it in his book). A salient example is the reaction by a member of the press to Shannon’s ideas about self-replicating computers which cause him to write the following:
“What happens if you switch on one of these mechanical computers but forget to turn them off before you leave for lunch? Well, I’ll tell you. The same thing would happen with computers in America that happened with jack rabbits. Before you could multiple 701,945,240 by 879,030,546, every family in the country would have a little computer of their own…” (p. 267, emphasis added)
Admittedly, the reason we are currently inundated with little computers isn’t because of uncontrolled self-replication of the machines. The “horror” predicted in the 1950’s by that reporter has happened though. Yet we don’t perceive this as a problem. We see it as making possible the information age, increased interconnectedness with family, friends and colleagues, and the ability to access to any information we would care to access (and many types we would not care to access). A perceived horror is now seen as a perceived benefit. How does that happen? Context? Habituation? Likely, it is due to our very human adaptation to and use of the product.
Tools are created by us to serve us. Tools that serve humanities’ needs well are adopted and used. Those that don’t, disappear. Humans adapt the tools to satisfy their needs. These needs really don’t change over time. We seek social interactions with others, information with which to make decisions and problem solve, games to play, ways in which we can be creative. The tools we make help us to satisfy these needs. Consequently, we should worry less about the possible implications of new technologies. Or maybe,we should be brave and be open to exploring how the new technologies can be used while always being mindful of the ways in which those technologies can be misused.