Some of the best cultural observers in the late twentieth century discerned the initial impact of digital computers on our society and tried to remind anyone who would listen of its dangers. Their thoughts help us remember what’s central to living a full human life in this world full of shiny, wonderful gadgets and always, always, the next new thing. Joseph Campbell’s life spanned a good part of the twentieth century. Born in 1904, this renowned expert in world mythology lived through two world wars, the Depression, the dropping of the atomic bomb, Vietnam, the domestic mess of the sixties, and the relentless encroachment of machines, first the mechanical ones and then the electronic ones, before his death in 1987. Late in life he spent many hours in interviews with Bill Moyers, the cream of which eventually became “The Power of Myth.” A highly popular, deeply interesting set of interchanges gleaned from those conversations aired on PBS soon after Campbell’s death. Subsequently, the entire set of conversations appeared in book form under the same title.
Although Campbell believed that we live in a demythologized world, he found that students around the country were attracted to his lectures in large numbers, mostly, he speculated, because mythology provided messages
unlike what ordinary course work at colleges and universities offered in his day. Myths are “stories about the wisdom of life. . . . What we’re learning in our schools is not the wisdom of life. We’re learning technologies, we’re
getting information. There’s a reluctance on the part of faculties to indicate the life values of their subjects.”
One major reason for this was increasing specialization, something that has intensified in the twenty-first century. Campbell pointed out that specialization necessarily limits the field in which one considers any problem and tends to eliminate the life values, especially the human and cultural aspects of any specific issue. Generalists, on the other hand, have the advantage of a broader perspective and the ability to make more complex associations and perhaps gain deeper insights as well. They can take something learned in one specialty and relate it to something learned in different specialty. By so doing, they can discover similar patterns or contradictions or discontinuities that aren’t apparent when one specializes in a narrow field.
Growing specialization and a greater focus on the literal, factual level of life, “the news of the day and the problems of the hour,” have only become more commonplace since the 1980s. Information technologies, with its data gluts, information overloads, knowledge “management,” and, most recently, big data, have put an enormous emphasis on the technologies themselves and have changed the pursuit of knowledge into a process of learning how to access the information one might need to know at some point or other in the future. As a result, the continuum of data, information, knowledge, and wisdom has become jumbled, their meanings confused. Some now describe knowledge as “actionable information.” Others, emphasizing dramatic changes in the state of knowledge due to the Internet, claim that the nature of knowledge has changed fundamentally. Knowledge now resides in networks, they maintain. It can’t possibly reside in an individual's head. In fact, knowledge is probably, in David Weinberger words, “too big to know.” As for wisdom, many seem to equate wisdom today with the consensus of a crowd or, even worse, the dynamics of the marketplace.
Like Joseph Campbell, the journalist and medical researcher Norman Cousins lived through the bulk of the twentieth century and observed the onslaught of technology with similar ambivalence and prescience. "The essential problem of man in a computerized age,” he wrote in “The Poet and the Computer” (1990), isn’t any
different than it was in previous times. “That problem is not solely how to be more productive, more comfortable, more content, but how to be more sensitive, more sensible, more proportionate, more alive. The computer makes possible a phenomenal leap in human proficiency . . . But the question persists and indeed grows whether the computer makes it easier or harder for human beings to know who they really are, to identify their real problems, to respond more fully to beauty, to place adequate value on life, and to make their world safer than it now is.”
Computers as electronic brains can help enormously in vital research of many sorts, Cousins wrote. “But they can’t eliminate the foolishness and decay that come from the unexamined life. Nor do they connect a man to the things he has to be connected to—the reality of pain in others; the possibilities of creative growth in himself; the memory of the race; and the rights of the next generation.” These things matter, Cousins went on to say, because in the computer age “there may be a tendency to mistake data for wisdom, just as there is a tendency to confuse logic with values, and intelligence with insight.” All of which makes that this bright and shiny present and that enchanting next new thing seem quite ephemeral and even trivial in comparison to the really exciting journey of life and the challenge of how to live it fully in the midst of—and perhaps in spite of— all our digital machines.