Digital Athena
  • Blog
  • Essays and Book Reviews
  • Contact Us
  • About Digital Athena
  • Index of Essays and Reviews

Technology and the Human Spirit

5/6/2014

0 Comments

 
Picture
Read Sven Birkerts's classic, elegant book, The Gutenberg Elegies, and you will be enchanted with the prescience and wise insights of this gifted writer as he explores what it could mean to lose the habit of prolonged reading in an age of information. And prepare to be haunted by his stark vision of what he calls "the argument of our time--the argument between technology and the soul." Now twenty years old, this brief meditation on the fate of reading in an electronic age reminds us what we may be losing as we race through the early years of the twenty-first century.

Some contend that the argument is over, the battle decided. Technology has won, they say. It dominates our lives. It dictates how we live, how we work, how we think. The reading chair is empty now. We are tied to our screens, our alerts, our ringtones, our texts.

Twenty years later the vision Birkerts articulated still hits home but it does need some tweaking. First of all I would leave out the soul, which carries too many connotations of specific religious dogmas, especially of an afterlife. Better that we talk about the human spirit with all its physical, mental, and emotional facets. And I think, at best, that we're in a different position vis-à-vis the technologies that permeate our lives.  It should be less an argument, which suggests loud voices taking definite sides and vigorously debating strong positions, and more of a search. We do struggle with the relationship between technology and the human spirit, but it's more of a dialogue than an argument, more of an exploration of how we shall move forward with all this technology we've created. The real question has become how shall we live our lives under the conditions we face and how shall we live them well.

This is why we tell each other stories, stories like The Gutenberg Elegies, that help us understand what it's like to live our lives today.


0 Comments

The Human Brain Is a Computer? The Limits of Metaphor

4/26/2014

0 Comments

 
Metaphors matter. They matter a lot, shaping not only the way we communicate but also how we think, feel, and even behave. George Lakoff and Mark Johnson explained this well in their now classic work, Metaphors We Live By. Their premier example in that book analyzed how the concept "argument" becomes colored by its close association with the metaphor "war." Thus "argument is war." Here are some of the expressions they found that structure how we think about an argument as a war:

Your claims are indefensible.

He attacked every weak point in my argument.

His criticisms were right on target.

He shot down all my arguments.

Essentially, Lakoff and Johnson contend that metaphors impact the way we experience and understand  the associated concepts, so that in the case of argument, for example, we in part understand, act, and talk about it in terms of war. It's not a dance. It's not a writing process. It's a battle.

The widespread use of the metaphor of the computer to describe the workings of the human brain today has a similar effect. By using such an analogy, people are accepting the implications that the human brain is simply a logical device. This leads to such statements and by implications activities as the following:

IBM's Blue Brain Project is attempting to reverse-engineer the human brain.

Modern architectural design acknowledges that how buildings are structured influences how people interface.

The position of department head requires an expert multitasker capable of processing multiple projects at any given time.

His behavior does not compute.

Human beings do possess logical functions. But the danger with using the digital computer, which runs algorithms based on IF, THEN, ELSE, and COPY logical gates, as a metaphor for the brain is what it leaves out: messy feelings, ambiguous behaviors, irrational thoughts, and the natural ebb and flow of memories. It also leaves out the influences of our subconscious--and the rest of our physical, organic bodies--on how we think, act, and make decisions. Thinking of the brain as a computer addresses very little about what it feels like to be a human being, very little about what it feels like to be alive.

In The Myth of the Machine, Lewis Mumford argued that far too much emphasis has been placed on distinguishing humans from animals because of our tool-making capacities. He wrote that there was nothing uniquely human in our tool-making. After all, apes use tools. Rather it was the human mind "based on the fullest use of all his bodily organs" and that mind's capacity to create language and use symbols that allowed human beings to build social organizations, civilizations, that distinguish us from other animals. It was through symbols and language that humans rose above a purely animal state. But the ability to create symbols, to be conscious of life and death, of past and future, of tears and hopes, distinguishes humans from other animals far more than any tool-making capability. "The burial of the body tells us more about man's nature than would the tool that dug the grave."

If we continue to distinguish human beings from other animals along the lines of tool-making, Mumford believed, the trajectory would be quite dire:

"In terms of the currently accepted picture of the relation of man to technics, our age is passing from the primeval state of man, marked by his invention of tools and weapons for the purpose of achieving mastery over the forces of nature, to a radically different condition, in which he will have not only conquered nature, but detached himself as far as possible from the organic habitat."

So we need to be careful about using the metaphor of the computer, our most modern of tools, to describe our minds and what it means to be a human being.









0 Comments

Ray Kurzweil's Mind

1/23/2014

0 Comments

 
Ray Kurzweil incessantly dreams of the future. And it's a future he describes as a "human-machine civilization." In How To Create a Mind: The Secret of Human Thought Revealed, Kurzweil looks forward to a time when technology will have advanced to where it will be possible to gradually replace all the parts of the body and brain with nonbiological parts. And he claims that it will not change people's identities any more than the natural, gradual replacement of the cells in our body does now. All this will come about after scientists and engineers, who are currently working on brain models in many different organizations and areas of the world, succeed in creating a complete model of the human brain. Kurzweil contends that the neocortex functions hierarchically and that it works according to pattern recognition. Therefore, he argues, it is possible to write algorithms that will  simulate how the brain actually works. That, in combination with increasing miniaturization, will make such substitution of nonbiological components possible by the 2030s.

That the human brain is akin to a digital computer is still a big and a very contentious issue in neuroscience and cognitive psychology circles. In the January issue of Scientific American, Yale professor of psychology John Bargh summarizes some of the latest thinking about this problem. Specifically he addresses the major role of the unconscious in how people make decisions, how they behave in various situations, and how they perceive themselves and the world around them. There is a complex dynamic between ourcontrolled conscious thought processes and the unconscious, often automatic, processes of which we are not aware. Nobelist Daniel Kahneman explained this phenomenon in Thinking Fast and Slow. Automatic thought processes happen quickly and do not include planning or deliberation.

Even Daniel Dennett, an eminent philosopher and cognitive scientist who has long held that neurons functioned as simple on-off switches that make them a logical switch similar to a digital bit, has recently changed his mind about the analogy of the human mind to a computer: "We're beginning to come to grips with the idea," he says in a recent Edge talk, "that your brain is not this well-organized hierarchical control system where everything is in order,  . . . In fact, it's much more like anarchy. . . ."  Yet even with this concession Dennett is still inclined to use the computer as a metaphor for the human brain. This leads him to make a curious statement, one which actually begs the question: "The vision of the brain as a computer, which I still champion, is changing so fast. The brain's a computer, but it's so different from any computer you're used to. It's not your desktop or your laptop at all."

By his own admission, Dennett's talk is highly speculative: "I'd be thrilled if 20 percent of it was right." What I think he means is that the brain is like a computer that is far more complex than existing machines but that it also has intention. The neurons are "selfish," and they are more like agents than computer instructions, which in turn are more like slaves. "You don't have to worry about one part of your laptop going rogue and trying out something on its own that the rest of the system doesn't want to do." Computers, on  the other hand, are made up of "mindless little robotic slave prisoners." So I'm not sure how helpful it is for Dennett to think of the brain as a computer at all. And Dennett's views on neurons and agents, combined with the more recent thinking about the impact of the unconscious on conscious thought, lead me to conclude that Ray Kurzweil's dream of someday replacing the human brain with robotic switches is just that: a dream.
0 Comments

Cyberspace: Lost on a Dark Journey

11/22/2013

0 Comments

 
Earlier this month, The New York Times Book Review published a special issue devoted to technology and its effects on our lives and our books. The editors asked a group of writers what their take was on how the Internet had changed the art of storytelling. Several writers emphasized that they tried to express a sense of modern fragmentation, of some loss of a sense of a whole, as if the continuity of narrative and the idea of life as a journey had been obscured in our current life. Others expressed the need to explain rediscover  the sense of the mystery that is at the bottom of what we call life. They believe the the role for writers is to dig deeper writers into the mysteries and wonder of life, even in this age of technology, when there is so much superficial activity available to us that our experience easily become disrupted and meaning of any sort becomes elusive.  Others note that although corporate economics constantly tries to attract us  with yet another novel technological gadget or twist, the truly successful technologies are those that resonate with the basic experience we have as human beings.

Writer Ander Monson turns all this on its head in an interesting way. Being incessantly bombarded by small bits of narratives, he says,  is to “experience the past  . . .  the distant, darkened past” in the sense that one feels palpably what it was like to be in a labyrinth such as the one Daedalus built for the monster known as the Minotaur according to Greek myth. It provides an ancient analogy for the experience of “ trying to find the line of ascent in a wall of information; the trail of URLs I click through in my morning’s misinforming.” In current terms,
then, the labyrinth becomes the Internet itself and itsendless information.

Pondering this brings him round to the fundamental experience of our contemporary lives today:   “It’s dark down here,” Monson writes, “and lonely. I am drawn mostly, insistently to the human voice. How powerful and necessary the solo voice, the experience of being someone, something else for a little while.” Expressing this experience, Monson declares, will remain what he calls “literature’s killer app” because the act of writing about it is concerned with words and hence “impervious to the threat by everything that’s not the word.”

It is a journey into the darkness not unlike the one that T.S. Eliot described it in “East Coker” as he described his own battle with writing:

“And so each venture
Is a new beginning, a raid on the inarticulate . . . 
And what there is to conquer
By strength and submission, has already been discovered
Once or twice, or several times, by men whom one cannot hope
To emulate . . .
There is only the fight to recover what has been lost
And found and lost again and again: and now, under conditions
 That seem unpropitious.”

Unpropitious indeed are our times. Yet it is heartening to see these writers probing to find the common threads of our experience and try to express what it means to be human today.

0 Comments

How Big Is Big Data?

7/12/2013

1 Comment

 
Big Data. The very concept seems to demand, indeed require, that massive pronouncements and claims of Herculean proportions should follow. Such a concept must inevitably overwhelm previous trends and satisfy even the most unbelievable expectations. But what in truth is the story that proponents of Big Data are (loudly) proclaiming? And is it a fad that's here today, only to be gone tomorrow? Or does it indicate a more deeply embedded belief system, part of a living myth, for our time?

To find an answer to this question, I turned to the latest book on the subject, appropriately entitled Big Data, with one of those absolutely headline-grabbing subtitles that is designed to boggle the mind (and presumably make the casual observer pick up the book and, hopefully, buy it):  A Revolution That Will Transform How We Live, Work, and Think. OK, I thought, so what kind of a transformation are we talking about here?

First let me say that the authors come well credentialed. Victor Mayer-Schönberger teaches at the Oxford Internet
Institute at Oxford University and, we are told, is the author of eight books and countless articles. He is a "widely recognized authority “on big data. His co-author, Kenneth Cukier, hails from the upper echelons of journalism: he's the data editor for The Economist and has written for other prominent publications as well, including Foreign
Affairs
.

This was a good place to start, I thought, to learn about the story of big data and the kind of changes—oops, I
mean transformations—that it was inevitably going to produce in our world. The major transformation the authors predict is that soon computer systems will be replacing or at the very least augmenting human judgment in countless areas of our lives. The chief reason for this is the enormous amount of data that has recently become available. Digital technology now gives us access to, both easily and cheaply, large amounts of information, frequently collecting it passively, invisibly, and automatically.

The result is a major change in the general mindset. People are looking at data to find patterns and correlations
rather than setting up hypotheses to prove causality: "The ideal of identifying causal mechanisms is a self-congratulatory illusion; big data overturns this. Yet again we are at a historical impasse where 'god is dead.' That is to say, the certainties that we believed in are once again changing. But this time they are being replaced, ironically, by better evidence." 

So there you have it. God is dead, yet again. Only this time the god is the god of the scientific method, of causality.
Out with the "why," in with the "what." If Google can identify an outbreak of the H1N1 flu and specify particular areas of significantly large instances of infection, is there any reason that we should worry about why this is occurring in such places, when we already know the what: there's an outbreak of flue and it is especially heavy in these locations, the authors ask. 

We have, my friends, slid into the  gentle valley of the "Good Enough." Correlation is good enough for now. It's
fast, it's cheap, it's here, let's use it. We'll get around to the why later, maybe, if it's not too complicated and expensive to find out. And here are some of the examples the authors use for proof of the good enough of correlations: “After all, Amazon can recommend the ideal book. Google can rank the most
relevant website, Facebook knows our likes, and LinkedIn divines whom we know.”
Such exaggerated attribution of insight and intuition to computer algorithms is
so common these days that it’s seldom even called out.


That's the transformation, according to the authors, that we have to look forward to. And behind their predictions lies a sense that the movement toward reliance on the results of big data to understand our world is not just inevitable but that the data itself, the vast invisible presence in our modern lives, also contains within itself a power and energy of incalculable value and ever-improving predictive powers. They call it “big-data consciousness”: Seeing the world as information, as oceans of data that can be explored at ever greater breadth and depth, offers us a perspective on reality that we did not have before. It is a mental outlook that may penetrate all areas of life. Today we are a numerate society because we presume hat the world is understandable with numbers and math. . . . Tomorrow, subsequent generations may have a “big-data consciousness”—the presumption that there is a quantitative component to all that we do, and that data is indispensible for society to learn from.”

And the heroes of this transformation? They are the people who can wield this data well—who can write the algorithms that will move us beyond our superstitions and preconceptions to new insights into the world in which we live. These are the new Galileos of our day because they will be confronting existing institutions and ways of
  thinking. In a clever turn of what I like to call "The Grandiose Analogy," the authors compare the use of statistics by Billy Beane of Moneyball fame to Galileo's pioneering observations using  a telescope to support Copernicus’s theory that the Earth was not the center of the universe: "Beane was challenging the dogma of the dugout, just as Galileo's heliocentric views had affronted the authority of the Catholic Church." It's another attempt to elevate by association the comparatively banal practices of putting a winning baseball team together on a shoestring to the level of the world-shattering scientific observation that the earth and by extension mankind is not at the center of God's universe after all.

If you can ignore the hyperboles in this book, however--and given the number of them this is no small challenge—you can come to see the reality of what big data actually is and what kinds of contributions its use might make to our lives. The scientific method isn't going away. The march of science to discover and explain its best hypotheses at any given time will continue. In fact the patterns and correlations unearthed by big-data methods may form the basis for new hypotheses and bring us even closer to understanding the "why" of many things to come. 

Nonetheless, within some contexts, big data can produce actionable information. In marketing,  Amazon, for example, can use knowing that people who read Civil War histories may also like a particular subset of mystery writers to boost sales through their customer recommendation algorithms. Google's ability to detect flu outbreaks
also produces actionable information. The NIH and other medical institutions can take actions based on such findings to make vaccines plentiful in certain areas, produce more vaccines if feasible, prepare hospitals and medical offices for the spike in needs, and publish other public health guidelines.

Still there some real problems with heralding the quantification of everything into digitally manipulatable form as
the answer to myriad issues. The supposition fails to take into account any fundamental issues except those obvious ones involving privacy and surveillance. First of all there are the insurmountable problems that complex algorithms
create. That very complexity produces higher and higher risks for errors in the writing and executing of the code. That same complexity makes it very difficult to judge whether the results reflect reality. The very fact that such algorithms may challenge our intuition makes it difficult to validate their results without having an understanding of the "why," or even a sense of the assumptions and content of the algorithms themselves. 

Statistics can be powerful tools but there was also a wonderful book called How To Lie with Statistics that came out nearly sixty years ago and is no doubt still relevant today. The authors of Big Data claim that knowledge and experience may not be so important in the big data world: "When you are stuffed silly with data, you can tap that instead, and to greater effect. Thus those who can analyze big data may see past superstitions and conventional thinking not because they're smart, but because they have the data." The authors also suggest that a special team of “algorithmists” could oversee all the algorithms to ensure that they do not invade the privacy of individuals or cross other boundaries.  I’m afraid Mayer-Schönberger and Cukier really ought to talk to the SEC about Wall Street
and its algorithms to see how well that’s been working out!

Finally, the proponents of big data want to discount intuition, common sense, experience, knowledge, insight, and even serendipity and ingenuity, never mind wisdom. In their quest to elevate the digitalization of everything, they neglect those very qualities, qualities which cannot be digitized. As Einstein once famously reminded us:  "Not
everything that can be counted counts, and not everything that counts can be counted."

1 Comment

Myth of the Ultimate Machine Age: The Genie and the Bottle

6/25/2013

0 Comments

 
There’s a semi-apocryphal story about Norbert Wiener, the brilliant, visionary MIT mathematician. It is said that he used to walk around the halls of the campus with his eyes closed and a finger on the wall to ensure that he did not lose his way. One day traveling what is fondly known as the “Infinite Corridor,” which stretches 825 feet from the main lobby of MIT’s central building west to east through 5 major buildings in all housing classrooms and offices. On one particular day, one of the classrooms in session  happened to have its door open and Norbert Wiener simply entered the classroom  and walked completely  around the perimeter and out the door again as he made his way toward his destination—to the silent amazement (and amusement) of the professor as well as
his students. 

Recently the New York Times published an excerpt from a long-lost article that Norbert Wiener wrote in 1949. Originally solicited by the oddball Sunday Times editor, Lester Markel,it was mysteriously either lost by Markel or abandoned by Wiener, or both. In any event, a researcher recently found the  among Wiener’s papers at the MIT archives. In the piece Wiener  about “what the ultimate machine age is likely to be.” He expounded  future automated systems well beyond what then existed and about smart computers and smart gauges that would integrate one machine with another machine various manufacturing processes.

Although he did not foresee the economic shift in the value of information versus manufacturing, the revolution he did envision was profound and his predictions dire: “These new machines have a great capacity for upsetting the
present basis of industry, and for reducing the economic value of the routine factory employee to a point at which he is not worth hiring at any price. If we combine our machine-potentials of a factory with the valuation of human beings on which our present factory system is based, we are in for an industrial revolution of unmitigated cruelty. . . Moreover if we move in the direction of  making machines which learn and whose behavior is modified by experience, we must face the fact that every degree of independence we give the machine is a degree of possible defiance of our wishes. The genie in the bottle will not willingly go back in the bottle, nor have we any reason to expect them to be well disposed to us. In short, it is only a humanity which is capable of awe, which will also be capable of controlling the new potentials which we are  opening for ourselves. We can be humble and live a good life with the aid of the machines, or we can be arrogant and die.”

Would that our writers and our thinkers and our leaders of corporations today, instead of blithely hailing the onslaught of robots and marveling at increased productively and the brilliance of our technology, had some of the compassion and wisdom that Wiener possessed in 1949.


  

0 Comments

Virtual Pull over the Edge

5/17/2013

1 Comment

 
Why is digital technology so exciting? Why is all this fast change that technology brings always so inevitable and wonderful? There’s an eerily similar narrative that so many books follow these days as they recount the breakthroughs in technology and how they are changing our culture. Written by highly credentialed and respected scholars and technology writers, they all seem to begin by announcing a revolution that is taking place, a level of change that will dramatically (and of course rapidly) affect how we live, how we think, how we interact, or sometimes all of the above. The books present many studies and discuss their ramifications to support their theses, all of which are pretty rational and often good food for thought. 

So far, so good. The problem with these books really appears toward the end of the works. There the authors somehow feel compelled to extensively predict how some particular aspect of digital technology will inevitably transform various aspects of our lives in drastic ways, some of which may enhance our lives and some of which may simply make things more complex, or more artificial, or more alienated than they already are. In most instances,
one is left thinking that writing the book got the authors so enmeshed in their own material that they literally ventured over the edge and into some great unknown by the end of the work. Maybe all these authors/editors need to do is just lop off the last 40 pages.

To show the typical trajectory of such books, let’s take a look at Infinite Reality: Avatars, Eternal Life, New Worlds, and the Dawn of the Virtual Revolution by Jeremy Bailenson, founding director of Stanford’s Virtual Human Interaction Lab, and Jim Blascovish, Distinguished Professor of Psychology at UC Santa Barbara.

First there’s the revolutionary thesis:
“We sit on the cusp of a new world fraught with astonishing possibility, potential, and peril as people shift from
 face-to-face to virtual interaction. If by 'virtual' one means 'online,' then nearly a third of the world’s population is doing so already.”

Then they announce the intent of the book:
 “In this book, we provide an account of how virtual reality is changing human nature, societies, and cultures as we know them. Our goal is to familiarize readers with the pros and cons of the brave new world in which we live. We explore notions of consciousness, perception, neuroscience, media technology, social interaction, and culture writ large, as they pertain to virtual reality—and vice versa.”

Blascovich and Bailenson are genuinely excited because they see a new frontier of opportunity for the behavioral sciences based on all the data that is becoming available through online social interactions. While the environment may be virtual, the behavior of the individuals is “real.” And virtual interactions can change individual behavior in the real world as well as people’s sense of themselves. Virtual experiences change how people make important decisions and how they react viscerally to real-life situations. Examples of studies and analyses of various data sets take up the majority of the volume and are meant to convince the reader that there is indeed a revolution underway. 

Some of their narratives and analyses are quite interesting and could have some meaningful implications for understanding how and why humans behave in the ways that they do and how therapists can use virtual reality
software of various sorts, including explicitly targeted exercises and games, to help people change negative behavior patterns or improve their abilities or performance in some areas. 

It’s in the final portion of the book, however, that the real problem arises. And it occurs in an area where so many technology writers get into hot water: Predicting the future. The authors paint a series of semi-plausible scenarios for various activities from market research to legal trial evidence to surgical training as well as physical therapy, airplane-pilot training, even virtual vacations. Some of these things are well on the way to becoming part of our lives (although I myself will probably resist virtual tourism to my dying day). 

Then Bailenson and Blascovich take another small leap into the more distant future that really puts them over the edge. First they discuss the “promise” of avatars, which they say will become “perceptually indistinguishable” from their real counterparts. People will be able to automatically interact with their avatars without the need for hardware or even voice commands so that they may be unaware that they are actually incorporating an avatar into their body. It will be an experience akin to wearing contact lenses. The authors also posit that avatars will “walk among us” so that we might not even realize that the figure approaching us is actually an avatar. This kind of speculation seems to seduce the authors into a discussion of how avatars will change close relationships (which inevitably in such works always seems to devolve into a discussion of virtual sex). 

Equally predictable and mundane are the applications of virtual reality to the domains of religion and education. A couple of examples are enough to illustrate how silly the claims about future can get:

On religion, the authors write that having a virtual reality experience of Moses parting of the Red Sea, including the ability to smell the flopping fishes would deepen our understanding of the miracle. Frankly I prefer the Charlton Heston version myself. (And in any event, the land is dry when the Israelites walk through the path, according to the Bible.)

On education, things are even worse when you consider that one of the authors is a psychologist: The authors envision a “virtual tutoring system that will combine virtual reality, nanotech, and artificial intelligence"
capabilities and provide the most complete educational experience anyone could hope for. The salient feature of this transition from “physical to digital” learning environments seems to be the elimination of the notably dull textbook with options such as movies and virtual reality programs. This means the children wouldn’t have to read anything at all—a great improvement, as any psychologist will tell you, for the development of the brain and for the
learning process itself.

With improvements such as these one can only hope that the virtual revolution is just that—virtual, that is, not for real.

1 Comment

Building Wonderland Bit by Bit

3/30/2013

1 Comment

 
Picture
When something is digitized, whether it is some text, an image, a video, or a series of sounds, it becomes broken up into a language made up of just ones and zeros, the universal language known as the binary code of electronic
communications. Each letter becomes a series of digits. Every image first becomes a series of pixels, each of which is then translated into a series of digits. In the end the whole audio-visual world can be reduced to an infinite series of ones and zeros, and we are swept down a rabbit hole where everything becomes “content,” separated from its forms and often from its context as well. This is the world in which mash-ups are considered high art, and it is also the world in which data, information, and knowledge are jumbled together, morphing into undifferentiated instantiations of the same "content."

Digitalization is the great leveler of meaning and value in our time. It can make entities seem both discrete and connected at the same time. If I search on Google for “paradise,” the first thing that appears will be an advertisement for the Paradise Rock Club on Commonwealth Avenue in Boston (since I live in the environs) followed by bakeries, a small town in Michigan, pictures of tropical islands, and innumerable stores and restaurants that have adopted the popular name. Occasional Wikipedia entries are scattered about alluding to
another world.

It is only in the middle of the fifth page (does anyone ever go that deeply into a search?) that I finally come across what I was really after: information about Dante’s epic poem Paradise. What’s more,  except occasionally for the first entry, all the results appear in the same format accompanied by descriptions of roughly equal length. Rock clubs,  tropical islands, and world-class masterpieces—all appear of equal weight when sorted by search engines such as Google or Bing. (Admittedly Dante’s work appears closer to the top if one searches on “paradiso.” In the world of “Content” (and let’s not forget “Big Data”), life does indeed seem to be getting, as Alice might observe, "curiouser and curiouser,"  by the day. 


1 Comment

Intimations of Humanity: Words from the Wise at the Start of the Digital Age 

1/30/2013

1 Comment

 
“Our problem today is that we are not well acquainted with the literature of the spirit. We’re interested in the news of the day and the problems of the hour.”  Joseph Campbell, The Power of Myth (1988)

Some of the best cultural observers in the late twentieth century discerned the initial impact of digital computers on our society and tried to remind anyone who would listen of its dangers. Their thoughts help us remember what’s central to living a full human life in this world full of shiny, wonderful gadgets and always, always, the next new thing.  Joseph Campbell’s life spanned a good part of the twentieth century. Born in 1904, this renowned expert in world mythology lived through two world wars, the Depression, the dropping of the atomic bomb, Vietnam, the domestic mess of the sixties, and the relentless encroachment of machines, first the mechanical ones and then the electronic ones, before his death in 1987. Late in life he spent many hours in interviews with Bill Moyers, the cream of which eventually became “The Power of Myth.” A highly popular, deeply interesting set of interchanges gleaned from those conversations aired on PBS soon after Campbell’s death. Subsequently, the entire set of conversations appeared in book form under the same title.

Although Campbell believed that we live in a demythologized world, he found that students around the country were attracted to his lectures in large numbers, mostly, he speculated, because mythology provided messages
unlike what ordinary course work at colleges and universities offered in his day. Myths are “stories about the wisdom of life. . . . What we’re learning in our schools is not the wisdom of life. We’re learning technologies, we’re
getting information. There’s a reluctance on the part of faculties to indicate the life values of their subjects.” 

One major reason for this was increasing specialization, something that has intensified in the twenty-first century. Campbell pointed out that specialization necessarily limits the field in which one considers any problem and tends to eliminate the life values, especially the human and cultural aspects of any specific issue. Generalists, on the other hand, have the advantage of a broader perspective and the ability to make more complex associations and perhaps gain deeper insights as well. They can take something learned in one specialty and relate it to something learned in different specialty. By so doing, they can discover similar patterns or contradictions or discontinuities that aren’t apparent when one specializes in a narrow field.

Growing specialization and a greater focus on the literal, factual level of life, “the news of the day and the problems of the hour,” have only become more commonplace since the 1980s. Information technologies, with its data gluts, information overloads, knowledge “management,” and, most recently, big data, have put an enormous emphasis on the technologies themselves and have changed the pursuit of knowledge into a process of learning how to access the information one might need to know at some point or other in the future. As a result, the continuum of data, information, knowledge, and wisdom has become jumbled, their meanings confused. Some now describe knowledge as “actionable information.” Others, emphasizing dramatic changes in the state of knowledge due to the Internet, claim that the nature of knowledge has changed fundamentally. Knowledge now resides in networks, they maintain. It can’t possibly reside in an individual's head. In fact, knowledge is probably, in David Weinberger words, “too big to know.” As for wisdom, many seem to equate wisdom today with the consensus of a crowd or, even worse, the dynamics of the marketplace.

Like Joseph Campbell, the journalist and medical researcher Norman Cousins lived through the bulk of the twentieth century and observed the onslaught of technology with similar ambivalence and prescience.  "The essential problem of man in a computerized age,” he wrote in “The Poet and the Computer” (1990),  isn’t any
different than it was in previous times. “That problem is not solely how to be more productive, more comfortable, more content, but how to be more sensitive, more sensible, more proportionate, more alive. The computer makes possible a phenomenal leap in human proficiency . . . But the question persists and indeed grows whether the computer makes it easier or harder for human beings to know who they really are, to identify their real problems, to respond more fully to beauty, to place adequate value on life, and to make their world safer than it now is.” 

Computers as electronic brains can help enormously in vital research of many sorts, Cousins wrote. “But they can’t eliminate the  foolishness and decay that come from the unexamined life. Nor do they connect a man to the things he has to be connected to—the reality of pain in others; the possibilities of creative growth in himself; the memory of the race; and the rights of the next generation.” These things matter, Cousins went on to say, because in the computer age “there may be a tendency to mistake data for wisdom, just as there is a tendency to confuse logic with values, and intelligence with insight.” All of which makes that this bright and shiny present and that enchanting next new thing seem quite ephemeral and even trivial in comparison to the really exciting journey of life and the challenge of how to live it fully in the midst of—and perhaps in spite of— all our digital machines.

1 Comment

 Myths for Our Time: Kevin Kelly’s Technium

1/22/2013

1 Comment

 
One day, Carl Jung wrote in a memoir, he suddenly realized that, although he had written extensively about myths and personal transformations, he did not know what myth he himself was living by: “I took it upon myself to get to know ‘my’ myth, and I regarded this as the task of tasks.” In What Technology Wants, former Wired editor and technology writer Kevin Kelly takes on a similar job. He went on his own quest, spending seven years reading and talking to others about what he considers the central personal challenge of our time:  how to understand the “essence” of modern technology and find the appropriate personal relationship to it. What Kelly actually discovered was his own myth, the story he (and many others) grapple with today about the technology that pervades our lives and how to live with it. 

Kelly calls the multitude of technologies that surround us and interact with each other the “technium.” For Kelly, the technium has a life of its own. Because of the countless feedback loops and complex interactions that exist in and between various technologies today, the technium, he claims,  has become a sentient, autonomous entity. It represents “the greater, global, massively interconnected system of technology vibrating around us.” More than a set of technologies, the technium has become  “a self-reinforcing system of creation,” from which new perspectives,
relationships, and influences “emerge.”

What should we make of such large claims? To try to put this theory into perspective, I like to place it within the context of the work Joseph Campbell did with the history of world mythologies. He observed that myths are archetypal stories about the common experiences human beings share. As the stories accumulate, they become a symbolic system that expresses the human condition of a certain time. The images of any given system are drawn from  the immediate environment. Thus when a people roam the land in a hunting culture, as the American Plains Indians did, they create myths and rituals concerning the animals. For the Indians, it centered around the buffalo. In an agrarian culture, the myths center on the earth, on seeds, on planting, growing, and harvesting as symbols of birth, life, death, and renewal. Kelly, finding our modern world permeated with machines and their technologies, focuses on the story of those technologies and our relationship to them. 

Campbell observed that even in the 1980s machines were finding their way into our mythology. He pointed out that Star Wars explores the problem of whether the machine is going to dominate humanity or serve it. In fact Campbell praised Star Wars as a story of mythic proportion that said “technology is not going to save us. Our computers, our tools, our machines are not enough. We have to rely on our intuition, our true being.” This was the message Obe Wan Kenobi gives Luke when he tells him to turn off his computer and use the force he has within.
Campbell believed we needed new myths for modern times.  And he thought it would have to be the poets and visionaries who would devise those new myths by listening to the song of the universe and creating new metaphors to express it. “Humanity,” as Campbell reminded his readers and students often, “comes not from the machine but from the heart.” 

With his vision of the technium, Kevin Kelly offers a different interpretation of our current state of affairs. Through his own quest, he says, he has learned to listen to the machines of technology for enlightenment. “Seeing our world through technology’s eyes has, for me, illuminated its larger purpose.” Technology, he finds, is a much larger force than we had previously imagined. It is as large as nature itself and our response to it should be similar to how people have traditionally responded to nature. While in the past people have looked to nature for enlightenment, now they should look to the technium: “We can see more of God in a cell phone than in a tree frog,” Kelly
submits. 

What’s more, Kelly argues, humans have less and less influence over the collective force of technologies, whose power he traces back to the beginning of the universe: “It follows its own momentum begun at the big bang.” In  positing the technium and describing what technology “wants,” Kelly is in effect forging a new myth for  our age:  Technology is a unifying, evolving entity ever increasing in its power and reach. “Technology is stitching together all the minds of the living, wrapping the planet in a vibrating cloak of electronic nerves, entire continents of machines conversing with one another, the whole aggregation watching itself through a million cameras posted daily. How can this not stir that organ in us that is sensitive to something larger than ourselves?”

Joseph Campbell observed that all living myths, myths, that is, that speak to the common human condition at a certain period of time, have one thing in common: They assume some kind of unity that transcends the reality of what we observe in our lives, a unity that connects all:  In the transcendent reality, “everything links and accords with everything else.” Kelly’s quest and his illumination are yet another example of humanity’s quest to envision something larger than ourselves. Even if we actually don’t call it something sacred, the attitude of worship nonetheless remains. It certainly emerges very strongly in What Technology Wants.


 
1 Comment
<<Previous

    RSS Feed

    Archives

    February 2014
    January 2014
    December 2013
    November 2013
    August 2013
    July 2013
    June 2013
    May 2013
    April 2013
    March 2013
    February 2013
    January 2013
    December 2012
    August 2012
    July 2012
    June 2012
    May 2012
    April 2012
    March 2012
    January 2012
    December 2011
    November 2011
    October 2011
    June 2011
    May 2011
    April 2011
    March 2011
    February 2011
    December 2010
    July 2010
    June 2010
    May 2010
    March 2010
    February 2010
    January 2010
    December 2009

    Categories

    All
    AI
    Computer Models
    Convergence
    Digital Software
    Division Of Labor
    E Readers
    Facebook
    Financial Markets
    Google
    Innovation Business Cycle
    Internet
    Knowledge
    Learning
    Media Use
    Myths
    Powerpoint
    Robots
    Screen Life
    Screen Life
    Search
    Social Networking
    Targeted Marketing
    Technology And Jobs
    The Nature Of The Digital
    The Nature Of The Digital
    Video Games
    Web 2.0
    Wikis
    Youth

    Cynthia's Blog Plan

    I'll aim to post here a few times a month, based on current events and my ongoing research.