Digital Athena
  • Blog
  • Essays and Book Reviews
  • Contact Us
  • About Digital Athena
  • Index of Essays and Reviews

Checking In . . .

9/7/2015

0 Comments

 
My beloved blog has languished here while I have attended to some major personal crises. I have not been totally out of touch with the issues of technology and culture that brought me to start this blog and add essays to it a few years back. I am excited to announce that I am writing a book and am doing all the chores such an undertaking entails--not just the research, thinking, and writing for the book but all the usual promotional preparations for getting the work published. 

I'm hoping in the next few weeks to offer a preview of the general outlines of the book and its goals. The working title for now (and I'm rather liking it) is: The Stories We Tell: A Living Mythology for the Digital Age.


Sign up for my RSS feed or send me your email in a comment if you want to be notified when the first installment of my updates on writing the book appears.
0 Comments

Technology and the Human Spirit

5/6/2014

0 Comments

 
Picture
Read Sven Birkerts's classic, elegant book, The Gutenberg Elegies, and you will be enchanted with the prescience and wise insights of this gifted writer as he explores what it could mean to lose the habit of prolonged reading in an age of information. And prepare to be haunted by his stark vision of what he calls "the argument of our time--the argument between technology and the soul." Now twenty years old, this brief meditation on the fate of reading in an electronic age reminds us what we may be losing as we race through the early years of the twenty-first century.

Some contend that the argument is over, the battle decided. Technology has won, they say. It dominates our lives. It dictates how we live, how we work, how we think. The reading chair is empty now. We are tied to our screens, our alerts, our ringtones, our texts.

Twenty years later the vision Birkerts articulated still hits home but it does need some tweaking. First of all I would leave out the soul, which carries too many connotations of specific religious dogmas, especially of an afterlife. Better that we talk about the human spirit with all its physical, mental, and emotional facets. And I think, at best, that we're in a different position vis-à-vis the technologies that permeate our lives.  It should be less an argument, which suggests loud voices taking definite sides and vigorously debating strong positions, and more of a search. We do struggle with the relationship between technology and the human spirit, but it's more of a dialogue than an argument, more of an exploration of how we shall move forward with all this technology we've created. The real question has become how shall we live our lives under the conditions we face and how shall we live them well.

This is why we tell each other stories, stories like The Gutenberg Elegies, that help us understand what it's like to live our lives today.


0 Comments

The Human Brain Is a Computer? The Limits of Metaphor

4/26/2014

0 Comments

 
Metaphors matter. They matter a lot, shaping not only the way we communicate but also how we think, feel, and even behave. George Lakoff and Mark Johnson explained this well in their now classic work, Metaphors We Live By. Their premier example in that book analyzed how the concept "argument" becomes colored by its close association with the metaphor "war." Thus "argument is war." Here are some of the expressions they found that structure how we think about an argument as a war:

Your claims are indefensible.

He attacked every weak point in my argument.

His criticisms were right on target.

He shot down all my arguments.

Essentially, Lakoff and Johnson contend that metaphors impact the way we experience and understand  the associated concepts, so that in the case of argument, for example, we in part understand, act, and talk about it in terms of war. It's not a dance. It's not a writing process. It's a battle.

The widespread use of the metaphor of the computer to describe the workings of the human brain today has a similar effect. By using such an analogy, people are accepting the implications that the human brain is simply a logical device. This leads to such statements and by implications activities as the following:

IBM's Blue Brain Project is attempting to reverse-engineer the human brain.

Modern architectural design acknowledges that how buildings are structured influences how people interface.

The position of department head requires an expert multitasker capable of processing multiple projects at any given time.

His behavior does not compute.

Human beings do possess logical functions. But the danger with using the digital computer, which runs algorithms based on IF, THEN, ELSE, and COPY logical gates, as a metaphor for the brain is what it leaves out: messy feelings, ambiguous behaviors, irrational thoughts, and the natural ebb and flow of memories. It also leaves out the influences of our subconscious--and the rest of our physical, organic bodies--on how we think, act, and make decisions. Thinking of the brain as a computer addresses very little about what it feels like to be a human being, very little about what it feels like to be alive.

In The Myth of the Machine, Lewis Mumford argued that far too much emphasis has been placed on distinguishing humans from animals because of our tool-making capacities. He wrote that there was nothing uniquely human in our tool-making. After all, apes use tools. Rather it was the human mind "based on the fullest use of all his bodily organs" and that mind's capacity to create language and use symbols that allowed human beings to build social organizations, civilizations, that distinguish us from other animals. It was through symbols and language that humans rose above a purely animal state. But the ability to create symbols, to be conscious of life and death, of past and future, of tears and hopes, distinguishes humans from other animals far more than any tool-making capability. "The burial of the body tells us more about man's nature than would the tool that dug the grave."

If we continue to distinguish human beings from other animals along the lines of tool-making, Mumford believed, the trajectory would be quite dire:

"In terms of the currently accepted picture of the relation of man to technics, our age is passing from the primeval state of man, marked by his invention of tools and weapons for the purpose of achieving mastery over the forces of nature, to a radically different condition, in which he will have not only conquered nature, but detached himself as far as possible from the organic habitat."

So we need to be careful about using the metaphor of the computer, our most modern of tools, to describe our minds and what it means to be a human being.









0 Comments

Creatures of the Screen, or Heroes in Life?

2/11/2014

0 Comments

 
While many are happily seduced by the wonders and innovations of our contemporary high-tech life, others see danger lurking in our ever-growing reliance on digital technology. Nicholas Carr has a solid piece in a recent Atlantic Monthly about the hazards of progressive automation. One major development he explores is the unintended consequences of airplane autopilot systems. Carr discusses two recent fatal crashes, one a Continental Commuter flight flying between Newark and Buffalo that killed all 49 passengers and crew and the other an Air France flight from Rio de Janeiro to Paris that crashed into the Atlantic killing all 228 on board. In both cases, the autopilot disconnected, forcing the pilot to take control. And in both cases the pilots reacted by taking the wrong action and actually causing their planes to lose velocity and crash. So it seems that while autopilot systems have contributed to greater air safety over time, they have also contributed to pilot errors and new types of accidents. 
 
Studies show that pilots, and others whose work has been largely automated, become complacent. Workers develop a kind of blind confidence that computers will operate perfectly, and this attitude fails to acknowledge the dangers that increasingly complex computer systems, as they interact with each other, may malfunction. Workers, in effect, become computer monitors, Carr argues. They become less aware of the processes they oversee and often less attentive to the tasks they actually have to do.  Automation can also make workers just plain rusty in performing ordinary tasks so that, when the computer system malfunctions or fails, workers make mistakes. Skills decline when they go unpracticed and workers can actually forget how jobs are supposed to be done. “Knowing,” Carr reminds us, “requires doing.” By separating workers from the work, ends are achieved without workers grappling with the means. “Computer automation severs the ends from the means,” Carr explains. And he claims “it’s the work itself—the means—that make us who we are.” 

Automation, in effect, changes who we are. We become passive, unengaged “creatures of the screen.” I recall the overwhelming public embrace of Chesley “Sully” Sullenberger after his spectacular and highly skilled landing of a US Airways plane on the Hudson, which saved the lives of all 155 passengers on board. He was proclaimed a “hero” and showered with honors. But I do not think it was simply "The Miracle on the Hudson” that drew people’s
attention to him and made him into a popular hero. Rather it was the back story—the story of how he had been a strong advocate for safety all his life, he maintained his own skills and practiced alertness. He understood the
limitations of the automated systems he used, and above all he worked hard to live with the integrity, humility, and value system that defined his life and his work. The reviewer of Sully’s autobiography in The Washington Post summed up public perception well:

“Sullenberger’s all-American life story is so compelling that it screams to be required reading for all young people, or anybody else who needs confirmation that courage, dignity and extraordinary competence can still be found in this land.... [A] remarkable life story.”

Carr’s question in the end is the right one: “Does our essence still lie in what we know, or are we now content to be defined by what we want?” Are we to become “creatures of the screen” or are we to maintain our full humanity, each of us heroes in our own way, by continuing to know and to learn by doing rather than letting our machines work on the assumption that the human being is probably the weakest link in any given system.

0 Comments

Ray Kurzweil's Mind

1/23/2014

0 Comments

 
Ray Kurzweil incessantly dreams of the future. And it's a future he describes as a "human-machine civilization." In How To Create a Mind: The Secret of Human Thought Revealed, Kurzweil looks forward to a time when technology will have advanced to where it will be possible to gradually replace all the parts of the body and brain with nonbiological parts. And he claims that it will not change people's identities any more than the natural, gradual replacement of the cells in our body does now. All this will come about after scientists and engineers, who are currently working on brain models in many different organizations and areas of the world, succeed in creating a complete model of the human brain. Kurzweil contends that the neocortex functions hierarchically and that it works according to pattern recognition. Therefore, he argues, it is possible to write algorithms that will  simulate how the brain actually works. That, in combination with increasing miniaturization, will make such substitution of nonbiological components possible by the 2030s.

That the human brain is akin to a digital computer is still a big and a very contentious issue in neuroscience and cognitive psychology circles. In the January issue of Scientific American, Yale professor of psychology John Bargh summarizes some of the latest thinking about this problem. Specifically he addresses the major role of the unconscious in how people make decisions, how they behave in various situations, and how they perceive themselves and the world around them. There is a complex dynamic between ourcontrolled conscious thought processes and the unconscious, often automatic, processes of which we are not aware. Nobelist Daniel Kahneman explained this phenomenon in Thinking Fast and Slow. Automatic thought processes happen quickly and do not include planning or deliberation.

Even Daniel Dennett, an eminent philosopher and cognitive scientist who has long held that neurons functioned as simple on-off switches that make them a logical switch similar to a digital bit, has recently changed his mind about the analogy of the human mind to a computer: "We're beginning to come to grips with the idea," he says in a recent Edge talk, "that your brain is not this well-organized hierarchical control system where everything is in order,  . . . In fact, it's much more like anarchy. . . ."  Yet even with this concession Dennett is still inclined to use the computer as a metaphor for the human brain. This leads him to make a curious statement, one which actually begs the question: "The vision of the brain as a computer, which I still champion, is changing so fast. The brain's a computer, but it's so different from any computer you're used to. It's not your desktop or your laptop at all."

By his own admission, Dennett's talk is highly speculative: "I'd be thrilled if 20 percent of it was right." What I think he means is that the brain is like a computer that is far more complex than existing machines but that it also has intention. The neurons are "selfish," and they are more like agents than computer instructions, which in turn are more like slaves. "You don't have to worry about one part of your laptop going rogue and trying out something on its own that the rest of the system doesn't want to do." Computers, on  the other hand, are made up of "mindless little robotic slave prisoners." So I'm not sure how helpful it is for Dennett to think of the brain as a computer at all. And Dennett's views on neurons and agents, combined with the more recent thinking about the impact of the unconscious on conscious thought, lead me to conclude that Ray Kurzweil's dream of someday replacing the human brain with robotic switches is just that: a dream.
0 Comments

James Gleick's Dark Journey: Searching for Vital Experience in a Virtual World

1/13/2014

0 Comments

 
James Gleick’s The Information: A History, A Theory, A Flood undertakes the enormous task of narrating the cultural, technical, and theoretical approaches to information over many centuries, from the basic binary system used by African drummers in ancient times through forms of writing and printing. He discusses the major
developments of the nineteenth and twentieth centuries, including Morse and the telegraph, Babbage and the analytical engine, and gives special emphasis to the theories of Shannon and Turing and those that followed in their footsteps as theorists began to think about information in more quantitative ways. 
 
Freeman Dyson, in his review of Gleick’s book for The New York Review of Books, observes that this quantification of information actually blurs the line between information and data. In discussing the consequence of Moore’s Law in the growth of cheaper and more capacious information storage capacity, Dyson says that in 1949 Shannon constructed a table of the various existing stores of memory. The largest store in Shannon’s table was the US Library of Congress, estimated to contain one hundred trillion bits of information.  “Today," Dyson says, “a memory disc drive storing that amount of information weighs a few pounds and can be bought for about a thousand dollars. Information, otherwise known as data, pours into memories of that size or larger, in government and business offices and scientific laboratories all over the world.” Dyson in effect implies that when you think about information as bits to be stored and manipulated you are in truth thinking more about discrete pieces of data than
what is commonly thought of as information.

Gleick’s book guides us cogently and neutrally through the vast history and theory of information. Yet as he approaches the end he shares more personal insights and experiences with his readers. As he contemplates the today's exponential growth in the US Library of Congress’s store of information, Gleick expresses the confusion and anxiety such massive growth and amassing of data produced, in one’s sense of identity and experience of life:  “As the train hurtled onward, its passengers sometimes felt the pace foreshortening their sense of their own history. Moore’s law had looked simple on paper, but its consequences left people struggling to find metaphors with which to understand their experience.” One familiar metaphor, he suggests, is “the cloud.” “All that information—all that information capacity—looms over us, not quite visible, not quite tangible, but awfully real; amorphous, spectral; hovering nearby, yet not situated in any one place. Heaven must once have felt this way to the faithful."                                                                                                                           

Many today express wonder and awe at the vast network of interconnectedness in the nodes of the Internet. But what really is the nature  of those connections and their structure? “The network has a structure,” Gleick muses, “and that structure stands upon a paradox. Everything is close, and everything is far, at the same time. This is why cyberspace can feel not just crowded but lonely. You can drop a stone into a well and never hear a splash.” Gleick ends his massive work on a note of gloomy uncertainty about the whole phenomenon. Using the analogy of the library for the Internet, he concludes: ”We walk the corridors, searching the shelves and rearranging them, looking for lines of meaning amid leagues of cacophony and incoherence, reading the history of the past and of the future, collecting our thoughts and collecting the thoughts of others, and every so often glimpsing mirrors, in which we may recognize creatures of information.” All this leaves us wondering if there is anything real and vital in that virtual universe.

0 Comments

The Unbelievers: Spiritual Experience without God

12/17/2013

0 Comments

 
Two Heroes Offer Lessons from Their Personal Journeys 

Modern heroes don’t slay monsters who roam the world threatening the innocent and the vulnerable. Nor do they rescue damsels in distress. Unlike folk heroes or action figures, the true modern heroes, as the mythographer Joseph Campbell described them, undertake inward journeys, moving beyond their personal limitations to experience some inspiration or vision. In so doing, they gain a different understanding of what it means to live a human life. Such experiences resonate deeply within their own spirits, changing how they think and feel. In effect, the experience is a transformative one. Once a hero has taken such a journey, the next task is to return to the world and tell others about it.

In a new documentary out this week, “The Unbelievers,” scientists Richard Dawkins, professor emeritus of the public understanding of science at Oxford University, and Lawrence Krauss, a cosmologist from Arizona State University, travel from Australia to England—and many places in between—to share the insights they have learned from their lifetimes of academic research and writing. They are just two of the leaders of an emerging group of scientists and other intellectuals known as the “new atheists.” In mythical terms they are on a mission to proclaim that contemporary science, not religion, gives us the better vision about what our universe is like and what the nature of the human condition is within that universe. 

Dawkins and Krauss tell their listeners that a universe can arise from nothing, needing neither a god nor some miracle to explain its existence.  And, in the general scheme of things, human beings are quite insignificant. The two scientists suggest that instead of turning to a god to give their lives meaning, people should find their own meaning and savor their lifetime. Dawkins in particular suggests that people can look to the scientific discoveries of our time for inspiration about the world around us: “Science is wonderful; science is beautiful,” Dawkins proclaims.                 "Religion is not wonderful; it is not beautiful. It gets in the way.”

Joseph Campbell also thought that science could indeed offer a “far more marvelous, mind-blowing revelation than anything the pre-scientific world could ever have imagined.” In fact, he said, the discoveries of science make the stories of the Bible and other ancient religions look childlike and tame by comparison. He also thought that the dogma of established religion obscured the true function of religion as a living myth, that is, to give a person a direct experience of the rapture of what it feels like to be alive. 

For Campbell, the question was whether science and technology would eventually make religion and magic fade away, or if science may someday help us better understand how the symbolic forms of myth arise from the human
spirit. He speculated that psychology (and no doubt neuroscience should be added to this now) could help us see why our common dreams become public myths, myths that can have positive, life-furthering ends. On the other hand, Campbell  thought that it would be artists—fiction writers and poets, painters and sculptors, musicians and  architects—who would articulate the contemporary vision. They, not the scientists, would interpret the symbols and create the stories that would help human beings live a good life under any circumstances. 

Today, the scientists are making similar claims for their own domain. They say they can help us understand who we are in this universe. And in some senses they can, but there is a difference. Scientists don't deal with symbols, metaphors, and stories. They deal with facts. Facts are processed through the intellect, whereas symbols, metaphors, and stories affect people at a deeper—dare we say spiritual?—level. Paradoxically, though, the unbelievers, the atheists, are more genuinely participating in building a living myth for today’s world than are the proponents of the ancient religions. This is especially true of the ancient Western religions, whose traditional sacred texts reflect far more foreign and primitive views of who we are as human beings and where we stand in this grand universe of ours.

0 Comments

Google Gives Many a Sense of Special Powers

12/6/2013

1 Comment

 
If anyone doubts there are signs of a living myth emerging from the culture in which we live today, all they need do is read some of scientific research and works by other leading writers and academics. The current literature on technology and culture is bursting with references about the various ways in which the Internet is affecting our consciousness. And as the pre-eminent  myth expert Joseph Campbell pointed out, one hallmark of a living myth, something that “hits you where it counts,” involves the transformation of consciousness in one way or another, depending on the time and circumstances in which the myth develops. Today, with technology permeating our experience, it seems natural that it would play a part in how we express what it means to be human today.

Researchers have found that when people use Google to answer questions about trivia that others pose, they gain an
elevated sense of their own knowledge. Daniel Wegner and Adrian Ward write in this month’s Scientific American
that the speed with which the Internet instantaneously returns screen results may even lead people to consider the vast amount of information on the Internet as an extension of their own personal memories. 
 
In their study, the authors divided the participants into two groups: one group could use Google to answer trivia questions posed by testers and the other group answered the questions without access to the search engine.  To make sure that the people without access had a sense of success similar to those who did use Google, the group without access to Google was told they were correct sometimes when they in fact had not given the right answer. Even with such controls, the group that used Google maintained the illusion that their own mental capacities were stronger based on their experience. The group without access to Google did not.

The authors point out a telling irony of this information age: we have a generation of people now who think they know more than previous generations, although their habitual use of the Internet for searching for information actually indicates that they may know even less about the world around them than their forebears. As is usual with this type of pattern of generalization, unfortunately,  the authors end their article with a  highly speculative set of musings: Perhaps, they posit, people become part of the “Intermind,” as they call the blending of individual minds with the Internet.  And, liberated from the necessity of remembering mere facts, may give their minds more energy for creativity, allowing them to transcend the current limits of their memory and thought processes. Wegner and Ward conclude: “We are  simply merging the self with something greater, forming a transactive partnership not just with other humans but with an information source more powerful than any  the world has ever seen.”  This kind of leap, from factual research to speculative visionary proclamations of major changes in our psychological experience and sense of self, are becoming more and more common with otherwise well-credentialed, well-respected writers. It reflects the need to believe that these changes based on our experience of highly technologized world are fundamental and indeed mind-altering in their nature.

1 Comment

Cyberspace: Lost on a Dark Journey

11/22/2013

0 Comments

 
Earlier this month, The New York Times Book Review published a special issue devoted to technology and its effects on our lives and our books. The editors asked a group of writers what their take was on how the Internet had changed the art of storytelling. Several writers emphasized that they tried to express a sense of modern fragmentation, of some loss of a sense of a whole, as if the continuity of narrative and the idea of life as a journey had been obscured in our current life. Others expressed the need to explain rediscover  the sense of the mystery that is at the bottom of what we call life. They believe the the role for writers is to dig deeper writers into the mysteries and wonder of life, even in this age of technology, when there is so much superficial activity available to us that our experience easily become disrupted and meaning of any sort becomes elusive.  Others note that although corporate economics constantly tries to attract us  with yet another novel technological gadget or twist, the truly successful technologies are those that resonate with the basic experience we have as human beings.

Writer Ander Monson turns all this on its head in an interesting way. Being incessantly bombarded by small bits of narratives, he says,  is to “experience the past  . . .  the distant, darkened past” in the sense that one feels palpably what it was like to be in a labyrinth such as the one Daedalus built for the monster known as the Minotaur according to Greek myth. It provides an ancient analogy for the experience of “ trying to find the line of ascent in a wall of information; the trail of URLs I click through in my morning’s misinforming.” In current terms,
then, the labyrinth becomes the Internet itself and itsendless information.

Pondering this brings him round to the fundamental experience of our contemporary lives today:   “It’s dark down here,” Monson writes, “and lonely. I am drawn mostly, insistently to the human voice. How powerful and necessary the solo voice, the experience of being someone, something else for a little while.” Expressing this experience, Monson declares, will remain what he calls “literature’s killer app” because the act of writing about it is concerned with words and hence “impervious to the threat by everything that’s not the word.”

It is a journey into the darkness not unlike the one that T.S. Eliot described it in “East Coker” as he described his own battle with writing:

“And so each venture
Is a new beginning, a raid on the inarticulate . . . 
And what there is to conquer
By strength and submission, has already been discovered
Once or twice, or several times, by men whom one cannot hope
To emulate . . .
There is only the fight to recover what has been lost
And found and lost again and again: and now, under conditions
 That seem unpropitious.”

Unpropitious indeed are our times. Yet it is heartening to see these writers probing to find the common threads of our experience and try to express what it means to be human today.

0 Comments

Big Data: Cold Water from the New York Times

8/21/2013

0 Comments

 
Investigative reporter for the Times James Glanz brings some hard facts and dissenting opinions to bear on the current big claims about Big Data as “the new oil” for our economy. Glanz cites Northwestern economics professor Robert Gordon, who says that comparing Big Data to the impact of oil in the late nineteenth and early twentieth century in terms of economic impact is simply a silly form of exaggeration: “Gasoline made from oil made possible a transportation revolution as cars replaced horses and as  commercial air transportation replaced railroads. If anybody thinks that personal data are comparable to real oil and real vehicles, they don’t appreciate the realities of the last century.”

Nor does the parallel to the rise of the electricity grid hold much credence with some. In terms of numbers, the comparison is certainly tempting:  From 2005 to 2012 the volume of data on the internet increased 1696%. But the revolutions that the unleashing of electricity produced in manufacturing processes, ways of daily living, and transportation have no match in the rise of “Big Data” to date. In fact, during the time that has seen the increase in Big Data, we have experienced a lackluster economy where productivity, which had risen largely due to automation from the 1970s through the start of the 2000s, has actually shrunk. Productivity growth decreased 1.8% annually from 2005 to 2012.

In part making such outlandish claims is one of the hazards of predicting the future, always a difficult if not impossible task to get right. Yet it may also borrow something from the spirit of the times: We have grown so
accustomed to enormous, “revolutionary” sorts of changes in the last two decades that some believe the sheer size of the growth rate in Big Data must signal something equally unprecedented and huge on the horizon. Yet in the end some economists have observed that the new analytics, which companies use to mine Big Data, just allows those companies to cannibalize the customer base of their competitors, or, make the case for digital advertising over print and other traditional media even stronger. It creates incremental or sideways changes, not revolutions in the economy. And still others muse that the current framework for the use of Big Data may be just plain wrong. In the end, they posit, the context in which our futurists have placed Big Data, like cloud computing, may end up simply being  “a mirage.” Big Data and cloud computing may be incorporated into our economic and business practices in ways we have yet to even envision. I think we’ll just have to wait and see on this one . . .

0 Comments
<<Previous

    RSS Feed

    Archives

    February 2014
    January 2014
    December 2013
    November 2013
    August 2013
    July 2013
    June 2013
    May 2013
    April 2013
    March 2013
    February 2013
    January 2013
    December 2012
    August 2012
    July 2012
    June 2012
    May 2012
    April 2012
    March 2012
    January 2012
    December 2011
    November 2011
    October 2011
    June 2011
    May 2011
    April 2011
    March 2011
    February 2011
    December 2010
    July 2010
    June 2010
    May 2010
    March 2010
    February 2010
    January 2010
    December 2009

    Categories

    All
    AI
    Computer Models
    Convergence
    Digital Software
    Division Of Labor
    E Readers
    Facebook
    Financial Markets
    Google
    Innovation Business Cycle
    Internet
    Knowledge
    Learning
    Media Use
    Myths
    Powerpoint
    Robots
    Screen Life
    Screen Life
    Search
    Social Networking
    Targeted Marketing
    Technology And Jobs
    The Nature Of The Digital
    The Nature Of The Digital
    Video Games
    Web 2.0
    Wikis
    Youth

    Cynthia's Blog Plan

    I'll aim to post here a few times a month, based on current events and my ongoing research.