Digital Athena
  • Blog
  • Essays and Book Reviews
  • Contact Us
  • About Digital Athena
  • Index of Essays and Reviews

Technology and the Human Spirit

5/6/2014

0 Comments

 
Picture
Read Sven Birkerts's classic, elegant book, The Gutenberg Elegies, and you will be enchanted with the prescience and wise insights of this gifted writer as he explores what it could mean to lose the habit of prolonged reading in an age of information. And prepare to be haunted by his stark vision of what he calls "the argument of our time--the argument between technology and the soul." Now twenty years old, this brief meditation on the fate of reading in an electronic age reminds us what we may be losing as we race through the early years of the twenty-first century.

Some contend that the argument is over, the battle decided. Technology has won, they say. It dominates our lives. It dictates how we live, how we work, how we think. The reading chair is empty now. We are tied to our screens, our alerts, our ringtones, our texts.

Twenty years later the vision Birkerts articulated still hits home but it does need some tweaking. First of all I would leave out the soul, which carries too many connotations of specific religious dogmas, especially of an afterlife. Better that we talk about the human spirit with all its physical, mental, and emotional facets. And I think, at best, that we're in a different position vis-à-vis the technologies that permeate our lives.  It should be less an argument, which suggests loud voices taking definite sides and vigorously debating strong positions, and more of a search. We do struggle with the relationship between technology and the human spirit, but it's more of a dialogue than an argument, more of an exploration of how we shall move forward with all this technology we've created. The real question has become how shall we live our lives under the conditions we face and how shall we live them well.

This is why we tell each other stories, stories like The Gutenberg Elegies, that help us understand what it's like to live our lives today.


0 Comments

Creatures of the Screen, or Heroes in Life?

2/11/2014

0 Comments

 
While many are happily seduced by the wonders and innovations of our contemporary high-tech life, others see danger lurking in our ever-growing reliance on digital technology. Nicholas Carr has a solid piece in a recent Atlantic Monthly about the hazards of progressive automation. One major development he explores is the unintended consequences of airplane autopilot systems. Carr discusses two recent fatal crashes, one a Continental Commuter flight flying between Newark and Buffalo that killed all 49 passengers and crew and the other an Air France flight from Rio de Janeiro to Paris that crashed into the Atlantic killing all 228 on board. In both cases, the autopilot disconnected, forcing the pilot to take control. And in both cases the pilots reacted by taking the wrong action and actually causing their planes to lose velocity and crash. So it seems that while autopilot systems have contributed to greater air safety over time, they have also contributed to pilot errors and new types of accidents. 
 
Studies show that pilots, and others whose work has been largely automated, become complacent. Workers develop a kind of blind confidence that computers will operate perfectly, and this attitude fails to acknowledge the dangers that increasingly complex computer systems, as they interact with each other, may malfunction. Workers, in effect, become computer monitors, Carr argues. They become less aware of the processes they oversee and often less attentive to the tasks they actually have to do.  Automation can also make workers just plain rusty in performing ordinary tasks so that, when the computer system malfunctions or fails, workers make mistakes. Skills decline when they go unpracticed and workers can actually forget how jobs are supposed to be done. “Knowing,” Carr reminds us, “requires doing.” By separating workers from the work, ends are achieved without workers grappling with the means. “Computer automation severs the ends from the means,” Carr explains. And he claims “it’s the work itself—the means—that make us who we are.” 

Automation, in effect, changes who we are. We become passive, unengaged “creatures of the screen.” I recall the overwhelming public embrace of Chesley “Sully” Sullenberger after his spectacular and highly skilled landing of a US Airways plane on the Hudson, which saved the lives of all 155 passengers on board. He was proclaimed a “hero” and showered with honors. But I do not think it was simply "The Miracle on the Hudson” that drew people’s
attention to him and made him into a popular hero. Rather it was the back story—the story of how he had been a strong advocate for safety all his life, he maintained his own skills and practiced alertness. He understood the
limitations of the automated systems he used, and above all he worked hard to live with the integrity, humility, and value system that defined his life and his work. The reviewer of Sully’s autobiography in The Washington Post summed up public perception well:

“Sullenberger’s all-American life story is so compelling that it screams to be required reading for all young people, or anybody else who needs confirmation that courage, dignity and extraordinary competence can still be found in this land.... [A] remarkable life story.”

Carr’s question in the end is the right one: “Does our essence still lie in what we know, or are we now content to be defined by what we want?” Are we to become “creatures of the screen” or are we to maintain our full humanity, each of us heroes in our own way, by continuing to know and to learn by doing rather than letting our machines work on the assumption that the human being is probably the weakest link in any given system.

0 Comments

Virtual Pull over the Edge

5/17/2013

1 Comment

 
Why is digital technology so exciting? Why is all this fast change that technology brings always so inevitable and wonderful? There’s an eerily similar narrative that so many books follow these days as they recount the breakthroughs in technology and how they are changing our culture. Written by highly credentialed and respected scholars and technology writers, they all seem to begin by announcing a revolution that is taking place, a level of change that will dramatically (and of course rapidly) affect how we live, how we think, how we interact, or sometimes all of the above. The books present many studies and discuss their ramifications to support their theses, all of which are pretty rational and often good food for thought. 

So far, so good. The problem with these books really appears toward the end of the works. There the authors somehow feel compelled to extensively predict how some particular aspect of digital technology will inevitably transform various aspects of our lives in drastic ways, some of which may enhance our lives and some of which may simply make things more complex, or more artificial, or more alienated than they already are. In most instances,
one is left thinking that writing the book got the authors so enmeshed in their own material that they literally ventured over the edge and into some great unknown by the end of the work. Maybe all these authors/editors need to do is just lop off the last 40 pages.

To show the typical trajectory of such books, let’s take a look at Infinite Reality: Avatars, Eternal Life, New Worlds, and the Dawn of the Virtual Revolution by Jeremy Bailenson, founding director of Stanford’s Virtual Human Interaction Lab, and Jim Blascovish, Distinguished Professor of Psychology at UC Santa Barbara.

First there’s the revolutionary thesis:
“We sit on the cusp of a new world fraught with astonishing possibility, potential, and peril as people shift from
 face-to-face to virtual interaction. If by 'virtual' one means 'online,' then nearly a third of the world’s population is doing so already.”

Then they announce the intent of the book:
 “In this book, we provide an account of how virtual reality is changing human nature, societies, and cultures as we know them. Our goal is to familiarize readers with the pros and cons of the brave new world in which we live. We explore notions of consciousness, perception, neuroscience, media technology, social interaction, and culture writ large, as they pertain to virtual reality—and vice versa.”

Blascovich and Bailenson are genuinely excited because they see a new frontier of opportunity for the behavioral sciences based on all the data that is becoming available through online social interactions. While the environment may be virtual, the behavior of the individuals is “real.” And virtual interactions can change individual behavior in the real world as well as people’s sense of themselves. Virtual experiences change how people make important decisions and how they react viscerally to real-life situations. Examples of studies and analyses of various data sets take up the majority of the volume and are meant to convince the reader that there is indeed a revolution underway. 

Some of their narratives and analyses are quite interesting and could have some meaningful implications for understanding how and why humans behave in the ways that they do and how therapists can use virtual reality
software of various sorts, including explicitly targeted exercises and games, to help people change negative behavior patterns or improve their abilities or performance in some areas. 

It’s in the final portion of the book, however, that the real problem arises. And it occurs in an area where so many technology writers get into hot water: Predicting the future. The authors paint a series of semi-plausible scenarios for various activities from market research to legal trial evidence to surgical training as well as physical therapy, airplane-pilot training, even virtual vacations. Some of these things are well on the way to becoming part of our lives (although I myself will probably resist virtual tourism to my dying day). 

Then Bailenson and Blascovich take another small leap into the more distant future that really puts them over the edge. First they discuss the “promise” of avatars, which they say will become “perceptually indistinguishable” from their real counterparts. People will be able to automatically interact with their avatars without the need for hardware or even voice commands so that they may be unaware that they are actually incorporating an avatar into their body. It will be an experience akin to wearing contact lenses. The authors also posit that avatars will “walk among us” so that we might not even realize that the figure approaching us is actually an avatar. This kind of speculation seems to seduce the authors into a discussion of how avatars will change close relationships (which inevitably in such works always seems to devolve into a discussion of virtual sex). 

Equally predictable and mundane are the applications of virtual reality to the domains of religion and education. A couple of examples are enough to illustrate how silly the claims about future can get:

On religion, the authors write that having a virtual reality experience of Moses parting of the Red Sea, including the ability to smell the flopping fishes would deepen our understanding of the miracle. Frankly I prefer the Charlton Heston version myself. (And in any event, the land is dry when the Israelites walk through the path, according to the Bible.)

On education, things are even worse when you consider that one of the authors is a psychologist: The authors envision a “virtual tutoring system that will combine virtual reality, nanotech, and artificial intelligence"
capabilities and provide the most complete educational experience anyone could hope for. The salient feature of this transition from “physical to digital” learning environments seems to be the elimination of the notably dull textbook with options such as movies and virtual reality programs. This means the children wouldn’t have to read anything at all—a great improvement, as any psychologist will tell you, for the development of the brain and for the
learning process itself.

With improvements such as these one can only hope that the virtual revolution is just that—virtual, that is, not for real.

1 Comment

Myths of Our Time: Why the Web Attracts Us

4/4/2013

1 Comment

 
The nonstop chatter about the power of the Internet has reached a level of cacophony that is hard to dismiss or even ignore. All sorts of people are fascinated with its ever-growing size, with its ubiquity, with its endless
variety, and, of course, with its “promise.” Depending on whom you listen to, the Web promises specific transformative powers, from the spread of democracy to the end of history, from equal access to all information to a repository of the sum of human knowledge that is utterly unfathomable in its size and breadth. The Web’s decentralized structure itself has even become a model for all sorts of peer-to-peer networks, from which “collective intelligence” will emerge to solve many of the world’s thorniest problems. It’s enough to make you think that
surely the cultural equivalent of the second coming is at hand.

The Web has indeed captured the general imagination, but what is actually emerging is a common living mythology for our time. A major component of any mythology is power of epic proportions, recounted in larger-than-life
stories. Today’s ever-expanding Internet of website nodes offers a story of the power of technology and may well provide a symbol that helps us understand the experience of what is like to live today.

In ancient times, humans constructed mythological symbols out of their physical environment and their way of life. In the hunting and herding societies, animals played key roles in the myths and rituals. In the agrarian societies, the planting cycle provided the focus for myths. So it is not surprising that, in a society woven through with various digital technologies that have changed the way we live and work—as well as the way we think and interact with others—that technology should find a central place in the myths of our day.

And by myths here I mean living, vital myths, which are neither true nor false but through their symbols speak directly to what it means to experience life at a particular time in history. The famous mythographer Joseph
Campbell (1904-1987) spent his life studying, writing, and teaching others about the world of mythologies. Whereas dreams are private myths, Campbell would say, myths are public dreams. Mythologies are really stories that contain archetypal symbols, symbols that have been used in countless inflections throughout the
mythologies of the world.

Campbell found remarkably similar and detailed stories of deaths and resurrections, virgin births, heroes’ journeys, and many other images and narratives. Like Jung and many others, Campbell emphasized that these similarities existed and resonated with so many people throughout the ages because myths originate in the unconscious. They are biologically grounded in the psyche, which Campbell defined as “the inward experience of the human body,
which is essentially the same in all human beings, with the same organs, the same instincts, the same impulses, the same conflicts, the same fears.”

So one key to the powerful attraction to the Web for many people as they approach it from different angles with varying interpretations and emphases, is that the web, which is sometimes called a net, is itself an archetypal symbol that recurs in other cultures and mythologies as a metaphor for, among other things, interconnectedness. One classic mythical symbol is the Hindu “Net of Indra,” or “Net of Gems.” The Net of Indra is an infinite net that contains a gem at every crossing of one thread with another. Each gem reflects all the other gems. Everything is interrelated and everything that occurs does so in relation to everything else. Campbell sees a similar insight in the nineteenth-century philosopher Arthur Schopenhauer’s idea about the shape of an individual’s life. Schopenhauer observed that towards the end of your life you can look back and see a consistent order, a plan, to it. People you seem to have met by chance become important agents in the structure of your life. And you too have served unintentionally a similar role in the lives of others, so that one gathers a larger vision of the unfolding of life “like one big symphony, with everything unconsciously structuring everything else.” Schopenhauer wrote that it is as if a single dreamer were dreaming a dream in which all the characters dream as well, so that everything links to everything else. James Joyce developed a similar theme in his final work, Finnegans Wake.

Campbell also told an American Indian story where the web again plays a central role in conveying the idea of interconnection. An American Indian chief, Chief Seattle, wrote to the President of the United States in 1852
in response to an inquiry from the government about buying tribal lands to accommodate new influxes of immigrants from Europe. The basic theme was one of the interdependence of all of nature: “But how can you buy or sell the sky? The land? The idea is strange to us. If we do not own the freshness of the air and the sparkle of the water, how can you buy them? . . . The earth does not belong to man, man belongs to the earth. . . . Man did not weave the web of life, he is merely a strand in it. Whatever he does to the web, he does to himself.”

Power, light, vastness, interconnection, transcendence beyond the visible—all these characteristics of the web converge in these various images. And this may explain why our electronic web—with its pulsing light, expanding toward some unknown and unseen space, connecting countless people, institutions, and sources of information through an endless array of light-emitting nodes—captures the imagination of  so many today as they dream the dream of life in the here and now. 

As Joseph Campbell always maintained, myths may evoke mystery and awe, their symbols leading forward. They point to clues of the spiritual potentialities of human life. While Campbell said it was impossible to predict what the next mythology might be, any more than it’s possible to predict what one might dream on any given night, he did believe that any new myth would have to take into account the planet as a whole and include the machines of our
modern life. This is the focus of our mythology: the story of the progress of technology, with the computer engineers as our magicians and the web as the source of all knowledge, both our Delphic oracle and the symbol of the interconnected nodes of the human race.

1 Comment

Building Wonderland Bit by Bit

3/30/2013

1 Comment

 
Picture
When something is digitized, whether it is some text, an image, a video, or a series of sounds, it becomes broken up into a language made up of just ones and zeros, the universal language known as the binary code of electronic
communications. Each letter becomes a series of digits. Every image first becomes a series of pixels, each of which is then translated into a series of digits. In the end the whole audio-visual world can be reduced to an infinite series of ones and zeros, and we are swept down a rabbit hole where everything becomes “content,” separated from its forms and often from its context as well. This is the world in which mash-ups are considered high art, and it is also the world in which data, information, and knowledge are jumbled together, morphing into undifferentiated instantiations of the same "content."

Digitalization is the great leveler of meaning and value in our time. It can make entities seem both discrete and connected at the same time. If I search on Google for “paradise,” the first thing that appears will be an advertisement for the Paradise Rock Club on Commonwealth Avenue in Boston (since I live in the environs) followed by bakeries, a small town in Michigan, pictures of tropical islands, and innumerable stores and restaurants that have adopted the popular name. Occasional Wikipedia entries are scattered about alluding to
another world.

It is only in the middle of the fifth page (does anyone ever go that deeply into a search?) that I finally come across what I was really after: information about Dante’s epic poem Paradise. What’s more,  except occasionally for the first entry, all the results appear in the same format accompanied by descriptions of roughly equal length. Rock clubs,  tropical islands, and world-class masterpieces—all appear of equal weight when sorted by search engines such as Google or Bing. (Admittedly Dante’s work appears closer to the top if one searches on “paradiso.” In the world of “Content” (and let’s not forget “Big Data”), life does indeed seem to be getting, as Alice might observe, "curiouser and curiouser,"  by the day. 


1 Comment

Is Our Digital Future Inevitable or Do We Have Options?

12/10/2012

1 Comment

 
Back to my blog after some professional and personal interruptions. I thought I’d begin again by talking about the way many people so readily embrace the new technologies that stream out of software and hardware  companies and into their lives. Most dismiss objections about the changes in our lives, in our relationships—indeed in our brains— that those new technologies may trigger.  For better or worse, it’s inevitable, people say. Stopping the changes, or even the rate of change, is impossible now. Many pundits and members of the digerati enjoy not just defining the current trends but also predicting the future, whether it be the next new thing or a broad vision of social change over the upcoming twenty years or more.

But is it all inevitable? I recently came across another take on the issue of inevitability and the impossibility of stopping the relentless march of change over time. In Thomas Mann’s Doctor Faustus, the narrator reflects on the consensus in his intellectual circle in Munich that as the 1930s unfolded, Germany was in for “hard and dark times
that would scoff at humanity, for an age of great wars and sweeping revolution, presumably leading far back beyond the Christian civilization of the Middle Ages and restoring instead the Dark Ages that preceded its birth and had followed the collapse of antiquity.”

Yet Mann’s narrator observes that no one objected to those conclusions. No one said this dark version of the future must be changed, must be avoided. No one said: ”We must somehow intervene and stop this from happening.” Instead they reveled in the cleverness of their insights, in their recognition of the facts and their inevitable results. They said: “’It’s coming. It’s coming, and once it’s here we will find ourselves at the crest of the moment. It is interesting, it is even good—simply because it is what is coming, and to recognize that fact is both achievement and enjoyment enough. It is not up to us to take measures against it as well.’”

It is a predicament well worth remembering, I believe, as we listen to our own technology enthusiasts. Our dark age ahead my not have death camps and atomic bombs but it has the possibility of being just as pernicious and inhumane. It could well be a time where in celebrating the wonders of technology we ignore what is the best essence of what it means to be human. We would do well to consider our choices while we still can.

1 Comment

Mythology for Our Time: The Hero As Multiprocessor

7/18/2012

0 Comments

 
“I am a multitasker,” my ten-year-old niece declared with a triumphant grin at a recent family get-together. I was horrified, frankly. After all the neuroscientists have been telling us lately about the limitations of our working memory—most people can only hold about seven items in  their working memory at any given moment—and about how switching back and forth between tasks actually makes people more inefficient, I was appalled to see a  member of our younger generation expressing multitasking as a positive achievement and a model for how to negotiate life.

If recent surveys and current trends are any indication, by the time my niece is 15, she will be checking her Facebook account, watching TV, texting several friends, and doing her homework in a rapid cycle of sequences for seven or eight or more hours per day. She will have acquired 365 friends on Facebook and sleep with her cell phone under her pillow. She will spend a great deal of her time tethered to her machines, alone, “communicating” with others through a truncated set of texting words, abbreviations, and acronyms. The closest she might come on some days to deep emotion will be expressed in a string of emoticons. Her time alone will resemble not solitude, where some contemplation of oneself and one’s life might occur. Rather it will be more of a muffled isolation within an electronic cocoon.

What draws people to the spell of multitasking? Why is this goal so valued as a continuous activity today? I think it began with a  set of metaphors that started making their way into our language, probably in the 1970s, possibly even earlier. I was first personally struck when I was having a conversation with a businessman  conversant with computer programming as he described how he “interfaced” with his client. When I asked him what he meant by “interface,” he told me he meant how people connected, just like the 8- or 12-pronged plugs that connected a
  computer terminal to a mainframe. By the 70s, we began to speak and think of some mechanical aspects of human thinking. By the eighties, the use of computer terminology to describe human thought became commonplace. We “processed” information. We “transferred” knowledge. We “crunched” the numbers. In short, we began to think of ourselves more as calculators than as people. Multiprocessing seemed a natural after that.

With the ubiquity of digital devices today, people have begun to emulate the microprocessors with which they share their lives. They have adopted the rhythm of the multitasking, breaking down large tasks into smaller steps and
processing multiple activities in a nearly simultaneous way. There are many problems with these analogies and the changes in our behavior they foster, but I’ll just mention two. One, we humans are not made to be multitaskers. We
basically can do only one relatively involved task at a time (most of us can walk and chew gum at the same time, but that’s different from activities that require real focus). The second problem involves the whole idea of equating
human activity with computers. It leaves out very large parts of what makes us human in the first place: creativity, self-awareness, morality, and our abilities to love, trust, empathize, grieve, and experience a whole range of
emotions that machines can never understand. All these experiences color our thoughts, one would hope, and make them more deeply human along the way.

0 Comments

Mythology for Our Time III: Using Video Games to Fix Reality

6/26/2012

0 Comments

 
“The world without spirit  is a wasteland. . . . What is the nature of a wasteland? It is a land where everybody is living an inauthentic life, doing as other people do, doing as you’re told, with no courage for your own life. “ Joseph Campbell, The Power of Myth

Reality is broken, and video gaming may well provide a way for fixing it, according to Jane McGonigal, a game designer and author of Reality Is Broken: Why Games Make Us Better and How They Can Change the World. The subtitle actually sums up the argument for the book. McGonigal argues that playing video games can help people find their core strengths. Essentially she believes that one can use video gaming as positive psychology therapy to learn how to become more optimistic, proactive, engaged, and creative in solving real-world problems. Not surprisingly the heroes in her book are the video game designers. She believes they can inspire people to give their lives more meaning and lead them to believe they are participating in epic actions, epic lives. She also suggests that people are likely to be more optimistic if they create alternate reality games in real life based on their favorite superhero mythology.

However, it is the subject of the main title, this  so-called“brokenness” of reality that provides a real clue to the mythology of  our time.  Reality (that is, real life) is disappointing, and in a series of bold statements, McGonigal tells us just how reality is failing us and why games are better. Here’s a sample:

 “Compared with games, reality is too easy. Games challenge us with the voluntary obstacles and help us put our personal strengths to better use.” Behind this statement is the sad and abiding idea that our real lives are boring, our real work an involuntary burden of unwanted tasks done at someone else’s bidding. “We are wasting our lives,” McGonigal explains.

And again:

“Compared with games, reality is unproductive. Games give us clearer missions and more satisfying, hands-on work.” Reality it seems  is  unstructured and offers few if any opportunities for satisfying work. Again, the work of our everyday lives is inherently tedious and the goals often ill-defined and hard to figure out.

One last sample:

“Compared with games, reality is disconnected. Games build strong social bonds and lead to more active social networks.” Real life is isolating, the author says. She cites the demise of extended communities in our everyday lives and refers to Robert Putnam’s landmark work Bowling Alone (2000) about the collapse of organizations and civic participation in the latter part of the twentieth century.

McGonigal argues that video gaming and alternate reality games can be powerful paths to help boost happiness, improve problem-solving and perseverance, and even provide sparks of a sense of community, all of which can
be applied to real-world experiences. To be sure, games of all sorts can be fun and give players a change of pace and respite from the responsibilities of life. But McGonigal goes way beyond the fun part. She is right in saying that we need to take games more seriously, that they are not just an evil force in society offering opportunities for people to waste their time or play incessantly and additively. But it is questionable whether she is also right in claiming that
video games are truly transformational and provide positive experiences that can influence the way people act and think in their real lives away from the video game screen. Her evidence is anecdotal and largely unconvincing.

In the end, it is McGonigal’s perspective is truly askew. Reality isn’t broken. It’s the relationship between people’s inner lives and their external reality that is out of whack. Life is complex, messy, full of demands, disappointments, inconveniences, and responsibilities. Virtual worlds and  gmes, on the other hand, offer more structure, clearer goals, and hence new ways to feel successful and to communicate. But this does not by any means lead to authentic living. In the mid-1980s, the renowned mythology expert Joseph Campbell observed that many people were leading inauthentic lives. He said that they weren’t connected to their own inner spirit. Nor did they have a sense of the fundamental mystery of life in general. Without a sense of who they really were and their place in the universe, it was not possible to be genuinely engaged with others. And all this basis for leading an authentic life, Campbell
wrote repeatedly, is what a living myth can provide.

Reality may seem broken for video gamers because the life on the screen is so vivid, so complete in its opportunity for vicarious heroism. It is  the land of superheros and super tasks, mythological in the sense that  characters and events are larger than life. But these things are not representative of a living mythology, which would inspire inward illumination and outer wonder through its symbols and narratives about modern life. “Myths inspire the realization of the possibility of your perfection, the fullness of your strength, and the bringing of solar light into the world. Slaying monsters [and here Joseph Campbell meant slaying the monsters within the individual] is slaying the dark things.” Campbell told Bill Moyers. “Myths grab you somewhere down inside.” Video games may excite, may amuse, may well elevate one’s mood, but they do not hit you down deep within your spirit. They do not change your life as Campbell defined it when he spoke of living myths.

0 Comments

The Non-Stop Now of Social Media: From Wise Crowds to Group Narcissism

4/2/2012

5 Comments

 
Yesterday I attended a conference on social networking at the Boston Museum of Fine Arts and was once again struck with how absolutely overwhelmed and engulfed our modern lives are with digital machines and  technology. They are omnipresent. We carry them with us wherever we go. We  transact business through them. We communicate with friends and family and even  with that larger amorphous network of acquaintances and distant mutual friends  and their mutual acquaintances. And if we are honest, most of us will admit that  we even take those devices into our bedrooms with us when we retire for the  night. It is now so easy for our lives, our minute-to-minute experience of life,  to become permeated and mediated by our technologies. 

The conference, “Are Social Networks Really Social? was sponsored jointly by the museum and the Boston Institute for Psychotherapy so I would guess therapists of various sorts were in the overwhelming majority. But  there were more ordinary folk like myself who came out of curiosity to hear  three speakers on the topic of social media: (1) a psychologist and author who specializes in technology, Sherry Turkle, (2)  a  novelist, Helen Schulman, who has written about the nature and consequences of all things digital on the lives of ordinary families, and (3) an artist, Rachel Perry Welty, who has explored media like Facebook and Twitter as performance spaces. Over a long and rich afternoon those speakers and the audience pondered how social media—everything from email to smartphones to  Facebook—affects both our relationships with others and our own psyches. There  was a general consternation, even fear, and some sadness too, about how distracted, unfocused, and isolated individuals are becoming in our society. 

Many ramifications of such behavior came up: People are less productive, and they’re less capable of sustained and complex thinking. Some observed that there’s an intolerance, perhaps even an inability to actually cultivate
solitude. Not unrelated is a strong tendency to avoid, even again to fear, having direct conversations with others. And this of course leads to a lack of intimacy as well as empathy. Sherry Turkle worries that many people are actually substituting “connections” for “conversations.”

I’m thinking it may even be worse than that because when people post some thoughts on Facebook, send out a Tweet, or text someone, they are often not “connecting” so much as they are “performing.” My own experience with
Facebook and its current invitation to participate: “What on your mind?”is that it is a site for self-promotion, or as Norman Mailer once humbly (or maybe wryly) called a collection of his short works: “Advertisements for Myself.”
(Mailer was very good at the self-promotion thing, well before 2.0.)

The Yale computer scientist David Gelernter recently wrote a diatribe in The Wall Street Journal primarily against the careless disposable nature of “digital words,” and how sloppy and lazy (and idiotic) texting and smiley faces
really are. Yes, I agree that it’s all regrettable and one can only hope that everyone will come to their senses eventually. But what is even more interesting is Gelernter’s observation that “Generation-i is stuck in then“now,” neither pondering the past nor planning for the future. It’s the permeation and flow of the continuously new. “Merely glance at any digital gadget and you learn immediately what all your friends are doing, seeing, hearing, 
.  .  .  and (if you care) what’s going on now any place on earth. The whole world is screaming at you. Under the circumstances, young people abandon themselves to the present. Group narcissism takes over, as Generation–i falls in love with its own image reflected on the surface of the
cybersphere.”

Group narcissism. It seems we have far more of this phenomenon than we do of wise crowds and smart mobs. But it’s not clinical narcissism, which is a serious personality disorder characterized by dramatic emotional
behavior, a deep need for admiration, an inflated sense of self-importance, and an underlying fragile esteem for oneself. No this narcissism refers to the simpler classical myth of the beautiful Narcissus, who fell deeply in love with his own image in a pool of water as he drank from it. But he could never embrace or possess the image so rather than relinquish it he lay down by the side of the pool and was gradually consumed by the flame of his own
self-passion.

5 Comments

Our Worst Selves, or Toward an Understanding of Anthony Weiner

6/20/2011

0 Comments

 
We are not, apparently, our best selves when we are online, according to Elias Aboujaoude, author of the new book, Virtually You: The Dangerous Powers of the e-Personality. In fact being online seems to bring out the worst in some people, from harboring delusions of grandeur (thinking we have God-like capabilities) to out-of-control gambling (there are currently 1300 web sites devoted to gambling and the numbers of participants is growing at an alarming rate every year). Impulsivity, infantile regression, viciousness, narcissism, and aggression—Aboujaoude sees these as the five major unpleasant traits that may emerge in the e-personality of any of us.

Dr. Aboujaoude is a psychiatrist and the director of both the Obsessive Compulsive Disorder Clinic and the Impulse Control Disorders Clinic at Stanford University School of Medicine. Based on his clinical experience and a good bit of research, Aboujaoude observes that tendencies developed online in one’s e-personality can affect real-life behaviors as well. Along the way he offers many very thoughtful insights into how online behaviors evolve and how they can ultimately change overall personality.

One quite relevant one involves how texting and sites like Second Life can impact real-life behaviors:

“The inner child that comes alive online or on the iPhone or Blackberry keypad acquires a ‘voice’ that is playful and orthographically challenged like a kid’s, but calculating and potentially dangerous, like an adult’s.” He says it is this “toxic mix of dark desires in the virtual world and the immature, barely oral phase to which many adults regress online” that contribute to much of the sexual, pornographic, and predatory communications on the Internet.

Aboujaoude also observes that the speed of online communication, coupled with the facelessness of such interactions, inclines people to more quickly enter into discussions of intimate details of their lives with strangers. Online communication can rapidly become “hyperpersonal,” which in turn contributes to real-life sexual “hook-ups” when people first meet—because by the time they meet they feel they know everything about the other person. Anthony Weiner aside, these kinds of insights go a long way to explaining how people we thought to be perfectly sensible can behave quite differently online (and perhaps offline in their real lives as well).

In the end, the author does not advocate that we simply turn off our connections to the Interent and return to life as it was fifteen years or so ago. No matter how serious the consequences of our being online can become, he has no illusions about a return to yesteryear. Rather he suggests we use caution in our actions online. He believes we need to first know ourselves before we can safely and maturely interact online.  Aboujaoude calls for more research into online behavior as well as for new paradigms both for parenting and educating our children. And he holds onto the hope that we will survive the changes the Internet is making just as we survived the invention and proliferation of the steam engine in the Industrial Revolution. The question remains, however, whether the changes the Internet brings are not more insidious, more pernicious, and more pervasive than those brought about by the inventions of the Industrial Age. No one as yet seems to know. I suppose we’ll just have to wait and see.

0 Comments
<<Previous

    RSS Feed

    Archives

    February 2014
    January 2014
    December 2013
    November 2013
    August 2013
    July 2013
    June 2013
    May 2013
    April 2013
    March 2013
    February 2013
    January 2013
    December 2012
    August 2012
    July 2012
    June 2012
    May 2012
    April 2012
    March 2012
    January 2012
    December 2011
    November 2011
    October 2011
    June 2011
    May 2011
    April 2011
    March 2011
    February 2011
    December 2010
    July 2010
    June 2010
    May 2010
    March 2010
    February 2010
    January 2010
    December 2009

    Categories

    All
    AI
    Computer Models
    Convergence
    Digital Software
    Division Of Labor
    E Readers
    Facebook
    Financial Markets
    Google
    Innovation Business Cycle
    Internet
    Knowledge
    Learning
    Media Use
    Myths
    Powerpoint
    Robots
    Screen Life
    Screen Life
    Search
    Social Networking
    Targeted Marketing
    Technology And Jobs
    The Nature Of The Digital
    The Nature Of The Digital
    Video Games
    Web 2.0
    Wikis
    Youth

    Cynthia's Blog Plan

    I'll aim to post here a few times a month, based on current events and my ongoing research.