Digital Athena
  • Blog
  • Essays and Book Reviews
  • Contact Us
  • About Digital Athena
  • Index of Essays and Reviews

Virtual Pull over the Edge

5/17/2013

1 Comment

 
Why is digital technology so exciting? Why is all this fast change that technology brings always so inevitable and wonderful? There’s an eerily similar narrative that so many books follow these days as they recount the breakthroughs in technology and how they are changing our culture. Written by highly credentialed and respected scholars and technology writers, they all seem to begin by announcing a revolution that is taking place, a level of change that will dramatically (and of course rapidly) affect how we live, how we think, how we interact, or sometimes all of the above. The books present many studies and discuss their ramifications to support their theses, all of which are pretty rational and often good food for thought. 

So far, so good. The problem with these books really appears toward the end of the works. There the authors somehow feel compelled to extensively predict how some particular aspect of digital technology will inevitably transform various aspects of our lives in drastic ways, some of which may enhance our lives and some of which may simply make things more complex, or more artificial, or more alienated than they already are. In most instances,
one is left thinking that writing the book got the authors so enmeshed in their own material that they literally ventured over the edge and into some great unknown by the end of the work. Maybe all these authors/editors need to do is just lop off the last 40 pages.

To show the typical trajectory of such books, let’s take a look at Infinite Reality: Avatars, Eternal Life, New Worlds, and the Dawn of the Virtual Revolution by Jeremy Bailenson, founding director of Stanford’s Virtual Human Interaction Lab, and Jim Blascovish, Distinguished Professor of Psychology at UC Santa Barbara.

First there’s the revolutionary thesis:
“We sit on the cusp of a new world fraught with astonishing possibility, potential, and peril as people shift from
 face-to-face to virtual interaction. If by 'virtual' one means 'online,' then nearly a third of the world’s population is doing so already.”

Then they announce the intent of the book:
 “In this book, we provide an account of how virtual reality is changing human nature, societies, and cultures as we know them. Our goal is to familiarize readers with the pros and cons of the brave new world in which we live. We explore notions of consciousness, perception, neuroscience, media technology, social interaction, and culture writ large, as they pertain to virtual reality—and vice versa.”

Blascovich and Bailenson are genuinely excited because they see a new frontier of opportunity for the behavioral sciences based on all the data that is becoming available through online social interactions. While the environment may be virtual, the behavior of the individuals is “real.” And virtual interactions can change individual behavior in the real world as well as people’s sense of themselves. Virtual experiences change how people make important decisions and how they react viscerally to real-life situations. Examples of studies and analyses of various data sets take up the majority of the volume and are meant to convince the reader that there is indeed a revolution underway. 

Some of their narratives and analyses are quite interesting and could have some meaningful implications for understanding how and why humans behave in the ways that they do and how therapists can use virtual reality
software of various sorts, including explicitly targeted exercises and games, to help people change negative behavior patterns or improve their abilities or performance in some areas. 

It’s in the final portion of the book, however, that the real problem arises. And it occurs in an area where so many technology writers get into hot water: Predicting the future. The authors paint a series of semi-plausible scenarios for various activities from market research to legal trial evidence to surgical training as well as physical therapy, airplane-pilot training, even virtual vacations. Some of these things are well on the way to becoming part of our lives (although I myself will probably resist virtual tourism to my dying day). 

Then Bailenson and Blascovich take another small leap into the more distant future that really puts them over the edge. First they discuss the “promise” of avatars, which they say will become “perceptually indistinguishable” from their real counterparts. People will be able to automatically interact with their avatars without the need for hardware or even voice commands so that they may be unaware that they are actually incorporating an avatar into their body. It will be an experience akin to wearing contact lenses. The authors also posit that avatars will “walk among us” so that we might not even realize that the figure approaching us is actually an avatar. This kind of speculation seems to seduce the authors into a discussion of how avatars will change close relationships (which inevitably in such works always seems to devolve into a discussion of virtual sex). 

Equally predictable and mundane are the applications of virtual reality to the domains of religion and education. A couple of examples are enough to illustrate how silly the claims about future can get:

On religion, the authors write that having a virtual reality experience of Moses parting of the Red Sea, including the ability to smell the flopping fishes would deepen our understanding of the miracle. Frankly I prefer the Charlton Heston version myself. (And in any event, the land is dry when the Israelites walk through the path, according to the Bible.)

On education, things are even worse when you consider that one of the authors is a psychologist: The authors envision a “virtual tutoring system that will combine virtual reality, nanotech, and artificial intelligence"
capabilities and provide the most complete educational experience anyone could hope for. The salient feature of this transition from “physical to digital” learning environments seems to be the elimination of the notably dull textbook with options such as movies and virtual reality programs. This means the children wouldn’t have to read anything at all—a great improvement, as any psychologist will tell you, for the development of the brain and for the
learning process itself.

With improvements such as these one can only hope that the virtual revolution is just that—virtual, that is, not for real.

1 Comment

A Sign of the Times: Spiritual Alternatives to Digital Gadgets

8/8/2012

3 Comments

 
Take your pick: Do you like the continual sounds from your smartphone announcing yet-another text message or phone call, or would you rather opt for an hour or so of listening to Mahler or watching whirling dervishes from Istanbul? 

There appears to be a wave of spirituality gaining force in the classical music world. And it is in part in reaction to our heavily technologized modern lifestyle. As Lincoln Center’s Artistic Director Jane Moss sees it,  the ubiquitous cacophony of cellphones, smartphones, and other digital gadgets is not only enormously seductive, but it is also  a
barrier to having a full interior life: “People are looking for larger experiences in a cyberworld” that has become more and more “like eating candy.” She has organized an annual White Light Festival in New York as a way to give
audiences a chance to experience “transcendence.” Moss stressed that the  festival is not about sacred music but about transcendence.  This year’s festival offers seventeen offerings from various international individual  performers and groups from a French Baroque ensemble to Indian Sufi mystics to contemporary American composers, along with many classics from the Western tradition. 

Other organizations are joining in: The annual Salzburg Festival, now the summer home of the Vienna Philharmonic, this year added a 10-day Spiritual Overture to its program while the Lucerne Festival in Switzerland has created a summer festival simply called “Faith.” A “Credo”series in that program  explores religions of seven different spiritual visions from the perspective that every religion is legitimate and each is only an approximation of what ultimately remains unexpressed. And the Pittsburgh Symphony is expanding its offering of a program it calls Music of the Spirit, an annual set of performances that is designed to show the symphony’s “deep commitment to
promoting and spreading a spiritual and universal message.”

Joseph Campbell, writing from the 1950s through the 1980s in the field of comparative mythology, could not possibly have foreseen how technology would permeate our lives as it does today, yet he did observe even
then that people had generally lost the ability to think and feel in metaphorical terms: “Our thinking is largely discursive, verbal, linear,” he told Bill Moyers in the conversations that were eventually aired on PBS and
published as  The Power of Myth. “One of the problems today is that we are not well acquainted with the literature of the spirit. We’re interested in the news of the day and the problems of the hour.” Were Campbell alive today, he might in addition have observed that we are so tethered to our digital machines and gadgets that we have no time for an inner life at all and may well be losing the capacity to ever develop one.

Campbell did not believe that contemporary society had a living functioning mythology.  And it is myths, as he points out, that provide “clues to the spiritual potentialities of human life.” The deep vitality of a culture’s mythology comes from the power of “its symbols as metaphors, delivering not simply the idea, but a sense of the actual participation in such a realization of transcendence,” he wrote late in his life in The Inner Reaches of Outer Space. If we were to have a new mythology in the future, he believed it would be up to the artists to create it, and he also believed that it would have to be a global mythology, taking into account and trying to express the rapture and the wonder of what it is like to be alive as human beings on this planet Earth within our solar system and “the cluster of twenty galaxies of which our galaxy is a member, which local cluster, in turn, is represented as but one of thousands of such local clusters of galaxies, themselves gathered in superclusters in a universe whose limits are not yet known.” Although Campbell focused primarily on the art of literature and story-telling, I think he would see good signs in how various musical organizations in the US and Europe are starting to offer programs that combine
Western and Eastern spiritual traditions, offering transcendence and the richness of a strong inner spiritual life through art in their own medium.
3 Comments

The Non-Stop Now of Social Media: From Wise Crowds to Group Narcissism

4/2/2012

5 Comments

 
Yesterday I attended a conference on social networking at the Boston Museum of Fine Arts and was once again struck with how absolutely overwhelmed and engulfed our modern lives are with digital machines and  technology. They are omnipresent. We carry them with us wherever we go. We  transact business through them. We communicate with friends and family and even  with that larger amorphous network of acquaintances and distant mutual friends  and their mutual acquaintances. And if we are honest, most of us will admit that  we even take those devices into our bedrooms with us when we retire for the  night. It is now so easy for our lives, our minute-to-minute experience of life,  to become permeated and mediated by our technologies. 

The conference, “Are Social Networks Really Social? was sponsored jointly by the museum and the Boston Institute for Psychotherapy so I would guess therapists of various sorts were in the overwhelming majority. But  there were more ordinary folk like myself who came out of curiosity to hear  three speakers on the topic of social media: (1) a psychologist and author who specializes in technology, Sherry Turkle, (2)  a  novelist, Helen Schulman, who has written about the nature and consequences of all things digital on the lives of ordinary families, and (3) an artist, Rachel Perry Welty, who has explored media like Facebook and Twitter as performance spaces. Over a long and rich afternoon those speakers and the audience pondered how social media—everything from email to smartphones to  Facebook—affects both our relationships with others and our own psyches. There  was a general consternation, even fear, and some sadness too, about how distracted, unfocused, and isolated individuals are becoming in our society. 

Many ramifications of such behavior came up: People are less productive, and they’re less capable of sustained and complex thinking. Some observed that there’s an intolerance, perhaps even an inability to actually cultivate
solitude. Not unrelated is a strong tendency to avoid, even again to fear, having direct conversations with others. And this of course leads to a lack of intimacy as well as empathy. Sherry Turkle worries that many people are actually substituting “connections” for “conversations.”

I’m thinking it may even be worse than that because when people post some thoughts on Facebook, send out a Tweet, or text someone, they are often not “connecting” so much as they are “performing.” My own experience with
Facebook and its current invitation to participate: “What on your mind?”is that it is a site for self-promotion, or as Norman Mailer once humbly (or maybe wryly) called a collection of his short works: “Advertisements for Myself.”
(Mailer was very good at the self-promotion thing, well before 2.0.)

The Yale computer scientist David Gelernter recently wrote a diatribe in The Wall Street Journal primarily against the careless disposable nature of “digital words,” and how sloppy and lazy (and idiotic) texting and smiley faces
really are. Yes, I agree that it’s all regrettable and one can only hope that everyone will come to their senses eventually. But what is even more interesting is Gelernter’s observation that “Generation-i is stuck in then“now,” neither pondering the past nor planning for the future. It’s the permeation and flow of the continuously new. “Merely glance at any digital gadget and you learn immediately what all your friends are doing, seeing, hearing, 
.  .  .  and (if you care) what’s going on now any place on earth. The whole world is screaming at you. Under the circumstances, young people abandon themselves to the present. Group narcissism takes over, as Generation–i falls in love with its own image reflected on the surface of the
cybersphere.”

Group narcissism. It seems we have far more of this phenomenon than we do of wise crowds and smart mobs. But it’s not clinical narcissism, which is a serious personality disorder characterized by dramatic emotional
behavior, a deep need for admiration, an inflated sense of self-importance, and an underlying fragile esteem for oneself. No this narcissism refers to the simpler classical myth of the beautiful Narcissus, who fell deeply in love with his own image in a pool of water as he drank from it. But he could never embrace or possess the image so rather than relinquish it he lay down by the side of the pool and was gradually consumed by the flame of his own
self-passion.

5 Comments

Texts and the Texter

10/25/2011

0 Comments

 
 “We shape our buildings; thereafter they shape us,” Winston Churchill observed about the symbiotic relationship between our architecture and ourselves. The same may be said for how we interact with our technologies.

Take a look at texting. The numbers seem to grow all the time but as of the Kaiser Foundation Study published in January 2010, young people were sending on average 3000 texts per month and were spending four times the amount of time texting than they were actually talking on their phones. And texting has created has influenced communications in several ways:

First of all, because people text on their cellphones, most must use a virtual keyboard on a touchscreen (Blackberry owners get to use the tiny physical keys, which is slightly more user friendly, I suppose.). In either case, the keys are much smaller than the average computer keyboard’s keys, so it’s easy to make mistakes. Plus, using the virtual keyboard also creates another level of awkwardness because you have to shift to a second (and on some cellphones a third) view to access all the characters on the QWERTY keyboard. In addition texting has the “short message service” limitation of 160 characters.

Then there’s the speed at which the communication is sent. Texts are delivered pretty much instantaneously. This leads people to think that they must respond at roughly the same speed. Delaying a response seems for many to imply that you’re ignoring the person contacted you.

The combination of a virtual awkward keyboard, the limited length, and the pressure to rapidly respond engenders the kind of shorthand of contracted words (Xlnt for excellent, rite for write), pictograms (b4 for before, @om for atom), initializations (N for no, LOL for laughing out loud, CWOT for complete waste of time), and nonstandard acronyms (anfscd for and now for something completely different, btdt for been there, done that, hhoj for ha, ha, only joking. Notice how the shorthand becomes more and more cryptic and we haven’t even talked about the emoticons—those variations on the ubiquitous smiley face using strings of punctuation

I know I’m old—way over thirty—but texting seems to me like the new pig Latin—another code designed to communicate secretly and to exclude others. In the case of pig Latin, the aim was to exclude parents. And for some ages the same may be true to today’s texting. It’s a silent and secret form of communication one can do in one’s lap under the dinner table. So essentially the technology of sending written messages via cell phones creates private languages.

Texting can be a convenient way to quickly notify someone, but the effects, especially for younger people, can be more far-reaching and burdensome and hardly convenient. Sherry Turkle met with one sixteen-year-old named Sanjay during her research for her new book Alone/Together. He expressed anxiety and frustration around texting. He turned off his phone while he spoke with Turkle for an hour. Turkle writes: “At the end of our conversation, he turns his phone back on. He looks at me ruefully, almost embarrassed. He has received over a hundred text messages as were speaking. Some are from his girlfriend, who, he says, “is having a meltdown.” Some are from a group of close friends trying to organize a small concert. He feels a lot of pressure to reply to both situations and begins to pick up his books and laptop so he can find a quiet place to set himself to the task. . . . “I can’t imagine doing this when I get older.” And then, more quietly, “How long do I have to continue doing this?” Sounds more like he’s facing a prison sentence rather than the joy of continuous connection  . . .

0 Comments

Is Addiction a Useful Concept for Media Use?

5/26/2011

0 Comments

 
Let’s face it, we are all “users.” Anyone who accesses a computer software program has been known as a “user” for many years now. No doubt, the term originated somewhere back in the mainframe age when people had to sit at a terminal, log on, and identify themselves before they could access the mammoth machines, machines that filled whole buildings in the sixties and whose power is now dwarfed by that on your average smartphone. Today, however, many experts from various specialties  are starting to denounce the addictive capacity of the latest technologies. They are saying many users are in fact becoming addicts. But how meaningful is it to talk about addiction when referring to people who constantly or continually use computers and their mobile devices to surf the Web, text with their friends, and check for email?

The latest draft of the fifth edition of the Diagnostic and Statistical Manual of Mental Disorders, a set of standards published by the American Psychiatric Association, posits a new category of mental disorder, called “Behavioral Addictions,” and it suggests, for starters, just one such disorder: gambling. The physiological rationale for the new category is that such behavior has the same clinical pattern as substance addictions, that is, an activity originally undertaken for pleasure becomes compulsive. The addicted person ceases to derive much pleasure from the activity but continues to pursue the pattern despite diminishing gains and increasing cost. Addicts lose control over their behavior. The activity begins to control them. Neurobiologically, experts claim, addictive behaviors follow the same path in the brain, generating the euphoria that dopamine creates, and leading addicts to repeat their behavior in search of new pleasures.

So how do experts define the criteria for “Internet” addiction? To begin with, they’ve identified six criteria that must be present:

Preoccupation—Thinking constantly about previous online activity or anticipating the next one.

Tolerance—Needing longer periods online in order to feel satisfied.

Lack of control—Finding it impossible to cut back or stop.

Withdrawal—Stopping induces restlessness, irritability, other changes in mood.

Unintended overuse—Repeatedly staying online longer than intended.

Also, the user must also experience one of three criteria that indicate the online activity is negatively affecting his life. These include (1) losing or jeopardizing the loss of something important, such as a job, a big opportunity, or personal relationship, (2) concealing and/or lying about time spent online, and (3) using the activity to escape real-life difficulties.

Psychologist Sherry Turkle doesn’t like the idea of labeling computer overuse as an addiction because, she claims, it calls for one solution: stopping. And she believes we must learn how to live with our technologies, that we can’t go back. “The idea of addiction, with its one solution that we know we won’t take, makes us feel hopeless. We have to find a way to live with seductive technology and make it work to our purpose,” Turkle writes in her new book Alone Together: Why We Expect More from Technology and Less from Each Other.

 It is certainly true that we can’t go back—the Internet and cell phones aren’t going away and there will no doubt be even more seductive technologies to come. Still it is hard to ignore the neurological science that tells us some people are rewiring their brains in ways that make them crave more media use, causing them to lose control of their time and how they spent it. Yet it may also be true that only those types of personalities are at risk who are predisposed to develop some sort of compulsive, addictive behavior in any event. But I do have one more nagging thought that just won’t go away. It’s what one of Sherry Turkle’s young research subjects observed about the pull and the power of our modern technology: This sixteen-year-old girl perhaps identifies the real problem with this postmodern life of ours: “Technology is bad because people are not as strong as its pull.”

0 Comments

Web 2.0: A Conversation Lost

5/13/2011

0 Comments

 
The art of conversation is so twentieth century. It seems that Web 2.0 has replaced the need for conversing entirely. For those who send hundreds of text messages each day, who constantly check and updates their Facebook Walls, even phone calls are passé—they’re far too time-consuming, too emotionally demanding, and just plain too complicated. Deval, a senior in high school whom Sherry Turkle cites in her new book Alone Together, observes: “A long conversation with someone you don’t want to talk to that badly can be a waste of time.” By texting, Deval explains, he only has to deal with direct information and not waste time on conversation fillers. At the same time, however, the high school senior confesses that he doesn’t really know how to have a conversation, at least yet. He thinks he might soon start to talk on the phone as a way to learn how to have an actual conversation: “For later in life, I’ll need to learn how to have a conversation, learn how to find common ground as I can have something to talk about, rather than spending my life in awkward silence.”

Neurologists and psychologists worry a lot today about the lack of face-to-face and voice-to-voice interaction that Web 2.0 enables. They point out that it is especially important for adolescents to have direct interaction with others because it is during the late teenage years and early twenties that the brain develops the ability to understand how others feel and how one’s actions may affect others around them. The underdeveloped frontal lobes of younger teenagers, explains Dr. Gary Small, Director of the UCLA Memory and Aging Research Center, lead teenagers to seek out situations that provide instant gratification. Younger teenagers tend to be self-absorbed. They also tend to lack mature judgment, are unable to understand danger in certain situations, and have trouble putting things in perspective.

One prevalent habit that impedes the normal development of the frontal lobes to the level of maturity one expects to see in adults by their mid-twenties is multitasking, says Dr. Small. The ability of multiple gadgets to allow young adults (and others) to listen to music, watch TV, email or text, and work on homework at the same time can lead to a superficial understanding of information. And all this technology feeds the desire for novelty and instant gratification, not complex thinking or deep learning. Abstract reasoning also remains undeveloped in such an environment.

High school senior Deval believes he can learn to have conversations by talking on the phone. But mastering the art of conversation is not the same kind learning as figuring out how to use the latest smartphone. Experts say it takes practice in listening to other people and learning how to read their faces and other gestures to fully understand what another person is feeling and saying. There are deeply intuitive aspects to learning how to fully converse with someone, what Gary Small calls the “empathetic neural circuitry” that is part of mature emotional intelligence. Researchers say it is too early to know how and if  “Digital Natives,” those born after 1980 who have grown up using all kinds of digital devices as a natural part of the rhythm of their lives, will develop empathy at all and if they do develop it, how it might differ from what empathy means today.

What the experts do know is that the more hours spent in front of electronic screens can actually atrophy the neural circuitry that people develop to recognize and interpret nonverbal communication. And these skills are a significant part of what makes us human. Their mastery helps define personal and professional successes as well.  Understanding general body language, reading facial expressions, and making eye contact are all part of the art of empathy. So in this age of superconnectivity, where communications are everywhere and we always on, we seem to risk losing many of the basic skills that are the hallmarks of effective communication itself.

See also

Alone Together by Sherry Turkle


iBrain: Surviving the Technological Alteration of the Modern Mind by Gary Small MD and Gigi Vorgan



0 Comments

Tweens Use Media 30% More than 5 Years Ago

2/17/2010

0 Comments

 
New Kaiser study shows dramatic increase in all media use except print

TV still dominates, contradicting popular image of proactive youth

Heavy media users especially unhappy

In the Kaiser Foundation’s third survey of media use among 8- to 18-year olds, researchers found a dramatic increase in media use in the 11-14-year old group as opposed to the 8- to 10-year old group. Eleven- to 14-year olds spent almost 9 hours using media. The number of hours for that age group rises 12 hours per day when multitasking hours are counted twice to fully reflect total media exposure. For all 8- to 18-year-olds, the average use, which had been stable in the studies conducted in 1999 and 2004, increased this time over an hour to 7 hours and 38 minutes, about the time adults spend at work.

Texting: 118 per Day

The media tracked by the study include iPod/MP3 players, video game consoles, computers, cell phones, television, radio, and printed materials. (Reading printed materials was the only activity tracked that decreased in the amount of time devoted to it.) The study did not include computer use for school work. Nor did it cover texting and cell phone calls, which for 8- to 18-year-olds averaged one hour and 30 minutes and 33 minutes, respectively. The number of texts sent was surprisingly high: 118 was the average.  If you add texting and cell phone calls to other media use, you get an average of 9 hours and 40 minutes per day, significantly more time than students spent in school and on homework.

The So-Called “Fun” Generation

Many books and articles have recently trumpeted the creative, interactive ways that the younger generation interacts with new media, including mashups and YouTube videos. Some writers explain that, after all, something must be “fun” for this generation to have any interest in it. But of course. The impression you get is that youths are constantly and joyfully experimenting with new technologies and reveling in new social connections. But among heavy users, those who use media more than 16 hours per day, 60% report being bored frequently, 32% admit being sad or unhappy often, and 33% report that they get into trouble a lot.

Life as Sound Bites

Those numbers about youths and negative moods directly contradict the popular image of youths as socially active, energized tinkerers. In fact today’s youth, especially those most involved in media use, may simply be seeking to escape their boredom or unhappiness. The compulsiveness of youths who send 118 text messages per day also suggests a chronic underlying loneliness in spite of all the gee-whiz access to information, technology, and friends. Could it be that those mediated, truncated messages and the constant media use just creates a lot of background noise, trivial diversions, and mind-numbing entertainments? Isn’t tweeting, after all, just another example of enforced brevity, the sound bites of our contemporary life? These youths think that they have happy outer lives, proclaiming by large percentages that they have lots of friends, get along with their parents, and are generally happy at school, but when asked about their inner life, the picture becomes considerably darker and more complicated.

TV Still Rules

The other surprising fact in this study is the ongoing popularity of TV in the media use of youths. This too contradicts all those who celebrate the brilliant creative use of new media by youths. Watching TV remains the number one activity for 8- to 18-year-olds—about 4 hours and 30 minutes per day, up an extra 40 minutes from five years ago. The 11- to 14-year old spend 5 hours per day watching TV. One reason for the increase is broader access: Kids can now more easily record shows, view them on demand, and watch TV on their laptops, smartphones, and iPods. There are also more TVs than ever in teenagers’ bedrooms, a situation that naturally increases TV time. In addition 37% report that the family car has either a TV or a DVD player.

A Vicious Cycle?

The Kaiser study notes that there is no clear cause and effect relationship between low levels of personal contentment and heavy media use, although the correlation remained constant when tested for age, gender, race, parent education, and single- vs. two-parent households. The researchers note that the cause/effect could go either way—or both: Children who are bored, unhappy, etc. may seek out more media to escape their angst. On the other hand, heavy use of media, especially television, does itself create boredom and unhappiness. It seems likely that time spent watching TV and boredom, unhappiness, and/or sadness actually form a vicious cycle, reinforcing the very negative moods that youth are trying to escape. It seems to me one of the saddest images of our empty modern life is that of a teenager watching some inane reality show in his room and texting his friends (or tweeting to the world) about the inanity he is watching.

Twitter Babble

Pear Analytics recently conducted a study of Twitter content and found that 40% of all tweets  are “pointless babble”:



Post a Comment

See also
Kaiser Foundation Report

Books about Generation M on Amazon
Digital Natives: Understanding the First Generation of Digital Natives  by John Palfrey and Urs Glasser
Grown Up Digital: How the Net Generation Is Changing Your World by Don Tapscott 
0 Comments

    RSS Feed

    Archives

    February 2014
    January 2014
    December 2013
    November 2013
    August 2013
    July 2013
    June 2013
    May 2013
    April 2013
    March 2013
    February 2013
    January 2013
    December 2012
    August 2012
    July 2012
    June 2012
    May 2012
    April 2012
    March 2012
    January 2012
    December 2011
    November 2011
    October 2011
    June 2011
    May 2011
    April 2011
    March 2011
    February 2011
    December 2010
    July 2010
    June 2010
    May 2010
    March 2010
    February 2010
    January 2010
    December 2009

    Categories

    All
    AI
    Computer Models
    Convergence
    Digital Software
    Division Of Labor
    E Readers
    Facebook
    Financial Markets
    Google
    Innovation Business Cycle
    Internet
    Knowledge
    Learning
    Media Use
    Myths
    Powerpoint
    Robots
    Screen Life
    Screen Life
    Search
    Social Networking
    Targeted Marketing
    Technology And Jobs
    The Nature Of The Digital
    The Nature Of The Digital
    Video Games
    Web 2.0
    Wikis
    Youth

    Cynthia's Blog Plan

    I'll aim to post here a few times a month, based on current events and my ongoing research.