Digital Athena
  • Blog
  • Essays and Book Reviews
  • Contact Us
  • About Digital Athena
  • Index of Essays and Reviews

Is Our Digital Future Inevitable or Do We Have Options?

12/10/2012

1 Comment

 
Back to my blog after some professional and personal interruptions. I thought I’d begin again by talking about the way many people so readily embrace the new technologies that stream out of software and hardware  companies and into their lives. Most dismiss objections about the changes in our lives, in our relationships—indeed in our brains— that those new technologies may trigger.  For better or worse, it’s inevitable, people say. Stopping the changes, or even the rate of change, is impossible now. Many pundits and members of the digerati enjoy not just defining the current trends but also predicting the future, whether it be the next new thing or a broad vision of social change over the upcoming twenty years or more.

But is it all inevitable? I recently came across another take on the issue of inevitability and the impossibility of stopping the relentless march of change over time. In Thomas Mann’s Doctor Faustus, the narrator reflects on the consensus in his intellectual circle in Munich that as the 1930s unfolded, Germany was in for “hard and dark times
that would scoff at humanity, for an age of great wars and sweeping revolution, presumably leading far back beyond the Christian civilization of the Middle Ages and restoring instead the Dark Ages that preceded its birth and had followed the collapse of antiquity.”

Yet Mann’s narrator observes that no one objected to those conclusions. No one said this dark version of the future must be changed, must be avoided. No one said: ”We must somehow intervene and stop this from happening.” Instead they reveled in the cleverness of their insights, in their recognition of the facts and their inevitable results. They said: “’It’s coming. It’s coming, and once it’s here we will find ourselves at the crest of the moment. It is interesting, it is even good—simply because it is what is coming, and to recognize that fact is both achievement and enjoyment enough. It is not up to us to take measures against it as well.’”

It is a predicament well worth remembering, I believe, as we listen to our own technology enthusiasts. Our dark age ahead my not have death camps and atomic bombs but it has the possibility of being just as pernicious and inhumane. It could well be a time where in celebrating the wonders of technology we ignore what is the best essence of what it means to be human. We would do well to consider our choices while we still can.

1 Comment

A Sign of the Times: Spiritual Alternatives to Digital Gadgets

8/8/2012

3 Comments

 
Take your pick: Do you like the continual sounds from your smartphone announcing yet-another text message or phone call, or would you rather opt for an hour or so of listening to Mahler or watching whirling dervishes from Istanbul? 

There appears to be a wave of spirituality gaining force in the classical music world. And it is in part in reaction to our heavily technologized modern lifestyle. As Lincoln Center’s Artistic Director Jane Moss sees it,  the ubiquitous cacophony of cellphones, smartphones, and other digital gadgets is not only enormously seductive, but it is also  a
barrier to having a full interior life: “People are looking for larger experiences in a cyberworld” that has become more and more “like eating candy.” She has organized an annual White Light Festival in New York as a way to give
audiences a chance to experience “transcendence.” Moss stressed that the  festival is not about sacred music but about transcendence.  This year’s festival offers seventeen offerings from various international individual  performers and groups from a French Baroque ensemble to Indian Sufi mystics to contemporary American composers, along with many classics from the Western tradition. 

Other organizations are joining in: The annual Salzburg Festival, now the summer home of the Vienna Philharmonic, this year added a 10-day Spiritual Overture to its program while the Lucerne Festival in Switzerland has created a summer festival simply called “Faith.” A “Credo”series in that program  explores religions of seven different spiritual visions from the perspective that every religion is legitimate and each is only an approximation of what ultimately remains unexpressed. And the Pittsburgh Symphony is expanding its offering of a program it calls Music of the Spirit, an annual set of performances that is designed to show the symphony’s “deep commitment to
promoting and spreading a spiritual and universal message.”

Joseph Campbell, writing from the 1950s through the 1980s in the field of comparative mythology, could not possibly have foreseen how technology would permeate our lives as it does today, yet he did observe even
then that people had generally lost the ability to think and feel in metaphorical terms: “Our thinking is largely discursive, verbal, linear,” he told Bill Moyers in the conversations that were eventually aired on PBS and
published as  The Power of Myth. “One of the problems today is that we are not well acquainted with the literature of the spirit. We’re interested in the news of the day and the problems of the hour.” Were Campbell alive today, he might in addition have observed that we are so tethered to our digital machines and gadgets that we have no time for an inner life at all and may well be losing the capacity to ever develop one.

Campbell did not believe that contemporary society had a living functioning mythology.  And it is myths, as he points out, that provide “clues to the spiritual potentialities of human life.” The deep vitality of a culture’s mythology comes from the power of “its symbols as metaphors, delivering not simply the idea, but a sense of the actual participation in such a realization of transcendence,” he wrote late in his life in The Inner Reaches of Outer Space. If we were to have a new mythology in the future, he believed it would be up to the artists to create it, and he also believed that it would have to be a global mythology, taking into account and trying to express the rapture and the wonder of what it is like to be alive as human beings on this planet Earth within our solar system and “the cluster of twenty galaxies of which our galaxy is a member, which local cluster, in turn, is represented as but one of thousands of such local clusters of galaxies, themselves gathered in superclusters in a universe whose limits are not yet known.” Although Campbell focused primarily on the art of literature and story-telling, I think he would see good signs in how various musical organizations in the US and Europe are starting to offer programs that combine
Western and Eastern spiritual traditions, offering transcendence and the richness of a strong inner spiritual life through art in their own medium.
3 Comments

Mythology for Our Time: The Hero As Multiprocessor

7/18/2012

0 Comments

 
“I am a multitasker,” my ten-year-old niece declared with a triumphant grin at a recent family get-together. I was horrified, frankly. After all the neuroscientists have been telling us lately about the limitations of our working memory—most people can only hold about seven items in  their working memory at any given moment—and about how switching back and forth between tasks actually makes people more inefficient, I was appalled to see a  member of our younger generation expressing multitasking as a positive achievement and a model for how to negotiate life.

If recent surveys and current trends are any indication, by the time my niece is 15, she will be checking her Facebook account, watching TV, texting several friends, and doing her homework in a rapid cycle of sequences for seven or eight or more hours per day. She will have acquired 365 friends on Facebook and sleep with her cell phone under her pillow. She will spend a great deal of her time tethered to her machines, alone, “communicating” with others through a truncated set of texting words, abbreviations, and acronyms. The closest she might come on some days to deep emotion will be expressed in a string of emoticons. Her time alone will resemble not solitude, where some contemplation of oneself and one’s life might occur. Rather it will be more of a muffled isolation within an electronic cocoon.

What draws people to the spell of multitasking? Why is this goal so valued as a continuous activity today? I think it began with a  set of metaphors that started making their way into our language, probably in the 1970s, possibly even earlier. I was first personally struck when I was having a conversation with a businessman  conversant with computer programming as he described how he “interfaced” with his client. When I asked him what he meant by “interface,” he told me he meant how people connected, just like the 8- or 12-pronged plugs that connected a
  computer terminal to a mainframe. By the 70s, we began to speak and think of some mechanical aspects of human thinking. By the eighties, the use of computer terminology to describe human thought became commonplace. We “processed” information. We “transferred” knowledge. We “crunched” the numbers. In short, we began to think of ourselves more as calculators than as people. Multiprocessing seemed a natural after that.

With the ubiquity of digital devices today, people have begun to emulate the microprocessors with which they share their lives. They have adopted the rhythm of the multitasking, breaking down large tasks into smaller steps and
processing multiple activities in a nearly simultaneous way. There are many problems with these analogies and the changes in our behavior they foster, but I’ll just mention two. One, we humans are not made to be multitaskers. We
basically can do only one relatively involved task at a time (most of us can walk and chew gum at the same time, but that’s different from activities that require real focus). The second problem involves the whole idea of equating
human activity with computers. It leaves out very large parts of what makes us human in the first place: creativity, self-awareness, morality, and our abilities to love, trust, empathize, grieve, and experience a whole range of
emotions that machines can never understand. All these experiences color our thoughts, one would hope, and make them more deeply human along the way.

0 Comments

Mythology for Our Time III: Using Video Games to Fix Reality

6/26/2012

0 Comments

 
“The world without spirit  is a wasteland. . . . What is the nature of a wasteland? It is a land where everybody is living an inauthentic life, doing as other people do, doing as you’re told, with no courage for your own life. “ Joseph Campbell, The Power of Myth

Reality is broken, and video gaming may well provide a way for fixing it, according to Jane McGonigal, a game designer and author of Reality Is Broken: Why Games Make Us Better and How They Can Change the World. The subtitle actually sums up the argument for the book. McGonigal argues that playing video games can help people find their core strengths. Essentially she believes that one can use video gaming as positive psychology therapy to learn how to become more optimistic, proactive, engaged, and creative in solving real-world problems. Not surprisingly the heroes in her book are the video game designers. She believes they can inspire people to give their lives more meaning and lead them to believe they are participating in epic actions, epic lives. She also suggests that people are likely to be more optimistic if they create alternate reality games in real life based on their favorite superhero mythology.

However, it is the subject of the main title, this  so-called“brokenness” of reality that provides a real clue to the mythology of  our time.  Reality (that is, real life) is disappointing, and in a series of bold statements, McGonigal tells us just how reality is failing us and why games are better. Here’s a sample:

 “Compared with games, reality is too easy. Games challenge us with the voluntary obstacles and help us put our personal strengths to better use.” Behind this statement is the sad and abiding idea that our real lives are boring, our real work an involuntary burden of unwanted tasks done at someone else’s bidding. “We are wasting our lives,” McGonigal explains.

And again:

“Compared with games, reality is unproductive. Games give us clearer missions and more satisfying, hands-on work.” Reality it seems  is  unstructured and offers few if any opportunities for satisfying work. Again, the work of our everyday lives is inherently tedious and the goals often ill-defined and hard to figure out.

One last sample:

“Compared with games, reality is disconnected. Games build strong social bonds and lead to more active social networks.” Real life is isolating, the author says. She cites the demise of extended communities in our everyday lives and refers to Robert Putnam’s landmark work Bowling Alone (2000) about the collapse of organizations and civic participation in the latter part of the twentieth century.

McGonigal argues that video gaming and alternate reality games can be powerful paths to help boost happiness, improve problem-solving and perseverance, and even provide sparks of a sense of community, all of which can
be applied to real-world experiences. To be sure, games of all sorts can be fun and give players a change of pace and respite from the responsibilities of life. But McGonigal goes way beyond the fun part. She is right in saying that we need to take games more seriously, that they are not just an evil force in society offering opportunities for people to waste their time or play incessantly and additively. But it is questionable whether she is also right in claiming that
video games are truly transformational and provide positive experiences that can influence the way people act and think in their real lives away from the video game screen. Her evidence is anecdotal and largely unconvincing.

In the end, it is McGonigal’s perspective is truly askew. Reality isn’t broken. It’s the relationship between people’s inner lives and their external reality that is out of whack. Life is complex, messy, full of demands, disappointments, inconveniences, and responsibilities. Virtual worlds and  gmes, on the other hand, offer more structure, clearer goals, and hence new ways to feel successful and to communicate. But this does not by any means lead to authentic living. In the mid-1980s, the renowned mythology expert Joseph Campbell observed that many people were leading inauthentic lives. He said that they weren’t connected to their own inner spirit. Nor did they have a sense of the fundamental mystery of life in general. Without a sense of who they really were and their place in the universe, it was not possible to be genuinely engaged with others. And all this basis for leading an authentic life, Campbell
wrote repeatedly, is what a living myth can provide.

Reality may seem broken for video gamers because the life on the screen is so vivid, so complete in its opportunity for vicarious heroism. It is  the land of superheros and super tasks, mythological in the sense that  characters and events are larger than life. But these things are not representative of a living mythology, which would inspire inward illumination and outer wonder through its symbols and narratives about modern life. “Myths inspire the realization of the possibility of your perfection, the fullness of your strength, and the bringing of solar light into the world. Slaying monsters [and here Joseph Campbell meant slaying the monsters within the individual] is slaying the dark things.” Campbell told Bill Moyers. “Myths grab you somewhere down inside.” Video games may excite, may amuse, may well elevate one’s mood, but they do not hit you down deep within your spirit. They do not change your life as Campbell defined it when he spoke of living myths.

0 Comments

Myths for Our Time (II): The Internet as Planetary Computer

6/1/2012

7 Comments

 
“People say that what we’re all seeking is a meaning for life. I  don’t think that’s what we’re really seeking. I think what we’re seeking is an  experience of being alive, so that our life experiences on the purely physical plane have resonances within our own innermost being and reality, so that we  actually feel the rapture of being alive. That’s what it’s all finally about,  and that’s what these clues in myths help us to find within ourselves. . . .   Myths are clues to the spiritual potentialities of the human life.   . . . We need myths [today] that will identify the individual not with is  local group but with the planet.” Joseph Campbell, The Power of Myth

For many today, the Internet seems to be a powerful presence. Why does it have such a deep resonance within our imaginations? How has it become so central to our contemporary life? And what does that say about the lives we live and our values? People find the phenomenon of the Web full of possibilities. Many believe that there’s something magical in its very existence and that it offers access to knowledge and powerful modes of communication that are fundamentally different from what we have had in the past. There is also the pervading sense that the Internet is changing us, both individually and communally, in very important ways. 

One way of understanding the role of the Internet in our culture is to consider it as a metaphor and potentially part of a mythology that expresses some essence of what it means to be alive today. Many envision the Internet as an ever-expanding, boundless entity with near-infinite connections both to other people and to sources of knowledge. And this powerful pull of the Internet seems to me to come from its similarities both to our sense of our
outer world—that ever-expanding universe of which we are such a minute part—and to our inner world—the endless depth of our own psyche, imagination, and unconscious with its potential links to communal metaphors and myths. Both these worlds, the outer and the inner, are ineffable, boundless, and to a certain
extent mysterious, unknowable. 

The Internet shares these characteristics and hence seems to offer a similar potential for knowledge, insight, even adventure. It’s cyberspace, after all, a place for journeys. One clicks on an icon (our computers do have their own “iconographies,” just as mythologies do). Microsoft Windows offers users an “Explorer” program to cross the threshold into the vast and unknown space called the Internet. The potential seen in the boundless, gargantuan phenomenon of the Web leads many people to make large claims: Kevin Kelly, founding editor of Wired,
calls the Internet a “planetary computer,” a “global computer,” and even a “large-scale sentience” with a distributed and vast intelligence that grows “smarter” by the second as millions of users provide evermore information merely
by clicking on a specific website because, in so doing, they indicate their preferences, their interests.

Is there transcendence here, one might ask, the kind of move beyond our ordinary life toward the ultimate mystery of the universe and the source of life itself? Kevin Kelly and others seem to think this is possible, that there is the promise of ultimate knowledge, the unknowable, in the Internet: “Currently,” he writes in What Technology Wants, “we are prejudiced against machines, because all the machines we have met so far have  been uninteresting. As they gain in sentience, that won’t be true,” Kelly writes. “What technology wants is increasing sentience. This does not mean evolution will move us only toward one universal supermind. Rather in the course of time the technium tends to self-organize into as many varieties of mind as is possible. . . The universe is so huge, so vast in its available mysteries, that it will require every possible type of mind to comprehend it. The technium’s job  is to invent a million or a billion varieties of  comprehension.”

Much hinges on what Kelly means by technium, a word he coined because he found that “culture” was too “small” and does not for him convey a sense of “self-propelling momentum.” (I would point out that the word "culture" is also associated with organic growth.) Kelly reaches towards a new kind of mystical sense in defining his technium: it is “the greater, global, massively interconnected system of technology vibrating around us.” And this is not just
hardware and software, but all “culture, art, social institutions, and intellectual creations” along with this impulse (and here he quite anthropomorphizes technology) of essentially “what technology wants,” which is to generate more technology, more inventions, more connections. For Kelly, as for many enthusiasts of the Internet, the “technium” seems to be alive. But is it? And is it truly self-organizing, or is it just some version of Larry Paige
 standing behind a curtain like the Wizard of Oz?

But the real question at the end of the day about what this technology “wants” is: does it have any place left for humanity and the spiritual potential that Joseph Campbell alludes to when he talks about the myths human beings create out of their own dreams, their own imaginations, their own psyches. Or is this new myth a myth of the machine as an all-knowing  and all-powerful deity—in short, a god? Or maybe it’s just all that smoke and mirrors, concocting something rather more illusory than elusive.

7 Comments

Myths for Our Time (I): The Web and Human Knowledge

5/14/2012

2 Comments

 
I recently had a chance to revisit a wonderful series  of conversations Bill Moyers had with Joseph Campbell, a renowned and innovative  scholar of comparative mythology, a long-time teacher at Sarah Lawrence, and  the author of many books, beginning with The Hero with a Thousand Faces (1948). The Moyers conversations, called The Power of Myth, originally aired on PBS in the late eighties. But now, even twenty-five years later, there is much
rich, relevant material in those programs and in a companion book that includes all twenty-four hours of those talks, which were whittled down to a mere six hours for the PBS series itself. 

Campbell and Moyers spoke over a period of two years at George Lukas’s Skywalker Ranch and later at the Museum of Natural History in New York. In the course of their talks, Joseph Campbell repeated several times in
different contexts that our contemporary life had no relevant myths. Things were changing too fast, he believed, for a mythology to form. He defined myths as metaphors, stories that harmonize our lives with reality. They express the experience of living in terms that are appropriate for a specific time. But our lives have essentially been demythologized in the latter part of the twentieth century (and perhaps even earlier). Yet the old myths are still useful as guides, Campbell always maintained; they provide messages and hints about what  it means to be alive. 

Various journalists, scholars, and innovative thinkers today are writing about the nature of our life today and how we can accommodate the prevailing technology and flood of information and live successfully amid all of it. In effect, they are attempting to articulate various parts of the dominant symbols, metaphors, and stories—a mythology of sorts for today. And so  I thought it would be interesting to use this blog to explore some these  writings within the context of mythology as Campbell defined it and to bring some of his wisdom to bear on the problem of how we live in this world of the early twenty-first century.

One of the most powerful symbols of our time is of  course the Internet, also known as the World Wide Web or just the Web, and it is having a profound influence on the way we think about things. Daniel Weinberger, whose book Too Big to Know I reviewed on January 24th of this year the shape and nature of knowledge: Human knowledge, Weinberger argues, is assuming the shape of—and the scale of—the Internet. One solution to the resulting information overload, then, is to build not hierarchies but networks. Weinberger claims this is a serious shift in knowledge itself (although I believe it may be more of a shift in our approach to perceiving and manage information). Weinberger writes: “The Internet’s abundant capacity has removed the old artificial constraints on publishing—including getting our content checked and verified.”In case you don’t think this is necessarily a good thing, he expands on this vision: “The new strategy of publishing everything we find thus results in an immense cloud of data, free of theory, published before verified, and available to anyone with an Internet connection.”

This may sound a bit like the Wikipedia version of knowledge, but with less rigorous rules. In fact, it closely  resembles a free-for-all of  knowledge. Weinberger sees the structure of the Internet changing our  understanding of scientific facts. He claims that authority no longer reigns, even in scientific research, because truth is always being debated and revised. But that has been the nature of science for centuries, as science learns more and more about the world around us. Weinberger claims we can best learn to use the Net by understanding that authority, or the truth, is “the last page in the linked chain you visit” does not follow. But this seems to be to be more whimsy
than anything else. In fact, given the uneven level of the quality of information one can find on the Internet, it simply doesn’t make sense to say that the last page is the final word on a given topic. The last page could be
completely specious, contradicting many highly informative pages that preceded  it. 

The idea that only now is knowledge networked is also very questionable. As C.W. Anderson points out in his review in The Atlantic of Too Big To Know, knowledge has always been networked, just not electronically. Using the example of finding the population of Pittsburg in 1983 in an almanac, Anderson writes: “What do
almanacs, census bureaus, government funding streams, volunteers, the notebooks volunteers carry, and libraries amount to, if not a network?” So too I would point out that learning has never been a linear process. As one researches a topic, one might move from a chapter in one book to a journal article to three books on the topic and on and on until one is satisfied of the grasp of the knowledge available. It wasn’t as easy as clicking on links but it was always a process of exploration with some serendipity and surprise always bound to be part of the experience. 

Nevertheless, Weinberger’s whole thesis is representative of a growing body of literature that derives its energy, its
vision, and its sense of mystery (often verging on mysticism) from the image of the Internet, and this phenomenon in itself bears more close examination. Is this the beginning of a new mythology, a new set of symbols and stories that help us explain to ourselves and each other what it means to experience life today?  Or is it just an overreaction to a powerful but essentially mechanistic intrusion of new electronic capabilities into our environment? Moyers writes that the last time he saw Joseph Campbell, he asked him if he still believed “that we are at this moment participating in one of the very greatest leaps of the human spirit to a knowledge not only of outside nature but also of our own deep inward mystery.” Campbell thought about that for a moment and then replied, “The greatest ever.”
Perhaps the next phase is always the greatest ever, when it comes to science.  I’m not sure our grounding in our own human spirits though is making a comparable leap forward however. 
See also:  Joseph Campbell Foundation, a nonprofit organization dedicated to preserving his memory and  works, including making available a large collection on work unpublished during his lifetime.
  .

2 Comments

The Non-Stop Now of Social Media: From Wise Crowds to Group Narcissism

4/2/2012

5 Comments

 
Yesterday I attended a conference on social networking at the Boston Museum of Fine Arts and was once again struck with how absolutely overwhelmed and engulfed our modern lives are with digital machines and  technology. They are omnipresent. We carry them with us wherever we go. We  transact business through them. We communicate with friends and family and even  with that larger amorphous network of acquaintances and distant mutual friends  and their mutual acquaintances. And if we are honest, most of us will admit that  we even take those devices into our bedrooms with us when we retire for the  night. It is now so easy for our lives, our minute-to-minute experience of life,  to become permeated and mediated by our technologies. 

The conference, “Are Social Networks Really Social? was sponsored jointly by the museum and the Boston Institute for Psychotherapy so I would guess therapists of various sorts were in the overwhelming majority. But  there were more ordinary folk like myself who came out of curiosity to hear  three speakers on the topic of social media: (1) a psychologist and author who specializes in technology, Sherry Turkle, (2)  a  novelist, Helen Schulman, who has written about the nature and consequences of all things digital on the lives of ordinary families, and (3) an artist, Rachel Perry Welty, who has explored media like Facebook and Twitter as performance spaces. Over a long and rich afternoon those speakers and the audience pondered how social media—everything from email to smartphones to  Facebook—affects both our relationships with others and our own psyches. There  was a general consternation, even fear, and some sadness too, about how distracted, unfocused, and isolated individuals are becoming in our society. 

Many ramifications of such behavior came up: People are less productive, and they’re less capable of sustained and complex thinking. Some observed that there’s an intolerance, perhaps even an inability to actually cultivate
solitude. Not unrelated is a strong tendency to avoid, even again to fear, having direct conversations with others. And this of course leads to a lack of intimacy as well as empathy. Sherry Turkle worries that many people are actually substituting “connections” for “conversations.”

I’m thinking it may even be worse than that because when people post some thoughts on Facebook, send out a Tweet, or text someone, they are often not “connecting” so much as they are “performing.” My own experience with
Facebook and its current invitation to participate: “What on your mind?”is that it is a site for self-promotion, or as Norman Mailer once humbly (or maybe wryly) called a collection of his short works: “Advertisements for Myself.”
(Mailer was very good at the self-promotion thing, well before 2.0.)

The Yale computer scientist David Gelernter recently wrote a diatribe in The Wall Street Journal primarily against the careless disposable nature of “digital words,” and how sloppy and lazy (and idiotic) texting and smiley faces
really are. Yes, I agree that it’s all regrettable and one can only hope that everyone will come to their senses eventually. But what is even more interesting is Gelernter’s observation that “Generation-i is stuck in then“now,” neither pondering the past nor planning for the future. It’s the permeation and flow of the continuously new. “Merely glance at any digital gadget and you learn immediately what all your friends are doing, seeing, hearing, 
.  .  .  and (if you care) what’s going on now any place on earth. The whole world is screaming at you. Under the circumstances, young people abandon themselves to the present. Group narcissism takes over, as Generation–i falls in love with its own image reflected on the surface of the
cybersphere.”

Group narcissism. It seems we have far more of this phenomenon than we do of wise crowds and smart mobs. But it’s not clinical narcissism, which is a serious personality disorder characterized by dramatic emotional
behavior, a deep need for admiration, an inflated sense of self-importance, and an underlying fragile esteem for oneself. No this narcissism refers to the simpler classical myth of the beautiful Narcissus, who fell deeply in love with his own image in a pool of water as he drank from it. But he could never embrace or possess the image so rather than relinquish it he lay down by the side of the pool and was gradually consumed by the flame of his own
self-passion.

5 Comments

Facebook Era: Is the End of Its Dominance Near?

3/23/2012

5 Comments

 
It's starting to look like the beginning of the end. Signs are appearing, here and there, that suggest the dominance of Facebook in the public imagination and the widespread obsession in popular culture with all things Facebook may by waning, oddly enough, even before it launches its initial public offering later this spring. A recent New York Times story about a new social website called Pinterest suggests some reasons why.

Pinterest's express goal is similar to Facebook's but with a twist. Pinterest intends to "connect everyone in the world through the 'things' they find interesting." It declares itself a "virtual pinboard," which allows users to place pictures of objects they like and organize them. Others can view them and copy them for their own uses. Interest categories include food and recipes, wedding planning, garden styles, and just about anything else you can think of. Piniterest at once both leverages Facebook and has certain advantages over the Facebook model. It uses a "tie-in" with Facebook, so that each time a user signs up, the Pinterest website automatically "follows" all of that users Facebook friends who are also members of Pinterest, sending them email notifications of the new enrollee.

The new website also distinguishes itself from Facebook by expressly discouraging self-promotion: "If there is a photo or porject you're proud of, pin away! However, try not to use Pinterest purely as a tool for self-promotion."  Take that, Facebook! Pinterest also may have a market advantage over Facebook. The site attracts a certain type of person, one who likes to collect or curate or scrapbook. The members tend to be hobbyists and as such they tend to be avid. One investor characterizes them as "voracious" in their use of the website. Ironically it tends to encourage true sharing in a way Facebook does not.  As Susan Etlinger, a technology analyst and consult describes the difference: "Facebook used to be about connecting with your friends, but now it's do focused on the individual, with curating your timeline and how you present yourself to the world. " Facebook isn't structured to bring together communities of interest.

Furthermore, because Pinterest members tend to be both collectors and people who shop online, they represent an active group of purchasers with vast market potential down the road. Like so many start-ups Pinterest has to date relied on private investors and venture capital so far--to the tune of $40M. But eventually they'll have to figure out how they're going to make money and many investors believe their member base could represent a real gold mine.
 
Other news out of the South by Southwest technology conference in the past few days indicates that start-ups are distancing themselves from Facebook as they see the social networking giant becoming more profit-oriented. They worry that the rich data Facebook currently provides to its users could become accessible only at a price in the future. Although they for the most part continue to have Facebook as part of their model, more and more start-ups are making sure they build their own apps in ways that allow them to be independent of Facebook. As one entrepreneur puts it, Facebook is "our greatest opportunity and our greatest risk." For more see "Start-Ups Resist Facebook's Pull" in last Wednesday's edition of The Wall Street Journal.
5 Comments

Is Knowledge Dead? David Weinberger Seems To Think So . . .

1/24/2012

0 Comments

 
We’ve had the end of many things lately . .  it started with the end of modernism. The postmodernists declared that everything was an interpretation that occurs within a particular context and both the interpretation and the context are products of a particular culture and historical point in time. Hogwash, detractors have argued for years. But now along comes David Weinberger, who seems to have counted himself among the detractors for some years. In his latest book, Too Big to Know, Weinberger proclaims that the Internet has vindicated those crazy postmodernists after all. Derrida and his gang were right all along. And knowledge as we are used to thinking about it is dead, a passé concept from a bygone era.

According to Weinberger, things have changed because knowledge is no longer found just in books but also on the Net, where it is linked into complex configurations that defy the weight of authority. Apparently anything goes on the Internet and Weinberger seem to revel in it as he celebrates our new age without traditional knowledge: “Welcome to the life of knowledge once it has been taken down from its shelf. It is misquoted, degraded, enhanced, incorporated, passed around through a thousand degrees of misunderstanding, and assimilated to the point of invisibility.” Knowledge, which used to be part of a pyramid that included data, information, knowledge, and wisdom, has become unknowable and impossible to master,  Weinberger argues. He finds the shapelessness of knowledge reinvigorating, although he notes that this has unfortunately deprived knowledge of its foundations.

Weinberger’s argument is far-reaching: He claims that the very nature of knowledge is different because of the Internet. His rather jazzy subtitle draws the outline of the argument: “Rethinking Knowledge Now That the Facts Aren’t the Facts, Experts Are Everywhere, and the Smartest Person in the Room Is the Room.” Today, knowledge is messy. And it’s so complexly linked that the human brain can no longer fully comprehend it. “Knowledge now is the unshaped web of connections within which expressions of ideas live.” And it’s constantly being revised and debated so that knowledge has become a never-ending process.

I have to object to such a view. Just because any cracked pot (including I suppose possibly me) can post an idiotic opinion or false facts or illogical arguments or bad poetry on the Internet doesn’t mean that knowledge is devoid of truth. If we say that the shape and content of the Internet determines what knowledge is, then we and our core humanity are truly lost. We are doomed to the wise crowd of the lowest common denominator and the smart mob in any random street.

Yes, we live in an age of “Big Data,” where sensors and tracking software record an enormous amount of data points, and yes such vast amounts of data make it easier to go wrong, but that still doesn’t mean there might not be a pattern in the data that could divulge some information. It’s still possible that collecting and analyzing enough information might lead to new insights and real knowledge.  And yes, the Internet seems to be capable of holding infinite amounts of data and information. But hasn't knowledge always been an open-ended affair? That’s what Hamlet was trying to tell Horatio when he told him there were “more things in Heaven and earth, Horatio, than are dreamt of in your philosophy.” And as for the fact that all this information seems hopelessly fragmentary,  ninety years ago T.S. Eliot was complaining about the same thing as he wrote in The Waste Land about the mere “fragments I have shored against my ruins.” Sure, the Internet may be unfathomable. But so too are the human heart and the human brain.

I find it heartening and enlightening to listen to scientists and artists who grapple with the mysteries of life at the edge of knowledge. The neuroscientist and researcher David Eagleman explained it well in a recent  interview on NPR: “We’re always looking for patterns. . . .  I’ve spent my life in science. .  .  .  It is the single most useful pursuit that we have in terms of trying to figuring out what is going on in the world. .  .  . But at some point the pier of science comes to an end and we’re standing at the end of the pier and looking at uncharted waters that go for as far as the eye can see. Most of what we’re surrounded with is mystery and what one comes to understand in a life of science is the vastness of our ignorance.”

But that doesn’t mean he didn’t go back to his lab in the morning.

0 Comments

Does Thinking Have a Future?

1/11/2012

2 Comments

 
I recently bought a promising book called The Future of Thinking: Learning Institutions in the Digital Age at the MIT Press Bookstore. Great title, I thought, and it looks like it's written by two smart people, Cathy Davidson from Duke, and David Theo Goldberg from the University of California system. Plus the project was underwritten by the MacArthur Foundation as part of a series for the John D. and Catherine T. MacArthur Foundation Reports on Digital Media and Learning. I enthusiastically plopped a copy into my basket and went on browsing. This’ll be good, I thought.  

The book itself, however, disappointed, and it’s been a little hard to figure out why. The actual book is less about thinking than it is about the learning environment the authors envision for the future.  But that’s OK.  I understand that the structures of our existing siloed educational institutions were conceived of centuries ago. They certainly don’t reflect the way people are learning informally outside of those institutions today. But wait. Do they have to, I asked myself? The authors seem to say yes. Is this the new reality?

Although from time to time the authors claim that they do not advocate using digital tools and technologies just because they are there, they do in fact believe that learning within institutions should better  reflect how people interact outside of formal learning environments today, including social networking tools, massively multiplayer online video gaming, virtual learning institutions, interactive collaborations, and open-access public forums. So let’s see where that takes us.

Existing traditional educational institutions are failing us, the authors argue. They envision future institutions as  "mobilizing networks." This is all very up-to-date indeed, I thought. Instead of top-down authoritative teaching and learning, the mobilizing network would support peer-to-peer learning and collaborative knowledge production. Digital learning, they emphasize, is participatory learning— which is in part code for not "teaching to the test." That’s fine. Teaching to the test has never worked very well anyway.

But there was still another problem with this book about the future of thinking.  There's a way of arguing in this book that says: “Here we are. Here are roughly the outlines of the arguments. Here are some drawbacks. But still we must go on with our vision, mustn’t we?” And they assume an audience that is fully onboard with their collaborative thinking: For example, the authors are concerned about how Web 2.0 as a network of "many-to-many collaborating and customizing together" may evolve in the wrong way as corporations such as Google gain control over more and more personal and institutional, and national information. But never mind: "Yet even though the concept is vague or open to exploitative, monopolistic, or oligopolistic (wow!) practices, Web 2.0 is a convenient way of signaling a new type of institution. It is one where contributions are distributed rather than coming from a single physical location and where ideas are shared outside the normal rules of tenure, credentialing, and professional peer review." Is there any room in their collaborative world for skepticism? For questioning whether loose collaboration-for-all is right for every age group and every discipline at all times?

There's also often a troubling lack of in-depth reasoning behind their advocacy of certain processes in the new forms of learning. Many people read differently these days, the authors argue. By implication our institutions should reflect these new processes, apparently with no analysis of their inherent value. Here's how the authors redefine reading for the digital age: "Even online reading . . . has become collaborative, interactive, nonlinear, and relational, engaging multiple voices. We browse, scan, connect in mid-paragraph if not mid-sentence to related material, look up information relevant or related to what we are reading. Sometimes this mode of relational reading might draw us completely away from the original text, hypertextually streaming us into completely new threads and pathways." It's an interesting description of what often happens online, but does it have anything to do with learning? Is it supposed to in the future?

Collaborative, many-to-multitudes, virtual, peer-to-peer—the authors  present a remix if you will of some au courante concepts. From Chris Anderson’s The Long Tail, they project from economic and business theory onto the role of the university:  "If we do indeed live on the long tail,  . . . then  virtual institutions may be the long virtual tail that wags the dog of traditional institutions without which it could not exist." Huh? Other popular ideas, such as those from Clay Shirky’s Here Comes Everybody: The Power of Organizing without Organizations and Yoklai Benkler’s The Wealth of Networks are added into the mix, all seemingly part of the ideal audience for this book.

I suspect part of the repetitive, jingoistic, and sometime contradictory statements that emerge from the text reflect on how the text was generated. You got it:  very collaboratively. Not only did the two authors collaborate but they then posted the draft online and invited comments—for a year. The draft was also presented in three additional public forums. Lastly, the authors worked to incorporate many of the comments and concerns voiced by others. It is a form of writing by committee that can wobble under the weight of the various points of view if not carefully shepherded by one (or even two) good writers with a single strong vision.

It isn’t that the book doesn’t offer some food for thought on many issues. It does. How do we create multidisciplinary forums and projects within the currently rigid institutions? Is learning more a process of learning how to learn, than learning what,  these days? Is it less about actually acquiring the information, since the information will always be there to be acquired when needed? And what about credibility on the Internet? How prominent an emphasis do we need to give to teaching students how to discern credible sources of information as a new part of that learning "how" process?

But as for the future of thinking . . . well, it seems there still needs to be more thought put into that. I just don’t know how collaborative it has to be.

2 Comments
<<Previous
Forward>>

    RSS Feed

    Archives

    February 2014
    January 2014
    December 2013
    November 2013
    August 2013
    July 2013
    June 2013
    May 2013
    April 2013
    March 2013
    February 2013
    January 2013
    December 2012
    August 2012
    July 2012
    June 2012
    May 2012
    April 2012
    March 2012
    January 2012
    December 2011
    November 2011
    October 2011
    June 2011
    May 2011
    April 2011
    March 2011
    February 2011
    December 2010
    July 2010
    June 2010
    May 2010
    March 2010
    February 2010
    January 2010
    December 2009

    Categories

    All
    AI
    Computer Models
    Convergence
    Digital Software
    Division Of Labor
    E Readers
    Facebook
    Financial Markets
    Google
    Innovation Business Cycle
    Internet
    Knowledge
    Learning
    Media Use
    Myths
    Powerpoint
    Robots
    Screen Life
    Screen Life
    Search
    Social Networking
    Targeted Marketing
    Technology And Jobs
    The Nature Of The Digital
    The Nature Of The Digital
    Video Games
    Web 2.0
    Wikis
    Youth

    Cynthia's Blog Plan

    I'll aim to post here a few times a month, based on current events and my ongoing research.