The Cro-Magnon Brain in the Information Age
We are a limited race, we humans. In fact, that’s part of what it’s always meant to be human—to make errors and mistakes, to forget car keys and birthdays, to be imperfect in large ways and in small. Yet it still comes as a surprise to many of us that the brains we are born with today are almost identical to the brains of Cro-Magnon humans of 40,000 years ago. They had the same brain volume and anatomy as we have, Torkel Klingberg writes in The Overflowing Brain: Information Overload and the Limits of Working Memory (New York: Oxford, 2009).
We certainly live in a world of a lot more information, both in volume and complexity, than our primitive ancestors. But how do we manage to live in this digital age of nonstop information and 24/7 stimulation? Not as well as we sometimes think, contends Klingberg, who is a professor and researcher in cognitive neuroscience. And that is largely because of the built-in limitations of working memory, known as “Miller’s mental bandwidth” or “the magical number seven.” In the 1950s, in one of the most influential articles of twentieth-century psychology, George Miller proposed that psychologists should look at the brain the way physicists look at copper wires: as communication channels that handle a certain amount of information over a given period of time. He proposed the number seven (roughly) as the number of pieces of information that working memory can retain simultaneously.
Working memory temporarily stores and manipulates information to solve such complex problems as language comprehension, learning, reasoning, and problem-solving. This type of memory is closely associated with controlled attention. In fact, working memory and controlled attention overlap in terms of neural activity. Simply put, you must pay attention to that which you want to remember. Attention, like working memory, has limits: the brain can only receive information from one source at a time. Both controlled attention and working memory are required to solve complex problems, follow instructions, and keep goals in mind—all the hallmarks of the cognitive powers of the adult human brain.
One of the most common experiences of reaching the limits of working memory is in receiving—and trying to remember—a set of oral directions: Turn left on Pacific and proceed three miles, take a right at the Mobil station, go two blocks and take a right on Oakhurst North for about a half a mile. Then take a left . . . and so on. Somewhere along the way, working memory finds its limit. Attention flags. The brain simply cannot absorb and retain more of the stream of information.
Now, fifty years after George Miller’s paper, information technology has created an environment that bombards us with far more information than we can ever absorb and retain. To make the problem even more absorbing, neuroscientists now know that the brain is plastic and flexible throughout an individual’s life, evolving over time based on each individual’s experience. Basically we are what we attend to. Not only does the brain store memories, it redraws its own map, rewiring connections based on brain activity. Areas expand with use; others shrink with disuse. Our brains remodel themselves throughout our lifetimes, responding and changing according to the stimuli and content that we feed them on a regular basis. And, interestingly enough, the brain responds astonishingly fast to Internet use. Studies have shown that people who have not previously used a search engine to explore the Internet begin to rewire the neural circuitry in their brains with as little as five hours of exposure.
Over time the brain can even strengthen its circuitry and become functionally larger through being challenged, whether with puzzles, learning a new musical instrument or language, developing other new skills, reading, or playing games. There are also exercises and routines that can help expand working memory and strengthen attentional power. But can the brain actually do two things at once?
While neuroscientists are also not entirely sure about how the brain handles two tasks at once, they are closing in on a likely theory. According to several recent studies, working memory and attention are confined to a small number of clearly definable areas in the brain. When two tasks demand access to this area of the brain, a bottleneck arises, the two simultaneous tasks interfere with one another, and, as a result, the brain switches back and forth between the tasks to accomplish the work at hand.
Thus it seems we can only multitask to the degree that some activities are largely automatic, needing little or no working memory. For example, we can walk and talk at the same time. We can even drive and talk at the same time. But when we try to drive and hold involved conversations with others on the phone, then the consequences of multitasking become quickly apparent. Under some driving conditions, a single second of inattention can be dangerous, so switching back and forth between watching the road and dialing a phone number (or, worse yet, texting) is far more hazardous than say, talking on the phone and sorting through junk email at your desk.
It has become popular to claim that the members of the Millennium generation were essentially “born digital.” Those born in 1980 or later, so the story goes, have been exposed from an early age to the continual and multiple claims of multiple media. They have learned to live with the myriad possibilities of split-screen attention. In fact, they seem to thrive in an environment where they are constantly interrupted while doing one task to attend to another. Those who first confronted this hectic environment as adults don’t do as well as their younger counterparts because the adult brain is less malleable, although it is still capable of significant changes.
Research is mounting, however, to disprove this theory. In fact, multitasking may not only be overrated—technically it may not even be possible. At UCLA a group recently demonstrated that multitasking impairs learning and memory even in teenagers. It found that you can accomplish some learning while multitasking but it is harder to retrieve the learned information, and it is especially hard to apply it to a new context or connect it with other information to gain new insights. University of Michigan researchers have actually concluded that multitasking makes people less productive, largely because of the costs involved in switching from one task to another and re-establishing the context of the problem. “Multitasking is going to slow you down,” says David Meyers, “increasing the chances of mistakes.” Vanderbilt University’s researcher Rene Marois agrees, “We are under the impression that we have this brain that can do more than it often can.”
A different study, this one at Oxford University’s Institute for the Future of the Mind, tested the impact of interruptions and found them to be a significant drain on the brain’s power, both with the younger and older generations. Researchers studied how interruptions affected two age groups, 18- to 21-year olds and 35- to 39-year-olds. Both groups were asked to use a simple code to translate images into numbers within a time limit of 90 seconds. When they worked with no interruptions, the younger group performed the task 10% faster than the older one. When interruptions in the form of IMs, text messages, or phone calls were added, however, the performances of the groups were equal. The researchers concluded that while the younger minds worked faster, the older minds had “more fluid intelligence, so they are better able to block out interruptions and choose what to focus on.” Fluid intelligence, which enables reasoning and the solving of nonverbal problems, critically depends on working memory.
In spite of evidence to the contrary, however, multitasking is if anything becoming more widespread: It is estimated that workers are interrupted every three minutes during the day, half the time by themselves. When people multitask, they tend to feel more productive, powerful, important, stimulated, hopeful of something new. The attraction to stimulation and novelty can cause “acquired attention deficit disorder,” a term coined by Harvard Medical School’s John Ratey, a specialist in attention. Ratey also contends that multiple technological devices can become an addiction. ‘It’s like a dopamine squirt,” he says, citing research that shows the brain’s reaction to such stimulation from high-tech devices, follows the same path as known addictive behaviors. Many people casually brush this objection aside. A typical reaction is that of Charles Lax, a managing general partner at a Boston venture capital firm, multitasks all the time: “We all suffer a kind of ADD,” he says. “But it’s not a problem. Being able to process lots of data allows me to be more efficient and productive.”
Ned Hallowell, another attention specialist and frequent collaborator with Dr. Ratey, sees the darker side of acquired attention deficit disorder.It has all the pitfalls of ADD but none of the pluses (ADD sufferers are frequently high-performance individuals who can at times create original ideas and innovative strategies). People with “acquired” attention deficit disorder, on the other hand, only experience the negative aspects of the disorder in reaching the limits as their brains are asked to process growing amounts of data. Their ability to solve problems declines. They increase the number of mistakes they make. Hallowell observes: “As our minds fill with noise—feckless synaptic events signifying nothing—the brain gradually loses its capacity to attend fully and thoroughly to anything.” Attention is hobbled, and the brain experiences alarms of “fear, anxiety, impatience, irritability, anger, or panic.” The bodily systems also go into red alert, dimming intelligence and prompting extreme behaviors, such as meltdowns or total avoidance of the problems at hand. Some people, Hallowell adds, deal with this condition better than others, but, he warns, no one can completely control the executive function of the brain so one should take care not to go too far down this path in the first place. What he’s really saying is we should be careful what we attend to, lest we lose the ability to pay attention at all.
Post a Comment
See also This Is Your Brain Online