The Shallows

I've got a review of The Shallows, a new book by Nicholas Carr on the internet and the brain, in the NY Times:

Socrates started what may have been the first technology scare. In the "Phaedrus," he lamented the invention of books, which "create forgetfulness" in the soul. Instead of remembering for themselves, Socrates warned, new readers were blindly trusting in "external written characters." The library was ruining the mind.

Needless to say, the printing press only made things worse. In the 17th century, Robert Burton complained, in "The Anatomy of Melancholy," of the "vast chaos and confusion of books" that make the eyes and fingers ache. By 1890, the problem was the speed of transmission: one eminent physician blamed "the pelting of telegrams" for triggering an outbreak of mental illness. And then came radio and television, which poisoned the mind with passive pleasure. Children, it was said, had stopped reading books. Socrates would be pleased.

In "The Shallows: What the Internet Is Doing to Our Brains," the technology writer Nicholas Carr extends this anxiety to the 21st century. The book begins with a melodramatic flourish, as Carr recounts the pleas of the supercomputer HAL in "2001: A Space Odyssey." The machine is being dismantled, its wires unplugged: "My mind is going," HAL says. "I can feel it."

For Carr, the analogy is obvious: The modern mind is like the fictional computer. "I can feel it too," he writes. "Over the last few years, I've had an uncomfortable sense that someone, or something, has been tinkering with my brain, remapping the neural circuitry, reprogramming the memory." While HAL was silenced by its human users, Carr argues that we are sabotaging ourselves, trading away the seriousness of sustained attention for the frantic superficiality of the Internet. As Carr first observed in his much discussed 2008 article in The Atlantic, "Is Google Making Us Stupid?," the mere existence of the online world has made it much harder (at least for him) to engage with difficult texts and complex ideas. "Once I was a scuba diver in a sea of words," Carr writes, with typical eloquence. "Now I zip along the surface like a guy on a Jet Ski."

Much of Carr's argument revolves around neuroscience, as he argues that our neural plasticity means that we quickly become mirrors to our mediums; the brain is an information-processing machine that's shaped by the kind of information it processes. And so we get long discussions of Eric Kandel, aplysia and the malleability of brain cells. (Having work in the Kandel lab for several years, I'm a big fan of this research program. I just never expected the kinase enzymes of sea slugs to be applied to the internet.)

As I make clear in the review, I was not entirely convinced by Carr's arguments:

There is little doubt that the Internet is changing our brain. Everything changes our brain. What Carr neglects to mention, however, is that the preponderance of scientific evidence suggests that the Internet and related technologies are actually good for the mind. For instance, a comprehensive 2009 review of studies published on the cognitive effects of video games found that gaming led to significant improvements in performance on various cognitive tasks, from visual perception to sustained attention. This surprising result led the scientists to propose that even simple computer games like Tetris can lead to "marked increases in the speed of information processing." One particularly influential study, published in Nature in 2003, demonstrated that after just 10 days of playing Medal of Honor, a violent first-person shooter game, subjects showed dramatic increases in visual attention and memory.

Carr's argument also breaks down when it comes to idle Web surfing. A 2009 study by neuroscientists at the University of California, Los Angeles, found that performing Google searches led to increased activity in the dorsolateral prefrontal cortex, at least when compared with reading a "book-like text." Interestingly, this brain area underlies the precise talents, like selective attention and deliberate analysis, that Carr says have vanished in the age of the Internet. Google, in other words, isn't making us stupid -- it's exercising the very mental muscles that make us smarter.

This doesn't mean that the rise of the Internet won't lead to the loss of important mental talents; every technology comes with trade-offs. Look, for instance, at literacy itself: when children learn to decode letters, they usurp large chunks of the visual cortex previously devoted to object recognition. The end result is that literate humans are less able to "read" the details of the natural world.

On his blog, Carr disagrees with me:

I was startled to find him claim that "the preponderance of scientific evidence suggests that the Internet and related technologies are actually good for the mind." I think that's incorrect, even while I'm happy to acknowledge that brain studies are imprecise and can be interpreted in different ways (and that the definition of what's "good for the mind" will vary from person to person).

As evidence, Carr refers to a very interesting review by Patricia Greenfield, a developmental psychologist at UCLA. The problem with the review is that, while it covers many important topics (from the Flynn effect to the tradeoffs involved in multitasking) it only discusses a single study that actually looked at the cognitive effects of the internet.

This was tested in a communication studies class where students were generally encouraged to use their laptops during lectures, in order to explore lecture topics in greater detail on the Internet and in library databases. Half of the students were allowed to keep their laptops open, while the other half (randomly assigned) had to close their laptops. Students in the closed laptop condition recalled significantly more material in a surprise quiz after class than did students in the open laptop condition. Although these results may be obvious, many universities appear to be unaware of the learning decrement produced by multitasking when they wire classrooms with the intention of improving learning.

Now this is a compelling finding, and I agree with Professor Greenfield that it should lead colleges to reconsider having the internet in the lecture hall. (Although it's also worth noting that the students in the internet cohort didn't get lower grades in the class.) But as the paper itself makes clear, this was not a study about the cognitive effects of the world wide web. (After all, most of us don't surf the web while listening to a professor.) Instead, the experiment was designed to explore the hazards of multitasking:

The work here explored the effects of engaging in multiple tasks simultaneously on traditional outcome measures of performance. While methodologically the procedures employed in the present study differ somewhat from those of the classic divided attention paradigm, the essence of those procedures has been preserved, and the resulting performance decrement obtained. In two studies, students performing multiple tasks performed significantly poorer on immediate measures of memory for the to-be-learned content.

Given this paucity of evidence, I think it's far too soon to be drawing firm conclusions about the negative effects of the web. Furthermore, as I note in the review, the majority of experiments that have looked directly at the effects of the internet, video games and online social networking have actually found significant cognitive benefits. Video games improve visual attention and memory, Facebook users have more friends (in real life, too) and preliminary evidence suggests that surfing the web "engages a greater extent of neural circuitry...[than] reading text pages."

Now these studies are all imperfect and provisional. (For one thing, it's not easy to play with Google while lying still in a brain scanner.) But they certainly don't support the hypothesis that the internet, as Carr writes, is turning us into "mere signal-processing units, quickly shepherding disjointed bits of information into and then out of short-term memory."

To get around these problematic findings, Carr spends much of the book dwelling on the costs of multitasking. Here he is on much firmer scientific ground: the brain is a bounded machine, which is why talking on the phone makes us more likely to crash the car. (Interestingly, video games seem to improve our ability to multitask.) This isn't a new idea - Herbert Simon was warning about the poverty of attention fifty years ago - although I have little doubt that the internet makes it slightly easier for us to multitask while working and reading. (Personally, I multitask more while watching television than while online.)

But even here the data is complicated. Some studies, for instance, have found that distraction encourages unconscious processing, which leads to improved decisions in complex situations. (In other words, the next time you're faced with a really difficult choice, you might want to study the information and then multitask on the web for a few hours.) Other studies have found that temporary distractions can increase creativity, at least when it comes solving difficult creative puzzles. Finally, there is a growing body of evidence on the benefits of mind wandering, which is what happens when the spotlight of attention begins to shift inwards. Does this mean we should always be distracted? Of course not. But it does suggest that focused attention is not always ideal. The larger lesson, I think, is that we should be wary of privileging certain types of thinking over others. The mind is a pluralistic machine.

One last note: Carr makes many important, timely and eloquent points about the cultural losses that accrue with the arrival of new technologies. (This seems like an apt place to add that Carr is an awesome writer; The Shallows was full of graceful prose.) I'm a literary snob, and I have a weakness for dense novels and modernist poetry. I do worry, like Carr, that the everywhereness of the internet (and television before that) is making it harder for people to disappear down the worm hole of difficult literature. This is largely because the book is a quiet medium, and leaves much of the mind a bit bored. (This helps explain why many mind wandering paradigms give undergrads readings from War and Peace; Tolstoy is great for triggering daydreams, which suggests that literature doesn't always lead to the kind of sustained attention that Carr desires.)

But this cultural argument doesn't require brain scans and lab studies. One doesn't need to name drop neural plasticity in order to hope that we will always wrestle with the challenging texts of Auden, Proust and even Tolstoy. Carr and I might disagree about the science, but I think we both agree that the act of engaging with literature is an essential element of culture. (It might not be "good" for my brain, but it's certainly good for the mind.) We need Twitter and The Waste Land.

UPDATE: Nick Carr posts a typically thoughtful reply in the comments, and I reply to his reply.

More like this

Students, even at advanced levels, who leave their laptops open in class often spend a lot of time doing distracting things that are unrelated to their courses. Of course they do worse on pop quizzes. The same thing would happen if you had half the class read the Phaedrus during their lecture: while that half would usually wind up learning more than the students listening to the professor, they wouldn't remember as many details about what went on in the room.

By Law Student (not verified) on 06 Jun 2010 #permalink

I was at an education conference about two years ago, where an eminent psychologist gave a speech on precisely this topic (I believe he actually cited some of Carr's work, as the name sounds very, very familiar).

The essence of the speech was 'Google is diluting our minds and scientific research', and focused on the alleged neurological (and social) effects of having ready access to information.

From a casual perspective, I see the argument - one could rightly interpret the number of attention-deficit kids in Computer Science as indicative of the problem. Or, perhaps, one could interpret it as indicative of a failure to treat the problem - or that "problematic" children most readily find their way into the discipline, etc., etc...

I'd be more easily convinced if this wasn't said about every single medium ever developed, that we later took for granted and built the world around...

Jonah, I loved your review and share most of your comments. I address a very similar topic in my book "The Principle of Relevance". In a nutshell, my theory is that the evolution of technology is inevitable and positive in that it gives unlimited access to information and knowledge, but it obviously comes at at a price, which is an increased use and dilution of our attention span. I believe that rather than thinking that "internet makes us stupid" it is up to us to become aware of the level of and upscale our information processing skills to match the potentials of technology and train ourselves to become powerful knowledge workers - masters of our wonderful and evolving technology - rather than random/superficial surfers victims of that same technology.

Stefania Lucchetti
Author of "The Principle of Relevance"

Dear Jonah,

Thanks for your kind words here and in the review about my writing. As I noted on my blog, I'm a big fan of The Frontal Cortex. (You're always distracting me, but in a good way.) Still, in this case I have to say that I think you're wrong - I don't believe that, as you assert, "the preponderance of scientific evidence suggests that the Internet and related technologies are actually good for the mind" - but, thanks to your more expansive explanation in this post, I now understand the source of the divergence in our views. You are much more restrictive in defining the scientific evidence relevant to an assessment of the Net's effect on the mind than I am. You (like Professor Greenfield, I should note) view studies of video games to be relevant, even when they don't involve direct Net use. I agree with that; I think video games are a useful proxy for certain aspects of Net use. But in my analysis I also include as relevant evidence the many studies that have been done of hypertext, hypermedia, screen-based multimedia, and on-screen multitasking. It seems to me that, if we accept that the Net is a hypermedia, multimedia, and multitasking system, then it would be foolish to ignore studies of the cognitive effects of those elements (as you seem to be arguing we should). But I would go further than that. I also believe (as I discuss in The Shallows) that studies of interruptions, distractions, and attentiveness, particularly their effects on the formation of long-term memories and the synthesis of complex conceptual knowledge, are relevant to any study of the Net's effects on our thinking. And I would (and do) include studies of the behavior of Web users, including eye-tracking studies - these, too, strike me as relevant. I'm sure you have good reasons to exclude all of these sorts of studies, while including video game research, but to me, when it comes to examining the mental effects of a complex medium like the Net, it seems wise to draw on as a broad a set of evidence as possible (while also, of course, keeping in mind the limitations of all these studies). I think that, in looking at the extant evidence broadly and in total, it's hard not to come away profoundly concerned about the Net's effects on the deepest, and to me most valuable, forms of human thought.

We are, however, on the same page when it comes to literature.

Best regards,

Nick Carr

So Carr's complaint is essentially that our brains are adapting to their environment? How is this problematic? Was it a tragedy 1.5 million years ago when the use of simple tools re-wired human brains?

I see no problem with our brains evolving in real time to better prepare themselves for an unknown future. If Carr wants his brain to stay prepared for the year 1997, that's his decision.

@ Eric

Agreed. Comparisons could be made to the shift in human societies from hunter-gatherer to agrarian. The physiological effects on humans have been profound and certainly not always positive. (I don't think there were too many morbidly obese hunter-gatherers.) But who today would advocate a return to a hunter-gatherer society?

As multiple points of research have revealed, we are awash in a sea of information. It is having profound effects on humans, both positive and negative, of that there is no doubt. What Carr's writings will perhaps accomplish is to make us more aware of the negative effects of the Information Age so - unlike with the agrarian revolution and the industrial revolution - we can lessen their impact.

In other words, we have to take the good with the bad, there's no way around it. However, let's be fully aware of the bad, rather than ignore it or be completely ignorant of it as we have in the aftermath of past societal revolutions.

Agree with Carr about The Frontal Cortex being awesome but very distracting! Sometimes I get tempted to try Ritalin or Aderall - not great for my creativity but at least I'd be able to focus for long periods of time and resist distraction - the simple force of will doesn't seem to be enough for me. Hope you'll tell us more about these drugs and their effects soon.

Nick, Thanks so much for your thoughtful and generous reply. Interactions like this almost make the internet worthwhile! I am more cautious when it comes to making generalizations that link the internet to "studies of interruptions, distractions, and attentiveness," as you put it. That's for a couple of reasons. Firstly, the justification for doing so relies largely on anecdotal reports and a single Stanford study with sample sizes of aprox 30 undergrads. And even then it remains unclear how much of that data is about the informational properties of the internet, and how much is about shuffling between the internet and the TV and the phone. Given these empirical limitations, I think it's very important (especially at this early stage) to constrain our speculations about what the internet is doing to our brain, especially when (in my opinion) many of the best controlled studies have linked computer use to cognitive enhancements.

Here's another example: In 2008, scientists at Michigan did a very clever study that showed that a certain kind of activity closely associated with modern life led to dramatic decreases in working memory, visual attention, and positive affect. (Needless to say, these decreases were much more dramatic than anything that's been linked to the internet.) What was this activity? It was walking down a city street. When people walk down the street, they are forced to exert cognitive control and top-down attention, and all that mental effort takes a temporary toll on their brain. Based on this data, I could easily make the case that it's better for the brain to stay home and play Google than go for a stroll in the metropolis. But I think that would be a shortsighted argument, based on a limited reading of a very limited data set.

Let's take this entire review and subsequent conversation to ground from a moment. Let's leave the numerous inconclusive studies, the theories in progress and the variable perceptions and arguments from academics, scientists and authors; let's just take a stroll down the street, to use Jonah's example.

What do we see with your own eyes? What is your gut reaction in the busy coffee shop as you observe those around clutching their electronic devices? What comes to mind on the bus when it's packed full of school kids clicking away? What basic natural instincts are being awakened in such situations? Spend some time visualising more scenarios where technology is being over-used or abused. When your ready, answer this question as honestly as possible.

Do you feel empathy or sympathy?

In How We Decide, Jonah Lehrer makes clear that emotional attunement to one's natural and social environment - which does have a neurological basis - is crucial to making judicious judgments and ad hoc decisions. I am just curious: are there any studies demonstrating that video games and web serving are beneficial for this aspect of our mental lives? Or are they improving mostly the nerdish cognitive skills boasted by the neuroscientists conducting all those clever experiments?

By Ivelin Sardamov (not verified) on 07 Jun 2010 #permalink

I think it is quite healthy to question the effects of new technology, regardless of its popularity and widespread adoption.

I see people walking down the streets with friends, who instead of talking, are texting away in silence. I just ate in a cafe next to a table with two guys who ate their entire meal surfing the internet on their cellphones, and spoke perhaps two words throughout the entire meal. I see cops texting while driving, and I witness my nephews going into depressions if they can't play videogames sometime during the day.

My concern is that these technologies are slowly trumping everyday reality for many people. Choosing between talking to your real live friend right next to you versus texting someone who's not, why is increasingly the electronic version of reality winning?

By Jonathan Hall (not verified) on 07 Jun 2010 #permalink

I love both Rough Type and the Frontal Cortex and am a regular reader of both; there is a sparkspray on this one.

The incremental approach concerning the technological possibilities - from the mind to the book to the press to the net - doesn't sit right. I'm reminded of the adage "swans mate for life", somehow suggesting that if only we could be more 'swan-like' things would be really rosy. I'm just not sure we should be making our decisions on present-day tech by making historical reflections; the thoughts of birds don't bear much on my marital decisions, and B.C.E. thoughts of admittedly great men concerning the hi-tech of the time are difficult to move on.

I've been on the web about 16 years and wrote one of the first 5000 web pages. Through this time I have found it surprising how the possibilities afforded by the Internet have been rapidly transformed into moral imperatives; it's rather amazing how it seems to self-perpetuate itself by convincing its 'users' that it tops sliced bread, in a million different ways. I applaud both of you for being two of the few and best voices to point this sort of thing out - Facebook, Twitter, Google really could be crap. The fact that the battle is so one-sided speaks volumes, perhaps dangerously so.

Finally the possibility of the Net destroying our collective minds seems longitudinal in nature; I don't think we are going to get very far by conducting studies, much like we aren't going to solve the issue of near-shore drilling by improving blowout preventers. These are complex issues that deserve to be treated complexly and might be handled better with other tools more up for the job, such as executive judgment, white knuckles, and political shot-calling.

Jonathan Hall nails my point with pinpoint precision, thanks! Just clipped the comments and started a blog over on Amplify, hoping to get some meaningful answers to your question...

Why is the increasingly electronic version of reality winning?

Blog Post: http://bit.ly/bv5MHk

Jonah,

I agree that individual studies should be interpreted very conservatively, but I don't think these studies should be ignored. It's when you look at them in total - and I should note that there are several interruption studies that look specifically at email and instant messaging - that you see patterns emerge, sometimes striking ones. There's a vast literature, as you know, on the cognitive effects of interruptions during computer use (I refer your readers to interruptions.net), and I think all of it is relevant.

Regarding your city street point: you'll recall I compare being on a busy urban street to being online in the last chapter of the book! We love the city street and the web for many good reasons, but we should also be aware that that they aren't conducive to some of the deepest - and to me most valuable - forms of thought our brains are capable of.

Cheers,

Nick

This is just a quick note to say that this blog is simply amazing. It's so refreshing to see such a civilised and thoughtful academic debate on the internet. As an undergraduate psychology student, my learning has been greatly enhanced by the articles I've read here.

My opinion, for what it's worth, is that there is no use in dichotomising this debate. As Dustin said, Nick Carr's book can serve to raise awareness of the negative aspects of internet use, which is important, I believe in giving us a more balanced perspective. It's certainly made me question my internet use and whether or not my brain is over-reliant on one information medium.

However, the benefits of the internet are unmistakably enormous. I don't know whether I would ever have been exposed to fascinating ideas such as those we are discussing, without the internet. It has given me a network (albeit, a somewhat anonymous one) of like minded individuals with whom I can debate and discuss the topics that matter to me.

In any case, I'm going to approach my internet use a little more conservatively in the future and I think I'll make that little extra effort to tackle a good book like I used to. Nick's new one seems like a good place to start.

Kind regards,
John.

I would agree that there is a significant change occurring in our cognitive processes as a result of emerging information technology, though it seems far from conclusive that the net effect of this change is negative.

Following on the warnings of Socrates, does anyone here dispute that books have been a cognitive boon for humankind? With the advent of smartphones, I and millions of others have virtually instantaneous access to vast stores of knowledge. Yesterday over dinner naked mole rats came up in conversation, and I pulled out my Droid, used voice search, and had gobs of new information to add to the discussion. On a recent trip to San Francisco, I used walking navigation to find my way around the city. Personally it seems a huge benefit to be able to spend fewer cognitive resources on storing large amounts of obscure facts or spatial maps. Instead, I have to learn the comparatively cognitive load of learning how to effectively access the information and use it.

So I get the opposite subjective feeling that Carr does. As I get older and become more immersed in information technology, I feel smarter.

But even assuming for a moment that Carr is correct in his assessment (which is far from conclusive), is there a call for a specific agenda? Is this a call to arms for a new Luddite revolution to smash the servers and free ourselves from the tyranny of ubiquitous computing?

By Derek James (not verified) on 08 Jun 2010 #permalink

I'm a big fan of The Frontal Cortex, and I certainly welcome the hypothesis generating work put forth in The Shallows. It's great seeing you two hash it out a bit more in the comments here, and I think it pulls on a fundamental difference that that's being left out of these debates. I'm currently (trying) to do a PhD on some of the neurocognitive correlates of internet behavior pattern and I don't agree at all with Nick's review of the literature. First of all, as John points out, there are a great many studies demonstrating cognitive enhancements from digital media. As you have both discussed, these may not map directly onto the kinds of 'deep thinking' discussed in the book. But I'm not so sure we should be sucked into the allure of this metaphor. Where exactly is the line between distraction and deep thought?

I'm that young man sabotaging myself on teh interwebz yet I found Nicholas Carr ... paradox?

I'm trying to get a handle on this. If I understand the dialog between Lehrer and Carr correctly (?) they agree on the value of sustained attention for certain purposes such as reading immersion and agree that aspects of web use develop contrary attention allocation patterns against that ability. They disagree however ... (1) on how central sustained attention is to cognitive function in other areas (Lehrer says "I agree about literature" as if to say literature is not a general proxy for other cognitive skills), and (2) on whether the use of the web is really in itself intrinsically equivalent to multitasking and whether skills and habits of using the web could be developed so that it is less disruptive to sustained attention, or at least develops compensatory abilities.

Am I honing in on it or getting the wrong impression?

Thanks for this wonderful post and followup discussion!!

After all of this information, I too, Iâm struggling a little to understand what the real issue is thatâs being discussed. Is the internet bad for the brain? Or is the way people use it bad for the brain? Are we short-cycling the learning and development process by multitasking and using the vast information on the net?

Many students spend five minutes studying, five minutes sifting through emails, five minutes on face book, back to studying for a scant moment, now texting with friends, now a quick google search to find some answerers, maybe jump to youtube to find that new clip thatâs going viral, again. Now jump back to face book to check if anyone has replied to the post!! And believe it these students are brilliant enough to pass the next test, so does that mean these habits are improving their minds?? And then the worst of it is to claim they have been studying the whole time. Six hours of work, one hour of productivityâ¦

So I think some of the discussion that has been had here is comparing apples to oranges, there are those that use multitasking skills and tools in very productive ways and each new tool makes them better, and there are those that do not. The difference may have nothing to do with technology or multitasking, or anything thatâs being discussed here.

With the advent of technology information is easy to find, but in todayâs classrooms itâs more about hitting the right marks regardless of what it actually teaches you. If we want to grow and learn and be more capable people and have more facts at our disposal all of the tools help. But without intention (vision, planning, goal setting), discipline, work ethic, intellectual curiosity etc. the tools donât make you smarter, but they might make you feel smarter, theyâll probably make you look smarter tooâ¦.

Thank You

A good book and an appropriate review. In addition, surprise, surprise!, a very decent discussion on the blog. What more can we ask for? For a beginning: a more comprehensive perspective. Let me explain, in the hope that both Nicholas Carr and Jonah Lehrer, will take my lines as an invitation to read a book they might have skipped because it is sooooo old.

Yes, The Civilization of Illiteracy, a book 15 years in the making, was published in 1996. It identifies the reasons (not just the symptoms) for the discontinuity between the former paradigm of literacy and the new one, of many literacies. Its main thesis is: We are what we do. That is, our pragmatics affects every aspect of our lives: work, leisure, values, education, religion, war, family and sexual mores.

Due to my life circumstances, the discontinuity between the âold worldâ of the humanities and the ânew worldâ of progress grounded in the âdigital revolutionâ struck me after I arrived in the USA and began teaching (1980). Unless you were part of it, you will not recall what was going on in education (and value definition) at that time. The book argued passionately for a well-grounded understanding of the necessary character of the change it discussed. Given Nickâs and Jonahâs intellectual profile, I am sure that had they known about my book, they would have further clarified their respective views. Maybe even changed some of the positions they argue for (or against). Socratesâthe great standard argumentâis rarely put in the proper context, therefore we keep writing and arguing about symptoms. May I suggest that you, and all those who came up with opinions on this blog, take a look at The Civilization of Illiteracy? It is available as a download at several sites (my own included: www.nadin.ws). Just Google.

In the 1440s Gutenberg invented movable type and by the 1450s he printed the first Bibles on a printing press. His invention fed directly into the emergence of the Renaissance, the Reformation and the Scientific Revolution. Because of Gutenberg information was accessible by the masses, not just interpreted and handed down by kings and clerics.

Similarly google-enabled web sites and blogs have broken down the control of information by government, politicians and large corporations. Real-time publishing gives us immediate, primary-source information; hyperlinks help us begin to see patterns and dig deeper into issues faster than ever before.

So I find discussions about Google's impact on short and long-term memory, etc. to be speculative and a diversion from considering the real advantages of parsing and linking large volumes of seemingly un-related information.

By Paul Murphy (not verified) on 05 Jul 2010 #permalink

Does anyone have the actual reference for this much cited article in the current games and cognition discussion?

"comprehensive 2009 review of studies published on the cognitive effects of video games found that gaming led to significant improvements in performance on various cognitive tasks, from visual perception to sustained attention. "

A good book and an appropriate review. In addition, surprise, surprise!, a very decent discussion on the blog. What more can we ask for? For a beginning: a more comprehensive perspective. Let me explain, in the hope that both Nicholas Carr and Jonah Lehrer, will take my lines as an invitation to read a book they might have skipped because it is sooooo old.

Yes, The Civilization of Illiteracy, a book 15 years in the making, was published in 1996. It identifies the reasons (not just the symptoms) for the discontinuity between the former paradigm of literacy and the new one, of many literacies. Its main thesis is: We are what we do. That is, our pragmatics affects every aspect of our lives: work, leisure, values, education, religion, war, family and sexual mores.

Internet, video games & human cognition

Nice review of a book focusing on important cultural change. The digital world revolves around the Internet; I agree with Carr that video games, iPhones etc. are important components. Iâm a neurologist and medical school teacher. A seminar with discussion from all students, rather than me lecturing, requires that everybody shuts off computers, iPhones etc. unless we agree that Sarah can check PubMed to find out exactly what the results of blankâs study were- i.e. that we need to know this detail. Iâve learned this the hard way. If I want to give a test of thinking and understanding, rather than regurgitation, I must stand in the middle of the class and make sure that nobody uses a computer, iPhone, etc. to pull up information. This is a big change from 30 years ago. I can with effort design an âopen book testâ in which students must write two essays on complex questions without simple Internet answers and without downloadable essays available to those who pay âI must do multiple Google searches for variants of each question first. In the real world, doctors, engineers etc can and usually should do Internet searches but they should budget their time to allow limited time for those searches, and most important, they should be skeptical of what they get- that often means sleeping on a novel idea before promoting it to others, or using it on a patient. I found the Shallows worthwhile but overstated and too narrow. Carr is right that every cultural change (not just technological change) has multiple trade-offs. If we spend more time on videogames/Internet, we must spend less time on other things including probably less time talking to our neighbors. There is no evidence that video games, brain training etc (which are used to exploit many older people worried about Alzheimerâs) improve executive function.

Practice has a big effect on our brains, even at age 80. Itâs not just that our brains are constantly changing at the cellular level, itâs that sustained practice lets us do much more. However, sustained practice or constraint physical therapy after a stroke (the good arm is restrained for 12 hours a day) has limits. If the machinery for movement is totally destroyed, all the physical therapy in the world wonât help. Likewise practicing holding my breath may enable me to reach 5 or 6 minutes (if I hyperventilate first) but never fifteen minutes. What am I practicing when I surf the Internet?

I recommend these papers: Putting brain training to the test. Adrian Owen, et al. Nature 465, 775â778, 2010 from the Cambridge MRC Cognition and Brain Sciences unit (CBU). This massive study had 11430 participants. Like all other studies, it was short term and found practice effects- people improved at the tasks that they practiced. However, no transfer of training to other tasks was found.

The effects of video game playing on attention, memory, and executive control. Boot WR, et al. Acta Psychol (Amst). 2008;129:387-98 University of Illinois, Urbana-Champaign, Expert video game players usually outperform others on measures of attention and performance. They probably do better as controllers of drone aircraft and surgeons working with the Da Vinci robot, although this hasnât been rigorously tested. Is their better performance on rapidly changing tasks due to their video game experience or are people with those skills more likely to spend lots of time on video games? Long term experimental studies of video game or Internet use suffer from the same problem as long term human dietary studies - people wonât stay in the group to which they are assigned, and they often provide deceptive reports about their past diet or activities.. We have only short term studies whose application to the real world may be small. Some drugs may have opposite effects after years of use than they do when used for 6 weeks.

In terms of improving cognition in 80 year olds, regular exercise will do more than any kind of brain training; thereâs no reason that you couldnât combine both. Electronic games can improve driving skills, which arenât trivial.

Iâm interested in the complaint of some Chinese and Russian leaders that the US devised twitter, Facebook etc as cyber weapons to destabilize their governments. I think that these âweapons' will erode the functioning of any large society; they increase our tendency to respond to political memes such as âsmall businessâ, âchoiceâ, âfreedomâ, âpersonal responsibilityâ, etc. as centrifugal forces. They facilitate assembly of crowds for political action, as we saw in the Iranian electoral crisis of 2009. Large governments will face increasing distrust of their people- this is easier to see in the US than Iran. We have more media access, but it will come to Iran and any country connected to the Internet. Iâm more worried about the effects of the digital world on society than on the nervous system.

By bobsnodgrass (not verified) on 23 Sep 2010 #permalink

I'm just a regular joe that is spending a lot of time researching multitasking, because like at least 97 or 97.5 percent of humans I suck at it. The big difference between me and many other people is that I know that I suck at it.

In reference to the point made in the review regarding the study of the use of Google;

"Carr's argument also breaks down when it comes to idle Web surfing. "A 2009 study by neuroscientists at the University of California, Los Angeles, found that performing Google searches led to increased activity in the dorsolateral prefrontal cortex, at least when compared with reading a "book-like text." Interestingly, this brain area underlies the precise talents, like selective attention and deliberate analysis, that Carr says have vanished in the age of the Internet. Google, in other words, isn't making us stupid -- it's exercising the very mental muscles that make us smarter."

It seems to me that there should be more activity in this area as when we Google search we have not yet made up our mind on the topic or chosen the material. There seems to me to be an amount of executive function that is still required that shows as additional activity that may not be distictive an actual improvement or inhancement in ability.

Did this study show any improvements in the subjects' abilities to remember the material that they had googled, or to put it to meaningful use? Or did it simply show the acivity associated with trying to decide what to focus on as a result of the search?

Bu gün sizlere internetten nasıl ek gelir getirebilirsiniz onun açıklamasını yapmaya çalıÅacaÄım. Sanal alemde ek gelir demi olurmuÅ arakdaÅ deyip geçmeyin sonunda aldıÄınız para tamamen gerçek. Sim networking ile ek gelir kazanmanın güzel yanı ise bu yolun zahmetsiz olmasıdır. Geçenlerde bir arkadaÅım çok prestijli bir bankada ki iÅini bırakıp sim networking iÅini ana iÅi olarak benimsediÄinden bahsetti. Bende açıkçası merak ettim nedir bu sim networking ile ek iÅ ek gelir fırsatı diye ve sizler için buldum. Sim Networking ile Ek Ä°Å Ek Gelir Fırsatı

A friend just linked me to this; really interesting, thanks!

I really appreciate that you mention evidence that shows use of the internet to be beneficial. The whole criticism of social media has sprung up again (I'm going to link to this in the post I wrote yesterday: http://noodlemaz.wordpress.com/2011/01/24/antisocial-media/ ) and barely anyone seems to bother trying to find out what studies have been done and what they actually show.

Of course there are negative effects; I know that I myself procrastinate severely, while away hours online - but I do think I gain things from that as well. Both in terms of learning and social interaction.

Also interested that you feel you multitask more watching the TV; I find the opposite. I can do several things at once online (maintain a few conversations, writing, researching, running errands) but with the TV, either I'm watching it or I'm not. I think I may have lost my TV multitask ability, as I genuinely used to revise and watch programmes at the same time.

Anyway, off to write a little link to this in my post!

@patness said, "I'd be more easily convinced if this wasn't said about every single medium ever developed, that we later took for granted and built the world around..."

True enough as far as it goes. Looking a little farther though, one must ask: Can we "build the world around" a population with short attention span, drastic memory changes, etc. ? That is to say, each transformation in human technology and consciousness causes changes, but what if the current technology causes changes that are not compatible with the type or complexity of civilization that we've reached?

Not as if it matters a whit, the changes happen and the river flows on; none of us can stand up and say "Stop, world---change your direction". In fiction it can be done, see the sf story The Marching Morons.

Wow!, this was a real quality post. In theory I'd like to write like this too - taking time and real effort to make a good article... but what can I say... I keep putting it off and never seem to get something done