We all know Twitter can be annoying, but is it really evil? During the past week, you may have heard that there is brand-new neuroscientific evidence proving exactly that. But the hype turns out to be just that: hype.
It all started with a press release from USC about an upcoming PNAS paper by Mary Helen Immordino-Yang and Antonio Damasio, entitled "Neural Correlates of Admiration and Compassion." The USC press release, which was picked up by EurekAlert and other outlets, says:
The finding, contained in one of the first brain studies of inspirational emotions in a field dominated by a focus on fear and pain, suggests that digital media culture may be better suited to some mental processes than others. . . The study raises questions about the emotional cost - particularly for the developing brain - of heavy reliance on a rapid stream of news snippets.
"If things are happening too fast, you may not ever fully experience emotions about other people's psychological states and that would have implications for your morality," Immordino-Yang said.
The next thing you know, the UK's Daily Mail is warning, "Twitter can make you immoral, claim scientists":
Social networks such as Twitter may blunt people's sense of morality, claim brain scientists.
New evidence shows the digital torrent of information from networking sites could have long-term damaging effects on the emotional development of young people's brains.
A study suggests rapid-fire news updates and instant social interaction are too fast for the 'moral compass' of the brain to process.
The danger is that heavy Twitters [sic] and Facebook users could become 'indifferent to human suffering' because they never get time to reflect and fully experience emotions about other people's feelings. (source)
Other outlets then started repeating the same spiel. The Telegraph's headline was "Twitter and Facebook could harm moral values, scientists warn." CNN said "Scientists warn of Twitter dangers."
Quite simply, this is another case of pop science journalism gone horribly awry. Damasio's PNAS paper did not conclude that Twitter can make you immoral. The paper didn't investigate the moral effects of Twitter (or other media technologies) at all. The word "Twitter" isn't even in the paper!
So how did the popular media get it so very wrong?
First, let's look at what the paper actually did. From the abstract:
In an fMRI experiment, participants were exposed to narratives based on true stories designed to evoke admiration and compassion in 4 distinct categories: admiration for virtue, admiration for skill, compassion for social/psychological pain, and compassion for physical pain.
Basically, 13 subjects (all right-handed Americans, if you'd like to make some more wild generalizations about morality) listened to stories intended to make them either admire or sympathize with others. Later, the subjects were placed in an MRI scanner, reminded of the stories, and asked to summon the emotions they had felt upon originally hearing them (participants were asked to become "as emotional as possible.")
What's the point? Well, the experimenters wanted to see which areas of the brain were most active (based on blood oxygen levels) during each type of emotion. They also wanted to know whether activity differences were apparent when the subjects' emotional response was based on a physical situation (physical pain or admirable physical skill) as opposed to a mental or social situation (social pain like grief, or admirable moral virtue).
They results? First, both feelings of admiration and feelings of sympathy correlated with increased brain activity in the posteromedial cortices (PMC). (That's why the paper is titled "Neural correlates of admiration and compassion": a physiological change in the brain associated with a specific mental function or state is called a "neural correlate.") Second, different parts of the PMC were active in conditions dealing with physical situations, as opposed to social situations. What does that mean? Well, it appears that slightly different regions of the cortex handle our appreciation of other people's physical experiences (like injury or motor skills) as opposed to their social/mental experiences (like altruism or suffering).
These differences may be based partly in how we think about our own bodies and minds. When we envision another person's emotional state, we often imagine ourselves in his or her place. It makes sense that parts of the brain involved in processing our own physical pain might be involved in sympathizing with the physical pain of others, while more interoceptive circuits could be activated by others' social pain. The authors say
Overall, these results suggest that the processing of social emotions is organized less around the kind of emotional response, be it compassionate or admiring, than around the contents and context of the situation.
I know, I know: we still haven't gotten to Twitter!
It turns out that the Twitter brouhaha arose out of the paper's third conclusion, which talks about the relative speed of emotional processing. The authors note that brain activity in one region (the anterior insula) peaked more quickly for sympathy with physical pain than for the other conditions. They attribute this to a more efficient, direct mechanism for relating to physical pain than for the other conditions. This makes evolutionary and developmental sense. Physical pain is arguably the most "basic" of the situations being tested; animals and young children can both sympathize with physical pain in others, even if they can't understand altruism or relate to grief or angst. And that's where you get the title of the USC press release, "Nobler Instincts Take Time."
(One more point - given how the media spun this finding, it's worth noting the time scale here. The authors found that brain activity associated with compassion for physical pain peaked at 6 seconds. Brain activity associated with compassion for mental pain, and the two admiration conditions, peaked between 8 and 12 seconds. So it's not like compassion for physical pain is instantaneous; nor does mustering compassion for emotional pain require several minutes of deep thought.)
So where does Twitter come in? Well, as you've seen, the paper is cool, but not terribly thrilling to your average reader, who likely doesn't know the PMC from the DMV. Maybe it wasn't thrilling to the USC folks writing the press release, either, because they chose to build off of one speculative paragraph from the paper's discussion:
If replicated, this finding could have important implications for the role of culture and evolution in the development and operation of social and moral systems; in order for emotions about the psychological situations of others to be induced and experienced, additional time may be needed for the introspective processing of culturally shaped social knowledge. The rapidity and parallel processing of attention-requiring information, which hallmark the digital age, might reduce the frequency of full experience of such emotions, with potentially negative consequences.
Okay. That speculation relates to the results described in this paper, but also reflects a more general, timely concern that new media technologies are placing unprecedented stress on our cortical systems (We've addressed these concerns before on this blog, here, here, and most extensively here. The neurobiological jury is still out on whether Google really is making us stupid.)
Anyway, what Damasio and his co-authors are doing here is perfectly normal: using the discussion to toss out an interesting possible direction for further research. Note that they very responsibly qualify this idea with "if replicated" and "could have" and "may be" and "might."
But this paragraph, I imagine, is why the USC press release prominently emphasized a quote not from another MRI expert, but from USC Annenberg media scholar Manuel Castells, who said, "Damasio's study has extraordinary implications for the human perception of events in a digital communication environment." By focusing on the new media implications, the press team tapped into a widespread and familiar public concern - the possibly negative effect of new media on human nature and society - while avoiding jargon-heavy brain regions and tricky explanations of who was feeling what kind of pain.
While it pushes the new media implications of the work hard, the USC press release admits (to a careful reader) that neither Castells nor the authors are calling Twitter, or any other social media platform, dangerous:
Immordino-Yang did not blame digital media. "It's not about what tools you have, it's about how you use those tools," she said.
Castells said he was less concerned about online social spaces, some of which can provide opportunities for reflection, than about "fast-moving television or virtual games."
But the release has a decidedly mixed message. Right before those quotes, it asserts that
Clearly, normal life events will always provide opportunities for humans to feel admiration and compassion. But fast-paced digital media tools may direct some heavy users away from traditional avenues for learning about humanity, such as engagement with literature or face-to-face social interactions.
This is yet another case of poorly chosen headlines and misleading leads, as we were discussing last month. To make matters worse, the USC press release was edited at least once; you can't tell now, but the USC release originally referenced Twitter (See Neurocritic for a detailed timeline of who said what when.)
Amidst the confusion, other reporters and editors apparently succumbed to a series of intuitive leaps, leading to headlines like "Twitter can make you immoral, claim scientists." These extrapolations seem plausible, but are based not on the actual paper by the Damasio lab - it was not yet released by PNAS - but on that USC press release. Apparently very few people thought it was necessary to check the facts.
It's reassuring to see that the outrageousness of the Twitter claim was a red flag to at least some writers. In addition to the Neurocritic's excellent posts on the topic, way back on Tuesday, PCMag's Mark Hachman called the link to Twitter "dubious":
What's odd about the press release is that Immordino-Yang's quotes are almost all confined to one issue: how an emotional response takes time to formulate. The author of the press release (report?), though, seems to be writing to an agenda. "Tweet this: Rapid-fire media may confuse your moral compass," the author, Carl Marziali, writes.
Oh well: Twitter, Twitter, Twitter. See, we've filled our quota, now, too.
On Wednesday, blogger Josh Smith of WalletPop, who is not a scientist, contacted Damasio, who said "We were certainly NOT talking about Twitter. The claim that Twitter makes us immoral is NOT ours, and has nothing to do with our study." (Wow! Josh went back to the original source and asked for clarification! How innovative!)
Hopefully, the news cycle for the "Twitter is evil" claim is over now (at least with reference to the Damasio paper). But it's disturbing to look back at the last week and see how thoroughly the supposed results were oversold by an enthusiastic media (I hate it when science is oversold) in a kind of positive feedback loop which, not surprisingly, eventually made its way onto Twitter. (Note: 140 characters is NOT enough to accurately describe a scientific study). Even those writers who portrayed the study accurately, like msnbc's Alan Boyle, played the "Is Twitter Evil?" card, and the misrepresented conclusion worked its way onto solid science blogs like the Intersection:
I don't think I'm the only one out there lately who sense that just maybe, not every aspect of how the Internet affects the media-or our thinking-is an improvement. In fact, there's actually science on this: See "Rapid-fire media may confuse your moral compass," EurekaAlert's breakdown of a recent study from USC neuroscientist Antonio Damasio.
No, there isn't science on this, Chris. Not quite yet!
So is Twitter evil? Who knows. But the accumulation of alarmist anecdotes, no matter how emotionally powerful, does not scientific proof make. And while Sheril, Physioprof, and others may be perfectly justified in eschewing the kingdom of Ashton and Oprah, neuroscience hasn't proven it's any worse than blogging.* For that level of scientific certainty, we have to wait until the Neurocritic's ingenious controlled Twitter experiment is funded.
Until then, expect more Twitstorms in the mainstream media.
*Full disclosure: yes, I'm on Twitter; I've found it useful in some respects but also exhausting. I don't even try to keep up with it. Basically, I wanted to experience it first hand before writing about it.
PNAS: "Neural correlates of admiration and compassion." (forthcoming)
Argh - forgot that two links automatically send a comment into the spam folder....
Ben Goldacre's consistently excellent site, Badscience, also covers this
I love it when you write up this kind of thing. How ridiculous . . . yet how typical.
Thanks for writing this. Twitter is just a communication tool and can be used or not used for a variety of purposes, many of which don't have much to do with rendering moral judgments.
BioE, you are objectively pro-twitter.
As long as you don't call me a twit, I'm okay with it. ;)
Tweet from rick_vosper: Real scientist shows how media screwed the pooch with "Twitter Could Harm Moral Values, Scientists Warn" headlines.
Why am I not surprised that Chris fell for this one.
Thanks for covering this! Sometimes you need more than 140 characters. :) It's just amazing how people make those leaps from "speed of processing" to "moral character".
You clearly have strongly felt emotions about the misrepresentation of scientific findings in the media. However, overexposure to social media has rendered me indifferent to your suffering. Sorry!
I hope I'm not to blame for the "twitter is evil" meme being associated with the Damasio study. My post on "why Twitter is evil" appeared shortly before the USC press release was issued, and attracted a ridiculous amount of attention, even though it was mostly tongue in cheek stuff.
Which only supports the thesis that Twitter is a low-signal medium.
That paper may not claim that twitter is evil, but dammit - Twitter is Teh Evile!!!
(or more accurately, seems rather exceedingly silly to me)
(but I am perfectly capable of arguing minutia for hours, knowing that even if I convince the person I am arguing with of my point, it won't change their opinion on the underlying issue being argued - so I may not be the best judge of silly)
linked to you here.
No, I don't think you are, James. :) I think you were just riding slightly ahead of the curve on this one, or prescient, or something. Nice job! :D
ROTFLMAO michael5000! You are too cruel!!!
I'm on Facebook and since the recent changes to the update format there I'm convinced I don't need Twitter. FB tried to buy up T and were unsuccessful. I hated the new Twittery (so I'm told) format but now I've gotten used to it I don't mind so much. I've found FB to be an excellent networking tool.
BTW, Language Log has another great debunking here.
Those darn right-handed Americans...
From wiki/trigeminal nerve/sensations:
"There are two basic types of sensation: touch/position and pain/temperature. They are distinguished, roughly speaking, by the fact that touch/position input comes to attention immediately, whereas pain/temperature input reaches the level of consciousness only after a perceptible delay. Think of stepping on a pin. There is immediate awareness of stepping on something, but it takes a moment before it starts to hurt.
In general, touch/position information is carried by myelinated (fast-conducting) nerve fibers, whereas pain/temperature information is carried by unmyelinated (slow-conducting) nerve fibers."
So there's a temporal step by step pattern, 1) physical sensation 2) pain sensation awareness 3) motor reaction 4) emotional compassion / empathy (which I think strongly correlates to mimicry and social species behavior).
I've no idea how that links to twitter tweets, though my dad's nickname was 'twit', my grandma's nickname was 'byrde' and an ex's nickname was 'tweety'. hmm, all right-handed Americans...