Things to read, Weekend Edition

i-c2c13b7f8b0d6f29f85502cc14519122-090609_dyars.jpg

Turns out DC has, or once did have, a hidden subterranean labyrinth - and you thought it was just a plot device from last fall's South Park election special! Even better, it was dug by a lepidopterist. Take that, you engineers!

ONE of the oddest hobbies in the world is that of Dr. H. G. Dyar, international authority on moths and butterflies of the Smithsonian Institution, who has found health and recreation in digging an amazing series of tunnels beneath his Washington home.

The New York Times revealed its 50 most looked-up words, and Nieman Journalism Lab had commentary:

"All of the 25-cent words I used in the lede of this post are on the list. The most confusing to readers, with 7,645 look-ups through May 26, is sui generis, the Latin term roughly meaning "unique" that's frequently used in legal contexts. The most ironic word is laconic (#4), which means "concise." The most curious is louche (#3), which means "dubious" or "shady" and, as Corbett observes in his memo, inexplicably found its way into the paper 27 times over 5 months. (A Nexis search reveals that the word is all over the arts pages, and Maureen Dowd is a repeat offender.)

"Corbett also notes that some words, like pandemic (#24), appear on the list merely because they are used so often. Along those lines, feckless (#17) and fecklessness (#50) appear to be the favorite confounding words of Times opinion writers. The most looked-up word per instance of usage is saturnine (#5), which Dowd wielded to describe Dick Cheney's policy on torture."

Tom Vanderbilt explained in New York Times Magazine why all those photos and Tweets and blog comments you upload are probably housed in giant featureless boxes in the middle of bean fields:

Power looms larger than space in the data center's future -- the data-center group Afcom predicts that in the next five years, more than 90 percent of companies' data centers will be interrupted at least once because of power constrictions. As James Hamilton of Amazon Web Services observed recently at a Google-hosted data-center-efficiency summit, there is no Moore's Law for power -- while servers become progressively more powerful (and cheaper to deploy) and software boosts server productivity, the cost of energy (as well as water, needed for cooling) stays constant or rises. Uptime's Brill notes that while it once took 30 to 50 years for electricity costs to match the cost of the server itself, the electricity on a low-end server will now exceed the server cost itself in less than four years -- which is why the geography of the cloud has migrated to lower-rate areas.

In Newsweek, Sharon Begley indicted academia for slowing the translation of science into clinically useful interventions.

Not all scientists put career second. One researcher recently discovered a genetic mutation common in European Jews. He has enough to publish in a lower-tier journal but is holding out for a top one, which means identifying the physiological pathway by which the mutation leads to disease. Result: at least two more years before genetic counselors know about the mutation and can test would-be parents and fetuses for it.

Bloggers Tim Kreider at Science Based Medicine and scibling Orac responded.

Mike Jay imagined "The Day Pain Died" for the Boston Globe Ideas Section:

What the great moment in the Ether Dome really marked was something less tangible but far more significant: a huge cultural shift in the idea of pain. Operating under anesthetic would transform medicine, dramatically expanding the scope of what doctors were able to accomplish. What needed to change first wasn't the technology - that was long since established - but medicine's readiness to use it.

Ronald Musto complained in the Chronicle Review that Google Books doesn't make the historian's job any easier when it "mutilates" the books it's ostensibly making more available:

In its frenzy to digitize the holdings of its partner collections, in this case those of the Stanford University Libraries, Google Books has pursued a "good enough" scanning strategy. The books' pages were hurriedly reproduced: No apparent quality control was employed, either during or after scanning. The result is that 29 percent of the pages in Volume 1 and 38 percent of the pages in Volume 2 are either skewed, blurred, swooshed, folded back, misplaced, or just plain missing. A few images even contain the fingers of the human page-turner. (Like a medieval scribe, he left his own pointing hand on the page!) Not bad, one might argue, for no charge and on your desktop. But now I'm dealing with a mutilated edition of a mutilated selection of a mutilated archive of a mutilated history of a mutilated kingdom -- hardly the stuff of the positivist, empirical method I was trained in a generation ago.

And for some more pessimism, Don Tapscott predicted the impending demise of the university:

Contrary to Nicholas Carr's proposition that Google is making us stupid, Tapscott counters with the following:

"My research suggests these critics are wrong. Growing up digital has changed the way their minds work in a manner that will help them handle the challenges of the digital age. They're used to multi-tasking, and have learned to handle the information overload. They expect a two-way conversation. What's more, growing up digital has encouraged this generation to be active and demanding enquirers. Rather than waiting for a trusted professor to tell them what's going on, they find out on their own on everything from Google to Wikipedia."

Maybe - but my question is, do they know how to tell if what they find on Google or read on Wikipedia is accurate? I don't think so. Information overload may be my generation's bane, but I think the next generation may lack the skeptical instincts they need to navigate their information-saturated environment. What do you think? (Either that or they'll run out of power to cool their servers and we'll all be back in the pre-industrial age. C'est la vie.)

At any rate, that should be enough reading for now. . . given our state of information overload and all. I'm off to look for those subterranean tunnels full of butterflies.

More like this

Maybe - but my question is, do they know how to tell if what they find on Google or read on Wikipedia is accurate? I don't think so. Information overload may be my generation's bane, but I think the next generation may lack the skeptical instincts they need to navigate their information-saturated environment. What do you think?

I disagree. "Skeptical instincts" tend to be the result of being burned by lies a few times, and everyone is going to encounter that plenty of times as they grow up now. If anything, it's previous generations, used to getting information only from authorities who should lack skepticism. Heck, even the standard 'rickroll' prank is an exercise in being skeptical.

That's right MPL. I started learning skepticism when in primary school I found I had to unlearn so much of what my Dad told me about history and the sciences. He used to love talking to me, while I listened like a dutiful daughter. Well once at school for a few years I realised I couldn't trust a word he was saying. Some things I learned were right, some were based on reality but he got details wrong and some things were wrong and totally unrealistic. I've basically given him the benefit of the doubt and decided he just likes talking and was unwilling to let facts stand in the way of a good story, but as I couldn't tell what was right or not, I quickly learned to just let him talk and to treat all he says as a 'story' until I got proof of accuracy. It was quite easy to extend the attitude to other questionable sources.

By Katkinkate (not verified) on 13 Jun 2009 #permalink

Yeah, now that we've got Google, who needs those motherfucking lazy shit professor assholes anyway. Let those med students just google that physiology shit right the fuck up!

Let me go into more detail.

As far as I can see there are three main problems right now with accuracy and the web: adults ("digital immigrants") who lack comfort with the internet and don't know how to read its social cues (which is why grandma and her educated but internet-naive friends are prone to fall for the Nigerian scam); kids ("digital natives") who are being confronted with a glut of information at far younger ages than they used to, and simply don't have the cognitive strategies developmentally to sort through it (I can't even imagine being able to Google at 7 or 8 and get heaven knows what); and finally, the victims of the digital divide - kids who aren't getting the kind of practice on the internet that their peers are, and are growing up as unfamiliar with it as their parents (and having very different social experiences than other kids their age, for whom the online world is an important part of their friend networks).

Individuals can learn to be skeptical, sure. But there are some data to suggest that as a group, the way we evaluate websites is very different than the way we evaluate a book or a newspaper. It's all still in flux, and I have no doubt that eventually we'll adjust to the environment we've made for ourselves and develop good cognitive strategies for coping with it. But in the meantime, I'm left to wonder why some of the smartest kids I taught in college were so credulous about things they read online, and so unquestioning of the source of the information, when they sure as heck questioned *me*!