We respond differently to babies' faces within 150 milliseconds

ResearchBlogging.orgIt's hard to resist flirting with babies. Even if a baby has been screaming her head off for hours on end in the seat behind you on a transatlantic flight, if she giggles and smiles when you're deplaning, you'll probably smile back. What is it about babies that makes our hearts melt almost instantaneously when we see them? Is it their cuteness, their happiness, or just their babyness?

A team led by Morten Kringelbach showed photos of babies and adults to twelve volunteers while their brains were being scanned with a MEG (magnetoencephalography) scanner. The key to the study was the control of the photos. Ninety-five judges had previously rated these pictures for the emotion they displayed and the attractiveness of the faces. Both the adult photos and the baby photos were rejected if they were rated as too attractive or unattractive: only middling pictures were chosen for use in the study. Each of the individuals -- 13 adults and 13 babies -- was depicted with a happy, sad, and neutral expression, and again all these photos were selected to have equivalent levels of emotional expression, so the viewers each saw 78 different pictures.

The MEG scanner, unlike fMRI, measures actual neuronal activity, and it responds extremely rapidly, allowing precise measurements within milliseconds after the cells in brain are activated. Viewers were told to look at a small red cross on the screen and press a button when it changed from red to green. They were told to ignore the pictures, which were flashed for about a third of a second between appearances of red/green cross. Of course the researchers were actually interested in the brain activity while the pictures were seen, and they ignored the data from when the cross actually changed color, which happened about 15 percent of the time. Here's what they were really interested in:

i-3dfe18f6b6e97c90b86b5c6671e683dc-kringlebach1.gif

The pictures show the difference in brain activation states at a given time compared to the 250 milliseconds before the pictures were viewed. As you can see, the pattern is different when viewers saw infant faces compared to adult faces. The most dramatic difference in brain activity occurred in the medial orbitofrontal cortex. This graph shows the dramatic difference in activity in that region when viewing baby faces compared to adult faces:

i-1ad1d19a0b405dce1fd2b735c5284e15-kringlebach2.gif

This area of the brain has been shown to be activated in a similar pattern when people see masked drawings -- drawings that they don't actually remember seeing because they are flashed so briefly. So almost immediately after seeing infant faces, adults show a dramatically different response compared to equivalently emotional and attractive adult faces; a response they may not even be aware of.

Kringelbach's team speculates that their research might be applicable in treating postpartum depression, where mothers seem emotionally unaffected by their new babies. It's possible, they say, that this area of the brain responds differently in mothers suffering from postpartum depression.

Kringelbach, M.L., Lehtonen, A., Squire, S., Harvey, A.G., Craske, M.G., Holliday, I.E., Green, A.L., Aziz, T.Z., Hansen, P.C., Cornelissen, P.L., Stein, A., Fitch, T. (2008). A Specific and Rapid Neural Signature for Parental Instinct. PLoS ONE, 3(2), e1664. DOI: 10.1371/journal.pone.0001664

More like this

Activity in the medial orbitofrontal cortex has been linked to the learning and memory of the reward value of reinforcers (DOI 10.1016/j.pneurobio.2004.03.006). What can we hypothesize as to the relationship between such activity and its implications in evolutionary biopsychology?

"Kringelbach's team speculates that their research might be applicable in treating postpartum depression,"

So how would that work in practice? Also, Dave, thanks to this line:

"It's hard to resist flirting with babies."

you're probably now on some Government watchdog list ;-)

How would this research be applied to postpartum depression? At this point, it's all speculation, but I suppose you could repeat this study on mothers with postpartum depression and see if their brains react differently than unafflicted adults. If so, then doctors could focus on treatments that target these regions of the brain (perhaps easier said than done, I agree).

MEG has high temporal resolution, but nontrivial signal localization problems. This is especially true for areas like the orbitofrontal cortex which is usually far from any detectors.
The fact that they are finding so much of the brain significantly active also makes me less confident about the task localization. The fact that they are seeing a signal difference at 150ms is interesting, but that's probably the only thing I'd take away from this paper.

Assuming the imaging data is reliable, the difference in brain activity when looking at babies vs. adults, seems to be consistent with Ledoux's (1996, 2000) discovery, that there are two neural pathways that project from the thalamus (centre of brain image).

The direct pathway (thalamus-->amygdala/visual cortex) involves automated emotional reactions to sensory input. This looks consistent with the brain activity when looking at adults.

The indirect pathway (thalamus-->frontal cortex-->amygdala/visual cortex) concerns thoughtful evaluation of sensory input before emotional reactions in the amygdala are activated. This looks consistent with the brain activity when looking at babies.

So, if I were to hazard a guess, this data is showing that when (presumably) adult evaluators respond to adult faces, their emotional responses to adult faces are more automated than when responding to baby faces. Probably this is due to adults having more interactions with adults than children.

It would be interesting to do this study again, but this time, the raters should be adults that spend a lot of time with children such as pediatricians, nurses in a maternity ward, or daycare workers.

By Tony Jeremiah (not verified) on 27 Feb 2008 #permalink

Hmm,

I wonder, though, if the parent vs. non-parent distinction would in fact be sufficient to demonstrate a divergence in the two processing pathways. The reason being that spending time with (presumably) one's own infants, may not be a sufficient enough experience to allow automatic reactions to different types of infants to develop. One might have to have experiences with many different types of infants before the difference arises.

So, persons who are frequently exposed to many different types of infants (not just their own), might produce brain activity similar to that of looking at adult faces.

In other words, does experience diversity (e.g., spending significant time with children of various races and temperaments) lead to automatization. If we can rule out experience diversity as a possible explanation for the difference, then I think we can begin to suggest that there is indeed some evolutionary basis for the difference as Alvin (@1) suggests.

By Tony Jeremiah (not verified) on 27 Feb 2008 #permalink

Did they try pictures of Mickey Mouse? I recall reading an SJ Gould essay comparing The Mouse's features to babies.

How about puppies vs. babies? Some find puppies far more appealing.

How about hand-drawn or computer-generated images of baby faces? Does it matter if the observer can tell that the baby in the image really exists?

Could this have something to do with the face of the shape? A child's face is circular, where as an adults in either oval or squarish.

The art community will tell you all abut the different responses to shape, maybe it's just the difference in out brain activity when looking at round vs oblate shapes.

Just an idea, but better to ask than to be quiet.

By joewanderlust (not verified) on 06 Mar 2008 #permalink