Single memory training task improves overall problem-solving intelligence

Blogging on Peer-Reviewed Research
Forget 'smart drugs' or brain-training video games. According to new research, a deceptively simple memory task can do what no drug or game has done before - it can boost your 'fluid intelligence', your ability to adapt your powers of reasoning to new challenges. Fluid intelligence doesn't rely on previous knowledge, skills or experience. It's at work when we solve new problems or puzzles, when we draw inferences and spot patterns, and when we test ideas and design experiments. To see what I mean, try testing yours.

i-5180c140e05ef616252f8e5154e6554c-Braintrain.jpg Fluid intelligence appears to be strongly influenced by inherited genetic factors and is closely related to success in both the classroom and the workplace. The ability plays such a central role in our lives that it begs an obvious question: is there any way of improving your fluid intelligence through training?

Video game manufacturers would like you to think so. Games like Dr Kawashima's Brain Training and Big Brain Academy are suggestively marketed as ways of improving your brain's abilities through the medium of number problems, Sudoku and word puzzles. As a result, your brain will allegedly become younger. And look, Nicole Kidman likes them. The pitch is certainly a successful one - these games are bestsellers and are increasingly joined by a swarm of imitators. Last year, the worth of the US brain-training market alone was estimated at about $80 million.

Whether these products actually work is open to debate but there is certainly no strong evidence that they do anything beyond improving performance at specific tasks. That seems fairly obvious - people who repeatedly practice the same types of tests, such as number sequences, will become better at them over time but may not improve in other areas, like memory or spatial awareness. But acquiring Jedi-levels skills in one specific task is a far cry from increasing your overall fluid intelligence; it'd be like saying that you're a better musician because your scales are second-to-none.

Nonetheless, Susanne Jaeggi from the University of Michigan has developed a training programme involving a challenging memory task, which does appears to improve overall fluid intelligence. The trainees do better in intelligence tests that have nothing to do with the training task itself and the more training they receive, the higher their scores.

Get smarts

Jaeggi's work was inspired by Graeme Halford from the University of Queensland, who suggested that the limits of our reasoning abilities are very similar to the limits of our working memory. The term refers to our ability to temporarily hold and manipulate pieces of information, as you do when you add up prices on a bill. Reasoning and working memory aren't identical, but they both involve holding pieces of information in a sort of mental notepad, and even seem to rely on similar networks of neurons. The idea is that both are constrained by the processing power of our brains or our ability to focus our attention.


Jaeggi recruited 70 young students and set half of them on a challenging training regime, involving the so-called "n-back task". These trainees watched a series of screens where a white square appeared in various positions on a black background. Each screen appeared for half a second, with a 2.5 second gap before the next one flashed up. While this happened, the trainees also heard a series of letters that were read out at the same rate.

At first, their job was to say if either the screen or the letter matched those that popped up two cycles ago but the number of cycles increased or decreased depending on how good the students were at the task. Boffins had to compare the current pair with those many cycles ago, while dunces only had to remember fairly recent ones. The students sat through about half an hour of training a day for either 8, 12, 17 or 19 days, and were tested on their fluid intelligence before and after the regimen using the German Bochumer-Matrizen Test.


On the whole, the trainees did significantly better on the fluid intelligence test than their peers who didn't receive any training. The control group did improve slightly, as you might expect from people who had done similar tests in the past, but the trainees still outperformed them. And their degree of improvement depended on the extent of their training. Those who were trained for the longest time - 19 days - showed the biggest improvements, while those who were only trained for 8 days didn't get significantly better.


Jaeggi noted that the test didn't just hone the abilities of the naturally intelligent people in the group, as low-performers benefited from the training programme just as much as high-performers did, if not more so. Nor did those who already had powerful working memories enjoy greater benefits, which suggests that the training doesn't simply work by improving this specific skill.

Jaeggi thinks that this task worked where others have failed because it remained challenging. The students were never allowed to get comfortable with the task - as soon as they improved, it became accordingly more difficult. Faced with the combination of two info streams and shifting difficulty levels, they couldn't develop simple strategies or switch to autopilot. The task was also very challenging. To succeed in it, students had to remember old items, constantly update the memories they were keeping, block out irrelevant ones, and manage two tasks at the same time using both sound and sight.

The results seem promising and the prospect of a single training exercise that can improve a whole suite of mental abilities certainly seems like a good thing. But for now, the study leaves behind many unanswered questions. How exactly does the training programme lead to better fluid intelligence? At what point will the benefits of extra training start to level off? And how long will it take for the programme's effects to wear off, it they ever do? The answers to these questions will help to decide if the findings are indeed "highly relevant to applications in education" as the authors claim.

And speaking of education, perhaps readers who are more familiar with the literature on intelligence can enlighten me on this: Jaeggi claims that fluid intelligence is fairly unchangeable in the face of education, which seems quite shocking. That would imply that our education system improves our knowledge and skills, but not our innate ability to solve problems or draw inferences. Is that really the case?

Reference: Jaeggi, S.M., Buschkuehl, M., Jonides, J., Perrig, W.J. (2008). Improving fluid intelligence with training on working memory. Proceedings of the National Academy of Sciences DOI: 10.1073/pnas.0801268105

Images: from PNAS

More like this

Man, I completely blew that test. Probably not a good idea to take it at work....

By longsmith (not verified) on 29 Apr 2008 #permalink

54% ... Not bad, considering the iPod was on, and I'm supposed to be doing productive, paid-for tasks :-) Note that the percentile is how many people you did better than, not how many you got right....

Still, I'm very interested in the whole "training" area, but towards overall intelligence, not just towards specific tasks....

So the idea is that training that requires an ever increasing amount of working memory (increasing at the edge of ones abilities) might increase fluid intelligence? Seems like a number of different training "regimes" could do that. I'd love to see more studies attempting to verify that - as well as explore the limits of the increases.

I actually got 96% when I first did that test. Even though it was about 1 in the morning.

My conclusion: writing about fluid intelligence improves your fluid intelligence. Rock!

93% and I'm dead tired, that can't be right.

ya 96% as well, and i don't think i was thinking clearly either

98% Yay. Thank you for the diversion from real work.

...Jaeggi claims that fluid intelligence is fairly unchangeable in the face of education, which seems quite shocking. That would imply that our education system improves our knowledge and skills, but not our innate ability to solve problems or draw inferences. Is that really the case?

If 'education system' is defined as use of the traditional objectivist instructional method (e.g., lecture), then it is most likely accurate to say that it enhances crystallized and not fluid intelligence, because it is an approach that places great emphasis on knowledge acquisition. However, if 'education system' is defined as primary use of a constructivist instructional method (e.g., discussion, lab, or any other such form of interactive learning), it is possible that one may impact fluid intelligence. However, this is only in the most general sense of the concept of learning transference (i.e., the ability to transfer what one learns in a prior learning situation to a new one). Additionally, interactive methods do not ensure knowledge transference, such as your posting concerning how teaching math from a real-world perspective does not transfer to a new situation. Instead, teaching abstract rules seems to produce the transference. Presumably, the experiments described in this posting are tapping into the same general principle of teaching abstractions that generalize to particular tasks.

Perhaps the tasks mentioned in both postings (which appear to enhance learning transference effects), are a form of implicit metacognitive instructionwhich has been known to enhance learning capacity.

By Tony Jeremiah (not verified) on 29 Apr 2008 #permalink

waitaminit, I went to bookmark that 'test' page for later, hit 'back' and ended up at:

New Advanced Personality Test Facebook App

Site Features: Enneagram Personality Tests, 16 Type Jung Personality Tests*, Big Five Personality Tests, Personality Disorder Test, Compatibility Test, Career Test, Eysenck Personality Test, Word Association Test, Ask The Oracle, Famous Leader Test, Intelligence Tests, Search Minds (Find others who score like you), Personality Type Articles, Discussion Forum, Test Links


96%. Yeah!

I wish the control had been spending the same amount of time doing some other sort of brain work (crosswords? reading? chess?), rather than nothing. I'd really like to be smarter, but 25 minutes a day is a lot to give up if I could get the same benefit from reading scientific papers outside my specialties, for example. Actually, a comparison with spending 25 minutes a day exercising would have been interesting.


Also, is there any way we can get a hold of this program to use it? Or any plans to develop this into a product?

I was a bit apprehensive to try this since I pulled an almost-all-nighter (worked until 4:30, to bed a bit before 5, and then up shortly after 6:30) and I feel like crap.

I got the 96th percentile on the first attempt, which was a little bit higher than I expected given my current sorry state.

Uhh yeah.. I just got mind f***!

By Anonymous (not verified) on 08 Jul 2008 #permalink

I'm not exactly a rocket scientist.. And apparently "brain age" didn't help me much on the test..

Buts what up with these people saying..

"Uh yeah.. Its 1am in the morning, and I'm so dead tired... But I got a 96%"

What.. you guys think you can do better? LOL

By Anonymous (not verified) on 08 Jul 2008 #permalink

I was a soldier since i was 12 year old and i like to learn a intelligence business to help my people to investigate some problem back home in Sudan

By thoal r luak (not verified) on 12 Feb 2010 #permalink