Brain Training 2009We’ve talked previously about brain fitness programs, including a review of them back in 2007 and some digging into the research backing for them last …

9 Comments to
Brain Fitness Expands, But Research Still Lags

Before posting, please read our blog moderation guidelines. The comments below begin with the oldest comments first. Click on the last comments page to jump to the most recent comments.

  1. Hi John,
    I totally agree;
    Lots of people still do not understand the importance of regular exercise and healthy eating.
    I suppose we can only keep trying to convince them. I will use your free simple tips as a reference.
    Many thanks
    Regards
    Dawn Pugh
    http://www.everytherapist.com

  2. Agreed.. Physical exercise and proper diet are definitely part of an overall brain fitness program. However, there are documented and peer reviewed studies now that show *certain* brain exercises or tasks can in fact improve ones working memory and fluid intelligence. Check the research done by Susanne Jaeggi and Martin Buschkuehl. Brain games such as that provided online for free by Brainworkshop use this methodology.

  3. Sure, if you’re an 80 year adult and do such tasks everyday (the study looked at thirty-two 80-year-old participants). The study is also valuable for what they did not find — no improvement in episodic memory, no improvement in digit span task, and no improvements that lasted at a one-year followup.

    N = 32 is nothing to start basing broad conclusions about the general population’s ability to benefit from such brain games. And something that works in an 80 year old, when memory is already often an issue, is not proven in any manner whatsoever to help a 40 year old who has no memory problems (or is looking to keep his brain “elastic”).

  4. John, you are simply misinformed, and therefore misinforming.

    “Since we last wrote on this topic, little has changed. There’s been no new definitive studies showing that brain training programs have any specific benefits to people without a brain disease, such as Alzheimer’s.”

    and

    “Sure, if you’re an 80 year adult and do such tasks everyday”

    Please take the time to read the literature. You seem to have a personal vendetta against computerized cognitive training, which prevents you from understanding what is going on and helping your readers separate reality from hope from hype.

    A couple of examples that directly contradict “Since we last wrote on this topic, little has changed. There’s been no new definitive studies showing that brain training programs have any specific benefits to people without a brain disease, such as Alzheimer’s.”

    1) The study that GaryD refers to above was based on healthy adults in their 20s (not in their 80s, as you say). See Jaeggi, S. M., Buschkuehl, M., Jonides, J., & Perrig, W. J. (2008). Improving Fluid Intelligence With Training on Working Memory. Proceedings of the National Academy of Sciences of the United States of America, 105(19), 6829-6833

    2)a remarkable recent study by Kramer et al showed all kinds of transfer for healthy adults 60 and over. Basak C, et al “Can training in a real-time strategy video game attenuate cognitive decline in older adults?” Psychol Aging 2008; DOI: 10.1037/a0013494.

    And we should discuss multiple prior studies, such as ACTIVE, or the literature on cognitive training for driving and flying, whose significance you simply choose to ignore.

    I am not responsible for what developers say. I am only responsible for what I say: the best approach for cognitive health, based on what we know today, is multi-pronged, based on a balanced nutrition, cardio physical exercise, stress management and mental stimulation. Computerized cognitive training can play an important role inside the mental stimulation pillar, bringing novelty, variety and challenge, and specific cognitive benefits.

    New tools can be useful when used appropriately, they are not magic cures or general solutions – is this really so surprising?

  5. Just one more thought: I am the first person to emphasize that there is much confusion in the field, so we do our best to inform consumers and professionals. A couple of examples:

    We published this 10-question Checklist to evaluate any product making brain-related claims:
    http://www.sharpbrains.com/resources/10-question-evaluation-checklist/

    and we also offer a series of in-depth interview with leading scientists
    http://www.sharpbrains.com/resources/neuroscience-interview-series/

    Let me also suggest you read Torkel Klingberg’s recent book, The Overflowing Brain: Information Overload and the Limits of Working Memory. It may help open your eyes.

  6. Thanks for stopping by as always, Alvaro, and plugging your site in this fledgling industry. I have nothing personal, but the studies you cite are typical of the studies that have demonstrated nothing much new of value. There’s a lot of money to be had here, selling people on the newest snake oil that purports to help people in cognitive tasks, while the research shows otherwise.

    Let’s look at Kramer’s 34 (18 people in the active game arm) person study of 70-year-olds (media age 69.89, SD =5) from Illinois first. Let’s keep in mind before we begin that 18 people from Illinois are not generalizable to the general population in ANY study.

    These 18 people participated in 15 1.5 hour training sessions to achieve their results over a course of 7 to 8 weeks. Researchers measured performance on a set of 10 cognitive tasks typically used to measure memory and perception performance (typically in people who have impairments in these areas).

    Out of the 10 tests, only 5 showed a significant difference between the gamers and the control. Two of the remaining 5 showed mixed results (N-back task and the Raven’s showed no main effect, so the researchers went digging), and another looked at only one component of the measure (accuracy but not speed).

    Overall, it’s an interesting study. But the only conclusion one draws from it that playing a specific video game (Rise of Nations: Gold Edition) can help people with some executive function tasks — specifically in task switching and improving working memory. But you’d have to play *this game* and play it at least 3-4 hours/week in order to see these effects. And no data on whether the effects are long-lasting or not.

    In fact, it’s pretty clear that the researchers did not find as much support for a difference between these two groups as they had intended. So is it truly “remarkable” that if one repeatedly performs a task that emphasizes task switching and swapping out active memory, one improves in measures of task switching and swapping out active memory? The real question not answered by this study is does any of this matter to a person’s everyday functioning in the real world? And would the effects hold up after playing the video game for 7 to 8 months (or years, even!) rather than weeks? Who knows.

    Sorry, the Jaeggi study I found was:

    Buschkuehl, Martin; Jaeggi, Susanne M.; Hutchison, Sara; Perrig-Chiello, Pasqualina; Däpp, Christoph; Müller, Matthias; Breil, Fabio; Hoppeler, Hans; Perrig, Walter J. (2008). Impact of working memory training on memory performance in old-old adults. Psychology and Aging, 23(4), 743-753.

    The study you are referring to is this one:

    http://www.pnas.org/content/105/19/6829.full?sid=447036e5-b1ec-46ac-bb79-b0b223e61136

    70 participants, mean age 25.6 years old, 34 in the active training task. The active training task group was broken into 4 different groups who got 8, 12, 17 or 19 sessions of training. Both control groups and the “active training” groups had significant improvement on the IQ test (because of test/re-test effects).

    This study is indeed interesting for the effects shown, but it’s hampered by the measure used to test for “fluid intelligence” — a German IQ test. It’s difficult to evaluate one’s measures when they aren’t in your ordinary textbooks so you can’t determine whether they are robust, reliable measures of what they purport to measure. Efforts trying to learn more about the BOMAT were unsuccessful, as standard literature searches turned up little. This is my lack of knowledge more than anything, yet it does give me a brief pause, since knowing your measures is an important part of understanding and interpreting research.

    However, Sternberg, in a commentary, noted the limitations of the study better than I could:

    http://www.pnas.org/content/105/19/6791.full

    And you know this is a “valuable study” because the makers of brain software already are citing it as proof that their own programs are “scientifically proven:”

    http://www.mindsparkebrainfitnesspro.com/

    And there’s the rub. These studies are being done in laboratories, for brief periods of time, with specific protocols and tests. As we said in the original article, nobody’s actually researched the brain programs themselves (which may or may not at all be like the cognitive tests given), nor has anyone done any type of longitudinal study that shows these effects are nothing more than temporary.

    Jaeggi’s study is indeed more interesting than most because of its demonstration of a transfer effect from performing a specific type of task to improving one’s fluid intelligence. But since this study goes against years of prior understandings about intelligence, it really needs to be replicated, expanded, and used with more traditional IQ measures before we can start generalizing from its results. And of course, that connection to the real-world is still missing (does this actually *help* people in their daily lives, or does it only help people on theoretical construct scores?).

    I have nothing against cognitive research showing advances in “brain training” if people don’t then turn them around to market their own unproven software. My main concern is when people manipulate research findings to make broad statements and generalizations about all people when these studies have — to date — been done on very small groups of people.

    I think the research in this area is exciting and holds a lot of potential. To date, however, most of that potential has been hyped far beyond what the data show, and for that reason, we remain skeptical about the benefits of most “brain training” programs on the market today.

  7. Hello John,

    Let me start with 3-4 points where it seems we agree:

    1) “I think the research in this area is exciting and holds a lot of potential”

    2) “To date, however, most of that potential has been hyped far beyond what the data show”. And the main party for this hype, in my view, has been NPR and their fundraising pledge based on a promise to “rejuvenate your brain 10 years”.

    3) A good number of developers are also trying to benefit from a “halo”-effect without doing their own research.

    4) We probably also agree on this: the concept of “brain age” makes no sense. And the reason why I find is so misleading, when presented as science, is because it implies the interventions are a) universal, for everyone, b) general solutions, for all goals people have. We have tried to debunk this, including articles such as
    http://www.sharpbrains.com/blog/2008/06/24/brain-age-posit-science-and-brain-training-topics/

    Now, where we seem to disagree, and it is my perception that you are basing very strong claims on your personal prejudices and not on published evidence, are the following points:

    5) Are you really saying cognitive training is snake oil? You seen to, which would probably be the least sound statement I have ever read in PsychCentral.

    6) When you talk about a “new industry that appears virtually out of nowhere” you basically show that you are less informed about neuropsychology, human factors and cognitive science in general than you believe you are.

    7) You opt to pooh-pooh Kramer’s paper by suggesting he and his team went on a fishing expedition. Kramer, a very reputed scientist, who has published some of the best meta-analysis on the impact of physical exercise. You seem to take many liberties with people’s motivations and integrity when you don’t agree with them – as I have experienced myself in the past.

    8) Where is the evidence that supports your very strong claim that for “you”, no matter who “you” are and what “you” already do (so, here you are stating something universal and general), going out for a walk is the single best thing you can do and everything else is a waste of money and time? citations, please? just to show the absurdity of your claim – would you say this to a professional athlete? You are basically falling into the same NPR trap.

    9) Similarly, where is the evidence that supports your claim that all you need in terms of mental stimulation is to do one more crossword puzzle, or to read the paper one more day? this seems to contradict so many things we know, that I don’t even know where to start the education process.

    In short, I agree that, as in any new field, we should differentiate reality from hope from hype.

    You choose to scream, Fire!, Snake Oil!, Stop the Thieves!

    I choose to provide information, and to proactively help the field become more rational, better structured, more research-based, more transparent to consumers and professionals. I suggest once more you read Torkel Klingberg’s book – could be a great start for you.

  8. Sadly Alvaro, we’ll going to have to just agree to disagree once again. If all you provided was “information,” your site would give a balanced view of new research in this area, and ask challenging (or heck, I’d even settle for intellectually interesting) questions of the researchers you interview.

    You do provide a kind of “information,” but it is biased in favor of the industry (which, again, is not surprising). I’m happy to provide an alternative view to your unbridled optimism about the mainstream software being marketed to people under the guide of “scientifically-proven” when, in fact, it is not.

  9. My criticism of Kramer’s research is a perfect example of information you didn’t mention (or glossed over) in your discussion of his study. How can you draw any generalized conclusions from a study of 34 people, where half the measurements they used showed no effect, and half of the remaining measures were decidedly mixed? You can’t, unless you have few critical thinking skills or don’t understand how to analyze a research study’s actual data (and not just the authors’ conclusions).

  10. And you don’t have to just take my word on it… Researchers at Lifespan hospitals found a similar hype for these products:

    http://psychcentral.com/news/2009/02/10/are-brain-exercises-mainly-hype/4009.html

  11. I find it amusing that this page has 2 boxes with 12 links to the Lumosity Brain Training Games salespage, right there in the right sidebar, under the big IMPROVE YOUR BRAIN heading….

Join the Conversation!

Before posting, please read our blog moderation guidelines.

Post a Comment:


(Required, will be published)

(Required, but will not be published)

(Optional)

Recent Comments
  • Manda Wisheswell: Hello everyone. I read the ‘7 ways’ several months ago when my first depression started...
  • Darlene Lancer, LMFT: Excellent post! One technique to interrupt identification with your thoughts is to ask:...
  • Bibiana: There are so many types of bipolar people, just like any other people. We all function on different levels....
  • Jane: Dear Sunny and Liv, Many thanks for sharing your stories of endurance and courage.
  • Corina: I believe every comment on this website needed to be read- so that is what I did. Now I would like to put...
Subscribe to Our Weekly Newsletter


Find a Therapist
Enter ZIP or postal code



Users Online: 7813
Join Us Now!