The Major Flaw In Brain Training Studies

People in the control group always realize they're just playing Tetris for hours.

In randomized drug trials, placebos are straightforward. Some people get drugs; some people get sugar pills. People aren’t quite sure what’s in their pill, so everyone has the same expectations that their condition will improve.

When it comes to psychological studies, it’s a little trickier. In studies of whether or not something like a brain training exercise improves cognition, people notice which treatment they’re receiving, and adjust their expectations for benefit accordingly. This could have implications for research on the effect of brain training and other psychological interventions, notes a study published in the July issue of Perspectives on Psychological Science.

Psychologists Walter Boot and Daniel Simons looked at how people expect their cognitive and perceptual abilities–in areas like visual processing, attention, and task switching–to improve after watching videos of certain video games. One group watched a video of an action game, while the other group watched videos of Tetris or The Sims, two slower-paced games often used as a control in video-game training studies. The participants who watched Unreal Tournament, an action game, were more likely to believe that the experience could improve their visual processing abilities than the participants who watched The Sims or Tetris. Their expectations for improvement matched the improvements in perception and cognition found in other published video game studies.

“Although our example singled out video-game interventions, the placebo problem is pernicious and pervasive, affecting most cognitive interventions in psychology,” the researchers write. As they note in their Q&A on Open Science Framework on the topic, part of problem is that when funding is short, control groups are expensive: “Including an active control condition effectively doubles the cost of an intervention compared to just testing the experimental group.” And even when there is a control group, this expectation factor could be falsely inflating results, since both groups are aware of which treatment they are receiving.

Some harsh words for the psychological community follow later in the paper:

Although some have claimed that placebo control groups in psychological interventions, such as ones examining the effect of game play on cognition, are impossible, that limitation does not excuse researchers from the requirement to account for expectation effects before inferring that an intervention was effective. There are methods to measure and account for the influence of differential expectations and demand characteristics. These include explicitly assessing expectations, carefully choosing outcome measures that are not influenced by differential expectations, and using alternative designs that manipulate and measure expectation effects directly.

They put together this handy little flowchart on whether or not you can determine a psychological intervention (like a video game or brain training exercise) actually caused improvement:

“We don’t want to recommend new therapies, change school curricula, or encourage the elderly to buy brain-training games if the benefits are just due to expectations for improvement,” Simons said in a press statement. “Only by using better active controls that equate for expectations can we draw definitive conclusions about the effectiveness of any intervention.”

So best to be wary before throwing your energy into a brain-training game that touts scientific backing. Without a really effective control group, those results could be skewed by study subjects’ expectations.