An article today in the NY Times by Patricia Cohen summarizes some research conclusions that suggest that the way to preserve your brain into old age is to get more education. The problem is that the main study being cited appears from the article to be what we call a ‘correlational study,’ which may prove nothing of the kind.
Correlational studies are done by rounding up a bunch of people who, say, all went to college and comparing them to people who didn’t, and then comparing their “brain fitness” test results in middle age. These studies can be extremely valuable and, as my old grad school prof Tom Bouchard used to teach us, they are often the best way to study the really really interesting things in science, because they may provide a tremendous wealth of information.
The problem with such studies is that they can suggest, but not necessarily prove, that one thing causes another. A mantra that all undergraduate psych students memorize is that “correlation does not equal causation.” Just because two things tend to happen together, like education and healthier brains, doesn’t prove that the one thing caused the other. It’s equally likely, say, that people with healthier brains were the ones who were both motivated to pursue, and who could succeed at completing, additional years of education. (You might be more “motivated” to go to college just because for you, it’s easier. If it’s easier, you will enjoy it more. On the other hand, kids with reading or attention problems, say, or who just aren’t as sharp, may find it harder and more painful to study, so the thought of four more years of slogging through books and tests may seem much less pleasant for them.)
Cohen’s main lapse is in failing to explain or discuss this caveat. What she doesn’t say is that we really don’t know if the older folks with the more resilient brains got that way because they went to college, or whether they chose and managed to get through college because they had perhaps more well-functioning or healthier brains in the first place. If someone had set up the proper experiment fifty years ago, by randomly assigning some people to go to college and keeping others out, we might have had very different results when testing their smarts when they were older. We might have found some difference, but maybe not the same level of difference, say, between the two groups, which might mean that big headlines about “save your brain by going to college” would have seemed overly optimistic and simplistic.
There is evidence that doing things such as going to college, particularly decades ago when fewer people did so and so admissions were more selective, did tend to require at least somewhat better probably “inborn” ability levels. We also have evidence that late-life problems with things like Alzheimer’s might be predictable even in one’s teens. A study of elderly nuns showed that the ones who eventually developed Alzheimer’s had, as young women applying to a religious order, a very different style of writing. They used few or no complex sentences, for instance (the kinds with commas in them and so on; the other girls didn’t do that.) The researchers in that study were able to do a sort of “experiment” by seeing how well reviewers could predict who would eventually develop Alzheimer’s, by simply rating the young women’s essays on whether they used “linguistically dense” sentences that had things like multiple ideas, or even just commas in them. Turned out the predictive value of that kind of thing was weirdly high — the researchers could almost always match the eventual health outcome with the women’s writing styles from decades earlier.
While it is probably true that going to college develops a stronger brain, and there are certainly studies (some cited in her story) suggesting that mental exercise in general makes the brain healthier (though jogging probably helps more), these results are tentative and shouldn’t be overly simplified. Almost all of the studies showing that people who study French or go to college have “better brains” are correlational; it’s as least as likely that people with better-functioning brains just enjoy using them for a wider range of mental tasks.
It’s good if readers recognize when reporters are perhaps jumping too hard on squishy findings. Sloppy reasoning gets tedious to read over and over again. Reporters and publications know that they can attract “mindshare” with dramatic but misleading headlines, stories based on poorly done or poorly reported half-truths. (e.g., “Air Force Still Denies Finding Alien Bodies at Area 51.”) But this kind of bad reporting is not just misleading — it can also have negative real-world consequences. People make poorer decisions when they are based on poor or faulty information — they change their lifestyles, adopt new diets and exercise regimens, waste blood and treasure on things that in the long run, just don’t matter. They also may do bad things to each other based on half-truths and this kind of ignorance.
When I was in college we were assigned to read a little workbook for honing our skills at spotting false conclusions derived from research. The earliest articles were the easiest to figure out — a piece by someone allegedly named “Pileous Lupus Swarthy” was about determining whether werewolves could be “diagnosed” based on their response to a “silver allergy” test that entailed firing a silver bullet in the brains of people suspected of being werewolves. (Since it was “well known” that werewolves are allergic to silver… well, you get the idea.) The article concluded that the diagnostic test worked perfectly, since 100% of the test subjects who were shot in the head died, proving that they were all werewolves.
Which is silly, but not so much when you remember that the identical reasoning was used in the middle ages to kill thousands of women suspected of witchcraft, by binding them hand and foot and throwing them into ponds. The “proof” they were innocent of witchcraft would be that they drowned. “Real” witches were the women who managed to float. They would be burned alive.