Category Archives: Psychology

Thoughts on the APA leadership’s role in torture

(Note: these comments were just inserted into a discussion — a very worthwhile read — on the American Psychological Association’s website, dealing with the recent revelations on the leadership’s collaborative role in the Bush torture/war crimes business. More information is available on the Psych Central website.)

As I struggle, as do many psychologists, with the decision as to whether to resign APA, I keep having a few troubling thoughts. One occurred to me as I read through comments on an APA division’s listserve, which initially contained comments by a lot of angry and upset psychologists about the APA leadership’s support of torture.  But then, a sort of patronizing attitude seemed to surface by one or two of the writers… and a number of others joined in. (Many of these people were consultants to government agencies and corporations and such.) The basic message being sort of “well, a bit of catharsis is good, but now let’s have an adult perspective about all this.” This was followed by a sort of faux-mature, “bygones be bygones” attitude, a sort of realpolitik-y notion that of course we have to be involved in helping with all the people who have to handle the “real world adult responsibilities” of torturing and maiming and droning. It felt dismissive, of course, but more troubling to me was that it felt like exactly the kinds of rhetorical fraud that is always used to rationalize the worst of war crimes, genocide, slavery and so on.

This impression was linked in my mind with another. Recently reading a piece on relational networks online (someone making the point that the British would have easily pre-empted Paul Revere’s’ ride if they had mapping data on who was connected to whom in the colonies as he had zillions of connections to rebellion leaders – e.g. also here), I realized that of course, many of these writers likely had links to the APA leaders whose names emerge as part of this unholy mess. And that really, the leadership whom we want to resign would all, on a “network map,” have far, far more connections with unnamed sympathizers than we at first imagine.

In other words, we should be careful of our implicit mental model in diagnosing this kind of problem. We tend, often, to use a sort of “tumor” model — that if we can only excise a few bad cells, the body will then be healthy. In fact, a better analog is of a network — that the individuals whom we see as the offenders are actually central nodes in a vast network of people, many of whom are no doubt “riding this cathartic storm out” in preparation for later emergence as leaders who imagine that they will make “sanity prevail” after us kids get done with our tantrums.

My final thought is what this all is really about. APA feels to me more like a slick corporation that incidentally gets dues from a bunch of subscribers to whom it markets itself, but whose real agenda is more self-aggrandizement than service. It has its Very Important Leaders, its Very Important DC Headquarters (why aren’t we headquartered in Berkeley or Harvard Square if we are actually a group of scientists and health care experts?), its lobbying arm to whom we (clinicians) paid dues for years while being told that doing so was a requirement of membership (it wasn’t – there’s a settlement)… and while dissenting voices to its collaboration with the Bush war criminals were silenced.

We know a few truisms from social and organizational psychology: organizations eventually exist primarily to perpetuate themselves; and organizational cultures are generally highly resistant to change unless you do the full forty years in the desert so the bearers of the infection can all die off or disappear. I’m not optimistic about change happening in APA. It makes it hard to hold on.

Leave a comment

Filed under Psychology, Psychology profession, Public affairs

November 22nd


Today is the 51st anniversary of the assassination of JFK.  And I’m old enough to remember the day.

The first hint of something big happening occurred when we caught glimpses of teachers standing together, heads bent close so they could hear something on a portable transistor radio (look it up…they were quite big at the time.) A short time later, a teacher came into the room and I remember her announcement. (Odd that I can’t remember who she was but her words still are lodged in my head.) “Children, we have some very sad news…”

They sent us home from school early. And I remember the whole long weekend of grief, the families being together, the funeral on TV, an entire nation in grief. I think the children learned how to feel from the adults, and what we mainly learned was that in this case, the adults didn’t seem to know how to feel either. Except that they were in shock. Something we did not understand, nor did they, but it got under our skin and stayed lodged in our minds, a critical dimension of the event. “Even our parents were crying!..”

It was my own first experience of grief, of something this powerful. I don’t think I ever got over it, or that many of us did. Later catastrophes that “everyone remembers where they were” for, pale in comparison. Perhaps after the first really bad one, your capacity for that kind of numbing shock is altered. Because it’s the first one that teaches you something you never knew before: your world can alter suddenly, irrevocably, even horribly.

After that time, you just know it. That awareness is forever part of you.

Leave a comment

Filed under Psychology, Public affairs

Weekly book review: The Swerve

Every so often a book comes along that helps to organize a great deal of what we’ve learned about the world. Such books show us how news events, historical shifts, wars, peace, technological changes, literature all occur and influence each other, showing that what Walter Cronkite used to call “the way it is,” is actually a pretty complicated and interconnected network of ever-shifting things. I took a course in college co-taught by faculty from various departments including history, literature, philosophy, and they taught us how the science, art, and politics of the Elizabethan world all influenced each other. Steven Johnson’s work, particularly his virtual trilogy, The Ghost Map, The Invention of Air, and Where Good Ideas Come From, (I reviewed the latter book here) is another example of that kind of integrative work that makes one gasp for air in delight.

ImageStephen Greenblatt’s The Swerve: How the World Became Modern (2011) is that kind of book. It is beautifully written, erudite, and informative. Most important, it’s very relevant to our own polarized times, when “Christian” fundamentalists seemingly want to wage permanent war against every other kind of fundamentalism and science, in a time when, faced with cataclysmic climate change and other urgent problems,  we are once again witnessing a stupid, time-wasting War of the World Views.

The Swerve is the story of how an Italian scholar and humanist, Gian Francesco Poggio Bracciolini, managed to find and get back into circulation a famous ancient book that had disappeared from history, Lucretius’s On the Nature of Things. Greenblatt shows how this long lost book, a philosophical treatise on the physical world and the implications for how one lives, became, after its rediscovery, a key source of ideas that propelled us into the modern world. How, in short, a book you’ve probably never read and never heard of, had a massive effect on not just the world you live in, but on how you view and understand that world — and yourself.

I enjoyed this a lot; started reading it idly in spare moments on my iPhone’s Kindle app, and it soon forced its way past other things I’ve been reading and became the main thing on my mind. The kind of book that you find yourself thinking about when you should be doing other things.

It wasn’t quite balanced the way I expected from the reviews, in that it’s largely a biography of Bracciolini, the guy who managed to find and copy and get Lucretius’s book back into circulation; but I later decided that understanding the intellectual, political and practical world Bracciolini lived in was background I needed, in order to grasp the importance of the book in his world and time. This was blended with a fascinating discussion of the clash between the outlook of Lucretius and the Epicureans, with other established views, both in ancient times (culminating eventually in the suppression and deliberate destruction of such works by the early and already intolerant and violent Church), and during the entire period leading up to and through the Renaissance, the Reformation, and beyond.

Greenblatt’s book shows how the basic scientific concepts of the ancient philosophers led to the ethical and psychological implications of their world view.  For if “the nature of things” or “the way it is,” is that all of reality, including us, is composed of atoms and nothing else; if even our “souls” (or we might say, “minds”) are made up of, and organzied by those ever-moving atoms, then the belief structures, ancient and modern, involving the fear of hell and fantasy gods, has no reality basis, and is both unnecessary, and unnecessarily harmful. In which case, the only sane thing would be to ditch the scary old gods and repressive theologies, and to find ways to appreciate and enjoy our brief time alive. (I personally don’t believe one has to do without a mystical or religious view of reality to accept and learn from such a philosophy, but perhaps I shall indeed order that hot tub!)

Greenblat shows how the blowback by the Church from Lucretius’s ideas helped fuel the fires of the Inquisition, and how the book and its outlook was a challenge both to the Church, the established political powers, and to probably, the dark, fear-based world views of many ordinary people. He finally discusses the importance of the book in helping to launch the Renaissance, the Reformation, the Enlightenment, all the way through (in an astounding ending) the nearly certain reference to Lucretius and his Epicurean philosophy in the Declaration of Independence, where Jefferson (who considered On the Nature of Things to be among his favorite books) declared that a main purpose of a government is to assist its citizens in their “pursuit of Happiness.”  (That was not, and at the time of the writing of the Declaration could not possibly have been a goal listed by a “Christian” Continental Congress, at least not one as fantasized by modern right-wing evangelicals.)

As a psychologist, this book represents the retrieval of an important chunk of my profession’s “intellectual history.” As a human being, it’s simply a wonderful tickle and challenge to one’s outlook if one grew up “churched” and still finds oneself pondering, as the world turns to desert and the darkness looms, about one’s life, what it all might mean, and whether leasing or buying a hot tub makes the most sense. At the very least, hot tub or not, it certainly inspires me to read the original source of all these fireworks, Lucretius’s Nature of Things.


Filed under Books, Climate change, Psychology, Public affairs, Writing and society

Another Science Reporter Fail: “Education and the Aging Brain”

An article today in the NY Times by Patricia Cohen summarizes some research conclusions that suggest that the way to preserve your brain into old age is to get more education.  The problem is that the main study being cited appears from the article to be what we call a ‘correlational study,’ which may prove nothing of the kind.

Correlational studies are done by rounding up a bunch of people who, say, all went to college and comparing them to people who didn’t, and then comparing their “brain fitness” test results in middle age.  These studies can be extremely valuable and, as my old grad school prof Tom Bouchard used to teach us, they are often the best way to study the really really interesting things in science, because they may provide a tremendous wealth of information.

The problem with such studies is that they can suggest, but not necessarily prove, that one thing causes another.  A mantra that all undergraduate psych students memorize is that “correlation does not equal causation.”  Just because two things tend to happen together, like education and healthier brains, doesn’t prove that the one thing caused the other.  It’s equally likely, say, that people with healthier brains were the ones who were both motivated to pursue, and who could succeed at completing, additional years of education.  (You might be more “motivated” to go to college just because for you, it’s easier.  If it’s easier, you will enjoy it more.  On the other hand, kids with reading or attention problems, say, or who just aren’t as sharp, may find it harder and more painful to study, so the thought of four more years of slogging through books and tests may seem much less pleasant for them.)

Cohen’s main lapse is in failing to explain or discuss this caveat.  What she doesn’t say is that we really don’t know if the older folks with the more resilient brains got that way because they went to college, or whether they chose and managed to get through college because they had perhaps more well-functioning or healthier brains in the first place.  If someone had set up the proper experiment fifty years ago, by randomly assigning some people to go to college and keeping others out, we might have had very different results when testing their smarts when they were older.  We might have found some difference, but maybe not the same level of difference, say, between the two groups, which might mean that big headlines about “save your brain by going to college” would have seemed overly optimistic and simplistic.

There is evidence that doing things such as going to college, particularly decades ago when fewer people did so and so admissions were more selective, did tend to require at least somewhat better probably “inborn” ability levels.  We also have evidence that late-life problems with things like Alzheimer’s might be predictable even in one’s teens.   A study of elderly nuns showed that the ones who eventually developed Alzheimer’s had, as young women applying to a religious order, a very different style of writing.  They used few or no complex sentences, for instance (the kinds with commas in them and so on; the other girls didn’t do that.)  The researchers in that study were able to do a sort of “experiment” by seeing how well reviewers could predict who would eventually develop Alzheimer’s, by simply rating the young women’s essays on whether they used “linguistically dense” sentences that had things like multiple ideas, or even just commas in them.  Turned out the predictive value of that kind of thing was weirdly high — the researchers could almost always match the eventual health outcome with the women’s writing styles from decades earlier.

While it is probably true that going to college develops a stronger brain, and there are certainly studies (some cited in her story) suggesting that mental exercise in general makes the brain healthier (though jogging probably helps more), these results are tentative and shouldn’t be overly simplified.  Almost all of the studies showing that people who study French or go to college have “better brains” are correlational; it’s as least as likely that people with better-functioning brains just enjoy using them for a wider range of mental tasks.

It’s good if readers recognize when reporters are perhaps jumping too hard on squishy findings.  Sloppy reasoning gets tedious to read over and over again. Reporters and publications know that they can attract “mindshare” with dramatic but misleading headlines, stories based on poorly done or poorly reported half-truths.  (e.g., “Air Force Still Denies Finding Alien Bodies at Area 51.”)  But this kind of bad reporting is not just misleading — it can also have negative real-world consequences.  People make poorer decisions when they are based on poor or faulty information — they change their lifestyles, adopt new diets and exercise regimens, waste blood and treasure on things that in the long run, just don’t matter.  They also may do bad things to each other based on half-truths and this kind of ignorance.

When I was in college we were assigned to read a little workbook for honing our skills at spotting false conclusions derived from research.  The earliest articles were the easiest to figure out — a piece by someone allegedly named “Pileous Lupus Swarthy” was about determining whether werewolves could be “diagnosed” based on their response to a “silver allergy” test that entailed firing a silver bullet in the brains of people suspected of being werewolves. (Since it was “well known” that werewolves are allergic to silver… well, you get the idea.)  The article concluded that the diagnostic test worked perfectly, since 100% of the test subjects who were shot in the head died, proving that they were all werewolves.

Which is silly, but not so much when you remember that the identical reasoning was used in the middle ages to kill thousands of women suspected of witchcraft, by binding them hand and foot and throwing them into ponds.  The “proof” they were innocent of witchcraft would be that they drowned.  “Real” witches were the women who managed to float.  They would be burned alive.

Leave a comment

Filed under Psychology, Writing and society