|I‘ve mentioned on these pages that, once my three books are complete, one of the things I’d really like to do next is to teach young people three things:
Earlier this week I described how I’d like to teach The Truth About Nature, using Model Intentional Communities — because you teach people by showing them what to do, not telling them, and because you learn better when you participate rather than just reading or observing. I’ve also written a lot about Natural Enterprises. This article is about Critical Thinking, and how, I believe, we could learn to be better at it.
First off, not all thinking is, or should be, critical. Both reflective thinking and creative thinking, for example, use very different processes.
There are many university courses that teach you how to think critically, even one that you can take online. My ‘minor’ in university was philosophy, so I took quite a few of them. I found them pretty academic, and unnecessarily hard and unintuitive to master. One of the best models I’ve found of the critical thinking process is the one from Dartmouth’s Composition Center that I’ve illustrated above. So, a young person visiting a Model Intentional Community, for example, would do her homework, observe and participate during the visit, consider both what she was told and shown (“this is a better way to live”) and what she was not told (“what’s the dropout rate?”), draw inferences (“they seem to be having fun and really believe in what they’re doing”; “having wilderness so close does seem healthy and inspiring”; “this is too radical a departure from the way I live for me to want to do personally”), challenge and evaluate her and others’ assumptions (“maybe living in the city is the real ‘radical departure’ “; “this model doesn’t appear to be scalable”), and form tentative opinions (“this is an important experiment, but I don’t think I could live this way”). That could be the end of it. Or, she might have to report back to class on her visit, or might decide to talk to friends about her visit, so she would then develop supporting arguments for the tentative opinions she had come to, and challenge those arguments, and their refutations, in her own mind and in conversations with others.
Following such a process would prevent two opposing critical thinking failures: in this case writing off the Intentional Community as a bunch of wackos (perhaps based on what others said to her before her visit), or becoming so enthralled she becomes blind to the Community’s problems and refuses to go home. So critical thinking is always a balancing act. It acknowledges that things (and people) usually are the way they are for a valid reason, and that at the same time just because something is ‘common wisdom’ doesn’t mean it shouldn’t change, perhaps radically. Balance doesn’t always lead to middle-of-the-road opinions, but it does require continuous skepticism.
Our culture has its own biases, and one of them is that ‘rational’ thinking is ‘sounder’ and preferable to both emotional thinking and relying on one’s instincts when forming opinions or making decisions. I don’t share this view. There are times when we can over-rationalize a situation, and when drawing on our emotional intelligence (“she says she’s happy here, but you can see in her face that she isn’t”) or our intuition (“this place is unhealthy, though I can’t put my finger on how I know that”) leads to more useful opinions and decisions, as hard as they may be to defend in our logic-biased human language. But I don’t think this invalidates the Dartmouth model: Even if the synthesis, the challenging and the analysis we do may be subconscious or emotional, the process remains unchanged and may actually be richer and more valuable for the inclusion of these ‘irrational’ elements.
A while ago I wrote an article on media ‘spin’ describing how, using techniques like selective emphasis, judgement-charged wording, and omission, a reader could be led to utterly invalid opinions and conclusions, and that sometimes neither the writer nor the reader is conscious of their role in the deception. Take a quick re-read of the study of the NYT coverage that I cited in that earlier post. How was the critical thinking process perverted in this article? The synthesis process (compiling and organizing the facts) was confounded because the writer was deceived about and hence misreported the facts. Some of the people (like President Clinton) that the writer quoted said things based on unsupported assumptions (perhaps based on political expediency). And the NYT writer’s own conclusion (that the 1998 bombing of the Sudan pharma plant was a justifiable anti-terrorist action), which was based on incomplete and erroneous information (and perhaps that writer’s faith that Clinton wouldn’t lie on something that important), allowed him to bias the reader by what he wrote, by what he didn’t write, and by how and in what order he wrote it. The result is that the vast majority of people in the West concluded, erroneously, that this devastating act, based on either a horrendous intelligence error or deliberate criminal deception, which caused untold and lasting horror for many Sudanis, was a justifiable and relatively harmless action.
It would take extraordinary critical thinking skills to have been able to come to an appropriate conclusion in this instance. I remember at the time I was completely taken in. I was overly generous in my trust of Clinton, because he was being subjected to the outrageous Republican witch-hunt at the time. I had read about the government genocide in Sudan so I was inclined to believe that they might disguise a bioweapons plant as a pharmaceutical plant. And there were no obvious clues in the NYT coverage of the event (or other, mostly derivative articles in the mainstream media) to make me skeptical. It was really only the fact that I read a lot of the alternative press, whose coverage did raise doubts in my mind about what had happened, that caused me to change my opinion. Until then, I just wasn’t thinking critically.
There was a very interesting study done in California in 1990 called Critical Thinking: A Statement of Expert Consensus for Purposes of Educational Assessment and Instruction. The study drew together about 60 leading thinkers on the subject. You can read more about it here. In essence it said that effective critical thinking requires a combination of three things:
In other words, you need to acquire these skills, be disposed to use them, and apply them in a disciplined way. I think our educational system tends to teach, and even require, students to be passive, but there are many opportunities in life to exercise these cognitive skills, and in my experience they improve with practice, not classroom training. So I wouldn’t be inclined, in teaching these skills to young people, to do much more than give them some interesting exercises to practice them. I think we’re all naturally curious, and once students realize they have these intellectual muscles I think they’ll be self-aware enough to start exercising them. I’m not sure you can teach critical spirit or intellectual rigour — it tends to be attitudinal and contextual (for example, I care a lot about whether Bush is lying to us, but much less about whether he’s clinically psychopathic, so a discussion on the former will energize my critical spirit and intellectual rigour while a discussion of the latter probably won’t).
What I think really needs to be taught is critical thinking as a defensive skill. We all think logically, but we can be fooled. Inadvertently or maliciously. If I were to design a Critical Thinking course it would quickly cover the basic cognitive skills, and provide some exercises for students to get these muscles working, and would then focus entirely on learning to challenge intellectual deception. It would be almost entirely case-study and exercise-based, and would focus on the two principal media of intellectual deception: (a) ‘political’ speeches and editorial writing, and (b) advertising. As citizens, we need to learn to think critically about what we’re told by those with a political axe to grind. Politicians, writers and speakers of rhetoric of every political stripe, editorial writers, lobby groups and lawyers, and those in their employ and under their control (like the major commercial media) all essentially make their living by spinning the truth, by deception and distortion. They are not interested in balance, so we need to learn to challenge and balance what they tell us. And as consumers, we need to learn to think critically about what we’re told by those with an economic interest in deceiving us. Corporations, advertising agencies, stock and real estate scam artists, brokers, Ponzi and pyramid schemers, and promoters also make their living by spinning the truth, to sell their product, so we need likewise to learn to challenge and balance what they tell us.
Success in such a program would be students who could deconstruct an unfair editorial, an inflammatory stump speech, a talk-show diatribe, a real estate huckster’s come-on, an infomercial, a televangelist’s sermon, or any of the other products of those con artists who prey on our lack of critical thinking to separate us from our reason or our money. The last class in the course would be to dissect an infomercial — some of them are powerfully seductive, and use every trick in the book.
It’s a survival skill we all need.