December 1, 2004
Thesis Statement / Introduction
It’s defined as “a pattern of thought characterized by self-deception, forced manufacture of consent and conformity to group values and ethics.” It’s called groupthink. It’s a phenomenon wherein people seek unanimous agreement in spite of contrary facts pointing to another conclusion.
Groupthink corrupts the communication models as we’ve seen them—on the levels of both interpersonal and mass communication. Groupthink is not persuasion. It is a psychological game of peer pressure and fear of exclusion. And it can result in tragedy.
Not being critical of each other’s ideas, examining alternatives, seeking expert opinion, being selective in gathering information or having contingency plans … these are some of the problems with groupthink.
The illusion of invulnerability, rationalizing poor decisions, believing in the group’s morality, pressuring others or not expressing yourself, pretending all is unanimous, using “mindguards” to keep out negative information… these are some of the characteristics of groupthink.
Relation to Class Materials / Evidence
All three of our main materials from the class have, directly or indirectly, involved or discussed groupthink and its implications. I’ll begin with Bill Moyers, who focused on it in “The Truth About Lies.” Our primitive need, he said, is a sense of belonging, which we’re afraid of losing if we have to admit the truth—and we’d rather conform to the group and be accepted than be different and keep our individual identity.
Moyers cited the Bay of Pigs invasion and the Challenger disaster as examples of what happens when the “popular vote” supersedes rational thinking. Nobody would like to be the one to bring up the “unpopular side,” the ugly truth that is the other side of the debate or topic. Why? Because would-be dissenters are afraid of being shunned by the group.
But we can turn to Vance Packard for probably the first indirect reference to groupthink, although it hadn’t yet been coined as a term. In “Eight Hidden Needs,” Packard discusses a trend among advertisers—the trend to appeal to acceptance, love and emotional security, among other things.
Then later on, particularly in Chapter 21, “The Packaged Soul?”, Packard begins to mention that that same trend is showing up in politics, as well. He wonders about the persuasive techniques of market research and its disturbing, “Orwellian” implications: “packaged communities,” depth probing and even the idea that human behavior could be electronically controlled like a machine. All of this, Packard worries, advances harmful values such as consumption and groupthink. He watches the trend of persuading us as consumers expand to persuading us as citizens—especially by manipulation of emotions.
Communication Models (Newcomb; Westley-MacLean)
As I said before, groupthink affects both interpersonal and mass communication. For the former, look at the Columbia, the Challenger, the Bay of Pigs and Pearl Harbor. In the case of the Kennedy administration and CIA, a board of NASA members and other groups, there was a communication breakdown. Dissenters and warnings were ignored. Feedback was suppressed.
On the mass-communication level, well… look what Hitler managed to do to practically an entire country. In the Vietnam War, the public was afflicted with groupthink, and those opposing military action were hated and denounced as disloyal to the United States. The public had been given a premise for war based on questionable grounds (the domino-effect theory); nobody was interested in why war might not be a good idea. And the media is affected too, in what it will report and how.
Implications / Conclusions
Hitler and Nazism. Pearl Harbor. Vietnam. The Bay of Pigs. The space shuttle disasters. None of these had to result in tragedy, but they did. I doubt a person in the room—maybe in the world—would refuse to spend five dollars to save billions, and more importantly, the lives of countless others. It didn’t even cost five dollars for physicist Richard Feynman to buy the materials he used in proving what went wrong with the ’86 Challenger shuttle. Ice water, a rubber O-ring and a c-clamp. That was all it took to show that something went wrong with the shuttle—and clearly with the decision-makers. If that's all it takes to prove that NASA scientists did nothing to prevent a tragedy, something isn't right. I can't explain what went wrong in all of the above events, or how the tragedies could have been avoided, but human life is not a price we can afford to pay.
For another example, just weeks ago a Ukrainian TV interpreter’s channel was under government pressure to slant news coverage in favor of a political candidate.
Smart people collectively can make stupid decisions. Some who are in power have self-serving agendas. People have to stand up to them, or else disasters will continue. Groupthink isn’t communication or persuasion; it promises either acceptance or rejection. The Newcomb and Westley-MacLean models don’t work when you use fear to send the message. Whether we’re talking to two people or two billion, communication has to have truth—and the receiver’s trust. If not, the message is lost, like so many lives have needlessly been.