Working Collectively can be
Dumber Than the Sum of their Brains.
'Groupthink' Is 30 Years Old, and Still Going Strong
JOHN SCHWARTZ and MATTHEW L. WALD / NY Times 9mar03
HOUSTON — At NASA, it really is rocket science, and the decision makers really are rocket scientists. But a body of research that is getting more and more attention points to the ways that smart people working collectively can be dumber than the sum of their brains.
The issue came into sharp focus in Houston last week at the first public hearing of the board investigating the Columbia disaster last month. Henry M. McDonald, a former director of the NASA Ames Research Center, testifying before the board, said that officials at the space agency want to do the right thing, but cannot always get the facts they need.
Smart people working
collectively can be
note: Robert Oppenheimer (1904 - 1967) was one
"I am become Death, the Shatterer of Worlds."
Oppenheimer also said,
"There are children playing in the streets who
Oppenheimer and Einstein
* See notes below
Referring to the shuttle program manager, Ron D. Dittemore, he said, "I have no concern at all that people like Ron Dittemore, presented with the facts, will make the right decision." But, he said, "the concern is presenting him with the facts."
In fact, NASA's databases are out of date. For example, it cannot easily collect its data on damage to the shuttle on previous flights, and then search the material for trends and warning signs.
Investigators are also questioning the quick analysis by Boeing engineers that NASA used to decide early in the Columbia mission that falling foam did not endanger the shuttle, though it is now considered one of the leading candidates for the craft's breakup. The analysis satisfied important decision makers, but some engineers continued to discuss situations involving possible problems related to the impact — a routine process NASA calls "what-if-ing."
Because the engineers directly connected to the process were satisfied that the foam was not a risk, they did not pass the results of their discussions up the line, even though they suggested the material could potentially cause catastrophic damage. But other engineers who had been consulted became increasingly concerned and frustrated.
"Any more activity today on the tile damage, or are people just relegated to crossing their fingers and hoping for the best?" asked a landing gear specialist, Robert H. Daugherty, in a Jan. 28 e-mail message to an engineer at the Johnson Space Center, just days before the shuttle disintegrated on Feb. 1.
The shuttle investigation may conclude that NASA did nothing wrong. But if part of the problem turns out to be the culture of decision making at NASA, it could lead to more group dynamics and words like groupthink, an ungainly term coined in 1972 by Irving L. Janis, a Yale psychologist and a pioneer in the study of social dynamics.
He called groupthink "a mode of thinking that people engage in when they are deeply involved in a cohesive in-group, when the members' strivings for unanimity override their motivation to realistically appraise alternative courses of action." It is the triumph of concurrence over good sense, and authority over expertise.
It would not be the first time the term has been applied to NASA. Professor Janis, who died in 1990, cited the phenomenon after the loss of Challenger and its crew in 1986.
The official inquiry into the Challenger disaster found that the direct cause was the malfunction of an O-ring seal on the right solid-rocket booster that caused the shuttle to explode 73 seconds after launching.
But the commission also found "a serious flaw in the decision-making process leading up to the launch." Worries about the O-rings circulated within the agency for months before the accident, but "NASA appeared to be requiring a contractor to prove that it was not safe to launch, rather than proving it was safe."
Groupthink, Professor Janis said, was not limited to NASA. He found it in the bungled Bay of Pigs invasion of Cuba and the escalation of the Vietnam War. It can be found, he said, whenever institutions make difficult decisions.
David Lochbaum, a nuclear engineer at the Union of Concerned Scientists, has studied nuclear plants where problems have gone uncorrected because of internal communications failures and poor oversight. His list includes the Davis-Besse plant near Toledo, Ohio, where in March 2002 technicians discovered that rust had eaten a hole the size of a football nearly all the way through the vessel head. Only luck prevented what might have become an American Chernobyl.
"As you go up the chain, you're generally asked harder and harder questions by people who have more and more control over your future," Mr. Lochbaum said. The group answering the questions then tend to agree upon a single answer, and to be reluctant to admit it when they don't have a complete answer.
Engineers, he said, can also become complacent in the face of a potential problem that has not gone badly wrong before.
"In the Challenger thing, where they had O-ring problems on previous flights, it got to be an annoyance, but not a symptom of a disaster," he said. Nuclear plants suffer from the same false security, he said; six plants had previously suffered minor corrosion, but none was discovered in a condition like Davis-Besse.
IT is only common sense that large institutions should try to make sound decisions, said John Seely Brown, a former researcher at Xerox and a co-author of "The Social Life of Information." But it can be bewilderingly hard to do in practice.
"Often it takes tremendous skill in running a brainstorming session," Mr. Brown said. "Every once in a while, the random way-out idea needs to have more of a voice."
But giving the dissenting voice or voices greater influence turns out to be tricky. "You've got to figure out something in a finite amount of time," Mr. Brown said, or find yourself, as NASA is now, "swimming in a sea of hypotheses."
Oppenheimer was made director of the Los Alamos lab, and in 1943 he gathered about 200 of the best scientists in the field to live and work there. They designed two bombs, one using uranium (called "Little Boy") and one using plutonium ("Fat Man"). By early 1945, the plants at Oak Ridge and Hanford had produced enough raw material for testing. On July 13, 1945, at a site called Trinity 200 km southwest of Alamogordo, a plutonium bomb was assembled and brought to the top of a tower. The test was postponed by thunderstorms. On July 16, the bomb was detonated, producing an intense flash of light seen by observers in bunkers 10 km away and a fireball that expanded to 600 meters in two seconds. It grew to a height of more than 12 kilometers, boiling up in the shape of a mushroom. Forty seconds later, the blast of air from the bomb reached the observation bunkers, along with a long and deafening roar of sound. The explosive power, equivalent to 18.6 kilotons of TNT, was almost four times larger than predicted.
Some of the Los Alamos scientists had circulated a petition asking President Truman to give Japan a warning and a chance to surrender before using the bomb. Some signed, some didn't, but the project remained a secret until the end.
Twenty-one days after the test, the B-29 bomber Enola Gay dropped the uranium bomb on Hiroshima, Japan. Three days later the plutonium bomb was used to bomb Nagasaki. The two bombs killed approximately 150,000 people when they fell. Earlier in the year, intense bombing of Tokyo with conventional bombs had killed about 100,000 people without causing Japan to surrender, but on August 15, 1945, Japan officially surrendered, bringing an end to World War II.
If you have come to this page from an outside location click here to get back to mindfully.org