According to a 1997 report from Idaho, thousands of American deaths have been attributed to the chemical compound dihydrogen monoxide (DHMO), mainly through accidental ingestion. The substance can cause severe burns, other unpleasant side effects, and “for those who have developed a dependency on DHMO, complete withdrawal means certain death.”
A 14-year-old student passed copies of this report to his classmates and asked, “What should we do about this dangerous substance?”
The result was overwhelming: 86 percent of respondents voted to ban DHMO. Of course, the punch line to the study is that none of the students stopped to reason that dihydrogen monoxide—two molecules of hydrogen to one molecule of oxygen—is also known as H2O, or water, which can indeed result in death from drowning, dehydration, or scalds. The student presented the results of his study as a science fair project under the all-too-apropos title, “How Gullible Are We?”
While we might not all be as gullible as the participants of that well-crafted social experiment, we are in fact, overwhelmed. Swamped with both information and a whirling dervish of data, most people lack the resources and time to fully consider everything they are told, and most training and professional development programs fail to address this critical challenge.
When I first read about this experiment, it reminded me of a recent business simulation Regis facilitated in which two participant teams exhibited similar, snap-judgment behavior. As part of the simulation training, we invited an “expert” to present during one of the rounds. We told four teams of second-year MBA students that they would compete over several rounds to determine who could drive business transformation and make their imaginary company the most profitable. Results would be based on the decisions each team made regarding the hypothetical company and global conditions.
Each team received the same setup instructions and data. However, two teams also listened to a 10-minute executive summary from a qualified economist, who had been given the same financial reports and supporting information that the students had received. He distilled that information into a PowerPoint deck and presented it to those two teams separately.
To our surprise, the economist’s summary, while accurate and well-reasoned, did not help the teams at all. In fact, it proved detrimental to their performance throughout the professional development training program. One of the teams that did not hear the economist came in first, closely followed by the other. Both of the teams who heard the economist did poorly in the simulation training rounds.
Our observation revealed that the teams who received the executive summary spent no time confirming or challenging the information. They accepted the summary at its surface level and acted with that information as their central focus. Like those high school students who voted before fully understanding the situation, these MBA candidates acted without developing their own understanding of the levers at work.
Some interesting research in Daniel Kahneman’s book Thinking, Fast and Slow suggests why these learning and development training participants so readily accepted information at its surface level. Kahneman, recipient of the Nobel Prize in Economic Sciences for his seminal work in psychology challenging the rational model of judgment and decision making, explains that our brain functions using two distinct systems.
System 1 is our fast brain. We can think of this as our automatic pilot or subconscious brain. It is responsible for getting us through the daily common activities, such as driving to work or brushing our teeth. It is also responsible for quick decisions or judgments. When we receive bits of information, it is System 1 that quickly jumps to a conclusion, filling the gaps along the way.
System 2 is our slow brain. It is responsible for analyzing information, assessing data, and considering solutions. As Kahneman explains, we spend most of our time reacting to the world around us through the lens of System 1. System 2 tends to be idling and only kicks into gear when System 1 cannot neatly fit a situation into a pre-established worldview. This reasoning sheds some light on why nicely packaged information is rarely challenged. We unconsciously prefer to slide by without exerting the extra effort required to wake up the Core Abilities of critical, creative, and systems thinking that are triggered in System 2.
These observations have since had a strong influence on the way we at Regis design our business simulations and other training and professional development programs, focusing on underlying beliefs and behaviors that drive learners’ actions. More on this topic to come in later posts.
Michael Vaughan is the CEO of The Regis Company, a global provider of business simulations and experiential learning programs. Michael is the author of the books The Thinking Effect: Rethinking Thinking to Create Great Leaders and the New Value Worker and The End of Training: How Business Simulations Are Reshaping Business.