During our recent Presidential Elections, I was introduced to a great concept that I had been theretofore unaware of, “the illusion of explanatory depth.”  

(And yes, I did intend to write “theretofore” in the opening. But, full disclosure, I’ve never used that word before and had to look it up to be sure it was a “real” word.)

The illusion of explanatory depth is a simple concept: We often don’t know as much as we think we know. As researchers Steven Sloman and Philip M. Fernbach explained recently in the New York Times, the “illusion of explanatory depth,” is the cognitive bias wherein “We typically feel that we understand how complex systems work even when our true understanding is superficial. And it is not until we are asked to explain how such a system works — whether it’s what’s involved in a trade deal with China or how a toilet flushes — that we realize how little we actually know.” (NYT, October 19, 2012)

Now, the reason I heard about this concept during our recent elections is because Sloman, Fernbach and others hypothesized that the illusion of explanatory depth plays a significant role in enabling our extremely polarized political zeitgeist. “People often hold extreme political attitudes about complex policies. We hypothesized that people typically know less about such policies than they think they do … and that polarized attitudes are enabled by simplistic casual models.” (Fernbach, Rogers, Fox, Sloman, 2012)

They did the research (so you don’t have to) and found that they were right. They also found the following:

  • If you ask people who hold extreme views about a policy to explain mechanistically how that policy works, and they are then confronted with how little they actually know about the policy, they tend to moderate their views.
  • But, if you merely ask people their reasons for holding the view they hold (e.g., it aligns with my values, it’s the way it’s always been) it either has no effect on the strength of their view, or it makes it stronger.
  • And, finally, they determined that after their position on a policy has been moderated by trying to explain it and realizing they don’t know as much about it as they thought they did, people change their behaviors.

At this point in this overlong post, you might be wondering why I’m telling you all this. At our most recent Transcribe Live in the Lab, we took on the topic of the illusion of explanatory depth and its relevance to our practice of applied collaboration. We asked: How will a better understanding of this concept help us help our clients collaborate to address complex business challenges?  

First is an awareness of context. Sloman, Fernbach et al. focused on the political arena, whereas our work is primarily in the corporate world. As we discussed the topic, we realized that the competitive nature of much of corporate culture may in fact reinforce the illusion of explanatory depth. We see it time and again, the perception that it is not okay to not understand something, regardless of how complex a model, system, or process is. But you have to weigh in, give your considered opinion. And there’s a clock ticking. Don’t get in the way. Don’t slow us down. But please take a position. And tell us why.  

The fact is, when you bring a group of stakeholders from various backgrounds together to collaboratively solve a complex problem, some significant percentage of the group will be affected by the illusion of explanatory depth, either about the challenge in question, or some component part or system.

Further, when there are significant hidden gaps in people’s understanding about something they are trying to fix or improve, that means they don’t have all the tools available to them to arrive at an effective, enduring, creative solution. Finally, discovering hidden gaps in understanding over time, bit by bit, will likely result in inefficient solution development, rework, extended timelines, and increased costs.

With a better understanding of the illusion of explanatory depth, there are some things to keep in mind the next time you are about to engage a diverse group of stakeholders, with strong and varied opinions, to solve a complex problem.

  • Recognize that corporate culture may be reinforcing the illusion and helping to hide critical gaps in understanding.
  • Make sure you build in adequate time for learning and level-setting, and be sure to ask people to “‘unpack’ complex systems” and explain how they work, rather than simply letting them express why they feel the way they do about them.
  • Don’t be afraid of complexity. A little clarity about how complex things are helps moderate people’s beliefs, and moderating people’s beliefs can “…increase their willingness to compromise and explore opposing views. … [and] could have implications for reducing impasse and increasing value creation…” (Fernbach, Rogers, Fox, Sloman, 2012)