I recently bought a book that I was really excited about. It's one of those books that's created a lot of buzz and it was highly recommended by someone I respect. The author's pedigree included Harvard, Stanford, McKinsey, and a career as a successful entrepreneur and CEO.
Yet about halfway in I noticed that he was choosing facts to fit his story and ignoring critical truths that would indicate otherwise, much like Malcolm Gladwell's often does in his books. Once I noticed a few of these glaring oversights I found myself not being able to fully trust anything the author wrote and set the book aside.
Stories are important and facts matter. When we begin to believe in false stories, we begin to make decisions based on them. When these decisions go awry, we're likely to blame other factors, such as ourselves, those around us or other elements of context and not the false story. That's how so many businesses fail. They make decisions based on the wrong stories.
Go to just about any innovation conference and you will find some pundit on stage telling a story about a famous failure, usually Blockbuster, Kodak or Xerox. In each case, the reason given for the failure is colossal incompetence by senior management: Blockbuster didn't recognize the Netflix threat. Kodak invented, but then failed to market, a digital camera. Xerox PARC developed technology, but not products.
How 4 Top Performers Maintain Their Mental and Physical Wellbeing
In each case, the main assertion is demonstrably untrue. Blockbuster did develop and successfully execute a digital strategy, but its CEO left the company due a dispute and the strategy was reversed. Kodak's EasyShare line of digital cameras were top sellers, but couldn't replace the massive profits the company made developing film. The development of the laser printer at Xerox PARC actually saved the company.
None of this is very hard to uncover. Nevertheless, the author fell for two of these bogus myths (Kodak and Xerox), even after obviously doing significant research for the book. Most probably, he just saw something that fit with his narrative and never bothered to question whether it was true or not, because he was to busy validating what he already knew to be true.
This type of behavior is so common that there is a name for it: confirmation bias. We naturally seek out information that confirms our existing beliefs. It takes significant effort to challenge our own assumptions, so we rarely do. To overcome that is hard enough. Yet that's only part of the problem.
In the 1950's, Solomon Asch undertook a pathbreaking series of conformity studies. What he found was that in small groups, people will conform to a majority opinion. The idea that people have a tendency toward conformity is nothing new, but that they would give obviously wrong answers to simple and unambiguous questions was indeed shocking.
Now think about how hard it is for a more complex idea to take hold across a broad spectrum of people, each with their own biases and opinions. The truth is that majorities don't just rule, they also influence. More recent research suggests that the effect applies not only to people we know well, but that we are also influenced even by second and third degree relationships.
We tend to accept the beliefs of people around us as normal. So if everybody believes that the leaders of Blockbuster, Kodak and Xerox were simply dullards who were oblivious to what was going on around them, then we are very likely to accept that as the truth. Combine this group effect with confirmation bias, it becomes very hard to see things differently.
That's why it's important to step back and ask hard questions. Why did these companies fail? Did foolish and lazy people somehow rise to the top of successful organizations, or did smart people make bad decisions? Was there something else to the story? Given the same set of facts, would we act any differently?
The use of the term "paradigm shift" has become so common that most people are unaware that it started out having a very specific meaning. The idea of a paradigm shift was first established by Thomas Kuhn in his book The Structure of Scientific Revolutions, to describe how scientific breakthroughs come to the fore.
It starts with an established model, the kind we learn in school or during initial training for a career. Models become established because they are effective and the more proficient we become at applying a good model, the better we perform. The leaders in any given field owe much of their success to these models.
Yet no model is perfect and eventually anomalies show up. Initially, these are regarded as "special cases" and are worked around. However, as the number of special cases proliferate, the model becomes increasingly untenable and a crisis ensues. At this point, a fundamental change in assumptions has to take place if things are to move forward.
The problem is that most people who are established in the field believe in the traditional model, because that's what most people around them believe. So they seek out facts to confirm these beliefs. Few are willing to challenge what "everybody knows" and those who do so are often put at great professional and reputational risk.
Now we can begin to see why not only businesses, but whole industries get disrupted. We tend to defend, rather than question, our existing beliefs and those around us often reinforce them. To make matters worse, by this time the idea has become so well established that we will often incur switching costs if we abandon it. That's why we fail to adapt.
Yet not everybody shares our experiences. Others, who have not grown up with the conventional wisdom, often do not have the same assumptions. They also don't have an existing peer group that will enforce those assumptions. So for them, the flaws are much easier to see, as are the opportunities to doing things another way.
Of course, none of this has to happen. As I describe in Mapping Innovation, some companies, such as IBM and Procter & Gamble, have survived for over a century because they are always actively looking for new problems to solve, which forces them to look for new ideas and insights. It forces them to question what they think they know.
Getting stories right is hard work. You have to force yourself. However, we all have an obligation to get it right. For me, that means relentlessly checking every fact with experts, even for things that I know most people won't notice. Inevitably, I get things wrong--sometimes terribly wrong-- and need to be corrected. That's always humbling.
I do it because I know stories are powerful. They take on a life of their own. Getting them right takes effort. As my friend Whitney Johnson points out, the best way to avoid disruption is to first disrupt yourself.