Cognitive Biases Inhibiting Innovation in Top Management Teams

The top management team of an organization is arguably the most important team for deciding and implementing innovation strategies. They typically decide which markets to be entered, which markets to be exited, and which new technologies to pursue. But decision making is fraught with biases – errors in judgment that affect the quality of decisions. Sometimes with devastating results. In this post we will see how basic human psychology affects the decision making of top management teams.

Searching and scanning. Two of the most common functions of top management teams are to make sense of the environment and decide upon which direction to take concerning the organization’s innovation efforts (Isaksen & Tidd, 2006). First, top management teams are responsible for searching and scanning the environment for new opportunities and markets, or significant changes to the firm’s existing market. The team must take in vast amounts of information, often of a contradictory nature. Perhaps more often, the signals that the team is supposed to act upon are weak amid a certain amount of noise. This naturally poses a problem when judging which (weak and noisy) signals to act upon.

Strategic decision making and problem solving. Second, top management teams are responsible for deciding which direction an organization takes its innovation efforts. These decisions typically span which new markets to be entered or exited, which new technologies to pursue, how to allocate resources to innovation projects, which kind of project portfolio should be implemented, and how risk and uncertainties should be managed.

Cognitive biases in top management teams’ decision making

Without being aware of the potential biases that may affect our decision making, we subject ourselves to an increased chance of making mistakes and inherently bad decisions (Kahneman, 2011). History holds many examples of failure to make adequate decisions concerning which direction an organization should take. Examples include Polaroid’s failure to recognize and move to the emerging digital imaging technology, resulting in the downfall of the company. In a similar vein, the Swedish company Hasselblad’s faced near demise when they reacted too late to the new digital technology. Another example is the one of IBM, who in the 1990’s got plenty of warning signals that technology moved away from big mainframe computers into more decentralized networked computing, but reacted too late and nearly lost its business. Thus, decisions about a firm’s innovation focus have tremendous consequences – and decision makers in organizations are subject to the same human biases as everybody else.

Common pitfalls in our thinking when making decisions in problem solving and scenario making are:

…we tend to see information as “evidence” that only supports our a priori idea.

  • Sticking to an a priori hypothesis when approaching a particular problem or scenario and disregarding contradictory information. If we hold strong preconceptions, we tend to see information as “evidence” that only supports our a priori idea. This is especially relevant when top management teams try to make sense of the contradictory and weak signals coming from the environment. This phenomenon is also called conformation bias.
  • Illusion of predictability. When subjected to this bias, decision makers typically undervalue the uncertainty of the environment.
  • Narrowing the problem too early. Problems, especially complex ones involving strategy, often have multiple solutions and these need to be carefully considered before making decisions. It is usually a good idea to consider information from different sources that either strengthen or weaken a specific scenario or idea.
  • Overconfidence in a chosen strategy. Often, this notion is fueled by an illusion of control and the decision makers fail to acknowledge other alternatives to the strategy, or suitable exit points should the chosen strategy prove to be less than desired.

(Isaksen & Tidd, 2006; Kahneman, 2011; Walsh, 1995)

A good example of someone who recognized the biases and errors in the decision making of his top management team, and consequently changed the way decisions were made was the former US president John F Kennedy. His significant failure in the 1961 Bay of Pigs Invasion, where the US sent approximately 1.500 ground soldiers against the 225.000 Cuban army was built on a number of preconceptions and decisions that to an outsider would seem ludicrous. The kind of decision making in Kennedy’s team even led to a now widely used concept, group think, when the research psychologist Irving Janis (1972) later analyzed the tape-recordings from the meetings.

Group think is characterized by massive pressure to conform, illusions of invulnerability, and the suppressing of contradictory information. Thus, in a team where group think roams, decision making based on sound analysis of information and scenarios is heavily impaired. Kennedy realized this himself however, and built a new administration where opposing views and dissent was encouraged. Later, Kennedy and his team successfully handled the razor sharp political knife edge of the 1962 Cuban Missile Crisis.

By Leif Denti


References

Isaksen, S. & Tidd, J. (2006). Meeting the innovation challenge: Leadership for transformation and growth. Chichester: John Wiley & Sons.

Janis, I. L. (1972). Victims of Groupthink: a Psychological Study of Foreign-Policy Decisions and Fiascoes. Boston: Houghton Mifflin.

Kahneman, D. (2011). Thinking, Fast and Slow. New York: Farrar, Straus and Giroux.

Walsh, J. P. (1995). Managerial and organizational cognition: notes from a field trip. Organization Science, 6, 1-41.

About the author


Leif Denti is pursuing his doctoral degree of Psychology at the University of Gothenburg, Department of Psychology. His main research venue is how project leaders stimulate creativity and innovation in their project teams (project name: Management for Sweden). Leif Denti is also involved in a research project at the School of Business, Economics and Law, University of Gothenburg, studying organizational factors that may influence problem solving in project teams. Leif Denti holds a licentiate degree in Psychology at the University of Gothenburg.

  • Val Vadeboncoeur

    Excellent article, Leif.  Designing innovation systems and processes is necessary and good but most innovation gets tripped up in the “human factor.” That’s the tricky part.  I’ve always said that innovation won’t reach a higher level of effectiveness until human beings become more conscious beings…”saints,” if you will.  (Drive fear out of the workplace and pride out of our hearts and we’ll definitely be getting somewhere!) The thing is that we all have our “blind spots” and the biggest blind spot, as Otto Scharmer would say is “us” or the leaders of innovation.  You’ve outlined some of those blind spots very nicely here. “Innovation psychology” is a fascinating and important field that’s emerging. Glad to see you’re on top of it.

  • Pingback: Cognitive Biases Inhibiting Innovation in Top Management Teams | Accounting and Small Business /Beverly Shares

Ad

STAY CONNECTED

 
Ad