How to control confirmation bias in your analysis
Confirmation bias is still a big problem in business analytics

How to control confirmation bias in your analysis

A few months back I delivered a conference speech on how to run a world class business analytics function. As is the case with these events, immediately afterwards I was approached by lots of people asking lots of questions. Although it can be quite exhausting to try to answer them all, it’s actually the part I enjoy the most because it helps me keep a pulse on the everyday issues being faced by people working in business analytics.

One memorable conversation I had was with someone who seemed genuinely dismayed about the pressure she was under to give ‘desirable’ analysis results. She described a situation several months prior where she was put under undue pressure to find a fact — any fact — that would support an assertion being made by her client. Despite her best efforts, she ended up caving in and providing a fact to the client, even though there were numerous other facts available that did not support the assertion. She told me that the client made it clear that they only wanted to hear facts that helped them with their objective, and had no interest in counter-arguments. She felt helpless in this situation and in the end relented in order to get the work off her plate and move on, but she clearly felt pretty disgusted with herself for allowing it to happen.

This kind of confirmation bias is rife in the business world, and many will argue that it is omnipresent in academia also. It might not always manifest itself as extremely as in the example I’ve just given, but it frequently hides in plain sight. I am sure many readers can relate to situations where they have exclusively looked for facts to support an argument to the exclusion of all other relevant facts.

Why should we worry about confirmation bias?

Confirmation bias helps nobody, especially not the business or organization that suffers from it. A lack of openness to alternative perspectives increases the chance of disagreement among decision makers with entrenched points of view, which in turn increases the chance of erroneous decisions or no decisions at all (decision paralysis).

In an analytics context, I often see confirmation bias manifest itself in three ways:

  • At the beginning, it can manifest itself in the way that analytics is requested. Someone in the organization has an agenda or aim, and wants to use analytics to support it. So their questions will be loaded with confirmation bias: Do you have evidence to support that …? Find me data which shows…
  • In the middle, a common behavior is to ask for analysis to be re-done on subgroups or over different timeframes until it eventually yields a palatable conclusion. What does this say if we just restrict it to the last 3 months? What about the US only?
  • At the end, the analysis provided by the analyst is recast in a way that looks like it supports a desired conclusion, often ignoring warnings from the analyst about this.

In the intense focus of the individual to find the data that supports the desirable conclusion, they often fail to take a step back and realize that — by ignoring counter-arguments or presenting analysis in an unbalanced way— they may not be making the right decision for the business.

Controlling for confirmation bias

I am not a believer in debiasing. You can’t eliminate a bias. You can, however, control it. And the best way to do that is to introduce structured, consistent processes which reduce the chance that bias can play a role.

The best point in time to sniff out confirmation bias is at the point of the initial request for analytics. This is the point at which the requestor can be briefed about the neutral values and unbiased methods of the analytics group, and where the request can be phrased in a way that supports a balanced, evidence-based approach.

When you receive a request for analysis, and assuming it is not just a simple raw data pull, you can consider some of the following ideas to control confirmation bias:

  1. Write a service charter to share with clients at the outset. As well as containing commitments on turnaround times and the like, it can also be used to make the evidence-based values of the team clear and to get the client’s agreement to work in a way that is consistent with those values. For example, to agree an unbiased approach to the problem, and to commit not to edit the conclusions after the fact without consultation.
  2. Ensure the analytics request is phrased in the form of a neutral questionand not an objective. Bad: We are looking for data that proves that sales have been declining because clients are now further away since our office move. Good: Based on our data, what possible reasons can be suggested for the recent decline in sales?
  3. Debrief the client on the results of the analysis and make a record of the results and the debrief. Debriefing with the client helps avoid misinterpretation of the results and can preempt further requests to dig deeper. Keeping a record means that there is recourse to intervene at a later point if the results are misinterpreted or misused.

One way to ensure that all this happens consistently is to set up a standard process for receiving and handling analytic requests. For example, you could create an analytics request form or problem statement that needs to be completed and agreed between the client and the analysts. Here’s an example I put together to illustrate this:

No alt text provided for this image

If the role of a business analytics function is to support accurate decision making in organizations, then it is essential that it has a way to counter confirmation bias and can produce balanced analytic perspectives for its clients. By installing the right processes and encouraging clients to follow them, you can make a lot of progress towards this aim.

I lead McKinsey's People Analytics and Measurement function. Originally I was a Pure Mathematician, then I became a Psychometrician. I am passionate about applying the rigor of both those disciplines to complex people questions. I'm also a coding geek and a massive fan of Japanese RPGs.

All opinions expressed are my own and not to be associated with my employer or any other organization I am connected with.



Keith McNulty

Leader in Technology, Science and Analytics | Mathematician, Statistician and Psychometrician | Author and Teacher | Coder, Engineer, Architect

5y

Elizabeth Arzadon makes an excellent point in this thread. I rarely see people approaching multiple teams for an analytic perspective on an issue. Where this is possible it is a very strong indication that the individual is genuinely looking for a balanced perspective. (Acknowledging that many organizations are not brimming with analytics resources).

Saud Al Zakwani

Head of Digital Transformation @ Petroleum Development Oman | Ai | DataScience | RPA

5y

It's exciting when the results defy your beliefs. If you are not excited, you are probably using analytics for your personal objective.

Keith Tully

Chaos Clearing Professional | Succession Analyst and Coach | Business Consultant | AAUN, OUAT, and Higher Mission Board Advisor | Author | KoC | Guild Keeper and Elder | 2-yr Coin Carrier

5y

I think this graph is too objective, you'd have to shade in much more of the what confirms your beliefs circle for most people for the what you see part to be correct

Kim Walter-Chaplin

Total Rewards Solutions | Strategic Advisor | Innovating HR | Translating Insights into Actions

5y

My favorite analytics quote:   "If we have data, let's look at the data.  If all we have are opinions, then let's go with mine."  

🥕Maynard Clark🌱

Advisory Board: Quantum Risk Analytics; Executive Director: Vegetarian Resource Center; Consultant; Editor; Wikipedian

5y

What might each of us believe about confirmation bias?

To view or add a comment, sign in

Insights from the community

Others also viewed

Explore topics