By Christo Lute, Director of Advanced Analytics,
In our processes of reasoning and decision-making, cognitive biases and heuristics are like shortcuts: they can be effective and useful, but they also frequently cause us to make errors in reasoning, leading us down the wrong paths. Thankfully, many biases and heuristics can be mitigated in a simple fashion; first, notice the bias is happening, and then address the bias. If you don’t have a practice that methodically trips your “notice,” the bias will likely take effect. Of course, this first step of noticing is easier said than done.
In the context of data analytics, cognitive biases and heuristics are particularly interesting because working with data doesn’t appear to give people any additional capacity for avoiding common biases. Even someone with a strong data background and familiarity with sampling and other types of statistical errors can still be prone to making rationality errors.
Below are a few cognitive biases and some simple practices that should help mitigate that particular bias. If the recommendations seem basic, it’s because the solution to the problem is often obvious from the outside. From the inside, cognitive biases seem innocuous and almost imperceptible, which is why they are so persistent in the first place. The long-term resolution to dealing with biases isn’t to simply notice the errors in reasoning and correct them, but to design methods and techniques that prevent biases from influencing outcomes in the first place.
Much of data science is about testing a hypothesis to prove its validity. If we place these pants next to those shirts on the rack, will we sell more shirts? If we purchase that new software, will we have an increase in productivity? These are critical practices for business, but the very act of creating a hypothesis opens you to the possibility of falling into the confirmation bias.
In essence, developing a hypothesis causes you to assume that a certain outcome is more likely because you have an expected outcome. The expected outcome may in turn cause you to ignore alternative explanations for the data you see. By thinking about the viability of shirts sold on the same rack with pants, since that’s a normal sort of pairing, you may fail to consider pairing shirts with ice cream, which could end up being more profitable. By choosing to pair clothes with other clothes, you confirm your bias that clothes are best sold with clothes.
Practical mitigation: write down your hypothesis, and then write down additional potential hypotheses on the same data. Make sure some of these hypotheses go against your expectations. This widens your view of expected outcomes and forces you to test for the alternatives prior to seeing the results.
Selection bias occurs when you select a group of data for analysis so that randomization is not achieved when it ought to be. This causes the sample to fail to be representative of the population or data set.
Selection bias occurs often in statistical analysis due to a sampling error or poor experiment design, but can also occur when you only report the most favorable outcome for an analysis or when you’re picky about which types of data are selected for meta-analysis. If you are analyzing the effectiveness of a new drug for cancer treatment with a meta-analysis, you should probably include all of the available literature, not only the studies that support your claim. When you are looking for sales data, you should probably include information from all departments, not only electronics.
Practical mitigation: When you are selecting data, get someone else to also select the data—ideally someone who does not share the goals you have for the data. If the two selections differ, discuss why your counterpart selected different data. Even if both parties are biased in their selections, the pair of biased selections and the following discussion will likely produce a less biased outcome.
The Dunning-Kruger Effect
Relatively unskilled people tend to assess their skills to be greater than they really are, and experts tend to evaluate their abilities as less skilled than they really are. This bias occurs regularly in business. Underestimating themselves, competent workers can fail to sell effective solutions to company problems, while employees who don’t truly grasp the work will boldly offer poor solutions.
Practical mitigation: Have additional decision-makers pick “for” and “against” positions to cause more discussion and critique of ideas. Requiring team members to play different roles and have different perspectives on the outcome of a given decision is likely to reduce the impact that any single person has on the outcome, mitigates the bad results of unskilled decisions, and increases the likelihood that a good idea will rise out of the discussion.
The Availability Heuristic
The availability heuristic happens when we use the most immediate examples that come to mind instead of using the most accurate or relevant examples. If a person gets into a traumatic car accident, they are more likely to assume that car accidents are going to happen more frequently than they actually do. In business, memories about failed projects may be easier to recollect than successful projects, causing some otherwise excellent project ideas to be canceled before they begin.
Practical mitigation: Data science is all about conquering the availability heuristic with data. A decision-maker may imagine that they sell lots of soda because they recall selling it more often, but it may turn out that the most sold item is French fries. Use the data to combat (or confirm) assumptions.
The skills for analytics, cognitive bias mitigation, decision-making, and emotional intelligence can be developed and honed. Check out Analytics Guild's one-on-one coaching, our opportunities for developing your own analytics strategy, or our group resilience training.