From the things we say to the actions we take each day, our world- and that of business- is comprised of thousands of decisions, both big and small. How we come to make those decisions is the result of intuition and analysis and, in most cases, influenced by biases that we may or may not be aware of.

We know about blind spots in decision making, mostly because of the work of ES collaborators Max Bazerman and Ann Tenbrunsel. A recent graph published in Business Insider: Australia, and included below, depicts additional biases that all would be wise to learn and attempt to obviate when analyzing ideas and programs.

20 Cognitive Biases that screw up your decisions - 1, Anchoring Bias,: People are over-reliant on the first piece of information they hear. In a salary negotiation, whoever makes the first offer establishes a range of reasonable possibilities in each person's mind. 2, Availability heuristic: People overestimate the importance of information that is available to them. A person might argue that smoking is not unhealthy because they know someone who lived to 100 and smoked three packs a day. 3, Bandwagon effect: The probability of one person adapting a belief increases based on the number of people who hold that belief. This is a powerful form of groupthink and is reason why meetings are often unproductive. 4, Blind-spot bias: Falling to recognize your own cognitive biases is a bias in itself. People notice cognitive and motivational biases much more in others than in themselves. 5, choice-supportive bias: When you choose something, you tend to feel positive about it, even if that choice has flaws. Like how you think your dog is awesome- even if it bites people every once in a while. 6, Clustering Illusion: this is the tendency to see patterns in random events. It is key to various gambling fallacies, like the idea that red is more or less likely to turn up on a roulette table after a string of reds. 7, Confirmation bias: We tend to listen only to information that confirms our preconceptions – one of the many reasons it’s so hard to have an intelligent conversation about climate change. 8, Conservatism bias: Where people favor prior evidence over new evidence or information that has emerged. People were slow to accept that the Earth was round because they maintained their earlier understanding that the planet was flat. 9, Information bias: the tendency to seek information when it does not affect action. More information is not always better. With less information, people can often make more accurate predictions. 10, Ostrich effect: the decision to ignore dangerous or negative information by “burying” one’s head in the sand, like an ostrich. Research suggests that investors check the value of their holdings significantly less often during bad markets. 11, Outcome bias: Judging a decision based on the outcome – rather than how exactly the decision was made in the moment. Just because you won a lot in Vegas doesn’t mean gambling your money was a smart decision. 12, Overconfidence: Some of us are too confident about our abilities, and this causes us to take greater risks in our daily lives. Experts are more prone to this bias than laypeople, since they are more convinced that they are right. 13, Placebo Effect: When simply believing that something will have a certain effect on you causes it to have that effect. In medicine, people given fake pills often experience the same physiological effects as people given the real thing. 14, Pro-innovation bias: When a proponent of an innovation tends to overvalue it’s usefulness and undervalue its limitations. Sound Familiar, Silicon Valley? 15, Recency: The tendency to weigh the latest information more heavily than older data. Investors often think the market will always look the way it looks today and make unwise decisions. 16, Salience: Our tendency to focus on the most easily recognizable features of a person or concept. When you think about dying. You might worry about being mauled by a lion, as opposed to what is statistically more likely, like dying in a car accident. 17, Selective perception: Allowing our expectations to influence how we perceive the world. an experiment involving a football game between students from two universities showed that one team saw the opposing team commit more infractions. 18: Stereotyping: Expecting a group of person to have certain qualities without having real information about the person. It allows us to quickly identify strangers as friends or enemies, but people tend to overuse and abuse it. 19: An error that comes from focusing only on surviving examples, causing us to misjudge a situation. For instance, we might think that being an entrepreneur is easy because we haven’t heard of all those who failed. 20, Zero-risk bias: Sociologists have found that we love certainty – even if it’s counterproductive. Eliminating risk entirely means there is no chance to harm being caused.

While Microsoft, Walmart and Volkswagen have different operating and governance systems, they are alike in one key way: each is only as good as the people who run them. As we have seen, some run more ethically than others and people can let a variety of conscious and unconscious biases negatively affect their decisions.

Learning more about bias helps us recognize when we are being led down a path that may run counter to the most ethical course of action. The above chart will help identify how to make rational and ethical decision making easier, leading to greater organizational productivity, collaboration and long-term success.

By: Jeremy Willinger, October 15, 2015.

This article was used by kind permission of Ethicalsystems.org.