Given the nature of our practice, we spend a great deal of time studying and thinking about how decisions are made, both bad and good. And one of the things that influences our decision making, say cognitive psychologists, is our natural tendency to simplify information. Simpler information is easier to understand and remember. This can be a very useful shortcut, but that simplification can also distort the information and our decisions.
Photo by Daniel Hjalmarsson on Unsplash
Here are a few key information quality errors (also called biases) to be aware of:
We tend to generalize, sometimes too much. Especially when dealing with large numbers of people or things, our minds tend to help us organize and simplify by assigning a person or thing to a group or class with which we are already familiar. This allows us to make quick guesses about characteristics a person may possess (based on similarity to the group). This becomes a problem when those assumed characteristics don’t accurately reflect the individual and/or the group.
We use what is at hand, even if it is unwise. In making decisions, we tend to rely more on the information currently in front of us for snap analysis (developing a hypothesis or explanation), rather than giving due consideration to what information is required to make a thoughtful decision.
We get stuck in explanation ruts. Similarly, we often make an initial guess as to an explanation for a situation, and then, as new information becomes available to us, adjust from there rather than rethinking our initial guess.
We fill in the gaps, even when we shouldn’t. When provided with a list of related words (e.g., nurse, sick, medicine, health, hospital, office, etc.), and then asked to recall those words, we will often insert a related word into the list that was never there (in this case “doctor”). This acts as a sort of false memory.
We tend to blame people, not environments. When lacking good evidence one way or another, we tend to blame internal, rather than external causes for an event. For example, a person gets into a minor car accident. Lacking any information about the cause, the human tendency would be to assume the driver’s lack of attention or skill caused the accident rather than something like road conditions or a mechanical failure.
We think of ourselves first. When creating an explanation for a situation, if one (or one’s company) is directly involved, there is a strong tendency to place more emphasis on one’s own role in the outcome (positive or negative) than is warranted by the evidence.
We feel losses more than gains. When looking for explanations, and especially when making decisions that have a material cost, the potential to lose something (e.g., money, status, etc.) tends to have an outsized impact on our thinking compared to potential gains. We would rather avoid danger than run towards potential rewards.
Understanding these and other mental shortcuts provides us the opportunity to rethink and re-examine our decision process.