Thinking biases are also known as cognitive biases. They may prevent us from being rational.
Some people tend to be put off by individuals and groups that make us feel uncomfortable or insecure about our views — called cognitive dissonance.
It is this preferred way of thinking by some people that leads to the confirmation bias — referencing only those perspectives that support our pre-existing views, while at the same time ignoring or dismissing opinions that threaten their world view, no matter how sincere, valid or the evidence provided.
It is important to distinguish between cognitive biases and logical fallacies. A logical fallacy is an error in logical argumentation (e.g. ad hominem attacks, slippery slopes, circular arguments, appeal to force, etc.). A cognitive bias, on the other hand, is a genuine deficiency or limitation in our thinking — a flaw in judgment that arises from errors of memory, social attribution, and miscalculations (such as statistical errors or a false sense of probability).
Psychologists suggest that our cognitive biases help us process information more efficiently, especially in dangerous situations. However, they lead us to make grave mistakes.
We may be prone to such errors in judgment, but at least we can be aware of them and aim to avoid them.
Here are some important thinking biases to keep in mind.
Bandwagon effect — the tendency to do (or believe) things because many other people do (or believe) the same.
In-group bias — the tendency for people to give preferential treatment to others they perceive to be members of their own groups.
Out-group homogeneity bias — individuals see members of their own group as being relatively more varied than members of other groups.
Dunning-Kruger effect — “…when people are incompetent in the strategies they adopt to achieve success and satisfaction, they suffer a dual burden: Not only do they reach erroneous conclusions and make unfortunate choices, but their incompetence robs them of the ability to realize it. Instead, …they are left with the mistaken impression that they are doing just fine.” (See also the Lake Wobegon effect, and overconfidence effect).
Fundamental attribution error — the tendency for people to over-emphasize personality-based explanations for behaviors observed in others while under-emphasizing the role and power of situational influences on the same behavior
Ultimate attribution error — A sub-type of the fundamental attribution error above, the ultimate attribution error occurs when negative behavior in one’s own group is explained away as circumstantial, but negative behavior among outsiders is believed to be evidence of flaws in character.
False consensus effect — the tendency for people to overestimate the degree to which others agree with them.
System justification — the tendency to defend and bolster the status quo, i.e. existing social, economic, and political arrangements tend to be preferred, and alternatives disparaged sometimes even at the expense of individual and collective self-interest.
Ludic bias — the analysis of chance related problems with the narrow frame of games. Ignoring the complexity of reality, and the non-gaussian distribution of many things.
Confabulation or false memory — Remembering something that never actually happened.
Understanding how these biases operate may help you make better decisions both at home and work and also to understand your own mind.