Unconscious Bias and how it can affect you

If I were to meet you I would immediately make a huge number of assumptions about you.  They would be based what I have ‘know’ about you and on my experiences of people I had met or heard about who I felt may be similar to you. We are susceptible to an anchoring bias where first impressions set the scene for everything that follows. Having made all these judgements about you I would then look to confirm my biases about you. Either in a positive or negative way.  You would do the same to me. We may be right about some aspects of each other, but we will certainly be wrong about others.

I will be much better at spotting your biases than of my own. I may dismiss you as stupid or lazy as you clearly don’t understand things. This then becomes the lens through which I view you and I will continue to confirm my bias about you and you about me.  If your ideas align with mine we probably will find ourselves susceptible to in-group bias, if we differ then out-group

We are not rational beings, although we may be predictable. We all have biases we are not aware of that influence our decisions. A bias is different to a logical fallacy. We can learn to improve our logical thinking with practice and following routines. Our biases, however, are often hidden from us. No one considers themselves to be prejudiced. I’m not sexist/a racist but ….  there then follows ‘evidence’ that justifies the stance.

We need to make assumptions in order to survive and function. We rapidly take logical shortcuts heuristics that for the most part meet our needs.  The problems are when our unconscious biases lead to poor decision making and to others being discriminated against. Dealing with our bias is fraught with difficulty. Consider this quote from the Nobel Prizewinner Daniel Kahneman from Thinking Fast and Slow 

 “Except for some effects that I attribute mostly to age, my intuitive thinking is just as prone to overconfidence, extreme predictions, and the planning fallacy as it was before I made a study of these issues. I have improved only in my ability to recognise situations in which errors are likely….And I have made much more progress in recognising the errors of others than my own.”

You can test your own biases using the Havard Project Implicit Website  Note the warnings that you may encounter an interpretation that doesnt fit with your view of yourself.

I have used the IAT with several groups I have delivered training to and have had angry people telling me, without a hint of irony that it isn’t accurate because they don’t agree with its findings!  Note that it doesn’t claim to be valid, nor can it ever be proven to be or not to be.  We are not great at judging ourselves – have a read of You are not so smart 

 

Research suggests we reject information that we are uncomfortable with.  Support for a political candidate does not appear to be removed by that candidate stating facts that are not verified by a fact checker. Research on Donald Trump reveals worrying trends. It goes beyond whether or not something is true to, is it even important if it is true or not.

It seems reasonable that we can change someone’s opinions with rationality.  Being presented with information that goes against a belief system doesn’t change that belief system. See the work of Haidt and Righteous Minds  It should be that numerate people when presented with data should interpret it in the same way as others. This is the case generally if the subject is not emotive.  So for example when asked to judge whether a new soap causes rashes or not most people agree that the evidence suggests it does. However given exactly the same data but with an emotive issue such as gun control, immigration or Brexit and the interpretation becomes biased.  We are more likely to get the analysis correct if it supports our belief. The same effect happens for those who were for, or against, Brexit. Possibly the most depressing research ever if you hope for a rational future !  Graph taken from https://www.onlineprivacyfoundation.org/opf-research/psychological-biases/psychology-and-the-eu-referendum/here

https://www.onlineprivacyfoundation.org/opf-research/psychological-biases/psychology-and-the-eu-referendum/

Redlawk, Civettini, and Emerson (2010) looked into this further, The Affective Tipping Point: Do Motivated Reasoners Ever “Get it”? published the journal Political Psychology (31(4)). They note, “Recent research has convincingly shown that emotions play an important part in most decision-making realms” (p. 564) and that those with an emotional bias (motivated reasoners) toward an issue may even become more strident in their beliefs as more negative information is presented.

There is a limit to how far we will go though.  As the graph below indicates someone with an emotional support for a person or cause will use motivated reasoning to enhance their opinion in the light of negative information. At the Affective Tipping Point, it seems that rationality starts to kick in and the criticism seems to take effect. Eventually, the weight of evidence leads to a negative perception.

So why should we care about biases if we can’t really prevent them?

If we understand that we might not be considering all the facts, then we are more less likely to be arrogant in our assumptions. We can learn to be more humble in our assertations. Applying scientific thinking and trying to remove emotions can help us make better, rational decisions.

I have come to the conclusion I actually know nothing really !

A handy list of biases below!

Taken from http://mentalfloss.com/article/68705/20-cognitive-biases-affect-your-decisions

Leave a Reply

Your email address will not be published. Required fields are marked *