Being part of a group means that you adopt many of the ideas they hold. It is the foundation of collaboration and society itself. However, it is also something that can cloud your judgement. An assault on the beliefs of the group could be perceived as an assault on your identity and so your defence mechanisms and biases kick in. It may cause you to disregard sound evidence because it contradicts the belief of your group. I avoid identifying as belonging to any particular ideology because I respect the possibility that I, like most other humans, will develop biases that ultimately will cloud my judgement of new information and thus truth.
Have you ever been wrong about something? Of course, you have. And before you realised you were wrong, did you think that you were right? Yes.
You changed your mind because you learned something new. This means we should always be prepared to change our minds – there’s almost always more knowledge out there. Take the temperature at which water boils as an example. You may boil water a thousand times using various instruments, travelling around Europe boiling water everywhere, and by boiling water from different lakes. You’ll find that in every experiment you try, water boils at 100 degrees Celsius. That is unless you decided to climb up Mont Blanc (4800m) where you would find that water boils at around 82 degrees.
And so we must stay humble and expect that our views of reality are incomplete. Regardless of how many years you’ve studied at university, no matter how many years in the business you have, I believe this ethos remains true. As Mark Twain once said: “What gets us into trouble is not what we know for sure, but what we think we know that just ain’t so.” The real danger lies in being too confident because that’s when you stop looking for your blind spots. And there will always be blind spots.
I believe in openly and regularly expressing one’s views – and subsequently changing them in light of new information and insights. As the philosopher Charles Sanders Pierce wrote:
“The rationality of the scientific method does not depend on the certainty of its conclusions, but on its self-corrective character: by continued application of the method science can detect and correct its own mistakes, and thus eventually lead us to the discovery of truth”.
Being scientific in one’s worldview lies not in always “using the facts”, but rather in how one approaches the truth. To be scientific is to constantly test one’s currently held ideas, to seek out evidence that would disqualify rather than reaffirm those beliefs, and to wholeheartedly embrace evidence that contradicts those currently held beliefs. It is said that John Maynard Keynes, when accused of changing his opinion on an economic issue by a political opponent, responded as follows: “When the facts change, I change my mind – what do you do, sir?”
But to have this stance becomes increasingly difficult the more fond of my own ideas I am. This is why I try to maintain a “healthy distance” from the ideas I hold. Take the CEO of a company for example. The CEO shouldn’t get too close to his workers because he may one day have to fire some of them. If a day comes where he is forced to lay off a number of employees, then it’s better that he has maintained a professional relationship with them so as to properly assess whom to let go. I am the CEO of a company whose main purpose is truth. My workers are various ideas of the nature of reality and so I must not allow myself to attach myself to any one of them.
This detachment is certainly not easier if you choose to label yourself as a “leftist”, “pro-market”, “green” or “conservative”. I am not claiming that people’s beliefs can’t be categorised but I am questioning the usefulness of doing so. However, I do believe that it is wise to avoid identifying too much with a certain dogma. We, humans, tend to want to conform to the beliefs held by the group to which we belong. That’s how “group-think” is borne. I believe many of the greatest mistakes in human history can be attributed to such dynamics. I recognise that I am most likely not different from the majority of humanity. I too will tend to conform to the ideas and beliefs of whatever group I belong to. Therefore I will avoid all types of ideological ear-markings as long as I possibly can. The man blinded by his love for hammers will see nails everywhere.
To summarise it all. The quest for truth requires us to stay humble to the possibility of us being wrong. There can always be new information out there that would alter our beliefs. But to stay objective and receptive to this new information is difficult. We have biases that make us attached to our current beliefs and many of them are unavoidable. However, the biases arising from belonging to, and self-identifying as part of, a certain ideological group is avoidable. If information is a lethal attack on the group (ideology) to which you subscribe, and the membership of that group (ideology) is part of your identity, then your judgement of this information will be clouded. The very process of labelling myself as being either “X” or “Y” is part of the identity formation which, eventually, could lead to this clouding of judgement that I speak of. So even though other people may categorise me as “X” or “Y”, I will avoid doing so myself because I believe that this will make “X” or “Y” part of my identity. Anything that is part of my identity becomes difficult to renounce in light of new evidence.