In January 1995, a Pittsburgh man confidently robbed two banks in succession, in broad daylight, wielding a semi-automatic handgun – wearing no disguise. Despite this, he was genuinely stunned when police arrested him at his home after clearly identifying him from the camera footage. "But I wore the juice!", he yelped incredulously. He reasoned if people can use lemon juice for invisible ink, surely he could smear it all over his face and proceed to pull off the crimes incognito.
Oh, how fragile the bubble of ignorant bliss! The story got the attention of David Dunning, a Cornell professor of social psychology, and he wondered if this person was too ignorant to be a bank robber, perhaps he was also too ignorant to know that he was too ignorant to be a bank robber.
Dunning and his graduate student, Justin Kruger, published a paper on the phenomenon, postulating that confidence is inversely proportional to competence – the less you know, the more you think you know, and vice versa. This has become known as the Dunning-Kruger effect, and it is much more prevalent than any would like to believe.
Could the things we don't know hurt us? It seems so. But in the age of information where we have access to knowledge at the end of our fingertips, the choice is definitely up to us: to know or not to know?
Hear no evil, speak no evil
On the other hand, the more we learn, the more we tend to realise how little we know. Too much knowledge can often hamper our ability to act – we've all heard of paralysis by analysis. When presented with all the choices, theories, gaps and shortcomings on a particular subject, it can seem impossible to make the best decision and move forward.
Winston Churchill's maxim: "Nothing avails but perfection" may be spelt shorter: "paralysis". Companies that exclusively or excessively use the analytical approach to problem solving, face the danger of getting stuck in an endless circle of planning with little to no action. "Cognitive over-stimulation", "data smog", "infobesity", "information overload" – these are all modern-day terms used to describe the double-edged sword that is big data.
Its shady twin, FOFO (the Fear of Finding Out), has lately been making its appearance in all industries, from the medical, financial and marketing spheres as far as to private homes. It is described as a psychological barrier that stops people from investigating a potential problem because they're afraid of what they might discover.
It is where ever-expanding knowledge meets decision-fatigue, and real psychological pain associated with having to deal with certain information you would rather ignore. It is essentially being afraid of finding out the truth, whether it be good or bad, because you don't want to deal with the potential consequences.
The hunch that you have a suspected medical condition, the nagging feeling that your finances are not really in order, or that you'll receive negative feedback on a new product in development – and the dread that it might be confirmed. Some marketing departments can be particularly haunted by the feeling that statistics might catch out their strategies as nothing but hot air (when in fact research shows the opposite to be true). We live in a world where we have access to all the right answers – but are we brave enough to ask the right questions?
The emperor has no clothes. So what?
Another name given to such behaviour by business ethics professors is "motivated blindness". When it's in someone's interest (or so they think) to remain ignorant, they will see only what they choose to see. Much like confirmation bias. They found that people will choose to ignore or avoid bad news even if they know it could help them make better decisions. It's not that they have an aversion to the truth, it's that they simply don't want it enough to pay the psychological and emotional costs of hearing it.
"The truth can hurt, but candour strengthens while ignorance weakens." The main cost of hearing the truth is often psychological when, in general, it is less of a threat than ignorance. Sometimes the cost can involve ostracising and boycotting, as well as ripple effects and consequences for your standing, autonomy, and access to resources. However, not speaking up usually only prolongs the inevitable.
Luckily, engineers are a curious folk by nature and their university studies only affirm their bloodhound traits to uncover problems. On the other hand, FOFO is often less to do with the courage to speak up, and more with the humility and bravery to listen up. When leaders allow their ego to dominate decision-making, the truth may get quashed in case it reveals any failures or lapses on their part.
As an antidote to this fear, law can take you to the level of compliance. Yet honesty and free flow of information will go much further in terms of securing company success and personal and professional growth.
Where there's smoke
When it comes to sifting through all the noise to get to the core of things, it helps to delegate and seek trusted, expert counsel. In the words of Steve Jobs: "It doesn't make sense to hire smart people and tell them what to do; we hire smart people so they can tell us what to do." Managing information is a learned skill and senior executives need to determine where the 'goldilocks' point on the information curve is, the sweet spot of not having too little or too much information to make the best possible decision.
When the boards and leaders of organisations are afflicted with FOFO, it can have dire consequences. Take Volkswagen for example. They previously had the expertise, driven and intelligent leaders and capable employees. But also a "no-failure" culture, a psychologically unsafe space where people couldn't speak their minds to the powers that be. Before Volkswagen's Dieselgate, the company suffered leaders who threatened to fire managers and engineers if they couldn't deliver world-class products in impossible time frames. Cheating and covering up of the truth were just waiting to happen.
Turkey is another painful example. The head of the Chamber of Civil Engineers had already warned that construction amnesties would turn Turkish cities into graveyards before a residential building in Istanbul collapsed in 2019. Many believe the problem was largely ignored because it would have been too expensive and unpopular to address it.
Thinking about thinking
In organisations that have a culture of avoiding (perceived) negative feedback, you'll probably also find employees who lack drive and motivation. Where there's reduced openness and trust there is reduced creativity, innovation and productivity, low morale and high staff turnover.
Risk-averse organisations with rigid and multi-tiered hierarchies are the most likely to suffer from FOFO, particularly amongst upper and senior leadership. On the flip side, creating an environment where employees feel comfortable asking questions and safe to challenge the status quo will help foster a culture of inquiry and transparency, which leads to better decision-making, problem-solving and increased employee-engagement.
The power of peer review is even reaching into primary schools nowadays. Children are encouraged to learn from each other, challenging the cancel culture mentality and learning to take it on the chin when it comes to critique. This type of learning helps with collaboration, communication and meta-cognition – thinking about your own thinking. A most valuable skill for anyone to have. The fear of discovering the truth is often worse than the truth itself.
Sometimes there is just no other choice but to take the red pill. After all: you can douse your face in lemon juice all you want but, in the end, it will only sting your eyes – and maybe get you jail time.