Clear thinking in HRM

Summary: In this article by Robin Chater, he argues that our ability to think clearly is not primarily derived from our innate abilities, but from the largely cultural environment within which we grow up. This ability is also strongly dependent upon our sense of identity.  Unless we understand who we are, we cannot relate to the world around us and make full sense of it. The problem is that the human cultural environment (social mores, family, religion, fashion etc) is filled with a huge number of ready-made solutions and these serve to cloud our thinking. Very few people are able to break free from their bonds of culture and function at their full potential level.

**************

In our corporate lives, we all wish to think clearly, because we want to make wise decisions. Such decisions are both good for the business we work within, and also our own individual career. After all, we strongly desire to be seen by others around us as someone whose judgement is sound.

However, if we step back a little, we all know that things are not quite that simple.

▶ There could be several acceptable outcomes from available courses of action. Our own choice may just be one of them.

▶ A wise decision is not always an acceptable one. A decision may be the best option available, but colleagues may not agree with the option for a number of reasons – such as their own self-interests, because they are risk aversive, prevailing prejudices, or because they think that the option would not be acceptable to someone in a more senior position.

▶ We do not always achieve recognition for our wise decisions.

But let us step further back from such considerations for a while and just focus on clear thinking itself. To put it philosophically, “clear thinking” is a necessary, but not sufficient guarantee of sound decision making. Without it, decisions would be haphazard, inconsistent and probably unfair. But thinking clearly does not guarantee an outcome will be successful. It is, however, so critical that it is worth putting in a lot of time to developing it and applying it.

❖ The development of identity

The reader is probably familiar with “Plato’s Cave”. An allegory used by the Greek philosopher to demonstrate that most people do not perceive reality, but dim shadows of the truth. It is as true today as it was more than two Millennia ago.

Let us start with the “who” that is at the heart of “clear thinking”. Being smart just does not happen by itself, or have any meaning dissociated from the person being smart. What is at its heart is identity – your identity.

Contrary to much modern thinking, a person’s identity does not essentially arise from an individual’s ancestry, or their DNA. Neither does it become visible through a personality test. It may seem trite, but each of us is genuinely unique. We are not some predefined construct, put together by a psychologist who devises a test. In the same way, none of us is predefined by our family and all the people in our ancestral line. The paradox is that distant ancestors are all strangers to us – but if we go back far enough we are all related.

An illustration of why identity is different from DNA is the fact that Jewish people have a widely mixed set of DNA though intermarriage, although they have a very strong and identifiable collective identity. Identity is cultural, social and, above all, psychological. When we are born we have a number of innate characteristics that give us an ability to play an instrument, or climb mountains. But most of the things that define “who” we are is defined by the broader functioning of our brains and the experience that tests and helps shape this functioning.  

We generally believe, for instance, that eye sight is something people either have, or lack, in some continuum down through poor vision to blindness. This is not so. We are all essentially blind, it is just that some of us are less blind than others. At one time, it was thought that vision was all processed in one part of the brain. Now we know this is simply not true. Everything that comes into our brain from our eyes is first split. One copy goes to our “old brain” for processing in a small blob just above the top of our spinal chord. This is the primal processor. This tells us if something is just about to pounce on us, or presents a potential danger. It does not identify, or explain what it is. We process this unconsciously and cannot understand what is happening to our reflexes. The other parallel data goes to 30 places in our newer, upper brain which start to interpret it. There are lots of interconnections and attempts to recognise the items from previous brain functioning that has been retained. This is the clever, but also highly malfunctioning, part of our mental processing. Some people have great inter-connectivity, others far less. But even where there is a gap or malfunction, the brain tries to fill the gap with what it can. All of us are imperfect, but some less than others. The important point is, however, that the brain can be trained to improve its functioning.

All our senses work in a similar way to vision and we also have receptors splurging out neurons to bring all the sensory perceptions together.

Perception is the first building block of identity. We are what we perceive. Beyond that, we develop our identity along one dimension. We either let our mental faculties take in ready- made solutions from outside us – traditions, habits, sayings, stories, morals, beliefs and a sense of ourselves – or we work autonomously with our perceptions to build up our sense of identity relative to the world around us. Of course, for most people there is no chance that they will be left to build up their own picture of the world and make sense of it. They are given guidance by parents, schools, religions, friends, political parties, news media, social media and advertisers. They are fed ready-made solutions that numb their sense of inner identity.

Although identity starts to take shape from an early age, it usually takes a personal revelation for it to adopt a distinctive shape. These moments in a person’s life are profound. The first is the “existentialist moment” when there is the sudden realization that we are all mortal, and the second is the “self-awareness moment” when we realize that our knowledge and values define us, we sense our existence and its relation to the world around us and become at last as a free-standing being. The existential moment can take place as early as infancy, whilst self-awareness arrives often at around age 18 when arriving at University, or taking up a job for the first time. However, for many people neither moment ever arises.  As the two are reliant on each other, denial of the “existential moment” can blunt an individual completely. The most common way this is done is by exposure to a religious belief system at an early age where the concept of an “afterlife” fills the gap that should otherwise be filled by the illuminating shock of mortality.

There is also a well know phenomenon of the “imposter syndrome”. This is common in high achievers, but can arise in almost anyone. It consist of a sense that we are not where we are through merit, that all we do is a fraud and that it is a matter of time before we are found out. This phenomenon often arises amongst intelligent people who have resisted the influences of accepted thinking. They may have a working-class upbringing, so they have had to break loose from their surroundings; or they may have grown up in a devoted catholic family, but decided to take an atheist course. They thus put themselves in the position of an outsider. The outside position is often a huge advantage in a senior management position, but it leaves the individual alone to steer, without a clear social or cultural rudder.

Another crisis of identity that arises for those who try to form their own identities is one of moral flux. If there is no God, or ultimate authority figure, to define a proper course of action, then the result can be confusion, or just a huge void. Faced with a dilemma, such as whether to own up to the mistaken transfer of a large sum of money to their bank account, they think only about whether they shall be “found out”, rather than what the consequences of dishonesty will indicate about themselves to their inner sense of being, or how much damage it could do to others and society itself.

Finally, there are those who suffer from such severe malfunctions of their system of perception and broader mental functioning that it affects their own sense of self in a fundamental way.

Thus, we can see that a sense of identity is not, in any way, predetermined, or something that can be put down to ancestry or classification into a particular human race, or type. It starts with two things – the functioning of human perception and the degree to which someone is influenced by the wider social and cultural environment around them. Just because someone is brought up to be “midde class” or “Chinese” or in a family of criminals does not mean that they will necessarily develop internally in any particular way. It is ultimately their ability to perceive, distinguish and resist outside influences that counts. That is why it is much easier, and probably preferable, to be neglected as a child than to be brought up in a family that offers blind devotion to any fundamental faith or political creed.

❖ Language – growth and degeneration

Language is an integral part of our culture and one reason why homo sapiens have been able to develop their  knowledge of the Universe to become so advanced and highly civilized. A large vocabulary helps us to think more accurately and in a subtle way about phenomena, It also assists us to share ideas and think creatively. Just in the same way as the “Flynn effect” has charted the growth of IQ scores over the 20th century, the wider availability of education has helped successive generations to improve their language capabilities. But, as with IQ, this trend appears to be going into reverse in the present century. The language used by “Generation Y” and those growing up after them is now becoming more limited, uglier and far less sophisticated. Thus we have the emergence of “like” in almos every sentence, “sort of”, “you know”, “as I said”, “ta” instead of “to” and “fur” instead of for, “I was sat” and the turning of verbs into nouns – as in “that was a big ask”. 

The effect of new, more casual and highly repetitive language forms is to reduce communication into a far more superfiial activity, with little care for aesthetics or the novelty of expression. These language forms are also spreading between generations, assisted by their take up in the media – such as the BBC – and by the fact that they are now the lingua franca of almost all sporting communities .

❖ Cognitive bias

The greatest ongoing threat to clear thinking is cognitive bias. This exists because an individual’s sense of identity is seriously malfunctioning – either because of the way that their perceptions are being formed, or because of the overall way their brain works to save effort when thinking about problems.

Faced with an overload of information coming at us, we all develop coping strategies and these are usually in the form of mental shortcuts. We start noticing things that we are culturally tuned to select out – like news about a favourite football team. We also remember unusual things more readily than familiar things – such as the record price for the transfer of a footballer whose name we do not even know. We like to select things that reinforce our own prejudices and see faults in others more than in our selves, or those close to us.

Into the deluge of data coming at us we therefore try to discover meaning. This is not always possible and many of us are content to allow huge holes to exist in our mental frameworks of meaning. This is particularly in areas such as politics, geography, science and art. We are content to let what we do not know just sit outside our sphere of interest. We do not see this lack of interest as a problem, because these subjects are not seen as critical or relevant to our everyday lives. We also get by with avoiding specifics and being content with generalizations, stereotypes or vague notions about time and place. We also instinctively believe in probabilities derived from “common sense” that serve everyday needs, even though they are entirely wrong – such as believing the probability of death in an aircraft accident is greater than in a car.  Moreover, things that affect people close to us will generally lead to the filling of the knowledge gap, even though that new information gained will be bereft of context, so its true meaning or significance will be lost to us.

We also generally believe (contrary to those with the imposter syndrome) that we know more than we actually do – especially about others. This is reinforced by popular journalists that simplify all ideas and events into pre-digested thoughts. We also have a tendency to wish to finish things we are doing, even though they may have been overtaken by events, or have objectives that were badly defined at the outset. One way to avoid having to change course is to set aims and tasks that are low risk. Thus, it does not matter if they are not answering key problems of the moment. It is, for instance, good enough to say that the matter had been looked into – even though, on close inspection, it had not.

We are also often under great pressure to come up with rapid solutions. This encourages mental short-cuts and a tendency to tackle obvious and immediate problems, rather than the more fundamental phenomena that underlie them. Furthermore, we live in a constant state of uncertainty about what to take away from a human experience. We prune everything back to mental bullet points and then, on reflection, add details about things that did not come from that experience at all. Days later, these merge into the memory as if they actually happened.  In the end, our memory persists only in respect to the stories we tell and the way events and information relate directly to us. Although we are doing it, to a large extent, all the time we finally sift all reality purely through the filter of our own identity.

❖ First steps

So, what hope is there for wisdom to emerge, for clear thinking to guide our decision making?  Well, for sure, it is not going to happen unless we spend a lot of time and effort working on skills such as logic, objectivity and proportionality and avoid the numerous fallacies from which most humanity suffers.

Here are the twelve most common paradigm paralyses – mistakes we can all make when thinking about a problem. It is necessary to keep these constantly in mind in order to cultivate clear thinking.

1. The Availability Heuristic: This was first identified by Tversky and Kahneman in 1973. Put simply, it refers to the tendency most of us have to evaluate something in terms of things that come easiest to mind. If a doctor, for instance, recently diagnosed two people with a rare disease, then they will tend to diagnose the same symptoms in their next patients. Tversky and Kahneman did the “K” test on a number of groups – by asking them whether there were more words that began with “k” than had “k” as the third letter in the word. The majority of people chose the first letter option, as they could immediately think of “kitchen”, “kite” etc. However, there are actually twice as many words in the English language with “k” as the third, rather than the first letter.

2. The Egocentric Construction of Reality: For most people the World is not around 24,000 miles in circumference, or thronged with nearly 8 billion people. It consists almost certainly of a single country, perhaps a city and maybe even just a single street. It is also made up essentially of immediate family and close friends and normally less than one hundred colleagues, cousins and acquaintances in an outer circle. Events can penetrate this world, but unless they endanger it, or greatly benefit it, they will be considered to have little fundamental importance. Thus reality consists of values and phenomenon that inhabit this narrowly personal world. Abstract concepts sit uncomfortably in its confines and all forms of negativity are generally avoided, or gossiped out of existence. If someone within the circle, or a closely allied circle, is – for instance – attacked on the street by someone of a different race then the experience will be magnified into a general prejudice. Similarly, tragedies affecting others will only really matter if it is within the circle and not outside it. The egocentric reality is therefore always narrowly defined, distorted and never reliable in any objective sense and can be highly dangerous if it is allowed to determine the fate of others in the wider World around it. This is, for instance, the primary weakness of democracy.

3. The Immediacy Impulse: It is a well known phenomenon that middle-class parents have fostered what has become known as “delayed gratification” in their children. This is the ability to see beyond the immediate time, exam hurdles and privations along the way to a better time in the further distance. In earlier times leading educational institutions also bred into the offspring of the rich a notion of the ‘greater good”. That some otherwise unacceptable things had to be tolerated so that a “higher outcome” could be achieved, or sustained. These values ended with the Millennial generation. Unfortunately, it also meant that there was no longer a need to be concerned with such things as historical context or the slow adaptability of people to change, or even strategic considerations – used by or against us. Things are thus judged in terms of immediate payback and narrow interests. One example of this is the reaction of many people to the 2020 pandemic lockdowns in Michigan, USA – leading to violent protests and threats in spite of the high number of deaths from the virus justifying the lockdown.

4. The Naturalistic Fallacy: This is a common tendency, first identified by the Philosopher Emmanuel Kant. It underpins many of the assumptions in religious morality, sociology and conventions of all kinds. This dislogic consists of assuming that what “is” the case, “should” be the case. In other words, if most people liked yellow roses, then it may be assumed that yellow roses were superior in some way. In fact, because something exists does not mean that it has some innate moral or practical superiority.  An example of this is the continued importance given to job interviews, even though other individual methods of recruitment selection actually produce better overall outcomes. Because every employer interviews, does not mean that interviews are some kind of imperative for HR professionals. 

5. Determinism: There is a pseudo-scientific assumption that the whole world is subject to “cause and effect”. However, even the most well proven scientific discoveries still contain degrees of uncertainty. If existence was deterministic then the fact that a child was abused, or lived in great poverty could readily be used to explain their phobias or criminality in later life.  Such outcomes would be inescapable. But there are numerous cases of perfectly psychological stable and honest people who had difficult childhoods.  To accept determinism is, in fact, to reject “free will”. Thus, everything anyone did could be ascribed to its inevitability, rather than the responsibility of the individual. Instead of determinism it is far more effective to think in terms of probability. This allows for certain factors to be involved in producing an outcome, but only to a partial extent and virtually never in some single causal chain. In all chains of causality, moreover, there is going to be a confidence limit for every relationship.  Even probability, however, can produce some odd results. If, for instance, every time it rained in Denver someone in Boston died of measles does not mean there is any link between those variables at all. For a link to exist there must be some pre-existing logical connection between the events. This may not be immediately obvious, but on investigation can generally be found.

6. Zero-Sum: Once again, a favourite mistake by “concrete thinkers”, such as those in the trade union movement, who like to divide all human experience into rigid categories – good/bad, practical/impractical, socialist/capitalist, us/them. A famous example of this thinking was that adopted by officials and MEPs during the drafting and debate about the EU Working Time Directive. It was argued that long working hours were correlated with stress and ill health at work. But no actual studies were undertaken to determine this and a maximum weekly working time figure was chosen, not by reference to scientific evidence, but an ancient ILO figure that itself was plucked out of the air.  Another favourite zero/sum approach to a problem is to declare that if something does not work perfectly, then it is better not to use it at all. A case in point is the use of the temperature check during the 2020 coronavirus pandemic. Because the check does not pick up all cases of infection it was declared useless in many national guidelines on a return to work. However, if it picks up some cases, it is logically more use than if no test at all were used. If it picked up false negatives there may be a slight argument in favour of it not being used. However, it would still pick up some people with a serious temperature-inducing illness and therefore be of value. 

7. Disproportionality: We have all felt irritated at times over small things in our lives, especially when what we intended backfires on us, or we act when we are tired. But there is a more fundamental block to clear thinking that occurs when people lack a sense of what values, outcomes, means, methods, hopes and fears are truly important and what are relatively unimportant. This is so often the traditional administrator’s weakness – when they care more about whether a sub clause is in place than whether the whole document is drafted correctly. It is equally what the Philosopher Bertrand Russell meant when he said that many schools cared more about whether a pupil swore than if they undertook a cruel act. An important skill for the HR Manager is knowing what aspects of their job are key to the success of the business and what may merely have ritual meaning, or are simply “safe” because they conform. The modern HR Professional is, above all, the one person in the management structure of a company (apart from the Chairman) who can step back and take a wider view. They are thus frequently the conscience of the organisation and are able to be this because they maintain a finely tuned sense of proportionality.

8. The Gambler’s Perspective: Virtually all gamblers lose – some more than others. But in many ways, we are all gamblers – we take out insurance against lost luggage, or ill health and decide one course of action rather than another by recalling how we last acted in similar situations. We know there is no guarantee of success, but we trust that experience helps to produce a desired outcome. However, the most common mistake people make when they bet in a casino, at the racetrack or in daily business life is that the same pattern of chance outcomes will not be sustained. For instance, if we roll a dice and come up with a six, three consecutive times, the gambler will assume that the chances of six appearing a fourth consecutive time will be reduced. But it is, in fact, exactly the same as all previous rolls of the dice. Each roll is a distinct activity and the chances of a particular number coming up are the same at each throw.

We all fall for the gambling fallacy in everyday life when we choose from available options. We do so usually in a total unconscious way – so it does not feel like a fallacy. Of course, the converse is also true – when people claim they have a “lucky number”. But actually, in most cases, that is less of a fallacy than the sequencing assumption – unless someone else knows your “lucky number” tendency and feeds that choice intentionally, or gambles against it. Curiously the world’s most favoured numbers are 7, 3 and 8, but they only jointly account for just under a quarter of personal favourites.  So that is not much help around a roulette – or collective bargaining – table.

9. The Berkson Paradox: This is a common error amongst many professionals who falsely believe they are operating in an objective and scientific way. It is the detection of a false negative correlation between two factors. For instance, that trade union activism is negatively correlated with career progress. It would seem to make sense because a person denied an opportunity may adopt negative attitudes, or develop career aspirations in another institution (ie: a trades union).  However, in doing so, they overlook the trade union activists who lack career aspirations and are perfectly satisfied with the substance of their job.  Moreover, there are many variables that have a negative correlation that may seem to signify something, but actually mean very little. For instance, if a company wishes to use recruitment consultants that are both inexpensive and high quality it may find that it can only have one criteria met.  Cheap consultants are low quality. But the review may be distorted because it excludes the consultants that are both expensive and high quality and cheap and poor quality. Including these may lead the company needing consultants to reposition their thresholds.

10. The Numerical Evidence Fallacy: Put simply, this works from the assumption that the more instances of an attestation that something happened the greater the weight that what was observed was true. Thus, an HR manager may receive seven complaints from separate female employees about claimed sexual harassment that implicates one male manager. The “prima facie” evidence of this would immediately look strong. However, the fact that all complaints are made at the same time might raise suspicions. Could these complaints be concerted? Could they be retaliatory? Might they relate to actions that would not really fall into the category of harassment?  In another example, a study of employee engagement may indicate a low level in a particular department, the department concerned is known to have a high employee turnover and a low productivity level. Everything points to the relevant manager’s competence.  But what if the manager had encouraged staff to be “honest” in the survey? Had staff who were developed to be high flyers that left for opportunities elsewhere?… and the manager had productivity problems because of high vacancy levels and poor quality residual staff who could not provide effective cover? In such a case we have a totally different picture of the manager concerned. 

11. The “My Dog Never Bites” Syndrome: It is a virtually universal claim of dog owners that their pet never bites. But people are regularly treated by doctors for dog bites, so they cannot all be right. This is the very same phenomenon as the bad driver who is never the one at fault in any accident, or the criminal who is always innocent.  The vast majority of people believe that what they assert must be regarded as true. That others are obliged to trust them, however they behave. Of course, we should not and we must be brave enough to distrust those who betray trust and to do it in respect to all their subsequent actions. 

12. The AUTHOR Complex is a predominant force in people’s lives, because it is reinforced by our primal fears. AUTHOR stands for “Aversion to Uncomfortable THOughts or Reactions”. Curiously, the three primary causes of aversion all begin with the letter “D” – death, debt and divorce. We are all mortal and run a high risk of debt or divorce, but none of us like to dwell on any of them. That is why our aversion to the things that can readily lead to them are also so blunted. We happily mix in close proximity to other people, even though they represent a significant threat to us through negligence, malicious intent or transmittable diseases they pass on. We use our credit cards without giving sufficient thought to our ability to pay debts back if our circumstances change and we happily enter into marital relationships – even though there is a 50/50 chance that this will end up with a painful divorce.

So few of us really contemplate what financiers call “downside risk” because we want, ironically, to insulate ourselves from “uncomfortable” realities. Thus we conspire against ourselves and make things worse. We also fear our own inner reactions to events and try to suppress them. We learn about “injustice”, torture, cruelty to children …etc  and something wells up within us to control the hate we naturally feel towards the perpetrators. Such control is necessary, but it also stunts all strong reactions to things in our own immediate experience, preventing us from challenging others, or even speaking up when necessary. Clear thinking allows us to deal with such problems in a detached and objective way and to channel our reactions. It gives us the opportunity to come up with ways to achieve the order and contentment that we yearn for by less damaging means. The starting point for this is to stop our averson to uncomfortable thoughts and reactions in the first place.

There are also a whole set of biases that many of us fall for – such as the “confirmation bias” where an individual only takes seriously facts that confirm their existing prejudices. A common weakness utilized by advertisers is the “framing bias” – such that we are easily swayed positively by an advert which says “85% fat free”, even though it means that the product contains 15% fat. Those who become involved in projects often get caught by the “sunk cost bias” where spending continues apace, even after it has been established that the project is a failure. Many health experts were encouraged by the mapping of DNA to hold onto the belief that if health risks could be related to an individual’s DNA and the individual informed about the risks then behaviour would change for the better. However, all the studies which followed this approach proved to have a neutral outcome. People did not stop smoking or drinking if they were told that they were especially prone to cancer.

❖ Conclusion

We have explored the development of individual identity, shown how important it is to helping us to make sense of the world and pointed out how it is so often limited by external influences and the common fallacies so ever present in the way we think.

At this point we also come to a deeper understanding of existence characterised by what the Japanese call “miyabi”, a word almost impossible to translate into English, but which comes close to the concept of mental refinement. And central to the achievement of miyabi is the presence of “ma” – often translated as “negative space”, but more accurately described as the space that defines things – like the gaps between the spokes of a wheel that defines the wheel.   It is from this point that all which has become one’s identity can reach out and make sense of everything. A point at which wise decisions are truly possible.