Coaching in an uncertain and complex future

, , , , , ,

Kim Stephenson

This article examines coaching and its future development in the light of recent (last few decades) of research into uncertainty, the impact of chaos and complexity and the role of the unconscious in decision making and rationalisation. Findings in social psychology, evolutionary biology and psychology, and the neurological implications of matters such as neuro-efficiency are discussed. Reference is made to the phenomenon of cognitive diversity. Four predictions are made about the types and directions of future change.

Coaching in an uncertain and complex future.

What’s the future for coaching and mentoring? That was the brief for this piece and as a scientist, knowing that it’s impossible to predict the future, I’m on a hiding to nothing. If I get it wrong, I prove that I’m human and can’t predict (no news there) and if I get it right, (a) who will remember?, and (b) how do I prove it’s not a fluke?

So I’ll look at the future using the best science I have and see where we get to.

How do I know the future is uncertain? Well, chaos theory illustrates among other things (Gleick, 1997) that apparently identical situations can evolve to produce very different results. It also shows that events can be non-linear, results can be substantially less or exponentially more than the sum of the parts and make it hard (or impossible) to determine causality (for example Sugihara et al, 2012).

Likewise, complexity theory includes the phenomenon of “emergence”, whereby amazing organisation can emerge from seeming chaos (Gribben, 2005; Beinhocker, 2007). Examples include apparently random termite activity orienting nests exactly to the cardinal points of the compass, or the simple and independent activity of millions of individual investors producing a sustained stock market boom (or bust!).

Prediction of volcanic activity or of weather has improved massively in recent decades (for example Roulstone & Norbury, 2013), but still not to the point where we can predict very accurately more than hours or minutes ahead, if that. Those are examples of complex systems “on the edge of chaos” – where “edge of chaos” is a technical term for a borderline chaotic system, not an expression of despair! Human societies, such as business organisations are other examples of “edge of chaos” systems, with the added complication that human societies are “self-organising”. They provide feedback cycles that can damp down or exaggerate outcomes. So human societies can produce feedback spikes – like a rock star, but even less controlled – or can eliminate feedback entirely. Sometimes those contrasting effects can happen in the same system (see for example Gladwell, 2001). So real life predictions of the behaviour of human societies, such as in the future for coaching and mentoring, are tricky!

Statistics don’t help

Of course, we can use statistics. When we learn psychology, get trained in psychometric instruments, are involved in finance or business plans, we become familiar with statistical methods being useful for prediction. We can reason from the statistics, and build potential futures, such as looking at how a different business model will work, or how a person with a particular profile will react to news phrased in a particular way.

In theory, the reasoning and the statistics work reliably, every time. In a casino, this is true. We know the odds, the payoff if we win, the rules about what is allowed and when bets are paid. So the statistics work, we can make better decisions (to the point where if you’re seen card counting, casinos will bar you because you might be able to predict events well enough to win!) But in life and business we don’t know the odds (similar situations can end up very different), we don’t know what the payoff will be (it’s non-linear). We don’t know the “rules” or when the game “ends” because they are self-organising systems, the starting conditions may differ subtly (they are borderline chaotic) and there are multiple “actors” in the scenario, whose actions affect the outcome in interlinked ways (it’s a complex system). So there are multiple cause-effect links too complex to calculate. And, of course, a lot of real life distributions aren’t “normal” (for example, stock markets, volcanoes and weather) and the probabilities don’t work properly with non-normal distributions (Buchanan, 2007).

That’s why I know it’s hard to make predictions – but in a way, that hasn’t mattered up until relatively recently.

Human beings have only tried to make those sorts of rules, to use statistics and predict the future mathematically and scientifically for about 500 years. Prior to that, for 100,000 or so years, we used our intuition. That worked pretty well. We weren’t always accurate, but most of the time we were good enough to find food, shelter and a mate – which is why we’re all here. If we hadn’t got it substantially right most of the time, humans would have gone the way of the dodo. In the process we developed the human brain, the most complex single thing in the known universe which is pretty good at doing this sort of thing (see for example Barkow et al., 1992). The trouble is, that it doesn’t do it the way we usually think it does.

All this said, I’d like to hazard some predictions!

Prediction # 1

It’s going to be difficult for coaches and mentors to continue to rely on “the answers are all inside you”, or to be unstintingly supportive rather than challenge clients’ egos, judgment and rationality.

Of course, some do provide critical perspectives, but many do not and I regularly read material about helping clients to “derive their own answers”, which I think will become problematic. That’s because the answers clients have may not be adequate and more importantly, their reasoning about what to do or where to find answers are not necessarily going to work and potentially are not reasons at all.

Let’s look at the science of that

Our lazy brains

Life was hard during most of the evolutionary period when humans were developing mostly effective ways to deal with the uncertainty of life. One of the uncertainties was the lack of guaranteed 24 hour pizza availability, you had to go out and hunt or gather food and that took a lot of energy. And because the brain is so powerful, it needs a lot of fuel to provide that energy, which is then not available for hunting or gathering. The brain only makes up about 2% of our body weight but uses about 15% of the blood flow, 20% of the oxygen and 25% of the glucose used by the whole body (Attwell and Laughlin, 2001). Naturally, our brain evolved to process events and make predictions as efficiently as possible so that we could save the fuel that was so hard to get. That means not taxing the brain’s limits and hence not thinking hard if we don’t have to.

So humans are cognitively economical (lazy!) We like the familiar. When we have a solution, we tend to stick with it – so given a metaphorical hammer, lots of problems come to resemble metaphorical nails even if they are actually metaphorical left hand thread screws. We’ll come back to that, in the light of chaos theory and an example, later.

But our brain is made up of many connections. There are (much simplified) models of the brain with two hemispheres or systems, three parts (reptilian, limbic, cortex), eight parts (cerebellum etc.), and various numbers up to the billions of connections between neurons – all representations of this incomprehensibly complex system.

The pros and cons of switching to auto-pilot

One representation also distinguishes the conscious mind that, unsurprisingly, is the bit that we’re conscious of, that makes up only (depending on which neuroscientist, brain surgeon or other expert you ask) between 0.1 and 5% of our mind. The remainder is below our conscious awareness (however many bits there are, some of which bits have elements we’re conscious of as well). The unconscious is where the habits of thought, the intuitions, the pattern recognition (“I know this situation”) reside that allow us to operate most of the time on auto-pilot. We consciously believe we’re thinking hard but actually we’re letting our “existing wiring” run things. That works well most of the time (sometimes we end up in the wrong place when we’re driving, because we went onto auto-pilot) and it allows us to be very cognitively economical.

Our conscious mind can step in when we need it, when we have decisions to make that require more conscious analysis, when our usual behaviour won’t be enough (Kahneman, 2011). That’s one reason why unfamiliar situations, meeting strangers, and change generally tend to meet with resistance – they take us out of our comfort zone where the auto-pilot works well and unconsciously, we don’t like wasting the energy.

And the thing is that while our conscious mind can step in, it doesn’t mean that it does.

What usually happens is that we make decisions on auto-pilot and when we’re asked why, give a logical sounding rationale for what we’ve done that actually has nothing to do with the way we decided (which is not rational at all, but is often effective, for example, Gigerenzer et al, 1999; Klein, 1999; Klein 2004). There’s some fascinating (and entertaining) examples (in e.g. Gazzaniga 2002) on split-brain patients, and lots of evidence from decision research (for example Meehl, 1954 and reviews) and the analysis of disasters such as financial crises and the two big space shuttle disasters.

Common to all of them is that the people involved can give you detailed and logical explanations of how they came to their decision, which can be demonstrated to be irrelevant, post-hoc justifications. They made the decisions in all sorts of ways, they may have ignored all the information in favour of personal bias, given too much prominence to some data, ignored some viewpoints that didn’t agree with their own or the majority view, thought they were including information they’d ignored. The only common feature is that what they say they did has nothing to do with what they actually did, and what they say they did is a more or less coherent story that was manufactured to make it look as if their conscious mind was in control and was omniscient. And people genuinely believe their self-justification. They do this even when shown the evidence that what they claim was a rational decision wasn’t at all rational and their story of their decision process is a complete fabrication. They still insist they were logical and that is the way that they made the decision. That self-delusion helped us as a species to survive, we could be “cognitively economical” – and if it all went pear shaped we had a logical excuse that kept our self-esteem intact.

But as we come to know more about how the brain works it’s going to get more difficult to justify propping those explanations up. Coaches are going to have to challenge more. It’s changing already because, although psychology has not been terribly successful over the years in showing this is reality, modern technology is allowing neuroscience to show evidence that we don’t think the way that much theory and mathematical prediction from coaching, psychometrics and intuitive psychology says, or the way that we think we do. It’s showing that our expressed reasoning is often nothing to do with our actual reasoning and that reasoning is totally irrational. Why neuroscience is more persuasive, I don’t know. Maybe it’s the cool T-shirt designs that you can get from brain scans – we’re not logical, remember.

Prediction # 2

Coaches, and particularly mentors, will find it hard to rely on their experience, of the “I’ve been where you are” kind and offer solutions based on what worked for them.

Again, there’s science behind that. We don’t know what is going to happen. We make predictions, unconsciously and we can justify those predictions in all sorts of ways, but actually the justifications have little to do with the predictions. One problem we’ve faced throughout evolution is the complexity of life, the multitude of interacting factors that affect outcomes, the literally chaotic situation where apparently similar situations can have extraordinarily different outcomes. This is the classic “butterfly effect”, where a butterfly flapping it’s wings or not can mean a hurricane blows out in the Atlantic, or destroys New Orleans, or a market or product takes off or crashes.

We can’t keep track of all the cause-effect links, the exact details of every last variable (Sugihara et al 2012) and that would still be true even if we knew what they were, which we don’t. We can’t deal with that, it’s too complex, too scary. The result is that we simplify. We actually have no idea why what we have done worked, or even, in many cases, whether it has worked. For example, the banks thought their policies were low-risk and very successful – and all the “experts” agreed with them- until 2008, whereupon they lost more in 6 months than they’d made in 20 years. Did those two decades of policies work? If so, why? (or why not?) and why did they fail exactly when they did and not at some time in the previous decade?

The case of the banks

Provides a nice example of how uncertainty about the future and of the causes and effects of the past, make the future of coaching and mentoring complex. Obviously, there were coaches and mentors around in the banking industry in 2004-7, the period where “experts” were saying that bank policies were wonderful to the extent that Fred Goodwin became European Banker of the Year and Sir Fred (Stephenson, 2011).

Bear in mind that when a risk analyst suggested to the HSBC board in 2007 that perhaps the current investment policy carried higher risk than they were estimating, he was fired. Bear in mind that, at the time, nobody suggested that Fred (of RBS) was “greedy, a gambler and a bully” (a description later applied to him by several commentators). Bear in mind that none of the “experts” challenged the universal banking policy of investing in securitised mixes of triple A and junk loans (don’t ask!), and all of the banks followed similar policies. Bear in mind that people like Alan Greenspan, the head of the Federal Reserve (like the Chair of the Bank of England, but backed with real money) and who personally had over 40 years’ experience of finance, didn’t see any problem and relaxed regulations to allow the banks more scope, supported by his colleagues with several hundred years of experience between them. Remember, the situation is a metaphorical left hand thread screw – so our metaphorical hammer from experience is the ideal tool.

Coaches as massagers of egos?

Now I think that the coaches involved probably didn’t challenge the “greedy bully” Fred, or the “four decades of experience, guru”, Greenspan, or ask them to question their judgment, examine their (and other bankers’) rationale. I suspect coaches fed their egos, told them they were experts and doing a great job and that they knew that the “answers were inside them”. And I suspect that, if they personally didn’t have a coach, anybody involved who was coached was encouraged to believe that they knew the answers, that they were right, that they knew best. Bob Diamond (of Barclays) was still saying years later that they had done a great job and it was time for regulators, the public and the media to “lay off” bankers and let them go back to doing their jobs. They weren’t challenged, asked to dig down into their rationale, to question their judgment and their support for policies that, afterwards, people like Robert Peston said were insanely risky. Because, of course, Peston was also around in 2004-2007 and never voiced any doubt about idolising a man who would later lose his Knighthood and be vilified. I also didn’t notice Peston in his book (Peston, 2012) that explained how the crash was “inevitable” and how he (Peston) understood it all now, ever explaining why he had been too stupid to see this “inevitable” event coming or how he was certain he understood it in retrospect when he (presumably) knows that the system is chaotic, complex etc. and nobody can possibly be sure about why what happed did happen, or why it happened when it did and not at another time.

The collusion of mentors?

Similarly, I’m fairly sure that any mentoring of bankers (or Peston, or any self-appointed “expert”) was done by people who valued their experience (such as that of Greenspan and his colleagues) – but we know from chaos theory that small differences can make huge differences in future. The suggestion that, “I’ve seen this before”, might look right, and people might genuinely believe it – but it’s not true. Sometimes the experience is valuable, sometimes it isn’t, but it doesn’t make a difference anyway. But sometimes, the experience is a disaster, because it makes human beings confident, like Fred; like Gordon Brown who recommended him for the Knighthood and the banking award; like Peston who didn’t notice anything wrong at the time. And that confidence is illusory, but the human capacity for self-deception won’t allow anybody to admit that they were totally wrong. They’ll always have a rational explanation – it’s part of what helped us survive as a species.

Prediction # 3

Coaches (and mentors) will begin far more to challenge the simplistic assumptions, the obviously false statements, the self-serving beliefs, the questionable rationales. Increasingly, they will question the relevance of their own experience, challenge their own biases and assumptions, depart from automatic routines of thought. They’ll start to distinguish between where experience is useful and where it isn’t (Kahneman and Klein, 2009; Gigerenzer and Selten, 2001).

It gives us all a warm feeling of being right if somebody says we have the answers, we are the expert, that we are right. We all like to have a mentor who says confidently “I’ve seen this, what you need to focus on is…..”

But it’s actually dangerous in the real world. Because we say (self-delusion again), “I welcome feedback, my coach is great like that”. But do we really welcome feedback that says, “You’re OK, but you could be better if you let go of some of your patently nonsensical but self-serving beliefs”? Or do we actually want feedback that says, “You’ve reached the top in your profession by being good, you know what to do, just be yourself; all the answers are within you, I’m not going to add anything, I’m just drawing out your inner genius”? Or somebody who tells us that they can chart a path through uncertainty, because they’ve seen it before, and so you can be safe (phew!) and don’t have to think too hard (cognitive economy) by following their lead and not questioning their rationale?

Prediction # 4

To accommodate change, coaches will need to grow. They’ll still need all the current skills such as building rapport (the relationship seems to be the biggest variable in coaching, as it is in counselling). They’ll still need to question and listen effectively, and do plenty of other things. But they’ll also have to know their own limitations and learn about the application to themselves and their clients of the psychology of decision making. They’ll also have to learn the neuro-science, the reality of uncertainty, chaos and complexity, the dangers of simplified thinking and cognitive economy and the difference between useful intuitive experience and unjustified over-confidence.

Eventually, I think, all coaches will have to be prepared to learn, to challenge, and to help clients through the pain of growth – because eventually there won’t be the easy option. The coachees whose egos won’t let them hire other than yes-men will go the way of Fred Goodwin and the other “shining star” bankers, into obscurity. And the function of coaching will be increasingly recognised as being to help clients benefit from cognitive diversity (see for example Page, 2008). The performance bar will be raised for both executives and coaches.

References

Attwell, D & Laughlin, S.B, 2001). An Energy Budget for Signaling in the Grey Matter of the Brain, Journal of Cerebral Blood Flow & Metabolism, 21, 1133–1145.
Barkow, J.H, Cosmides, L and Tooby, J, Eds; (1992). The Adapted Mind, Oxford University Press.
Beinhocker, E; (2007). The Origin of Wealth: Evolution, Complexity, and the Radical Remaking of Economics; Random House Business Books.
Buchanan, M, Power Laws & the New Science of Complexity Management at http://www.optimalenterprise.com/docs/Power%20Laws%20-%20Complexity%20Mgmt%20sb34_04107.pdf accessed 25/11/14.
Gazzaniga, M.S; (2002). The split brain revisited, Special Editions, Scientific American.
Gladwell, M (2001). Tipping point: how little things can make a bit difference. Abacus.
Gleick, J; (1997). Chaos: Making a New Science; Vintage,
Gigerenzer, G., Todd, P. M., & the ABC Research Group. (1999). Simple heuristics that make us smart. New York: Oxford University Press.
Gigerenzer, G., & Selten, R. (Eds.). (2001). Bounded rationality: The adaptive toolbox. Cambridge, MA: MIT Press.
Gribben, J; (2005). Deep Simplicity: Chaos, Complexity and the Emergence of Life; Penguin Books.
Kahneman, D. (2011). Thinking, Fast and Slow. Macmillan.
Kahneman, D. and Klein, G. A. (2009). Conditions for intuitive expertise: A failure to disagree. American Psychologist, Vol. 64(6), pp.515-526.
Klein, G. A. (1999). Sources of Power: How People Make Decisions Cambridge, MA: MIT Press.
Klein, G. A. (2004). The Power of Intuition: How to Use Your Gut Feelings to Make Better Decisions at Work. Currency.
Meehl, P; (1954). Clinical versus statistical prediction: a theoretical analysis and a review of the evidence. The University of Minnesota Press.
Mirollo, R. E. & Strogatz, S. H, (1990). Synchronization of pulse-coupled biological oscillators, SIAM Journal on Applied Mathematics, Volume 50, Issue 6.
Page, S. E, (2008).The Difference: How the Power of Diversity Creates Better Groups, Firms, Schools, and Societies, Princeton University Press. Peston, R; (2012) How Do We Fix This Mess? The Economic Price of Having it All and the Route to Lasting Prosperity, Hodder & Stoughton.
Roulstone, I & Norbury, J (2013). Invisible in the Storm: The Role of Mathematics in Understanding Weather, Princeton University Press.
Stephenson, K (2011). Taming the Pound- Making money your servant, not your master, Matador.
Sugihara, G; May, R; Ye, H; Hsieh, C-h; Deyle E; Fogarty, M and Munch, S; (2012). Detecting Causality in Complex Ecosystems; Science26 October 2012: Vol. 338 no. 6106; pp. 496-500.

Acknowledgement

I’d like to thank the editorial team, particularly Bob MacKenzie and Pauline Willis for their help in focussing the article (from a long list to a short list to a final choice), positive critiques of drafts of the article content and the clarification of my somewhat tortured prose style! And also, albeit indirectly, the influence of the great researchers and intellects (many directly credited in the references) who helped shape my thinking.

About the author

Kim Stephenson is the only person in the world qualified and with experience of both professional psychology and financial advice. In finance, he’s a regular media contributor, an Associate of the Chartered Insurance Institute, on the CII and PFS “subject matter expert” list on financial risk and the psychology of financial decision making, author of Taming the Pound and co-author of the forthcoming “Finance is Personal – making your money work for you in college and beyond”, published in the US by Praeger. He operates a free website to help people with financial decisions. As other parts of his portfolio career Kim is a writer, speaker and consultant on the psychology of risk, finance, leadership teams and team development as well as executive selection, development and coaching.
You can contact Kim via the Taming the Pound site, at kim@stephenson-consulting.co.uk or on 01344 421199

Share this article