It is known as the automation paradox: the skills you automate, you lose. Including managing and predicting your own future.
AI at the core. Data-driven enterprises. Autonomous business. It sounds incredibly interesting and terrifying at the same. Particularly once you read up on AI.
We are messy
The idea that AI could predict the future just does not sit well. It feels a bit like some of the thinking in transhumanism, where people are regarded as too unpredictable and messy. AI can predict behaviour, can recognise patterns, can play GO and chess better than anyone in the world, but they can’t be as messy as we are. These predictive systems are frequently wrong, as when they recommend a book I already own. Those flaws are trivial because the recommendation is so cheap to produce and easy to ignore. But when determining who should have access to social services or healthcare, or who might commit crimes, such errors carry more weight and warning.
Uncharted: How to Map the Future
I picked up Margaret Heffernan´s book “Uncharted: How to Map the Future” for a perspective on trend watching. Instead, it became a journey about humanity. We have moved from a complicated world to a complex one. The two aren’t the same – and complexity isn’t just complicated on steroids. Complicated environments are linear, follow rules and are predictable; like an assembly line, they can be planned, managed, repeated and controlled.
Complex environments are non-linear and fluid, where very small effects may produce disproportionate impacts. A volatile environment of shifting opacity where a lone individual with a cell phone could tip the balance. Complex global systems incorporate a multitude of factors, each influencing others but controlled by no one person or nation. Apple’s iPhone may have been ‘designed in California’, but making it depends on raw materials and suppliers from Ireland, the Philippines, China, Taiwan, Japan, Austria, Korea, Singapore, Thailand, Germany, the UK, the Netherlands, Indonesia, Puerto Rico, Brazil, Malaysia, Israel, the Czech Republic, Mexico, Vietnam, Morocco, Malta, Belgium and most of the United States. But they expose Apple (and similar phone manufacturers) to natural disasters, labour disputes, economic volatility, social turmoil, religious strife, trade wars and political discontent:
We’re so dazzled by the ornate complexity of such manufacturing systems that we forget, or prefer to deny, that contingencies have multiplied, fragility has proliferated, accurate prediction has become harder. We are overconnected. Data can´t help you in such a world. Large dataset might describe a group or neighbourhood of voters well, but still, be unable to predict with certainty how an individual will vote next time.
We have a choice
We risk falling into a trap: more need for certainty, more dependency on technology; less skill, more need. We become addicted to the very source of our anxiety. Technology aims to solve the so-called problem of human complexity by force-fitting a predetermined model onto the surprising variety of human existence. At a time when we are deluged with propaganda undermining human talents in favour of the perfection of machines, the sheer creativity of human interaction has never been more critical. Our choice is not between false certainty or ignorance; it is between surrender or participation. That is the book in a nutshell.
The first problem is the models. Economic models have never been able to predict any crisis. Today’s technology today accommodates vastly far more data, but the intrinsic difficulty of models remains: the more data is compressed, the more its predictive power is compromised.
The second problem lay in agendas. Algorithms are, as the mathematician Cathy O’Neil once said, opinions encoded in numbers. Models are profoundly susceptible to the beliefs of the people who design and run them; they aren’t and cannot be morally neutral. Because economics requires moral choices. Also, do not forget that forecasting businesses are commercial enterprises.
You are not an aggregation of data
Our desire for certainty leaves us susceptible to other people’s agendas and business imperatives. This is never truer than in competitive and turbulent times when the difference between winning and losing is so marked. The rhetoric flowing from Silicon Valley casually assumes that a person is simply an aggregation of data. We are much more than that. This is about humanity. This is about all the things that a GDP cannot capture. Does anyone seriously propose a standardised measurement of how much I love my partner or my children? Beauty, passion, empathy, compassion, altruism, joy, fun, laughter are outside the scope. As Sunil Prashara of PMI says, a robot cannot say sorry.
You need to experience life
The survivors of existential crises have tremendous wisdom, won at a high cost, about what we need to endure when the unexpected arrives. You need to live and experience life to predict the future. What gives life meaning is the rich and constant interplay between past, present and future. We create models of the future by recruiting our memories of the past. Mentally, we are all time travellers. The less we engage with the here, and now, the more impoverished our scenarios of the future become. Life is not a mathematical model. We are seduced into submission by the beautiful, frictionless fantasy. The point of predictions should not be to surrender to them but to use them to broaden and map your conceptual, imaginative horizons. Don’t fall for them – challenge them. All forecasts are probabilities, not absolutes.
Here are some questions to ask the next time you see a prediction
- If there is a vested interest, where does it lie?
- What’s at stake?
- What am I being sold?
- Is this propaganda, bad science, careerism or entertainment?
- Does the hypothesis emanate from individuals working alone or from a team?
- What’s their track record?
- How far out are they looking?
- What are the counterarguments?
- How wide is the range of opinion?
- What’s the agenda?
The fallacy of accuracy
Data scientists know that, with a large enough dataset, projecting trends with gross accuracy is easy, but it’s near impossible to reduce from that to pinpoint accuracy for an individual. The last mile of data. We live in a world where there is a lot of uncertainty. The time horizon for accurate forecasts is dauntingly small: for example, the Good Judgment Project found that, while many forecasters were accurate within only about 150 days, its own super-forecasters weren’t confident beyond 400 days. Organisations that invest time and effort into intricate five-year plans or thirty-year plans can’t believe that they represent any kind of absolute truth. Predicting the future is impossible, but automating it merely speeds up bias, errors, short-term thinking and flaky assumptions. The temptation to try to simplify complex systems is how you get them wrong. An informal audit of three DNA-testing companies showed completely different results for the same DNA sample, largely because each firm uses different sets of reference data.
Master your fate
You are the captain of your soul, the master of your own destiny. You will become what you think. Everything is contextual. What get you here, won’t get you there. History is not a predictor of the future, and generals always fight the battles of the last war. Social skills, emotional intelligence, strategic thinking and creativity can determine life outcomes. Experience changes us, flukes, imagination, and accidents change us. The complex interplay of personality, work, illness, education, friends, family, history, success and failure will alter who we are. What you expect will form you. Read “Psychocybernetics“.
Technology is not neutral
Technology is not neutral. All technology effectively outsources work from humans to machines, promising purported efficiency gains at the expense of individual knowledge and learning. You are not a robot that can be programmed, assessed, profiled or classified. You are too unique. You are more than a data set.
The truth about complex systems is that trying to simplify them doesn’t guarantee that they become more effective; it risks making them less effective. The big danger in confusing a complex system (like science) for a complicated process (like loading bags on planes) is that, in striving for efficiency and predictability, the robustness of the system, its creativity and ability to withstand adversity, is killed off. Understanding such environments and managing them requires the capacity to distinguish between what can and cannot be predicted, and tolerance for the ambiguity and uncertainties that lie in between. The need for efficiency is the risk of inhibiting innovation, marginalising underrepresented ideas and discouraging new and multi-disciplinary fields. Serendipity is like the butterfly wings that make it rain in the Amazon. There are black swans everywhere. And in a complex system that can have a cascading effect.
Impossible to predict
Predicting anything in that context is impossible. I always prefer Burrus´ classification of hard and soft trends as one way of looking at the future. You also need to consider all the alternative points of view in order to assess multiple possible realities. That’s why the intelligence services attach probability ratings to their briefings: they know that their forecasts, however scrupulous, are riddled with contingencies, accidents, luck and change. Never imagine that you know what will happen.
This is why intelligence organisations frequently use multiple teams to examine and analyse the same data: because doing so can tell more than one story. That variety might look inefficient, but it imparts adaptability and responsiveness – essential qualities when you know you can’t predict what the future holds. The author talks about how leaders find it difficult to accept that. It is inefficient, messy and indeed unpredictable. The language of efficiency is unmistakable: life is much simpler, and more predictable if the complexity, anomalies and flukes can be managed out of it.
Eliminate uncertainty is a business model
Over the past 100 years, turning to models, data, history, profiling and DNA have thwarted our hunger for certainty. We have now reached a stage where, instead of coming to terms with uncertainty, companies intervene to eliminate it. That is the thinking behind Amazon’s anticipatory shopping patent. Instead of customers making their own decisions, Amazon decides for them, sending what they want before they know they want Pervasive monitoring devices – smartphones, wearables, voice-enabled speakers and smart meters – allow companies to track and manage consumer behaviour. The goal is to change people’s actual behaviour at scale. When you read “social efficiency both as a design goal and a metric for the design of social network systems”, you need to worry. Efficient friendship? The appeal for large corporations is that doing so is hugely profitable.
Surrender to machines
The price of certainty is high: surrendering to a limited experience of life, designed by individuals and corporations who do not know us, whose interests are not ours. They are reluctant to spell out the risks, which are not just conformity, injustice and authoritarianism but the loss of social connection and diversity. In this picture of predictable lives, there are no flukes, no happy (or unhappy) accidents. There is no evolution either. The more time we spend visiting places that others have described, the more we follow the paths others have made, reading what we’re told, seeing what the algorithm recommends, listening to what crowd-sources admire and eating what’s already been photographed, tasted, marketed and measured, the less capacity we have to see what we didn’t expect, to hear what we weren’t told about or to ask questions that haven’t already been answered.
Embrace the mess
While we can never render complexity simple, we could embrace it as an adventure, calling us to investigate the infinite permutations of life that it contains. Surrendering agency, action and adventure for convenience is a miserable bargain. Life is to be experienced. You could say that experiments are how we learn everything. We try to stand up, fall over, recalibrate, and next time find we can teeter for a second or two. Keep at it, and mastery emerges. That life, work, love are more complex than rules-bound games make experiments more valuable, not less. But it is characteristic of complex systems that small actions can make a disproportionate impact. You just don’t know, won’t know, until you try.
Decades of strategic planning have left companies paralysed, unable to explore without a map, incontrovertible data or rock-solid guarantees. Are firms now like travellers who daren’t stray outdoors without a GPS, afraid they might lose their way? Experiments are pragmatic ways to test out the future. Or try keeping it simple. Thuiszorg has one rule; do what’s right for the patient. Leave the rest to the people doing the work. Separating the ineradicably complex (patient care) from the merely complicated (assigning nurses to patients) radically reduced costs and enhanced motivation and meaning in work. Though both aspects of the work are necessary, they don’t have to be run the same way.
Experiments are how we explore ecosystems, feeling out their contours and boundaries. If we want to map the future, we start by acknowledging that we don’t know all it holds, that everyone can contribute, but no one knows what we will find. With simple language, an absence of power and entrenched interests, alertness to weak signals and small insights, we start to delineate the contours of what lies ahead. And then you write the stories. Stories derived from verifiable facts that are consistent and plausible. That means that the scenarios can be explained. Each has to be considered as seriously and thoroughly as any other, as Shell’s team has always rejected assigning probabilities to them. Soft data (cultural differences in different markets, for example) is as important as hard data.
Scenario planning originated with the recognition that much in the world is too complex to be predictable and that the future is too malleable to be revealed by hard data alone. It was pioneered after the Second World War by the Rand Corporation and the American military, and it is becoming increasingly a tool used to develop strategy.
What can we learn from an artist?
Executives in corporations need certainty like an addiction, and they are so afraid of being wrong that they have lost the capacity to think freely. We can learn from artists. Artists have the capacity to make work that defies time, instead of trying to force-fit a predetermined idea of the future, they have learned to live productively with ambiguity, to see it as a rich source of discovery and exploration. We may not all be artists, but we can learn from their habits. Artists use any number of words to describe the process between collection and making: gestation, filtering, percolating, simmering, mulling, distilling, digesting, waiting. No one I’ve ever talked to or worked with can explain how or why clarity emerges; they simply trust that it will. In practical terms, this means that artists wait for meaning to emerge. Mind-wandering. Diffuse but intense attention. Travel without an agenda. Non-linear. Undetermined. Unplanned. Open to reflection, accident and discovery. Inefficient. Inconsistent.
Visionary insight machines
Artists start without waiting to be asked. They have to begin. The future isn’t something to be nailed down, defined and programmed. The only way to influence it is to keep noticing. While an efficient mindset prizes predictability and continuity, an artist’s passion for exploration develops the capacity for change. That is what many organisational strategists yearn to emulate. The billions of dollars spent on digital transformation programmes start with the dream of turning hierarchical, bureaucratic, data-driven organisations into visionary insight machines. If only everyone could think and act like an artist. Artists think for themselves. In doing so, they claim the right to influence the future of their own lives, of their work, and of anyone who witnesses. These aren’t the stale, frightened minds of executives who can’t imagine even participating in a different scenario but the highly adaptive minds that seize, in uncertainty and ambiguity, the freedom required for adaptation, variation and change.
The future is here
Suddenly the future is here. However much we are prepared to make the future, it sometimes arrives too soon, too fast and threatens to make or break us. Companies have existential crises: defining moments when, if they aren’t to die, they must forge a new future for themselves. These problems weren’t secrets, but the company’s culture was so polite and hierarchical that nobody talked about them. The lesson: sit back at your peril; what you can’t be is patient or passive. Release the rebels.
Our institutions, corporations and organisations need reform to be prepared, to maintain trust and to stay relevant. This calls for a new kind of leadership. Those who will rise to that challenge will be outstanding convenors, better chosen for their scepticism than their confidence. Collecting voices, structuring exploration, keen listening and synthesising success and failure will be the focus of their work. They will need to be excellent interrogators of the ecosystems in which they reside, aware of where they fit and the impact of their decisions on others. They will need the moral authority, to be honest about sacrifices, and they will have to resist the rhetorical allure of over-simplified fantasies. The need for legitimacy doesn’t render expertise useless or trivial. None of the challenges we face will be solved by expertise alone, but it places on experts the responsibility to explore humbly all the consequences and ramifications of their work on everyone. It is only together that legitimate boundaries for technology can be defined.
The future is personal
The efficiency of the gig economy, the splintering of communities into competitive individuals, our dependence on technology all undermines this, eroding the lifesaving power of loyalty and friendship that these crises demand. Staying human, not just a user or a consumer, is the first way to prepare for an unpredictable future. Accumulating memories with which to plot a journey forward, full of options and deviations. We aren’t all artists, but we can all allow our minds to wander off the predictable path to explore what lies beyond it. The richer our knowledge and experience, the more easily we can identify where the opportunity for experiments lies.
Man versus machine
The automation paradox is a physical reality. The more we surrender to the authority of devices, the less independent and imaginative our minds become, just when we need them most. Extravagant claims for artificial intelligence don’t just denigrate real intelligence but mask a determination to force the world into predictable patterns. But we become digital slaves only if we think of ourselves as no more than data. Every time we use technology – to nudge us through decisions or to interpret how other people feel – we outsource to machines what we could and can do ourselves.
Proud to be unique
Imagination, creativity, compassion, generosity, variety, meaning, faith and courage: what makes the world unpredictable are also the strengths that make each of us unique and human. What we need to be is human. The future will always be uncharted, but it is made by those active enough to explore it, with the stamina and imagination not to give up on themselves or each other.