We are increasingly asked questions about “future proofing” business by our clients. We then talk about technology trends, VUCA, speed and understanding exponential change. It doesn’t always resonate. Too much science fiction.
Last week I had the pleasure of talking to Brett King, the author of “Bank 3.0” and “Breaking bank” and he did a fantastic job grounding technology change into historical patterns. Starting with the textile industry and the luddites. Every time a disruptive technology arrives, there is resistance, followed by the inevitability of the change taking hold. He calls that a modality shift.
Friction is suicide
His conclusion, and this does not just apply to banks, is that friction is suicide. And that is just the beginning. What if Watson and AI going to do to your business model? Computers have learned how to learn. Watson is 90% accurate on cancer diagnosis. The best surgeon is only 50% accurate. Who do you prefer as your doctor?
The ghost in the machine
It is getting scary. The ghost is in the machine. Apparently Audi has two cars that have learned how to drive themselves and they have 2 distinctly different driving styles. Nobody knows why…..
History and patterns
History and patterns are important, particularly in ICT. Hence the reason to pick up Walter Isaacson’s “Innovators” and read about the history of ICT. From the computer to programming. From programming to transistors. From transistors to microchip. From microchip to video games. From video games to the internet. From internet to the personal computer. Software. Online. And finally Artificial Intelligence (AI).
Covering innovators such a Turing, Aiken, Atanasoff, Von Neumann, Moore, Fairchild, Metcalfe, Gates, Wozniak, Jobs. Torvalds, Andreessen, Page and IBM.
What is immediately striking that change and tech is not as fast as you think. Here are some historical highlights:
1890 First computer
1956 First artificial intelligence conference.
1962 Doug Engelbart publishes “Augmenting Human Intellect
1965 Moore’s Law predicts microchips will double in power each year or so.
1972 Nolan Bushnell creates Pong at Atari with Al Alcorn.
1975 Steve Jobs and Steve Wozniak launch the Apple I
1983 Microsoft announces Windows.
1991 Linus Torvalds releases first version of Linux kernel.
Tim Berners-Lee announces World Wide Web.
1997 IBM’s Deep Blue beats Garry Kasparov in chess.
1998 Larry Page and Sergey Brin launch Google
2011 IBM’s computer Watson wins Jeopardy!
Some interesting lessons
- First and foremost is that creativity is a collaborative process.
- The digital age may seem revolutionary, but it was based on expanding the ideas handed down from previous generations.
- Physical proximity is beneficial.
- You need to pair visionaries, who can generate ideas, with operating managers, who can execute them. Visions without execution are hallucinations.
- Government should undertake projects, such as the space program and interstate highway system, that benefited the common good.
- A combination of governmental, market, and peer sharing—is stronger than favouring any one of them.
- The most successful endeavours in the digital age were those run by leaders who fostered collaboration while also providing a clear vision.
- Similarly, collaborative groups that lacked passionate and wilful visionaries also failed.
- Most of the successful innovators and entrepreneurs had one thing in common: they were product people. They cared about, and deeply understood, the engineering and design. Larry Page felt the same: “The best leaders are those with the deepest understanding of the engineering and product design.”
The last chapter is about AI
I am fascinated by AI and where that could go. This is from “The future of the mind”:
Although it occupies only 2 percent of the body’s weight, the brain has a ravenous appetite, consuming fully 20 percent of our total energy (in newborns, the brain consumes an astonishing 65 percent of the baby’s energy), while fully 80 percent of our genes are coded for the brain. There are an estimated 100 billion neurons residing inside the skull with an exponential amount of neural connections and pathways. The same amounts as there are stars in the Milky Way galaxy.
The mechanical brain
To build a copy of the brain with the current technology would not just need a single Blue Gene computer but thousands of them, which would fill up not just a room but an entire city block. The energy consumption would be so great that you would need a thousand-megawatt nuclear power plant to generate all the electricity. And then, to cool off this monstrous computer so it wouldn’t melt, you would need to divert a river and send it through the computer circuits.
It is remarkable that a gigantic, city-size computer is required to simulate a piece of human tissue that weighs three pounds, fits inside your skull, raises your body temperature by only a few degrees, uses twenty watts of power, and needs only a few hamburgers to keep it going.
But Moore’s law will ensure that in the coming decades the power of neuroscience will become explosive. Current research is on the threshold of new scientific discoveries that will likely leave us breathless. One day, we might routinely control objects around us with the power of the mind, download memories, cure mental illness, enhance our intelligence, understand the brain neuron by neuron, create backup copies of the brain, upload ourselves into computers and communicate with one another telepathically. The world of the future will be the world of the mind.
However, some theories would suggest that the brain is a quantum device, constantly dealing with alternate universes. And it will take a while before we can build that.
Then there are the ethics. AI has no conscience, moral code or ethics. Skynet or the 3 laws of robotics by Assimov?
And despite all of the proclamations of artificial intelligence engineers and Internet sociologists, digital tools have no personalities, intentions, or desires. They are what we make of them.
IBM’s research director John Kelly says. “People will provide judgment, intuition, empathy, a moral compass, and human creativity.”
If we mortals are to uphold our end of the human-computer symbiosis, if we are to retain a role as the creative partners of our machines, we must continue to nurture the wellsprings of our imagination and originality and humanity. That is what we bring to the party.
It raises an interesting prospect: perhaps no matter how fast computers progress, artificial intelligence may never outstrip the intelligence of the human-machine partnership.
C. P. Snow was right about the need to respect both of “the two cultures,” science and the humanities