Narratives around AI have gone mainly in one direction: Doomsday. Fear sells, even though AGI / ASI technology is impossibly far off. Why then do doomsday narratives still matter now?
Watch the talk on the power of storytelling and narratives on AI scenarios here:
We have to talk about how we talk about AI.
Is AI going to make society better or worse? More pleasant, fair, healthy, prosperous?
How you answer Q depends on your Narrative – the big picture story you buy into.
I just gave a talk for AI+X Summit, Switzerland’s largest event around Artificial Intelligence. It was organised by three universities: the University of Zurich, ZHAW, and ETH (Einstein’s university).
My topic: “New narratives for our future with AI”.
We decided that was a better title than the alternative: “How to save the world”.
Why do we need New Narratives around AI?
For two years, headlines and our feeds have been filled with warnings.
Extinction. Artificial intelligence could surpass human control and lead to catastrophic consequences, e.g. by deciding the way to ‘save the planet’ is ‘eradicate humans’.
Enslavement. If macro-evolution is ultimate reality and might makes right, why not?
(un)Employment. 1-3 corporations will own the robots and rule the world.
Entertainment. But the good news is we get free drugs and VR headsets! (Panem et circenses)
Conversations about this with my host, AI professor Thilo Stadelmann, led to the audience’s most popular slide (most photographed):
Current Narratives around AI suck.
Imagine: 90% of the population are deemed a “useless class” and are given drugs and entertainment to pacify them. To keep them distracted from their lack of economic and societal purpose. Preventing unrest in a world where their contributions are no longer needed.
What I described is not the plot of a Sci Fi movie.
It’s warnings from leading thinkers on important stages.
But that’s impossible!
Yes, or at least, AI researchers think it is impossibly far off, given current technology. Some think AGI (Artificial General Intelligence) is altogether impossible.
And AGI is still a long way from ASI (Superintelligence).
So until then… what’s the big deal?
Why worry?
Let the headlines vomit forth their doomsday warnings! We know better. We can chuckle. Right?
Wrong.
Vision determins Direction.
AI is a revolution in knowledge. So I want to take us back to another knowledge revolution – the printing press. Because there is no clearer example of how competing Narratives can lead to completely different results of the same technology.
Companies and whole countries are currently going through a kind of AI arms race. Scrambling to out-do each other for the competitive advantage.
Now imagine having a 500-year head start to a revolutionary technology.
Rivals would never catch up!
That’s how far ahead the Chinese were. They invented the printing press a thousand years ago – 500 years before Gutenberg. A 500-year head start on the ability to mass disseminate knowledge. The ability to educate an entire population.
And you know what happened?
Nothing.
Why? Because of their Narrative.
Books – or bookshelves? Narrative determines.
It wasn’t a lack of books. The Chinese had plenty of books. In fact, as early as AD 823, Chinese monasteries had so many books that they invented rotating bookcases.
In the twelfth century, a Buddhist monk named Yeh Meng-te travelled through eastern China and reported that “in seven out of ten temples, one can hear the sound of the wheels of the revolving cases turning day and night.”
So you’d think, these were highly studious people. But in actuality, the bookcases were not turning due to diligent scholarship…
As hardcore Buddhists, the monks had no incentive to study or add to the body of knowledge.
Because they did not believe in reality.
They thought rationality was what bound them to the illusion that this world is real. So the last thing they wanted to do was study this reality with their rationality. They wanted to silence their intellect and destroy their personality to experience Nirvana.
And that’s why they were turning the bookcases.
They weren’t reading the books.
They were meditating on the sound of the revolving bookcases!
Narrative = Your belief about Reality
A narrative is a big-picture story.
An explanation.
A big idea.
And because ideas have consequences, narratives give direction.
And this is why we should be concerned with doomsday narratives around AI.
But before we go there… Why was Europe different?
Why did the printing press spark a knowledge revolution in Europe?
Because Europeans had bought into a different narrative.
From Eden to Aristotle
The ancient Greeks believed in a Cosmos – ordered understandable reality – as opposed to a Chaos. The Bible gave Europe an even clearer narrative. In the Genesis creation account, God brought order out of chaos.
Human beings were made in God’s image and were called to do the same.
So instead of emptying your mind to escape reality, we should study reality so as to shape it. Earth is a garden. We are gardeners.
This leads us to what I am calling “Civilisation-Building Principle 1”: The universe is ordered (Reality is Reliable).
That’s why the printing press actually changed European society. We wanted to know what was in those books because it would help us make a difference in the world. This was one of a few key (narrative) developments that created what we call the modern world.
And behold, 500 years later, the modern world grappling with the consequences of AI.
The ABC of Narratives: Why Doomsday Matters
As we saw from the printing press, the narrative is equally, if not more important than the invention. Inventions by themselves do not change society. The direction technology takes us depends on the direction we take technology.
(Case in point: social media.)
How likely is a negative AI scenario? What determines its likelihood? There’s an objective and a subjective component. We can differentiate between Assumptions and Conditions.
An assumption is a subjective belief about what is happening or might happen.
But what would have to be true for that assumption to happen? That’s the objective condition.
Assumption: AI is going to steal our jobs or kill/enslave us
Condition: ASI (Artificial Super-Intelligence)
So, everyone might believe that ASI is coming for your job or life (assumption), but we’re still a long way from even AGI. The technological condition is not met. So the subjective belief has nothing to do with reality. That’s what we call hype.
If you’re worried about an extinction scenario: Good news! AGI is impossibly far off.
But between the A + C, there’s one more element of narratives… B for Buy-in. That is, how people embrace and act on the assumption (rather than the objective condition).
Assumption: AI is going to steal our jobs or kill/enslave us
Buy-in: centralization and surveillance
Condition: ASI (not met, but…)
So, the bad news?
Even if Terminator never arrives, we could still end up in a technological dystopia.
Because how people respond to the idea of a technology can be just as powerful as the technology itself.
Narratives can become Self-fulfilling prophecies
This leads us to D for Destination: authoritarian policies that cause the very societal harms they (allegedly) sought to avoid. (Enforced with AI.)
Ladies and gents, this is what we call a self-fulfilling prophecy.
The belief in Evil ASI can lead to a tech dystopia, not because of the technology itself, but because fear drives preemptive actions that create dystopian conditions.
Narratives matter because they are a vision.
This is why we urgently need positive New Narratives around our Future with AI.
This is why we urgently need Positive New Narratives for Our Future with AI.
In my talk, I proposed a core theme for this new narrative: Relational Individualism™, which combines the best of modern western individualism (human rights, freedoms) and traditional concern for community (relationships and responsibilities).
After all, aren’t we reading those history books?
—
Jyoti Guptara is a bestselling author and global keynote speaker. His Story-based consulting and training programs help leaders experience more success with less stress.
Invite Jyoti for an inspirational keynote talk at your next milestone event.