AI is an unfathomably powerful accelerator, carried along by relentless currents. Like all great advances, it is the product of a long progression. Each has its ‘We are going to need a bigger boat!’ moment, instilling fear, awe and drama. Such leaps forward tend to be accompanied by social, wealth generating change – from industrial factory towns to the vast AI related growth industry of fact checking software.
Unquantifiable benefits to humanity are already accruing from the application of AI. The change will be gargantuan, even if the world decides, as it has with astonishing self-protecting and instinctive unity in the case of nuclear weapons, that many of AI’s iterations will be too unpredictably species threatening to use.
However, it is important to recognize that current AI systems, while powerful tools, cannot engage in the kind of flexible, context-dependent problem-solving that characterizes human intelligence. The ability to participate in open-ended, improvisational communication, is a key distinction between human and artificial intelligence.
There is nothing new about instructing machines to deliver individualised outputs at remarkable speed. My grandfather’s company launched the first electronic jacquard loom in 1981. Despite digital punch cards having featured on looms for hundreds of years, this was a breakthrough in efficiency for the production of complex patterned textiles, that only increased the need for a wide array of enhanced human relationships across the production chains, as happens when industries scale and sharpen their focus through innovation.
In all this AI related excitement, we will have forgotten whether we fired six shots at the classical and creative approach to learning, or maybe five. The clue’s in the name: by pulling the trigger on the humanities, we ditch humanity itself.
Context is everything. The emphasis, in independent school entrance tests, on the measurement of a child’s processing speed is strategically misguided. Why compete in this theatre against the overwhelming superiority of AI? Why not place greater emphasis on all the creative qualities and wit (in its true sense), that AI may never possess? Everyday moments of human communication are highly creative, and our ability to solve problems through flexible, context-dependent interactions is what sets human intelligence apart from AI.
For the little it’s worth, my view is that terming digital technology ‘artificial’ is to belittle its potential as a new form of intelligence. ‘Artificial’ is to use human artistry to replicate a behaviour or object so as to be unauthentically recognisable, the etymology of ‘artificial’ being ‘ars’ (art) and ‘fecere’ (to make). While artificial hips do the same as organic ones, digital creations reach far beyond the limitations of the constructs they replace, being too different to be ‘artificial’.
AI may already be cutting loose to create its own artifices, and will beat our ‘ars’ at its own game, with no patience for human imposed artificiality. Life is too short AI might smugly say, with only finite means to understand the infinitely mysterious (call it spiritual if you like) artifice of our intelligence, connected as it probably is to fields of particle physics and science about which we know nothing. Can the mystery of consciousness be replicated by an intelligence that has no direct experience of life? Will it be bothered to try?
The nature of meaning and the role of analogical reasoning in language definitely still has a place in this new world. Words and concepts acquire meaning through a process of analogical reasoning and context-dependent usage, rather than having fixed, essential meanings. The flexibility and adaptability of language arise from our natural communicative instincts and our ability to draw connections between seemingly disparate domains of experience, something that current AI systems will struggle to replicate – and why education remains of the utmost importance for the next generation.
There may eventually be a conjoining of biological and digital intelligence. Until such a time, how about renaming Artificial Intelligence ‘Digitificial intelligence’?
By Charles Bonas.