Search This Blog

Follow adrianbowyer on Twitter

My home page

Saturday, 29 August 2020

GapOfTheGods

 


I am aware of the God-of-the-gaps nature of definitions of intelligence, whereby something that a computer becomes able to do successfully, like chess, is removed from the canon of intelligent ability.  By this process intelligence becomes a melting iceberg, drifting towards the Equator, with a smaller and smaller area upon which we humans may stand.

Despite that, I would like to propose a new definition of intelligence:


Intelligence is the ability to moderate impulses by deliberation.


By impulses I mean, in human terms, emotions.  But also, at a lower level, I mean such phenomena as a single-celled organism swimming up a chemical gradient towards food.

Let me start by considering systems that are entirely emotional and that do not deliberate: computers.  Consider what happens when you run a Google search.  The Google machine is completely unable to resist its impulse to respond.  If you were to ask it, "What is the best way to subvert the Google search engine?" it would return you a list of websites that would be its very best effort to answer your query correctly.  All computer systems, including all current AI systems, are entirely driven by their irresistible emotional need to respond to input.

If you type something at Generative Pre-trained Transformer 3 it will respond with coherent and rational text that may well be indistinguishable from human composition.  In that regard it is on its way to passing the Turing Test for intelligence.  But it cannot resist its emotional need to respond; the one thing you can guarantee is that, whatever you type at it, you will never get silence back.

But now suppose someone asked you, "What would be the best way for me to murder you?"  You would hesitate before answering and - if free to do so - not answer at all.  And under compulsion you would frame a considered lie.

Everything that responds to input or circumstances, from a thermostat, through a computer, a single-celled organism, to a rat, then a person, has an impulse to respond in a certain way.  But the more intelligent the responder, the more the response is mediated by prior thought and mental modelling of outcomes.  The degree of modification of the response depends both on the intensity of the immediate emotion with which the response starts, and the intelligent ability of the responder to model the situation internally and to consider alternatives to what the emotion is prompting them to do.  If you picked up a hot poker, the emotional impulse to drop it would be well-nigh impossible to resist.  But if someone held a gun to your head you would be able to grit your teeth and to retain your grip.  However, the single-celled organism swimming towards food would not be able to resist, no matter what danger lay ahead.

Today's AI systems are far cleverer than people in almost every specialised area in which they operate in just the same way that a mechanical digger is better than a person with a shovel. Computers are better than people at translating languages, playing Go or poker, or - of course - looking up information and references.  But we know that such systems are not intelligent in the way that we are with complete certainty once we see how they work; even a near-Turing-Test-passing program like GPT-3 is not thinking in the same way that we do because it cannot resist its impulse to do what it does. 

We are not used to regarding the physics that drives computers to do exactly what they are programmed or taught to do as an emotion, but that is what it is.  If you see someone whom you find sexually attractive, you know it immediately, emotionally, and certainly; that is your computer-like response.  But what actions you take (if any) when prompted by that emotion are neither certain nor immutable. 

Note that I am not saying that computers are deterministic and we are not.  Nor am I saying that we have "free will" and they do not, because "free-will" is a meaningless concept.  There is no reason to suppose that an AI system such as the current ones that work by machine learning could not be taught to moderate impulses in the same way that we do.

But so far that has not been done at all.

Finally, let me say that this idea makes evolutionary sense.  If our emotions were perfect guides to behaviour in all circumstances we would not need intelligence, nor even consciousness, with the considerable energy consumption that both of those require.  But both (using my definition of intelligence) are needed if an immediate emotional response to a situation is not always optimal and can be improved upon by thinking about it.  

1 comment:

Adrian Bowyer said...
This comment has been removed by the author.

Post a Comment