On Artificial Intelligence and Machine Men
By Everett A Warren
March 14, 2004
Presented as part one of sixteen
[Edited to add: the entirety of this article can now be read in one place, over on Green Man Envy: here! The links referenced in the essay were surprisingly intact given almost a decade has passed since this was posted... there are some dead ones, and they're fixed over on GME!]
~ ~ ~
To get the obvious out of the way, and to make sure it is so for all readers, Artificial Intelligence, or AI, is not necessarily just a film by Steven Spielberg, it is a term used to describe a computer system, or other machine, that is able to think for itself. As in the aforementioned production, AI is often associated with robotics, specifically with those with a humanoid structure – an effort to anthropomorphise machines in appearance and behaviour.
Of course, in recent computing journals there have been articles that claim AI is a misnomer1 – after all, the computer can't truly possess intelligence – it is merely programmed to take some action based upon a particular occurrence. Whether or not that proves itself true – and I will detail another related common misconception that may shed some light on this in a moment – it is worthy of some thought and analysis. To support how far the lofty goal of AI has fallen, one might point to the common marketing of simplistic algorithms found in behaviours of computer-controlled opponents in gaming software as "enemy AI". Do not think for a moment that a modern day video foe can think one whit more than some of the pixelated opponents of the early games – they merely have more data points, and more complex looping behaviour, all of which are merely a set of rules coded out that they must follow.
"Speak - it is our only hope."
Computers are digital; human minds are analogue. This has been stated many times, and presented as pure fact by far too many to be untrue. Analogue means "many states" and digital means "two states" - in other words, something that is analogue will have a smooth range of possibilities, while digital is either on or off. Many insist that the differences between analogue and digital are best expressed in the difference between a dimmer switch and a common wall switch. Computers, of course, speak binary – zeros and ones, the language of a two-state digital system. Humans can speak all manner of languages, pitches, and so forth: there's an upper and a lower limit, but they can pretty much sweep the range. Digital and analogue, yes?
No, not really.
Looking closer at the two-state system of computers, you may come to realise that 5 volts or 12 volts or some such must be maintained, and transistors let the current flow in various levels based on such a triggering voltage – an artificial constraint requires that only one of two voltages, either the positive or the negative – is applied to the switch. The logic systems that the technology relies on requires that these zeros and ones are maintained through force – in some regards, the system would much rather be a dimmer switch – after all, it runs on alternating current, which doesn't sit still. It gravitates between one extreme and another in 60 hertz cycles. To do anything with modern microchips, this untamed wave has to be clipped and strapped down into voltage levels that can translate to that on or off state.
The human mind, of course, is purely analogue, yes? No, to that as well? The mind, like a computer, is made up of a bunch of different components that react to electrical signals. Although it's not a neat and clean analogy, consider a neuron to be the transistor of the mind. Each neuron is either on or off – a neuron doesn't fire half-way, or part-way, so it would seem that the transistor already has the analogue advantage. With a transistor, you have to artificially control the input "switch" so that it either gets a high or a low blast of voltage, thus has a binary operation. The neuron, however, only offers two states. The analogue aeffect that we experience is achieved through the quantity of neurons that fire in response to an event.
Keep that in mind, we're going to jump around a bit.
~ ~ ~
1. Neville Holmes – Artificial Intelligence: Arrogance or Ignorance? – IEEE Computer Magazine – November 2003 (Vol. 36 No. 11) – pp. 120, 118-119 – The Profession
~ Questions and discussion welcome! ~
~ ~ ~