Make your own free website on
Only Machine
the machines : limits

Relating mostly to Agents, this is a general view of the limitations of machines in terms of intelligence and computational ability. The basic theory is that machines are more or less like us. That whatever enables them to be sentient also gives them abilities and restraints similar to ours. In short, machines are very human... created in humanity's image after all. Also related: Postulates.



1. Agents miss. A very long discussion can stem from this but the view of this project is that Agents do not have the capacity for instant "computer-like" ballistics calculations anymore than humans do. They miss Trinity, Neo while sitting, and Morpheous.

2. Agents use language. More specifically, with each other. They do not have intantaneous transmission of ideas or a more effecient machine language, instead they use the relatively slow and imprecise human language.

3. Machines make mistakes. The Paradise crops, for example. The Agents make several. They can hesitate or be surprised.

4. Machines have emotions. Smith clearly does. Jones and Brown flee from Neo. The Animatrix makes it even more clear. This can be an advantage as well as a limitation depending on how much it controls you.

5. Machines are creative. They can have differing views which means they don't always reach a single discrete solution to an issue. Such creativity is powerful but it causes vague pathways as well. Once subjective interpretation is involved there is no garantee that purely rational actions will always result.



If you could be a cold purely logical state machine why inherit the weaknesses of human sentience? Good question, but it comes down to what you are created to do. A Doc Bot or Harvester does well as a state machine. However, an Agent- who interacts regularly with humans- must have neural net-type mind to pass the Turing Test. The machines at the top of the hierarchy probably required creativity to some extent so it's doubtful they were "dumb" state machines too.

Another misconception is the belief that a being made of code MUST be able to make instantaneous calculations. This is like saying that we, as chemical machines, should be able to too. Sentient beings are more than the sum of their parts which means that they do not necessarily have access to the processes that take place on the lower level to enable their sentience. A human example would be to say that we should be able to spontaneously split into two copies just as our cells can.