Susan Stepneyby Susan Stepney

The mathematician Stanislaw Ulam is famously quoted as saying that using a term like nonlinear science is like referring to the bulk of zoology as the study of non-elephant animals (Campbell, Farmer, Crutchfield, Jen. "Experimental Mathematics: the role of computation in nonlinear science". Comms.ACM 28(4):374-84, 1985). Unconventional, or non-classical, computation is a similar term: the study of non-Turing computation. But how big is its associated field? Whereas zoology has the advantage of studying many kinds of non-elephants, non-classical computer scientists are currently restricted to studying a few relatively primitive, or dimly seen, or merely hypothesised, devices.

The elephant in the room here is the classical Turing machine, with its 70-plus years of detailed study. Some maintain that there are only elephants, and that the rest of us are like the blind men in the Indian parable, claiming we have found snakes, and ropes, and walls instead. However, I prefer the reversed parable, illustrated with a cartoon of several blind men groping at said snake, and rope, and wall, and confidently declaring, "yes, it’s an elephant!"

The classical Turing machine was developed as an abstraction of how human "computers", clerks following pre-defined and prescriptive rules, calculated various mathematical tables. The abstraction has proved extremely powerful, yet, as an abstraction, it omits certain features.

Physics
Despite being formulated post-relativity and post-quantum mechanics, classical Turing computation is essentially based in classical physics. This is understandable given the source of its abstraction, but today's nano-scale computer components are embodied in the quantum world. Considerable effort is taken to make them classical again. The study of quantum computing and quantum communication is taking quantum weirdness seriously, and building devices simply impossible to achieve in the classical Turing model. For example, John von Neumann noted that "Any one who considers arithmetical methods of producing random digits is, of course, in a state of sin" (John von Neumann. "Various Techniques Used in Connection With Random Digits." In Householder et al, Monte Carlo Method, National Bureau of Standards,1951), yet quantum systems are inherently probabilistic. And the features of General Relativity, where observers, and different parts of the computational device, may experience radically different proper times, have barely been addressed.

Parallelism
Weird physics isn't the only property of the real world not addressed by the classical model. The real world is massively parallel, yet in the classical Turing model parallel computers are no more powerful than sequential ones. A single human clerk, following rules, can calculate exactly the same numerical tables as a whole army of clerks. Of course, the army can do it faster, but in the Turing model, this advantage is swallowed up by simple space versus time tradeoffs. When we are considering real time embedded computer systems, with them interacting with the real world, issues of time become crucial, and parallelism has to be used to meet hard deadlines. One reason parallelism is still a relatively niche study is due to the enormous success in manufacturing computers: Moore's law (Gordon E. Moore. "Cramming more components onto integrated circuits". Electronics. 38(8):114-7, 1965) has enabled sequential machine performance to grow fast enough to cater to our growing everyday needs, and only specialists have needed to go parallel. But recently, Moore's law has taken a new direction: instead of individual chips getting faster, they are going multi-core. If Moore's law continues in this direction, we will soon have kilo-core chips, and we now have to take this everyday parallelism seriously. This will benefit not just for our everyday devices, but also massively parallel cluster-based computation, up to vastly parallel Avogadro-scale computation.

Interaction
The classical Turing model is "ballistic": we give the clerk a set of instructions, shut them in a room, and wait for the answer. Interactive computing is a "guided" model: there is a continual interactive feedback loop between the computation and the real world. The computation here is an open system of closely coupled reciprocal causation, constantly receiving input and changing its behaviour based on that input, constantly providing output and affecting its environment, and hence its own subsequent inputs, through that output.

Nature
The Turing model was inspired by a single paradigm: the actions of human clerks. Unconventional computation can be inspired by the whole of wider nature. We can look to physics (from general relativity and quantum mechanics, to annealing processes, analogue computing, and on to fully embodied computation), to chemistry (reaction-diffusion systems, complex chemical reactions, DNA binding), and to biology (bacteria, flocks, social insects, evolution, growth and self-assembly, immune systems, neural systems), to mention just a few.

Even if it were to turn out that the theoretical power of unconventional computers is no more than classical, the practical power has enormous potential. New problems require new perspectives, new formulations, new conceptual frameworks, new mechanisms, new approaches. Unconventional computation offers these novel opportunities.

Susan Stepney
Deptartment of Computer Science, and Centre for Complex Systems Analysis
University of York.

{jcomments on}
Next issue: January 2025
Special theme:
Large-Scale Data Analytics
Call for the next issue
Get the latest issue to your desktop
RSS Feed