Hypercomputing is about the hypothetical extension of the capabilities of traditional Turing-machines. I.e. how to go beyond Turing-compatibility? How to calculate infinitely many steps within a finite amount of time (these are supertasks, see i.e. the Thomson lamp) and how to solve a mathematical problem which is unsolvable by Turing machines? Obviously, these are mainly thought experiments, but the main point is to exceed certain limits of Turing-computing.

But we can turn into another direction and we can examine the limits of physically existing computers. A Turing machine is an abstract construction originally with to aims: to model how a “human computer” can solve a problem (in the year of publication of Turing’s paper about Turing machines (1936) the computational processes were performed by humans with paper and pen); and to create a mathematical construction to describe it–this was adopted to electromechanical/digital computers around the end of WWII. In other worlds: it was based on an abstract conception about mathematics.

According to this approach, every Turing machine which are capable to solve a problem are equal, since they give the same answer using the same logic and it is indifferent if one of them is a super fast computer while the other one is a slow, mechanical model made of tinker toy.

But–opposite to the physical reality–mathematics does not contain time as parameter and it explain why one of the central problems of Turing machines is the “halting problem”. The question is whether the computer halts because of a given input sometime at all, but it is irrelevant whether we have to wait either a second or one trillion years.

Penrose makes a distinction between hardware and software (Emperor’s new Mind, p. 24.). The previous one is “the actual machinery involved in a computer” (circuits, processors, etc.) while the software “refers to the various programs which can be run on the machine”. So Turing machines are equal regarding their software–after all, they are based on the same logic. But they aren’t equal in the real world, since it is not the same to wait for a result a second or to a longer time than the earth would exist. It is indifferent from a practical point of view that we don’t know the answer since a Turing machine unable to solve the problem or because it takes too long time.

But making a distinction between the abstract mathematics of computing and the real, physically existing machines makes possible to introduce a neologism called undercomputing. Similarly in a certain way to the concept of the Landauer number, it is about the limits determined by the physical reality, and while hypercomputing is to exceed the traditional Turing machine, undercomputing is to take into consideration the physical limits of every physically existing computer.

Bennett and Landauer [The fundamental physical limits of computation. Scientific American 253(1), 1985] have studied some fundamental physical limits of information processing pointing out that, for example, PI doesn’t have an exact value in reality, since it cannot be calculable to the last decimal. So we can interpret computers as machines with performing capacities influenced by their hardware that are determined by the physical laws.

## No comments:

## Post a Comment