Delvenne, Jean-Charles
[UCL]
"Dynamics" is very roughly the study of how objects change in time; for instance whether an electrical circuit goes to equilibrium, due to thermal dissipation. By "information", we mean how helpful it is to observe an object in order to know it better, for instance how many binary digits we can acquire on the value of a voltage by an appropriate measure. A "computation" is a physical process, e.g. the flow of current into a complex set of transistors, that after some time eventually gives us the solution of a mathematical problem (such as "Is 13 prime?"). We are interested to various relations between these concepts.
In a first chapter, we unify some arguments in the literature to show that a whole class of quantities of dynamical systems are uncomputable. For instance the topological entropy of tilings and Turing machines.
Then we propose a precise meaning to the statement "This dynamical system is a computer", at least for symbolic systems, such as cellular automata. We also show, for instance, that a "computer" must be dynamically unstable, and can even be chaotic.
In a third chapter, we compare how complicated it is to control a system according whether we can acquire information on it ("feedback") or not ("open loop"). We are specifically interested in finite-state systems.
In last chapter we show how to control a scalar linear system when only a finite amount of information can be acquired at every step of time.
Bibliographic reference |
Delvenne, Jean-Charles. Dynamics, information and computation / Dynamique, information et calcul. Prom. : Blondel, Vincent |
Permanent URL |
http://hdl.handle.net/2078.1/5166 |