An Experimental Approach to Algorithmic Information Theory
indicates that the paper has been written or made available recently.
click on to download the online version of the paper.
click on to download the supporting material related to the paper.
H. Zenil, F. Soler-Toscano, J.-P. Delahaye and N. Gauvrit, 'Evaluating Kolmogorov Complexity: An Alternative to Compression Algorithms', Springer, forthcoming mid 2013.
Selected EAIT Papers
- F. Soler-Toscano, H. Zenil, J-P. Delahaye and Nicolas Gauvrit, 'Calculating Kolmogorov Complexity from the Frequency Output Distributions of Small Turing Machines', submitted.
- J-P. Delahaye and H. Zenil, 'Numerical Evaluation of Algorithmic Complexity for Short Strings: A Glance into the Innermost Structure of Randomness', Applied Mathematics and Computation, 219, pp. 63-77, 2012. DOI:10.1016/j.amc.2011.10.006.
- H. Zenil, J.P. Delahaye and C. Gaucherel, 'Image Characterization and Classification by Physical Complexity', Complexity, vol. 17--3, pages 26-42, 2012. (online 2011) DOI:10.1002/cplx.20388.
- H. Zenil and J-P. Delahaye, 'On the algorithmic nature of the world', in M. Burgin, G. Dodig-Crnkovic (eds), 'Information and Computation', World Scientific, 2010.
- J-P. Delahaye and H. Zenil, 'On the Kolmogorov-Chaitin complexity for short sequences', in 'Randomness and Complexity: From Leibniz to Chaitin' , edited by Cris S. Calude (University of Auckland, New Zealand), World Scientific, 2007.
- H. Zenil, 'Compression-based investigation of the dynamical properties of cellular automata and other systems', journal of Complex Systems, 14:2, 2010.
- H. Zenil and J.P. Delahaye, 'An algorithmic information-theoretic approach to the behaviour of financial markets,' submitted by invitation, themed issue on Nonlinearity, Complexity and Randomness, Journal of Economic Surveys, 2011.
- J.J. Joosten, F. Soler-Toscano and H. Zenil, 'Program-size versus Time Complexity. Slowdown and Speed-up Phenomena in the Micro-cosmos of Small Turing Machines', International Journal of Unconventional Computation, special issue on Physics & Computation 2011.
What is EAIT (or the Algorithmic Nature Program)?
We have coined the term Experimental Algorithmic Information Theory or EAIT to emphasize the experimental character of this research. EAIT is a multidisciplinary research program seeking to connect theoretical computer science to the real-world by way of computer experiments.
Nicolai Kolmogorov together with Ray Solomonoff, Leonid Levin and Gregory Chaitin created the field of Algorithmic Information Theory (AIT), sometimes referred to as program-size, algorithmic or simply Kolmogorov complexity. Greg Chaitin has been a particularly forceful advocate of a quasi-empirical approach to mathematics—one that involves performing experiments and running programs—to initiate a paradigm shift in the way math is done, calling this approach meta-mathematics. This approach is consonant with the one proposed by Stephen Wolfram's NKS spuring the exploration of what Wolfram has coined as the Computational Universe. Within this concepts, we have beendeveloping a hands-on approach to Algorithmic Information Theory.
Our research program is focused on:
The problem of the calculation and evaluation of the program-size complexity for finite strings, particularly short strings since they are already too short to be approached by compression algorithms. And providing a stable definition of program-size complexity independent (as possible) of additive constants.
The evaluation of the logical depth of certain strings by exhaustive means and independent of additive constants, particularly of short strings. And providing a general stable definition of logical depth independent of additive constants.
An information content approach to the notion of "naturalness" and "exactness" for a model of computation (or actual machine implementation), namely a programming language or a universal Turing machine.
Trade offs between complexity measures, particularly program-size and computational time complexity.
- The investigation of all forms of irreducibility results for deterministic systems from the halting problem, to intractability, to nonlinear phenomena, to classical unpredictability, to quantum uncertainty, in connection to the problem of induction and epistemology in general.
The thorough study of the output distribution of certain classes of computation, including runtimes.
Comparing the output probability distributions for as many models of computation as possible, such as Turing machines, cellular automata, Post tag systems, rewriting systems, SK combinatorics, register machines, finite automata, etc., and physical sources of information, such as random images, sounds, DNA sequences and so on.
The study of possible statistical correlations and probable mathematical convergence in output distribution.
The devising of actual applications, from providing actual universal distribution tables for compression and other purposes to providing useful Bayesian prior distributions.
Real-world applications of the theory of algorithmic complexity, from the behavior of the markets to the characterization and classification of images.
Who is involved in EAIT?
Hector Zenil started the project as a graduate student of Jean-Paul Delahaye at the computer science laboratory of the University of Lille. The research program formaly began with Hector Zenil's PhD thesis in 2006. Delahaye and Zenil main inspiration and theoretical basis came from the works of Leonid Levin, Greg Chaitin and Ray Solomonoff, and Cristian Calude, Ming Li, Paul Vitanyi, Walter Kirchherr and Stephen Wolfram.
Background and motivation
Since the days of Newton and Leibniz, scientists have been constructing elaborated views of the world. In the past century, quantum mechanics revolutionized our understanding of physical reality at atomic scales, while general relativity did the same for our understanding at large scales. Beyond the fact that these two general theories do not reconciliate each other mainly due to the linearity of the former as opposed to the nonlinearity found in the dynamics at all other scales, our current understanding of quantum mechanics suggests that randomness is an intrinsic property of physical reality (at least at some scale) while the study of nonlinear dynamics has shown that even classical mechanics has unpredictability at its core.
Some contemporary scientists have recently come up with theories in which nature is seen as processing information computing the laws of physics and everything we see around us, including all sorts of complex things like life. In this view, the universe would be computing itself and our computers would betherefore doing nothing but reprograming a part of the universe to make it compute what we want to compute.
Some of these views support the idea that nature is, or can be better studied if seen as, performing (only) digital computations. Fredkin, for example, assumes that the universe is describable by digital information and therefore the universe can be conceived as either the output of some computer program or as being some sort of vast digital (Turing complete) machine. Wolfram, on the other hand, claims that the world may turn out to be produced by a set of very simple rules being computed for a long time capable of generating the great complexity we see around us. If so, Wolfram suggests, one can go hunting for our universe (the very program that generates it).
John Archibald Wheeler's "It from bit" or Feynman's quotation, illustrates the early anticipation of this digital shift to understanding physics in terms of information.
"It always bothers me that, according to the laws as we understand them today, it takes a computing machine an infinite number of logical operations to figure out what goes on in no matter how tiny a region of space, and no matter how tiny a region of time ... So I have often made the hypothesis that ultimately physics will not require a mathematical statement, that in the end the machinery will be revealed, and the laws will turn out to be simple, like the chequerboard with all its apparent complexities."
Richard Feynman in 1964,
from one of the six Messenger lectures he delivered at Cornell University
It is not unreasonable to imagine that information sits at the core of physics, just as it sits at the core of a computer. It from bit. Otherwise put, every 'it'—every particle, every field of force, even the space-time continuum itself—derives its function, its meaning, its very existence entirely—even if in some contexts indirectly—from the apparatus-elicited answers to yes-or-no questions, binary choices, bits. 'It from bit' symbolizes the idea that every item of the physical world has at bottom—a very deep bottom, in most instances—an immaterial source and explanation; that which we call reality arises in the last analysis from the posing of yes–no questions and the registering of equipment-evoked responses; in short, that all things physical are information-theoretic in origin and that this is a participatory universe.
John Archibald Wheeler in1990,
from his paper "Information, physics, quantum: The search for links".
There are also those who believe that nature does not only perform digital computations but rather quantum computations (e.g. David Deutsch, Charles Bennett, Seth Lloyd) namely because our world is mean to carry out quantum processes according to the the current model of physics, with digital computation being the handicapped part (after decoherence) of a quantum computation (in words of Bennett). Some theories under development are attempting to unify these views based in the works of 't Hooft, Susskind and Smolin on quantum gravity (e.g. Computational Loop Quantum Gravity).
These contemporary views of a computational universe are deeply related by the concept of information to a contemporary mathematical field of research called algorithmic information theory (AIT). AIT researchers think that the true nature of nature can only be explained by the study of randomness (e.g. Greg Chaitin, Leonid Levin). The study of randomness leads to the definition of complexity and information in algorithmic terms.
What we think and what our research program claims is that if the world is truly either computing or a computer by itself (digital or quantum) it should follow the same algorithmic laws that computers do, like the production of an output in accordance to the frequencies of what algorithmic probability predicts.
We are interested in the question of whether nature is, or behaves as, such a computer, as an information processor. We think the question is currently within the scope of modern scientific research and it is therefore a falsifiable statement. We think there may be evidence to be had from a comparison between the outcome of the world itself and the expected output of purely digital computation.
In the way, we have been connecting the theory to the real-world, from experimental approaches to the theory, such as the evaluation of the program-size complexity of strings by exhaustive means, to the algorithmic study of the behavior of financial markets, to applications of the concept of logical depth for image characterization and classification.
Other online resources
- Chaitin, G.J., Algorithmic Information Theory. Third Printing, Cambridge University Press, 1990.
- Calude, CS. (ed), Randomness and Complexity, from Leibniz to Chaitin, World Scientific, Singapore, 2007.
- Calude, CS. Information and Randomness. An Algorithmic Perspective, 2nd Edition, Revised and Extended, Springer-Verlag, 2002.
- C. S. Calude, M. J. Dinneen. Exact approximations of omega numbers, Int. Journal of Bifurcation & Chaos, 17, 6 (2007), 1937-1954.
- Turing, A.M., On Computable Numbers, with an Application to the Entscheidungsproblem. Proceedings of the London Mathematical Society, ser. 2. vol. 42 (1936-7), pp.230-265; corrections, Ibid, vol 43 (1937) pp. 544-546.
- "Turing Machine", The Stanford Encyclopedia of Philosophy (Spring 2002 Edition), Edward N. Zalta (ed.).
- Hilbert, D., Mathematical Problems: Lecture Delivered before the International Congress of Mathematicians at Paris in 1900, translated by Maby Newson, Bulletin of the American Mathematical Society 8 (1902), pp. 437-479. , HTML version by D. Joyce available on-line.
- Gödel, K., On Formally Undecidable Propositions of Principia Mathematica and Related Systems. Monatshefte für Mathematik und Physik, 38 (1931), pp. 173-198. Translated by Martin Hirzel.
- Copeland, B. Jack, "The Church-Turing Thesis", The Stanford Encyclopedia of Philosophy (Fall 2002 Edition), Edward N. Zalta (ed.).
- Hodges, Andrew, "Alan Turing", The Stanford Encyclopedia of Philosophy (Summer 2002 Edition), Edward N. Zalta (ed.).
- Minsky, M., "Size and Structure of Universal Turing Machines," Recursive Function Theory, Proc. Symposium. in Pure Mathematics, 5, American Mathematical Society, 1962, pp. 229-238.
- Ming Li and Paul M. B. Vitanyi, An Introduction to Kolmogorov Complexity and Its Applications (Texts in Computer Science), Springer, 1997.
- Rogozhin, Y. "Small Universal Turing Machines." Theoretical Computer Science 168 iss. 2, 215-240, 1996.
- Wolfram, S. A New Kind of Science. Champaign, IL: Wolfram Media, pp. 706-711, 2002.
The Experimental Algorithmic Information Theory Project
by Hector Zenil
Back to the main page
Hector Zenil <hectorz[at]alumni.cmu.edu>
Laboratoire d'Informatique Fondamentale de Lille and IHPST (Paris 1)
WWW URL: "http://zenil.mathrix.org"
Randomness Through Computation
edited by H. Zenil
Computation in Nature
& the Nature of
edited by H. Zenil
The shortest universal machine implementation contest
Turing machine from the Stanford Encyclopedia of Philosophy
Busy Beaver machine
from Wolfram MathWorld
Algorithmic randomness from Scholarpedia
Applications of AIT
Pascal Michel's Historical survey of busy beaver results
The 2,3 Wolfram Turing Machine Research Prize
Wolfam's NKS online book
Laboratoire d'Informatique Fondamentale de Lille
Institute d'Histoire et Philosophie des Sciences et Techniques