A bit of cryptography, passwords strength, and sysadmins of savings banks

This recent webcomic from xkcd, Password Strength, relates to what I’ve been investigating lately–the question of whether there need be a tradeoff between password strength and ease of recall, using, as usual, my knowledge of finite algorithmic information theory. While my conclusions will be presented in a paper in the next few months/years, the webcomic […]

An alternative method (to compression) for approximating the algorithmic complexity of strings

The method introduced in my doctoral dissertation was featured in the French version of Scientific American Pour La Science in its July 2011 issue No. 405 under the title Le défi des faibles complexités. Jean-Paul Delahaye points out that: Comme les très petites durées ou longueurs, les faibles complexités sont délicates à évaluer. Paradoxalement, les […]

“The World is Either Algorithmic or Mostly Random” awarded a 3rd Place Prize in this year’s FQXi contest

Based on the combined ratings of the contest community and the panel of expert reviewers appointed by the FXQi, which included the members of the institute, I was awarded a 3rd Place Prize for my work The World is Either Algorithmic or Mostly Random in this year’s FQXi contest on the topic Is Reality Digital […]

New book: “Lo que cabe en el espacio” on Philosophy of Space and Geometry

I am excited to announce the publication of my new book written in Spanish Lo que cabe en el espacio on the philosophy of space in connection to our reality, and what we can or cannot do with it and in it. The book, under the title “Lo que cabe en el espacio: La geometría […]

The Lighthill Parliamentary Debate on General Purpose Artificial Intelligence

In 1973, Lucasian professor at Cambridge, James Lighthill, was asked by the British Parliament to evaluate the state of AI research in the United Kingdom. His report, now called the Lighthill report, criticized the failure of AI to achieve its grandiose objectives. He specifically mentioned the problem of “combinatorial explosion” or “intractability” of the discourse […]

Is Faster Smarter? IBM’s Watson Search Engine Approach to Beat Humans

IBM’s computer named “Watson” has beaten Jeopardy! (human) contestants in a series of games this month. IBM has a long history of innovations (watch this other 100th anniversary documentary featuring Greg Chaitin and Benoit Mandelbrot, among others here). Not everybody was impressed by Watson though, according to Gavin C. Schmitt who interviewed Noam Chomsky, the […]

Randomness Through Computation

The official announcement of the book I edited has been released: RANDOMNESS THROUGH COMPUTATION Some Answers, More Questions edited by Hector Zenil 450pp (approx.) ISBN: 978-981-4327-74-9 The book will be available next month (February 2011) from Amazon, Borders and other large online bookstores, as well as on the publisher’s (World Scientific and Imperial College Press) […]

Compression-based Investigation of Cellular Automata, A Phase Transition Coefficient and a Conjecture Related to Universal Computation

In a recent paper, forthcoming in the Journal of Complex Systems, vol. 19, I present a method for studying the qualitative behavior of cellular automata and other abstract computing machines based on the approximation of their program-size complexity using a general lossless compression algorithm. I show that the compression-based approach classifies cellular automata (CA) into clusters according to their heuristic behavior, with these clusters showing a correspondence with Wolfram’s main classes of systemic behavior. I also present a Gray code-based numbering scheme for initial conditions optimal for this kind of investigations, and a compression based method for estimating a characteristic exponent in the form of a phase transition coefficient measuring the resiliency or sensitivity of a system to its initial conditions. I also conjecture that universal systems have large transition coefficients.

Classifying objects by complexity

We present a method for estimating the complexity of an image based on the concept of Bennett’s logical depth. We use this measure to classify images by their information content. The method provides a means for evaluating and classifying objects by way of their visual representations.

Comments on Turing’s very first Universal machine approaching Turing’s 100th. birthday anniversary

The idea that a machine could perform the tasks of any other machine is the description of a Universal (Turing) machine. Its invention is considered by many to have been one of the major landmarks giving rise to the field of computer science. ‘Universal’ means that one can ‘program’ a general-purpose machine to perform the […]

Stephen Hawking: A brief examination of the recent warning over alien civilizations

Stephen Hawking asserts that while aliens almost certainly exist, humans should avoid making contact. The original story published by BBC News can be found here. He claims: “We only have to look at ourselves to see how intelligent life might develop into something we wouldn’t want to meet.” Stephen Hawking recent assertion looks like an […]

On the Algorithmic Nature of the World

In a new paper I’ve coauthored with Jean-Paul Delahaye, we propose a test based on the theory of algorithmic complexity and an experimental evaluation of Levin’s universal distribution to identify evidence in support of or in contravention of the claim that the world is algorithmic in nature. To this end we have undertaken a statistical […]

Evaluating the complexity of a living organism by its algorithmic complexity

One of the greatest scientific achievements of the last century was the understanding of life in terms of information. We know today that the information for synthesizing the molecules that allow organisms to survive and replicate is encoded in the DNA. In the cell, DNA is copied to messenger RNA, and triplet codons in the […]

Physics-like computation, Wolfram’s PCE and Church’s thesis

The lack of correspondence between the abstract and the physical world seems sometimes to suggest that there are profound incompatibilities between what can be thought and what actually happens in the real world. One can ask, for example, how often one faces undecidable problems. However, the question of undecidability has been considered to be better […]

The Shortest Universal Turing Machine Implementation Contest

The Shortest Universal Turing Machine Implementation Contest ANNOUNCEMENT 23 Dec – 2008 http://www.complexitycalculator.com/experimentalAIT/TuringMachine.html Contest Overview In the spirit of the busy beaver competition though related to program-size complexity, we are pleased to announce the “Shortest Universal Turing Machine Implementation Contest”. The contest is open-ended and open to anyone. To enter, a competitor must submit a […]

The 2008 Midwest NKS Conference: What is computation? How does nature compute?

2008 Midwest NKS Conference: Call for Papers and/or Participation GENERAL ANNOUNCEMENT AND CALL FOR PAPERS What is computation? (How) does nature compute? 2008 Midwest NKS Conference Fri Oct 31 – Sun Nov 2, 2008 Indiana University — Bloomington, IN http://www.cs.indiana.edu/~dgerman/2008midwestNKSconference/ In 1964, in one of the six Messenger lectures he delivered at Cornell University (later […]

Misguiding research on Artificial Intelligence.

AthensOriginally uploaded by hzenilc. The field of strong AI, sometimes referred to as artificial general intelligence or AGI, is defined as intelligence that is capable of the same intellectual functions as a human being, including self-awareness and consciousness, I will call the assumed object of study of AGI just agi in lowercases, not only to […]