Archive for July, 2006

NKS Cooking

Posted in New Ideas, Recreation on July 31st, 2006 by Hector Zenil – Be the first to comment

A novel approach to cooking

The algorithm:

  1. Consider the space of all possible recipes.
  2. Enumerate the ingredients per food style (Mexican, Chinese, French, Italian, and so on).
  3. Take the subsets of the ingredients per food-style set.
  4. Begin with a random choice of initial ingredients.
  5. Mix them by following a set of simple cooking rules like baking, sprinkling or heating.
  6. Season with  salt and pepper to taste.

One could use a robotic device and see what results. One would also have to set up an evaluation scheme to decide which dishes are good and which are bad– either a human tester or a reliable sensor such as a modern “artificial nose” (essentially a microarray-like object with a number of different receptors).

However, since the testing procedure is undecidable in general, in the case of any given recipe, neither a human nor an automatic device can decide whether or not the end result will taste good.

It would also be of interest to find out which “cooking rules” lead to which patterns of smell and taste. For instance,  one would expect  any dish involving sugar, butter, and flour  to taste reasonably good provided the rules do not yield a “burnt offering”.  Of course, to make things really interesting, one should introduce additional ingredients, including ingredients  which aren’t traditionally combined with the basic ones.

For French cuisine, butter would be a primitive, for American food, ketchup, for  Mexican, chili, for Chinese,  rice,  for Italian, tomato. French and Italian food share  cheese as a primitive; cooking “au gratin” would always be a reasonably safe bet.  Since primitives are important, you cannot cook pancakes au gratin, and ketchup would be safer if you wanted your dish to meet with the approval of the American tester.

An ingredient is a primitive in the sense that any dish in a given style can use either more or less of it and still retain  its character as an exemplar of that style.  Primitives can be used at will,  including in emergencies or when the rules happen to yield nasty results.

One might investigate reducible shortcuts, such as preparing something faster or skipping steps. However, most of the recipes are irreducible, making it impossible to take shortcuts without going through the whole preparation.

With some effort one can also prove that there exists a universal algorithm which,  given a recipe and the initial ingredients,  is able to reproduce any other recipe.

Concerning time/space complexity, it remains an open question whether the preparation time can always be reduced to a polynomial.

One can also ask about the algorithmic complexity of a dish, since one can measure its information content–the number of ingredients involved, the number of rules and steps applied, whether the rules were iterated, nested or random- looking.  However, since, as noted above, the procedure is undecidable, the only known way to approach the question of the complexity of a dish is to compress it into a piece of universal tupper-ware and see what happens.

Another open question: Given a compressed ingredient, how many other ingredients can one compress? It is believed that this yields  a classification of  food types, defining a threshold between those in a lower/trivial class and  more sophisticated items.

Experimental Metamathematics

Posted in Foundations of Math on July 30th, 2006 by Hector Zenil – Be the first to comment

Metamathematics is to mathematics what metaphysics is to philosophy. It is about stepping outside the discipline and attempting to grasp it as a single entity/ object of study; it is an investigation into the foundations of mathematics. (For an elucidation of  Foundations of Mathematics see

Some mathematicians have begun to practice  a quasi-empirical brand of mathematics and insist on results for which there is experimental evidence. This development has been facilitated by the fact that massive calculations are possible nowadays.  This new brand of mathematics is related to experimental mathematics in being a type of mathematical research in which computation is used to investigate mathematical structures and identify their fundamental properties and patterns.

A few researchers have devoted themselves to what we could call experimental metamathematics, doing methamatematical research with the help of today’s vast computational resources, which were unavailable not too long ago. Among such researchers are Stephen Wolfram and Gregory Chaitin, both of whom have formulated rich questions and made remarkable discoveries and contributions to the field (NKS on Math sample video lecture by Wolfram  available here.)

Such efforts have made it possible to explore and arrive at important results regarding particular spaces of possible programs, including precise results which are perfectly consonant with traditional mathematics. For instance, Wolfram arrived at an axiom using only NAND and NOR equivalent to the set of all axioms of Boolean Algebra by considering the space of all propositional logic expressions up to six NANDs and two variables. He found that none of the 16896 possible axiom systems of this kind explored by him worked even up to 3-value operators. But with 6 NANDS and 3 variables, he found that 296 of the 288684 possible axiom systems worked up to 3-value operators, and 100 worked up to 4-value operators (NKS Book, Wolfram 2002, p. 809). He then found the smallest, proving that there were no smaller and formulating the following axiom: (((x \[Nand] y) \[Nand] z) \[Nand] (x \[Nand] ((x \[Nand] z) \[Nand] x))) == z.

Similar research on first-order logic would be worth undertaking, considering that it is powerful enough to express most or all of our mathematical theories.

The image below is a graph of what I call the universe of mathematical theories (or axiom systems, since they are not interpreted under any particular structure and have no semantic content). Each column represents a formula generated under an enumeration which eventually goes through all possible first-order formulas. Each cell on the matrix has a color,  gray if the formula of the column is false in the axiom system of the row,  white if it is true. Some cells are black and they represent formulas that are neither false nor true. In other words they are syntactically underivable within the axiom system in question. The project was  developed in a pre-release Mathematica Version 6 using Waldmeister, an automatic theorem prover embedded in the new Mathematica– a theorem prover based on the Bendix-Knuth resolution. Some patterns in the image can be explained by the chosen enumeration of the formulas and the axiom systems (which are subsets of the same formulas). However the question is whether or not a pattern can always be explained, and whether patterns are formed only by syntactical facts or whether they can shed light on the general properties of classes of axiom systems. I have explored these issues through to the beginning of the mathematical universe of length 5 and 6, but the process is clearly intractable, so I am beginning to generate samples of universes of higher degree, and to perform translations of mathematical theories into pure syntactical axioms and axioms systems into theories under arbitrary interpretations. One way or another this procedure will touch one of our current mathematical theories. The question is which number will it be  and how many times could it be repeated under different interpretations.

Mathematical Universe of First Order Axiom Systems of Length 4 the The image above shows a sample with the first proofs for all the 161 raw formulas of length 4 (involving 4 bound variables) and the first 1000 axiom systems (which come from the subset operation applied to the  161 raw formulas of length 4). Since the size (total number of axiom systems) in the universe of length 4 is 2.9×10^48, this is only the very first part of it, with the first 161000 proofs obtained using the automated theorem prover. The problem is evidently intractable but a sampling method can be conceived in order to explore higher universes, to mix formulas between universes and find possible unrecognized (not easily syntactically explained) structures.

All equational axiom systems can be represented with two binary operators f and p, where p is a pairing function and we specify a constant using an axiom of the form
ForAll[a,ForAll[b,f[a, a] == f[b, b]]], and then the ith constant is c[i]=Nest[p[f[a,a],#]&,f[a,a],i]. We can then use f[c[i],p[c[1],x]], as the ith function of x. Note that x may be a list built from pairs. Of course, this assumes that there are an infinite number of  individuals in the most general case.

All axioms can be given in Skolemized form, hence there is no need to use existential quantifiers in axioms.

The mathematical universes created by this method were checked for consistency (proving that a==b in inconsistency systems for any a and b, in other words that in them it is possible to prove that anything equals anything) and internal independence (the minimal set of axioms, in other words that an axiom ai in S cannot be derived from another axiom aj or a subset of {a1,..,an} in the system S, where n is the length –number of axioms– of the system S). Some interesting results (to be published in the Journal of Complex Systems) have been obtained, like the proportion of  consistent to inconsistent axiom systems in such an enumeration –which is not precisely arbitrary but lexicographically–, or of internal independence to dependency. The purpose of the project is to shed some light on questions like the following:

- Euclid described axioms as self-evident truths. From the perspective of modern mathematics it is quite evident that it would be very difficult to  find axioms that are self-evident truths, and even  more difficult to claim that one’s axiom of choice or the continuum hypothesis are self-evident truths. So the problem is: Where do we get axioms from?
- What is evident is that math theories begin from something that is assumed to be true,  a starting point adopted for pragmatical reasons rather than because of its intrinsic truth value.
- How special are our mathematics (Are we being “mathcentric” in the Copernican sense)?
- How far along are  current mathematical theories in the enumeration of all possible axiom systems (they are enumerable)?
- Given an enumeration, how arbitrary is it and what role will it have in placing our current mathematics?
- Is there a shorter axiom system (in terms of number of axioms or their length) defining current mathematical theories? (My claim: There may be several copies of mathematical theories through the whole enumeration, but I wonder whether they get syntactically simpler while becoming semanticaly more complicated. For example, the same operations may have more complicated interpretations in order to capture an equivalent theory in fewer or shorter axioms. S. Wolfram was surprised at how close the axiom systems for propositional logic were when he enumerated non-quantified systems for his NKS book.  Could the same be true  for arbitrary mathematical theories, and if so what would that mean?)
- Is there a single type of math? In other words, could  an hypothetical alien civilization have a different mathematics? It seems that there is a single physics since, presumably, we live in a single universe. However we have several models of that universe, not all of which are mutually compatible, and a priori their are known as approximations. So the question could be split into two: perhaps there is a single physics but we are not able to access it assuming real numbers, i.e. by traditional methods. However, it seems unclear how we might argue that there is a single type of mathematics.
- Is there a minimum set of mathematics shared by all possible hypothetical civilizations, probably at least  arithmetic theory, those theories closer to physical reality?.
- Assuming a discrete universe and the role of mathematics and physics for determining what we call “real”, the enumeration of possible universes would be feasible. What would it mean if our universe is very close to the beginning of the enumeration of all possible universes? Of course that is very dependent on the chosen enumeration, but could it be in some way independent or less arbitrary? In what degree?