Fix a programming language, consider for all n the percentage of programs of length n which halt. What can be said about the behaviour of r_n? The question is not very interesting for `real-life’ programming languages (a random piece of code is very likely to be syntactically incorrect) but becomes much deeper when one considers an optimal programming language (= optimal Turing machine), whose existence is the starting point of the theory of Kolmogorov complexity. We will present several results when the above question is understood in this setting. [Joint work with Damien Desfontaines and Alexander Shen]
TBD
Some finitely presented groups have an uncomputable word problem.
However f.p. *simple* groups have a computable word problem.
Some subshifts of finite type have an uncomputable language.
However *minimal* subshifts of finite type have a computable language.
For some finitely axiomatizable theories, the set of provable formulas
is uncomputable.
However the set of provable formulas of finitely axiomatizable
*complete* theories is computable.
There is a pattern here. In fact, a general statement can be obtained,
using for example the language of universal algebras or
quasivarieties, and indeed such a statement can be found in the
note by Kuznetsov proving the first result about groups.
Here we will give a proof of a similar statement using the vocabulary
of topology.
The framework is actually a bit more general. We will prove for
example that the word problem of any simple group (which might not be
finitely presented, or even recursively presented) exhibits a very
peculiar property (which coincides with "being computable" for
finitely presented and recursively presented groups).
The result will be formulated using a concept known as enumeration reducibility.
All results and proofs in the talk will be elementary.
A $\mathbb{Z}$-subshift is effective if there is a Turing machine which can enumerate a set of forbidden patterns defining it. We present two natural ways to extend the definition to general finitely generated groups and we construct a strongly aperiodic subshift whose forbidden patterns can be generated by a Turing machine with access to an oracle for the word problem of $G$. This construction proves that there exist strongly aperiodic effective subshifts in a group if and only if the word problem is decidable.
A Turing machine is topologically transitive if every partial configuration --- that is a state, a head position, plus a finite portion of the tape --- can reach any other partial configuration, provided that it is completed into a proper configuration. We characterize topological transitivity and study its computational complexity in the dynamical system models of Turing machines with moving head, moving tape and for the trace subshift. We further study minimality, the property of every configuration reaching every partial configuration.
Ramsey's theorem (RT^k_n) states that every coloring over k-tuples of integers with at most n colors admits a infinite monochromatic subset. Whenever k >= 3, RT^k_n is equivalent to the arithmetic comprehension axiom (ACA). On the other hand, RT^2_n is strictly weaker than ACA and incomparable with weak König's lemma. Last, RT^1_n is provable over RCA. The provability strength of RT^k_n does not depend on the number of its colors in reverse mathematics. However, there is an asymetry in the proof of equivalence between RT^k_n and RT^k_m whenever m and n differ. In this talk, we shall highlight this difference by using the notion of computable reduction which, unlike provability over RCA, cares about the number of applications of the considered principle.
Signal machines is a geometric computational model with continuous space and time and whose events are discrete. We will present three simple constructions about signal machines in connection with arithmetics. We first construct a signal machine which can serve as an oracle and involves continued fractions. Then, we construct a signal machine which semidecide the algebraicity of a real number. This machine looks like a parallel billard, and its construction tells us about how speed is stronger than space or time in that model. We will see how to slightly modify the signal machine model to let it become computationally equivalent to the (full) Blum Shub Smale model of computation over the reals.
In 1941, Claude Shannon introduced a model for the Differential Analyzer, called the General Purpose Analog Computer (GPAC). Originally it was presented as a model based on circuits, where several units performing basic operations (e.g.~sums, integration) are interconnected. However, Shannon itself realized that functions computed by a GPAC are nothing more than solutions of a special class of differential equations of the form y'=p(y) where p is a (vector of) polynomial. Analog computer have since been replace by digital counterpart. Nevertheless, once can wonder whether the GPAC could in theory have super-Turing computable power. A few years ago, it was shown that Turing-based paradigm and the GPAC have the same computational power. So, switching to analog computers would not enable us to solve the Halting problem, or any uncomputable exotic thing, but we can nonetheless compute everything a Turing machine does (given enough resources), and a return to analog technology would at least not mean a loss of computational power. However, this result did not shed any light on what happens at a computational complexity level: can an analog computer (GPAC) solve a problem faster (modulo polynomial reductions) than a digital computer (Turing machine)? I will provide some elements which show that some reasonable restrictions of the GPAC are actually equivalent to P (polynomial time) and going even further, that we can show an equivalence with the polynomial of computable analysis. This gives an elegant, analog definition for polynomial time computable functions over real numbers.
In computability theory and computable analysis, finite programs can compute infinite objects. Presenting a computable object via any program for it, provides at least as much information as presenting the object itself, written on an infinite tape. What additional information do programs provide? We characterize this additional information to be any upper bound on the Kolmogorov complexity of the object. Hence we identify the exact relationship between Markov-computability and Type-2-computability. We then use this relationship to obtain several results characterizing the computational and topological structure of Markov-semidecidable sets. We also prove an analog of Rice and Rice-Shapiro theorems for subrecursive classes of functions, like the primitive recursive functions or the polynomial-time computable functions, obtaining an exact characterization of the properties of functions that are decidable or semi-decidable, given a program for the function. Again Kolmogorov complexity is at the core of the characterization.
While is a simple imperative program whose data structure are abstract binary trees. It is Turing-complete. Niel Jones considered the programs where one does not use the tree constructors, this is While\{cons}. This (sublanguage was shown to be LOGSPASE-complete which makes it relatively powerful computationally speaking. We have continued the study of this language. In the language, there is an expression (t == u) for the equality of two trees t and u. In this talk, we consider a more atomic operation where one only test if a tree is nil. The two forms of tests are equivalent within While. But, what happens in While\cons? We show that equality is actually not computable. The result is all the more surprising that tree isomorphism is LOGSPACE-complete which looks paradoxical.