\documentclass{article}
\usepackage[utf8]{inputenc}
\usepackage{amsmath}
\usepackage{amsthm}
\usepackage{fullpage}
\title{Universal Sequential Search Problems}
\author{L. A. Levin}
\date{}
\newtheorem{theorem}{Theorem}
\newtheorem{lemma}{Lemma}
\newtheorem{definition}{Definition}
\begin{document}
\maketitle
\begin{abstract}
The article examines several well-known problems of the ``sequential search type'' and proves that these problems can only be solved in the time it takes to solve any problem of the specified type in general.
\end{abstract}
\section{Introduction}
After clarifying the concept of an algorithm, the algorithmic unsolvability of a number of classical problems was proven (for example, the problems of identity of group elements, homeomorphism of manifolds, solvability of Diophantine equations, and others). This removed the question of finding a practical way to solve them. However, the existence of algorithms for solving other problems does not remove a similar question for them due to the fantastically large amount of work prescribed by these algorithms. This is the situation with the so-called sequential search problems: minimization of Boolean functions, search for proofs of limited length, determining graph isomorphism, and others. All these problems are solved by trivial algorithms consisting of enumerating all possibilities. However, these algorithms require exponential working time, and mathematicians have formed the belief that simpler algorithms for them are impossible. A number of serious arguments have been obtained in favor of its validity (see [1, 2]), but no one has been able to prove this statement. (For example, it has not yet been proven that finding mathematical proofs requires more time than verifying them.)
However, if we assume that there exists some (even artificially constructed) problem of the sequential search type that cannot be solved by simple algorithms (in terms of computation volume), then it can be shown that many ``classical'' sequential search problems (including the minimization problem, the proof search problem, and others) also have this property. This constitutes the main results of the article.
\section{Definitions and Problems}
Functions $f(n)$ and $g(n)$ will be called comparable if for some $k$:
\[f(n) \leq (g(n)+2)^k \text{ and } g(n) \leq (f(n)+2)^k.\]
Similarly, we will understand the term ``less than or comparable''.
\begin{definition}
A sequential search type problem (or simply a sequential search problem) will be called a problem of the form ``given $x$, find some $y$ of length comparable to the length of $x$, such that $A(x, y)$ is satisfied'', where $A(x, y)$ is some property verifiable by an algorithm whose working time is comparable to the length of $x$. (Here, an algorithm can be understood as, for example, Kolmogorov-Uspensky algorithms or Turing machines, or normal algorithms; $x$, $y$ are binary words). A quasi-sequential search problem will be called the problem of determining whether such a $y$ exists.
\end{definition}
We will consider six problems of these types. The objects considered in them are encoded naturally in the form of binary words. The choice of natural encoding is not essential, as they all give comparable code lengths.
\begin{enumerate}
\item Given a list of a finite set and its cover by 500-element subsets. Find a subcover of a given cardinality (or determine if it exists).
\item A partial Boolean function is given in tabular form. Find a disjunctive normal form of a given size that implements this function in the domain of definition (or determine if it exists).
\item Determine whether a given propositional calculus formula is derivable or refutable. (Or, equivalently, whether a given Boolean formula is equal to a constant.)
\item Given two graphs. Find a homomorphism from one to the other (determine its existence).
\item Given two graphs. Find an isomorphism from one to the other (or to its part).
\item We consider matrices of integers from 1 to 100 and some condition about which numbers can be adjacent vertically and which horizontally. Numbers are given on the border, and it is required to continue them to the entire matrix while observing the condition.
\end{enumerate}
\section{Main Results}
Let $f(n)$ be a monotonic function.
\begin{theorem}
If there exists any problem of sequential search (quasi-sequential search) type that cannot be solved in time less than $f(n)$ for argument length comparable to $n$, then problems 1-6 also have this property.
\end{theorem}
The idea of the proof is that problems 1-6 are ``universal sequential search problems''.
\begin{definition}
Let $A(x, y)$ and $B(x, y)$ define sequential search problems $A$ and $B$ respectively. We say that problem $A$ reduces to $B$ if there are three algorithms $r(x)$, $p(y)$, and $s(y)$, working in time comparable to the length of the argument, such that $A(x, p(y)) \Leftrightarrow B(r(x), y)$ and $A(x, y) \Leftrightarrow B(r(x), s(y))$ (i.e., from an $A$-problem $x$, it's easy to construct an equivalent $B$-problem $r(x)$). A problem to which any sequential search problem reduces is called ``universal''.
\end{definition}
Thus, the essence of the proof of Theorem 1 consists in the following lemma.
\begin{lemma}
Problems 1-6 are universal sequential search problems.
\end{lemma}
The described method apparently allows easy obtaining of results like Theorem 1 and Lemma 1 for most interesting sequential search problems. However, the problem remains to prove the condition present in this theorem. Numerous attempts have long been made in this direction, and a number of interesting results have been obtained (see, for example, [3, 4]). However, the universality of various sequential search problems can be established without solving this problem. In the system of Kolmogorov-Uspensky algorithms, the following can also be proved:
\begin{theorem}
For an arbitrary sequential search problem $A(x, y)$, there exists an algorithm that solves it in time optimal to within multiplication by a constant and addition of a value comparable to the length of $x$.
\end{theorem}
\section*{Acknowledgements}
The author expresses sincere gratitude to A. N. Kolmogorov, B. A. Trakhtenbrot, Ya. M. Barzdin, Yu. I. Albryton, and M. I. Degtyar for valuable discussion.
\begin{thebibliography}{4}
\bibitem{yablonsky} Yablonsky S. V. On algorithmic difficulties of synthesis of minimal contact circuits. Collection ``Problems of Cybernetics'', 2. Moscow, Fizmatgiz, 1959, 75-121.
\bibitem{zhuravlev} Zhuravlev Yu. I. Set-theoretic methods in Boolean algebra. Collection ``Problems of Cybernetics'', 8. Moscow, Fizmatgiz, 1962, 5-44.
\bibitem{trakhtenbrot} Trakhtenbrot B. A. Optimal computations and Yablonsky's frequency phenomenon. Seminar. Novosibirsk, ``Nauka'', Siberian Branch, 1965, 4, 5, 79-93.
\bibitem{degtyar} Degtyar M. I. On the impossibility of eliminating complete enumeration when calculating functions relative to their graphs. Reports of the USSR Academy of Sciences, 1969, 189, 4, 748-751.
\end{thebibliography}
\end{document}