The concept of predictive complexity is a natural development of the theory of prediction with expert advice. Predictive complexity bounds the ability of any algorithm which tries to predict elements of a given sequence. In other words, it is a quantitative measure of "learnability" as an inherent property of an object and it indicates the limits of machine learning.
Predictive complexity imposes a lower bound on the loss suffered by any prediction algorithm. This bound holds up to an additive constant in the similar fashion as in the theory of Kolmogorov complexity. The total loss suffered by an algorithm is determined by a loss function. A loss function measures the deviation between guesses and actual outcomes. Different loss functions define games with different properties and, therefore, different types of predictive complexity.
The concept of predictive complexity was introduced in the paper [Vovk, 97]. Generally speaking, the value of predictive complexity does not correspond to the loss of an algorithm. It is defined as an optimal function in a special class. The paper [Vovk and Watkins, 98] introduces a method that allows to prove the existence of such optimal functions for many reasonable games. This method is based upon the Aggregating Algorithm and works for all so called mixable games.
The most important loss functions are the logarithmic and square loss ones. Both of them specify mixable games and therefore logarithmic complexity and the square-loss complexity exist. Logarithmic complexity turns out to coincide with a variant of Kolmogorov complexity, namely negative logarithm of Levin's a priori semimeasure. That is why predictive complexity may be regarded as a generalization of Kolmogorov complexity. At the same time, square-loss complexity is of particular importance because the square loss function corresponds to the squared error, which is used very often in statistics.
The paper [Vovk and Gammerman, 99] proposes the main application of predictive complexity, Complexity Approximation Principle (CAP). The idea of this principle is to employ upper estimates of predictive complexity for solving the hypothesis selection problem. In the case of logarithmic complexity, CAP is reduced to well-known Minimum Description Length principle. In [Kalnishkan, 00] an attempt was made to apply CAP to the case of square-loss complexity.
The papers [Kalnishkan, 99A] and [Kalnishkan, 99B] investigate the relations between different variants of Predictive Complexity. The former deals with the special case, namely, the relations between the square-loss and logarithmic complexity, while the later contains a more general result.
In [Vovk and Kalnishkan, 00] the problem of the existence of predictive complexity is considered. Some necessary and sufficient conditions are proved for special cases. The paper also investigates the expectations of predictive complexity. They turn out to be given by the Legendre transformation and this leads to a kind of uniqueness theorem, namely, the fact a particular kind of predictive complexity specifies a unique (up to a parametrisation) loss function.
The following people at CLRC are interested in the theory of predictive complexity:
[Vovk, 97] V.Vovk. Probability Theory for the Brier Game. In M.Li and A.Maruoka, editors, Algorithmic Learning Theory, volume 1316 of Lecture Notes in Computer Science, pages 323-338. 1997. To appear in Theoretical Computer Science.
[Vovk and Watkins, 98] V.Vovk and C.J.H.C.Watkins. Universal Portfolio Selection. In Proceedings of the 11th Annual Conference on Computational Learning Theory, pages 12-23, 1998.
[Vovk and Gammerman, 99] V.Vovk and A.Gammerman. Complexity Approximation Principle. The Computer Journal, 42(4), pages 318-322, 1999.
[Kalnishkan, 99A] Y.Kalnishkan. Linear Relations between Square-Loss and Kolmogorov Complexity. In Proceedings of the Twelfth Annual Conference on Computation Learning Theory, pages 226-232. Association for Computing Machinery, 1999.
[Kalnishkan, 99B] Y.Kalnishkan. General Linear Relations among Different Types of Predictive Complexity. In Algorithmic Learning Theory, 10th International Conference, ALT'99 pages 323-334, volume 1720 of Lecture Notes in Artificial Intelligence, Springer-Verlag, 1999.
[Kalnishkan, 00] Y.Kalnishkan. Complexity Approximation Principle and Rissanen's Approach to Real-Valued Parameters. In Machine Learning: ECML 2000, 11th European Conference on Machine Learning, volume 1810 of Lecture Notes in Artificial Intelligence, Springer-Verlag, 2000.
[Vovk and Kalnishkan, 00] Y.Kalnishkan and V.Vovk. The existence of predictive complexity and the Legendre transformation. Technical report TR-00-04, Computer Learning Research Centre, Royal Holloway College, May 2000. Presented at TAI 2000, Fourth French Days on Algorithmic Information Theory. Download
The theory of predictive complexity combines techniques from two different branches of computer science, namely Computational Learning Theory and the theory of Kolmogorov complexity. Below you can find several links to web sites related to these topics.