Algorithmic complexity bounds on future prediction errors
Tài liệu tham khảo
A. Chernov, M. Hutter, Monotone conditional complexity bounds on future prediction errors, in: Proc. 16th Int. Conf. Algorithmic Learning Theory (ALT’05), LNAI, vol. 3734, Springer, Berlin, Singapore, 2005, pp. 414–428, http://arxiv.org/abs/cs.LG/0507041.
Solomonoff, 1964, A formal theory of inductive inference: Part 1 and 2, Inf. Control, 7, 1, 10.1016/S0019-9958(64)90223-2
Solomonoff, 1978, Complexity-based induction systems: comparisons and convergence theorems, IEEE Trans. Inf. Theory, IT-24, 422, 10.1109/TIT.1978.1055913
J. Schmidhuber, The Speed Prior: a new simplicity measure yielding near-optimal computable predictions, in: Proc. 15th Annual Conf. Computational Learning Theory (COLT 2002), Lecture Notes in Artificial Intelligence, Springer, Sydney, Australia, 2002, pp. 216–228.
M. Hutter, Universal Artificial Intelligence: Sequential Decisions based on Algorithmic Probability, Springer, Berlin, 2005, 300 p., http://www.idsia.ch/~marcus/ai/uaibook.htm.
Li, 1997
Cilibrasi, 2005, Clustering by compression, IEEE Trans. Inf. Theory, 51, 1523, 10.1109/TIT.2005.844059
M. Hutter, New error bounds for Solomonoff prediction, J. Comput. Syst. Sci. 62 (4) (2001) 653–667, http://arxiv.org/abs/cs.AI/9912008.
M. Hutter, Convergence and loss bounds for Bayesian sequence prediction, IEEE Trans. Inf. Theory 49 (8) (2003) 2061–2067. http://arxiv.org/abs/cs.LG/0301014.
M. Hutter, Optimality of universal Bayesian prediction for general loss and alphabet, J. Mach. Learn. Res. 4 (2003) 971–1000. http://arxiv.org/abs/cs.LG/0311014.
M. Hutter, General loss bounds for universal sequence prediction, in: Proc. 18th Intl. Conf. Machine Learning (ICML-2001), 2001, pp. 210–217, http://arxiv.org/abs/cs.AI/0101019.
M. Hutter, Convergence and error bounds for universal prediction of nonbinary sequences, in: Proc. 12th European Conf. Machine Learning (ECML-2001), 2001,pp. 239–250. http://arxiv.org/abs/cs.LG/0106036.
Zvonkin, 1970, The complexity of finite objects and the development of the concepts of information and randomness by means of the theory of algorithms, Russ. Math. Surveys, 25, 83, 10.1070/RM1970v025n06ABEH001269
M. Hutter, A.A. Muchnik, Universal convergence of semimeasures on individual random sequences, in: Proc. 15th Int. Conf. Algorithmic Learning Theory (ALT’04), LNAI, vol. 3244, Springer, Berlin, Padova, 2004, pp. 234–248, http://arxiv.org/abs/cs.LG/0407057.
J. Schmidhuber, Algorithmic theories of everything, Report IDSIA-20-00, quant-ph/0011122, IDSIA, Manno (Lugano), Switzerland, 2000.
Schmidhuber, 2002, Hierarchies of generalized Kolmogorov complexities and nonenumerable universal measures computable in the limit, Int. J. Foundations Comput. Sci., 13, 587, 10.1142/S0129054102001291
M. Hutter, Sequence prediction based on monotone complexity, in: Proc. 16th Annual Conf. Learning Theory (COLT’03), LNAI, vol. 2777, Springer, Berlin, 2003, pp. 506–521. http://arxiv.org/abs/cs.AI/0306036.
M. Hutter, Sequential predictions based on algorithmic complexity, J. Comput. Syst. Sci. 72 (2006) 95–117, http://arxiv.org/abs/cs.IT/0508043.
N.K. Vereshchagin, A. Shen, V.A. Uspensky, Lecture Notes on Kolmogorov Complexity, Unpublished, http://lpcs.math.msu.su/~ver/kolm-book (2005).
Uspensky, 1996, Relations between varieties of Kolmogorov complexities, Math. Systems Theory, 29, 271, 10.1007/BF01201280
J. Poland, M. Hutter, Convergence of discrete MDL for sequential prediction, in: Proc. 17th Annual Conf. Learning Theory (COLT’04), LNAI, vol. 3120, Springer, Berlin, Banff, 2004, pp. 300–314, http://arxiv.org/abs/cs.LG/0404057.