Abstract:
The word "martingale" has related, but different, meanings in
probability theory and theoretical computer science. In computational
complexity and algorithmic information theory, a martingale is
typically a function d on strings such that E(d(wb) | w) = d(w) for
all strings w, where the conditional expectation is computed over all
possible values of the next symbol b. In modern probability theory a
martingale is typically a sequence
ξ0,ξ1,ξ2, ... of random
variables such that E(ξn+1 |
ξ0,...,ξn) = ξn for all n.
This paper elucidates the relationship between these notions and proves that the latter notion is too weak for many purposes in computational complexity, because under this definition every computable martingale can be simulated by a polynomial-time computable martingale.
Journal Version: