Information source (mathematics)

In mathematics, an information source is a sequence of random variables ranging over a finite alphabet Γ, having a stationary distribution.

The uncertainty, or entropy rate, of an information source is defined as

H { X } = lim n H ( X n | X 0 , X 1 , , X n 1 ) {\displaystyle H\{\mathbf {X} \}=\lim _{n\to \infty }H(X_{n}|X_{0},X_{1},\dots ,X_{n-1})}

where

X 0 , X 1 , , X n {\displaystyle X_{0},X_{1},\dots ,X_{n}}

is the sequence of random variables defining the information source, and

H ( X n | X 0 , X 1 , , X n 1 ) {\displaystyle H(X_{n}|X_{0},X_{1},\dots ,X_{n-1})}

is the conditional information entropy of the sequence of random variables. Equivalently, one has

H { X } = lim n H ( X 0 , X 1 , , X n 1 , X n ) n + 1 . {\displaystyle H\{\mathbf {X} \}=\lim _{n\to \infty }{\frac {H(X_{0},X_{1},\dots ,X_{n-1},X_{n})}{n+1}}.}

See also

  • Markov information source
  • Asymptotic equipartition property

References

  • Robert B. Ash, Information Theory, (1965) Dover Publications. ISBN 0-486-66521-6


  • v
  • t
  • e