Next: Data Compression
Up: Intro to Information Theory
Previous: Entropy Rate
Let
A source with entropy rate H can be coded with arbitrarily small probability of coding failure at any rate R( bits/source output) if R>H. However, if R<H, the
failure probability is bounded away from zero (no matter how hard you try)
Aside
This makes the code for (III) from before OPTIMAL! Can't do any better
Why you ask?
Look at long sequence of source letters and only code the "typical ones"
If
is typed then by LLN
because
typical sequences will have (on average)
occurrences of letter j
Well
So
HAND WAVE
prob sequence is not typical for large n is negligible
Therefore, number of typical sequences
So code each of the typical with nH(X) bits AND ignore the others!
Christopher Rose
1999-02-24