Next: Entropy Rate
Up: Intro to Information Theory
Previous: Thought experiment
Think of it as a measure of disorder
- 1.
- If you have no idea what should happen it should be maximal
- 2.
- If you know exactly what's going to happen it should be zero.
- 3.
- If events are indept., then the entropy should be sum of individual
- I
- H(X)= s bits (use )
- II
- H(X)=1
- III
-
Idea
Entropy might have something to do with how compactly you can code the output of a source
Def.
Self information is
of source digit aj
Def.
Entropy
Def.
Joint entropy
In general
Show using
that
etc.
Def.
Conditional entropy
and from decomposition of p(x,y)
In general
which implites that
Usually seen as
H(XY)=H(X)+H(Y|X)=H(Y)+H(X|Y)
Notice that if X indept. Y then
H(XY)=H(X)+H(Y) why?
Next: Entropy Rate
Up: Intro to Information Theory
Previous: Thought experiment
Christopher Rose
1999-02-24