Let and be Random Variables. The mutual information is

i.e. the amount of information about conveyed by
(where is the Mathematical Entropy)
Note with equality iff and are independent.
Also

Lemma

For Random Variables and ,
let and be their distributions and their joint distribution.
Then the following holds

where is the Relative Entropy.

Remark

This measures the distance of to , i.e. how dependent and are.

Lemma

Let and be Random Variables.
Then