Operational Channel Capacity = Information Channel Capacity
We prove a bunch of propositions leading up to this
We only actually prove the equality for a Binary Symmetric Channel (BSC).

Proposition

For a Discrete Memoryless Channel (DMC)

Proof

Let be the Information Channel Capacity.
Suppose we can Transmit Reliably at rate
i.e. there is a sequence of codes with of length and size
such that as .
The error probability defined as:

is clearly (i.e. smaller than the max error probability)

Take Random Variable to be the input uniformly distributed over ,
and be the output when is transmitted and decoded.
So, say
Then by Fano’s Inequality write:

Thus the sequence of codes doesn’t exist

Proposition

Consider a Binary Symmetric Channel (BSC) with error probability . Let .
Then there is a sequence of codes of length and size s.t. as .

Proof

WLOG so there is some s.t.
Using minimum distance Decoding rule
Let
We pick an Binary Code at random
(i.e. each with probability )
Say
Choose at random (i.e. each with probability )
We send through the channel and get output

It suffices to show

Let

Now consider those two separately

If

Proposition

Consider a Binary Symmetric Channel (BSC) with error probability .
Let .
Then there is a sequence of codes with of length , size
and as

Proof

Pick s.t. .
By previous, we construct a sequence of codes
with of length and size and as
Throwing out the worst half of the codewords in gives a code with

so as
Note has length and size for sufficiently large.
So we can replace by a subcode of size of
and still get as