Saturday, August 20, 2011

Channel Capacity

Channel Capacity, C is defined as
‘the maximum mutual information I(X; Y) in any single use of
the channel, where the maximization is over all possible input
probability distributions {p(xj)} on X”
C is measured in bits/channel-use, or bits/transmission.

Information Capacity Theorem:- is defined as
‘The information capacity of a continuous channel of bandwidth B
hertz, perturbed by additive white Gaussian noise of power spectral
density
N0/2
and limited in bandwidth to B, is given by
C = B log2(1 +P/N0B)
where P is the average transmitted power.

Assumptions:
1. band-limited, power-limited Gaussian channels.
2. A zero-mean stationary process X(t) that is band-limited to Bhertz, sampled at Nyquist rate of 2B samples per second
3. These samples are transmitted in T seconds over a noisy
channel, also band-limited to B hertz.
The number of samples, K is given by
K = 2BT

No comments:

Post a Comment