Академический Документы
Профессиональный Документы
Культура Документы
Faculty of Engineering
Department of Electrical Engineering
Communication ECE 407
Assignment No.1
For a given Binary Symmetric Channel, if the input probabilities are given by the
Calculate:
1. The output probabilities.
2. Source entropy H(x).
3. Destination entropy H(y).
4. Conditional entropy H(x/y).
5. System entropy H(x, y).
6. Average mutual information I(x; y).
7. Gain of information of x2 due to the reception of y2.
1. The output probabilities,
2 1
1 2
13
(1 ) = ( ) +
( )=
3 3
10 3
45
1 1
9 2
32
(2 ) = ( ) + ( ) =
3 3
10 3
45
2. Source entropy,
1
2
3
2
() = log 2 (3) + log 2 ( ) = log 2 3 0.918
3
3
2
3
3. Destination entropy,
() =
13
45
32
45
log 2 ( ) + log 2 ( ) 0.8673
45
13
45
32
4. Conditional entropy,
2 1
(3) (3) 10
(1 1 ) =
=
13
13
( )
45
1 2
(10) (3)
3
(2 1 ) =
=
13
13
( )
45
1 1
( )( )
5
(1 2 ) = 3 3 =
32
32
( )
45
9 2
(10) (3) 27
(2 2 ) =
=
32
32
( )
45
( ) = (
10 13
13
3 13
13
) ( ) log 2 ( ) + ( ) ( ) log 2 ( )
13 45
10
13 45
3
5 32
32
27 32
32
+ ( ) ( ) log 2 ( ) + ( ) ( ) log 2 ( ) 0.66974
32 45
5
32 45
27
5. System entropy,
(, ) = () + () 1.537
6. Average mutual information,
(; ) = () () 0.24826
7. Gain of information of x2 due to the reception of y2.
27
(2 2 )
32) 0.33985
(2 ; 2 ) = log 2 (
) = log 2 (
2
(2 )
3
2nd Channel
1st Channel
Show that,
() ()
(|) = (, ) log
,
(|) = (, ) log
,
1
(|)
1
(|)
We can consider the third joint variable and sum all over it, so nothing changes,
(|) = (, , ) log
,,
(|) = (, , ) log
,,
1
(|)
1
(|)
(|)
(|)
(|, )
(|)
(|)
)
(|, )
(|) (|) 0
(|) (|)
With the equality happening if & only if (|) = (|)
A signal amplitude X is a random variable uniformly distributed over the range (-1, 1).
The output Y is also random variable uniformly distributed in the range (-2, 2).
Find,
A) Source entropy.
B) Destination entropy.
For the source entropy, the variable is uniformly distributed over the given
range; and so we apply the normalization property of probability function to get
the f(x) to equal 1/2
1
() =
1
1
log 2 2 = 1
2
For the destination, the same applies, and the f(x) will equal 1/4,
2
() =
2
1
log 2 4 = 2
4
For a continuous random variable X constrained over the range (-M, M), Show that the
entropy is maximum if X is uniformly distributed over the range (-M, M). Then, find
the corresponding maximum entropy.
() = () log 2 ()
After applying the variation or taking the functional derivative w.r.t P(X), and
normalizing the log function to e, thats just shifting our constant lambda with
some constant value,
log () 1 + = 0
() = 1
We now still have the normalization of probability to find lambda,
() = 1 = 1
2 1 = 1
1
1
1 =
() =
2
2
Which means that the variable X is uniformly distributed over the interval (-M,
M).
Find ( ) that will achieve maximum capacity, and then find the capacity.
Well, we know that the capacity is just the mutual information with the inputs
having the same probability,
1
1
(|)
(; )| = ( ) (|) log 2 (
)
()
=0 =0
We have,
1
() = ()(|) =
=0
1
1
1
( + ( 1) (
)) =
1
1
(1 )
= [ log 2 () + (1 ) log 2 (
)]
1
Further simplification would yield,
1
= [log 2 + log 2 + (1 ) log 2 (
)]
1
10