当前位置:
文档之家› 信息论中信源熵之间关系的证明
信息论中信源熵之间关系的证明
i=1
∑ ∑ =
−
m j=1
n i=1
p(aib j ) log
2
p(ai ) p(ai /b j )
p( ai
/bj )
∑∑ ∑∑ =
mn j=1 i=1
p(ai b j ) log 2
p(ai / bj ) − p(ai )
mn j=1 i=1
p(aibj ) log 2 p(ai / bj )
j
i
∑ ∑ = − [ p(bj ) p(ai / bj )] log 2 p(ai)
i
j
= H(X),
∑ ∑ 其中
p(bj ) p(ai/ bj ) = p(aibj ) = p(ai ) .
j
j
同理: H (Y) = H(Y / X ) + I (X;Y) ≥ H (Y / X ) .
3.1.2. H ( X ) = H ( XY ) − H(Y / X ) .
同理: I ( X ;Y ) = H (Y ) − H (Y / X ) .
∑ ∑ 3.4.2 证明:
I(X;Y )
=
n i =1
m j =1
p(aibj ) log
2
p(ai / bj ) p(ai )
∑ ∑ n m
1
= i =1 j =1 p(aibj )log 2 p(ai / bj ) p(bj / ai ) p(aibj )
nm
∑ ∑ 证明: H ( X / Y ) = −
p(aibj ) log 2 p(ai / bj )
i=1 j =1
nm
mn
∑∑ ∑ ∑ = −
p(aibj ) log 2 p(aibj ) + [ p(aib j )]log 2 p(bj )
i =1 j =1
j =1 i =1
nm
m
∑∑ ∑ = −
H ( XY) = H( X ) + H (Y / X ) = H(Y ) + H ( X / Y )
= H(X ) + H (Y) − I (X;Y)
= H( X / Y ) + H(Y / X ) + I ( X ;Y ) .
3.3.1 H ( XY) = H( X ) + H (Y / X ) = H(Y ) + H ( X / Y ) .
p(aibj ) log 2 p(ai / bj )
i =1 j =1
∑∑ n
=−
i=1
m j=1
p(aibj ) log 2
p(ai )
p(aibj ) p(ai )
∑ ∑ ∑ ∑ n m
n
= − [ p(aibj )]log 2 p(ai ) −
i=1 j=1
i=1
m j=1
p(aib j ) log
n
∑ ∑ ∑ 2.8 有关概率的基本公式: p(ai ) = 1, p(b j) = 1, p(ai / bj ) = 1,
i=1
j=1
i=1
m
nm
n
m
∑ ∑ ∑ ∑ ∑ p(bj / ai ) = 1 ,
p(aib j ) = 1 , p(aibj ) = p(bj ) , p(aib j ) = p(ai ) ,
p(aibj )I (ai / bj )
j=1 i=1
mn
∑∑ = −
p(aib j ) log 2 p(ai / bj ) .
j =1 i=1
ⅱ:在已知随机变量 X 的条件下,随机变量 Y 的条件熵 H(Y / X) 为:
mn
∑ ∑ H (Y / X ) = E[ I (bj / ai )] =
p(aibj )I (bj / ai )
nm
nm
∑∑ ∑∑ = −
p(aibj ) log 2 p(aibj ) +
p(aibj ) log 2 p(ai / bj )
i =1 j =1
i=1 j =1
nm
∑∑ +
p(aibj ) log 2 p(bj / ai )
i=1 j =1
= H( XY ) − H ( X / Y ) − H (Y / X ) .
nm
∑ ∑ 证明: H ( XY ) = −
p(ai bj ) log 2 p(ai ) p(b j / ai )
i=1 j=1
∑ ∑ n
=−
i=1
m j=1
p(aibj ) log 2
p(ai / bj ) p(b j / ai )
p(ai ) p(ai / bj )
nm
nm
∑∑ ∑∑ = −
p(aib j ) log 2 p(ai / bj ) −
j =1 i =1
mn
∑∑ = −
p(aib j ) log 2 p(bj / ai ) .
j =1 i =1
nm
mn
∑ ∑ ∑ ∑ 2.7 联合熵: H ( XY) =
p(aibj )I (aibj ) = −
p(aibj ) log 2 p(aibj ) .
i =1 j =1
j=1 i =1
n
m
p(aibj ) log 2 p(aibj ) + p(bj ) log 2 p(bj )
i =1 j =1
j =1
n
= H (XY) − H(Y) , ∑ 其中: p(aibj ) = p(bj ) .
i=1
3.2.2 H ( X / Y ) = H( X ) − I ( X ;Y ) .
nm
∑ ∑ 证明: H ( X / Y ) = −
2.3 条件自信息量: I (ai / bj) = − log 2 p(ai / bj ) 或 I (bj / ai) = − log 2 p(bj / ai )
2.4 互信息量: I (ai;bj )
= log
2
p(ai / bj ) p(ai )
(i = 1,2,⋯, n; j =1,2,⋯, m)
平均值)为信源的平均自信息量,一般称为信源的信息熵,也叫信源熵或香农熵,
记为 H ( X ) .
1.7 条件熵:在联合符号集合 XY 上的条件自信息量的数学期望.可以用 H ( X / Y ) 表示.
1.8 联合熵:也叫共熵,是联和离散符号 XY 上的每的元素 aibj 的联合自信息 量的数学期望,用 H ( XY) 表示. 2.基本公式 2.1 自信息量: I (ai) = −log 2 p(ai) 2.2 联合的自信息量: I (aibj) = − log 2 p(aibj ) 当 X 和Y 相互独立时, p(aibj ) = p(ai) p(bj );则有: I (aibj ) = − log 2 p(aibj ) = − log 2 p(ai ) p(bj ) = −log 2 p(ai ) − log 2 p(bj ) = I (ai ) + I (bj )
nm
∑ ∑ 证明: H ( XY ) = −
p(ai bj ) log 2 p(aib j )
i=1 j=1
nm
∑∑ = −
p(aibj ) log 2 p(ai ) p(bj / ai )
i=1 j =1
nm
nm
∑ ∑ ∑ ∑ = − [ p(aibj)]log 2 p(ai ) −
p(aibj )p(bj / ai )
ij
ij
= H( XY ) − H (Y / X ) ,
同理: H (Y ) = H ( XY ) − H( X / Y ) .
3.2 条件熵 H (X /Y) = H (XY) − H (Y) = H(X ) − I (X;Y) .
3.2.1 H ( X / Y ) = H( XY ) − H (Y ) .
i =1 j=1
i =1 j =1
m
∑ = H ( X ) + H (Y / X ), 其中: p(aib j) = p(ai) . j=1
同理: H (XY) = H (Y) + H (X /Y).
3.3.2 H (XY) = H(X ) + H (Y) − I (X;Y) .
nm
∑ ∑ 证明: H ( XY ) = −
证明: H ( X ) = −∑ p(ai ) log 2 p(ai )
i
∑ ∑ = −
i
[
j
p(bj ) p(ai
/ bj )]log
2
p(aibj ) p(bj / ai )
∑∑ ∑∑ = −
p(aibj ) log 2 p(aibj ) − [−
p(aibj ) log 2 p(bj / ai )]
j=1
i=1 j=1
i=1
j=1
p(aib j) = p(ai ) p(bj / ai ) = p(bj) p(ai / bj ) .
3.各种熵之间的关系 3.1 无条件熵
3.1.2 H ( X ) = H ( X / Y ) + I ( X ;Y ) ≥ H( X / Y ) .
n
∑ 证明:① H ( X ) = − p(ai ) log 2 p(ai)
p(ai bj ) log 2 p(ai ) p(b j / ai )
i=1 j=1
∑ ∑ n
=−
i=1
m j=1
p(aibj ) log 2
p(ai ) p(b j )
p(bj / ai ) p(bj )
nm
mn
∑ ∑ ∑ ∑ = − [
p(aib j )]log 2 p(ai ) −