资源描述:
《information measure》由会员上传分享,免费在线阅读,更多相关内容在学术论文-天天文库。
1、2InformationMeasuresShannon’sinformationmeasuresrefertoentropy,conditionalentropy,mutualinformation,andconditionalmutualinformation.Theyarethemostimpor-tantmeasuresofinformationininformationtheory.Inthischapter,wein-troducethesemeasuresandestablishsomebasicpropertiestheypossess.Thephysicalmeanings
2、ofthesemeasureswillbediscussedindepthinsubsequentchapters.Wethenintroducetheinformationaldivergencewhichmeasuresthe“distance”betweentwoprobabilitydistributionsandprovesomeusefulinequalitiesininformationtheory.Thechapterendswithasectionontheentropyrateofastationaryinformationsource.2.1Independencea
3、ndMarkovChainsWebeginourdiscussioninthischapterbyreviewingtwobasicconceptsinprobability:independenceofrandomvariablesandMarkovchain.Alltheran-domvariablesinthisbookexceptforChapters10and11areassumedtobediscreteunlessotherwisespecified.LetXbearandomvariabletakingvaluesinanalphabetX.Theprobabil-itydi
4、stributionforXisdenotedas{pX(x),x∈X},withpX(x)=Pr{X=x}.Whenthereisnoambiguity,pX(x)willbeabbreviatedasp(x),and{p(x)}willbeabbreviatedasp(x).ThesupportofX,denotedbySX,isthesetofallx∈Xsuchthatp(x)>0.IfSX=X,wesaythatpisstrictlypositive.Otherwise,wesaythatpisnotstrictlypositive,orpcontainszeroprobabil
5、itymasses.Alltheabovenotationsnaturallyextendtotwoormorerandomvari-ables.Aswewillsee,probabilitydistributionswithzeroprobabilitymassesareverydelicate,andtheyneedtobehandledwithgreatcare.Definition2.1.TworandomvariablesXandYareindependent,denotedbyX⊥Y,ifp(x,y)=p(x)p(y)(2.1)forallxandy(i.e.,forall(x,
6、y)∈X×Y).82InformationMeasuresFormorethantworandomvariables,wedistinguishbetweentwotypesofindependence.Definition2.2(MutualIndependence).Forn≥3,randomvariablesX1,X2,···,Xnaremutuallyindependentifp(x1,x2,···,xn)=p(x1)p(x2)···p(xn)(2.2)forallx1,x2,···,xn.Definition2.3(PairwiseIndependence).Forn≥3,rando
7、mvariablesX1,X2,···,XnarepairwiseindependentifXiandXjareindependentforall1≤i