资源描述:
《Information-Theory》由会员上传分享,免费在线阅读,更多相关内容在学术论文-天天文库。
1、MachineLearningSrihariInformationTheorySargurN.Srihari1MachineLearningSrihariTopics1.EntropyasanInformationMeasure1.DiscretevariabledefinitionRelationshiptoCodeLength2.ContinuousVariableDifferentialEntropy2.MaximumEntropy3.ConditionalEntropy4.Kullback-LeiblerDivergence(RelativeEntro
2、py)5.MutualInformation2MachineLearningSrihariInformationMeasure•Howmuchinformationisreceivedwhenweobserveaspecificvalueforadiscreterandomvariablex?•Amountofinformationisdegreeofsurprise–Certainmeansnoinformation–Moreinformationwheneventisunlikely•Dependsonprobabilitydistributionp(x)
3、,aquantityh(x)•Iftherearetwounrelatedeventsxandywewanth(x,y)=h(x)+h(y)•Thuswechooseh(x)=-logp(x)2–Negativeassuresthatinformationmeasureispositive•Averageamountofinformationtransmittedistheexpectationwrtp(x)referedtoasentropyH(x)=-Σp(x)logp(x)x23MachineLearningSrihariUsefulnessofEntr
4、opy•UniformDistribution–Randomvariablexhas8possiblestates,eachequallylikely•Wewouldneed3bitstotransmit•Also,H(x)=-8x(1/8)log(1/8)=3bits2•Non-uniformDistribution–Ifxhas8stateswithprobabilities(1/2,1/4,1/8,1/16,1/64,1/64,1/64,1/64)H(x)=2bits•Non-uniformdistributionhassmallerentropytha
5、nuniform•Hasaninterpretationofintermsofdisorder4MachineLearningSrihariRelationshipofEntropytoCodeLength•Takeadvantageofnon-uniformdistributiontouseshortercodesformoreprobableevents01•Ifxhas8states(a,b,c,d,e,f,g,h)withprobabilities1/2(1/2,1/4,1/8,1/16,1/64,1/64,1/64,1/64)1/4Canusecod
6、es0,10,110,1110,111100,111110,1111111/8averagecodelength=(1/2)1+(1/4)2+(1/8)3+(1/16)4+4(1/64)6=2bits•Sameasentropyoftherandomvariable•Shortercodestringisnotpossibleduetoneedtodisambiguatestringintocomponentparts•11001110isuniquelydecodedassequencecad5MachineLearningSrihariRelationsh
7、ipbetweenEntropyandShortestCodingLength•NoiselesscodingtheoremofShannon–Entropyisalowerboundonnumberofbitsneededtotransmitarandomvariable•Naturallogarithmsareusedinrelationshiptoothertopics–Natsinsteadofbits6HistoryofEntropy:thermodynamicsMachineLearningSriharitoinformationtheory•En
8、tropyisaverageamoun