Attention & Memory

Attention & Memory

ID:40358149

大小:3.15 MB

页数:55页

时间:2019-07-31

Attention & Memory_第1页
Attention & Memory_第2页
Attention & Memory_第3页
Attention & Memory_第4页
Attention & Memory_第5页
资源描述:

《Attention & Memory》由会员上传分享,免费在线阅读,更多相关内容在学术论文-天天文库

1、Attention&Memory集智俱乐部钟翰廷目录●RNNanddynamicsinRNN○FromvanillaRNNtoLSTM/GRU○UnitaryEvolutionRNN○Delay&HierarchiestoReachFather○DynamicsinRNN●AttentionMechanism○Introduction○AttentioninImageCaption/Generation○AttentioninNLP○BetterAttentionMechanism●MemoryNetwork○NeuralTuringMachineandMemory

2、Network○DynamicMemoryNetwork○HierarchicalMemoryNetworkPartI:RNNanddynamicsinRNN●FromvanillaRNNtoLSTM/GRU●UnitaryEvolutionRNN●Delay&HierarchiestoReachFarther●DynamicsinRNNFromvanillaRNNtoLSTM/GRUBasicstructureFromvanillaRNNtoLSTM/GRUBPTT:BackPropagationThroughTimeVanisinggradientproblem

3、FromvanillaRNNtoLSTM/GRUVanisinggradientproblemFromvanillaRNNtoLSTM/GRUPascanu,Razvan,TomasMikolov,andYoshuaBengio."Onthedifficultyoftrainingrecurrentneuralnetworks."ICML(3)28(2013):1310-1318.●Usearegularizationtermthatrepresentsapreferenceforparametervaluessuchthatback-propagatedgradi

4、entsneitherincreaseordecreaseinmagnitude●FromthedynamicalsystemsperspectivewecanseethatpreventingthevanishinggradientproblemimpliesthatwearepushingthemodeltowardstheboundaryofthecurrentbasinofattractionFromvanillaRNNtoLSTM/GRULSTMGRUUnitaryEvolutionRecurrentNeuralNetworks●Whentheeigenv

5、aluesofthehiddentohiddenweightmatrixdeviatefromabsolutevalue1,optimizationbecomesdifficult●uRNNlearnsaunitaryweightmatrix,witheigenvaluesofabsolutevalueexactly1.●uRNNusesReLUasactivationfunction.uRNNCopyingmemoryproblemuRNNAddingProblemDelay&HierarchiestoReachFartherThetemporaldependen

6、ciesarestructuredhierarchicallyDelay&HierarchiestoReachFarther●Delaysandmultipletimescales:Elhihi&BengioNIPS1995,KoutniketalICML2014●HierarchicalRNNs:SordonietalCIKM2015,Mehri&SoroushICLR2017DynamicsinRNNLSTM:ASearchSpaceOdyssey●Theforgetgateandtheoutputactivationfunctionarethecritical

7、componentsoftheLSTMblock●CertainmodificationssuchascouplingtheinputandforgetgatesorremovingpeepholeconnectionssimplifyLSTMwithoutsignificantlyhurtingperformance●LearningrateandnetworksizearethemostcrucialtunableLSTMhyperparameters.Surprisingly,theuseofmomentumwasfoundtobeunimportantD

当前文档最多预览五页,下载文档查看全文

此文档下载收益归作者所有

当前文档最多预览五页,下载文档查看全文
温馨提示:
1. 部分包含数学公式或PPT动画的文件,查看预览时可能会显示错乱或异常,文件下载后无此问题,请放心下载。
2. 本文档由用户上传,版权归属用户,天天文库负责整理代发布。如果您对本文档版权有争议请及时联系客服。
3. 下载前请仔细阅读文档内容,确认文档内容符合您的需求后进行下载,若出现内容与标题不符可向本站投诉处理。
4. 下载文档时可能由于网络波动等原因无法下载或下载错误,付费完成后未能成功下载的用户请联系客服处理。