2016 Firat,Multi-Way Multilingual Neural Machine Translation with a Shared Attention Mechanism

2016 Firat,Multi-Way Multilingual Neural Machine Translation with a Shared Attention Mechanism

ID:40254488

大小:328.44 KB

页数:10页

时间:2019-07-29

2016 Firat,Multi-Way Multilingual Neural Machine Translation with a Shared Attention Mechanism_第1页
2016 Firat,Multi-Way Multilingual Neural Machine Translation with a Shared Attention Mechanism_第2页
2016 Firat,Multi-Way Multilingual Neural Machine Translation with a Shared Attention Mechanism_第3页
2016 Firat,Multi-Way Multilingual Neural Machine Translation with a Shared Attention Mechanism_第4页
2016 Firat,Multi-Way Multilingual Neural Machine Translation with a Shared Attention Mechanism_第5页
资源描述:

《2016 Firat,Multi-Way Multilingual Neural Machine Translation with a Shared Attention Mechanism》由会员上传分享,免费在线阅读,更多相关内容在学术论文-天天文库

1、Multi-Way,MultilingualNeuralMachineTranslationwithaSharedAttentionMechanismOrhanFiratKyunghyunChoYoshuaBengioMiddleEastTechnicalUniversityNewYorkUniversityUniversityofMontrealorhan.firat@ceng.metu.edu.trCIFARSeniorFellowAbstractintoapointinacontinuousvectorspace,resultinginafixed-di

2、mensionalcontextvector.Theotherre-Weproposemulti-way,multilingualneuralcurrentneuralnetwork,calledadecoder,thengener-machinetranslation.Theproposedapproachatesatargetsequenceagainofvariablelengthstart-enablesasingleneuraltranslationmodeltoingfromthecontextvector.Thisapproachhowever

3、translatebetweenmultiplelanguages,withanumberofparametersthatgrowsonlylin-hasbeenfoundtobeinefficientin(Choetal.,2014a)earlywiththenumberoflanguages.Thiswhenhandlinglongsentences,duetothedifficultyismadepossiblebyhavingasingleatten-inlearningacomplexmappingbetweenanarbitrarytionmecha

4、nismthatissharedacrossalllan-longsentenceandasinglefixed-dimensionalvector.guagepairs.Wetraintheproposedmulti-In(Bahdanauetal.,2014),aremedytothisissueway,multilingualmodelontenlanguagepairswasproposedbyincorporatinganattentionmecha-fromWMT’15simultaneouslyandobservenismtothebasicen

5、coder-decodernetwork.Theat-clearperformanceimprovementsovermodelstrainedononlyonelanguagepair.Inpartic-tentionmechanismintheencoder-decodernetworkular,weobservethattheproposedmodelsig-freesthenetworkfromhavingtomapasequenceofnificantlyimprovesthetranslationqualityofarbitrarylengthto

6、asingle,fixed-dimensionalvec-low-resourcelanguagepairs.tor.Sincethisattentionmechanismwasintroducedtotheencoder-decodernetworkformachinetrans-1Introductionlation,neuralmachinetranslation,whichispurelybasedonneuralnetworkstoperformfullend-to-endNeuralMachineTranslationIthasbeenshownt

7、ranslation,hasbecomecompetitivewiththeexist-thatadeep(recurrent)neuralnetworkcansuccess-ingphrase-basedstatisticalmachinetranslationinfullylearnacomplexmappingbetweenvariable-arXiv:1601.01073v1[cs.CL]6Jan2016manylanguagepairs(Jeanetal.,2015;Gulcehreetlengthinputandoutputsequenceson

8、itsown.Someal.,2015;Luonge

当前文档最多预览五页,下载文档查看全文

此文档下载收益归作者所有

当前文档最多预览五页,下载文档查看全文
温馨提示:
1. 部分包含数学公式或PPT动画的文件,查看预览时可能会显示错乱或异常,文件下载后无此问题,请放心下载。
2. 本文档由用户上传,版权归属用户,天天文库负责整理代发布。如果您对本文档版权有争议请及时联系客服。
3. 下载前请仔细阅读文档内容,确认文档内容符合您的需求后进行下载,若出现内容与标题不符可向本站投诉处理。
4. 下载文档时可能由于网络波动等原因无法下载或下载错误,付费完成后未能成功下载的用户请联系客服处理。