欢迎来到天天文库
浏览记录
ID:34606434
大小:428.82 KB
页数:10页
时间:2019-03-08
《Reducing the Dimensionality of Data with Neural Networks.pdf》由会员上传分享,免费在线阅读,更多相关内容在学术论文-天天文库。
1、www.sciencemag.org/cgi/content/full/313/5786/504/DC1SupportingOnlineMaterialforReducingtheDimensionalityofDatawithNeuralNetworksG.E.Hinton*andR.R.Salakhutdinov*Towhomcorrespondenceshouldbeaddressed.E-mail:hinton@cs.toronto.eduPublished28July2006,Scienc
2、e313,504(2006)DOI:10.1126/science.1127647ThisPDFfileincludes:MaterialsandMethodsFigs.S1toS5MatlabCodeSupportingOnlineMaterialDetailsofthepretraining:TospeedupthepretrainingofeachRBM,wesubdividedalldatasetsintomini-batches,eachcontaining100datavectorsan
3、dupdatedtheweightsaftereachmini-batch.Fordatasetsthatarenotdivisiblebythesizeofaminibatch,theremainingdatavectorswereincludedinthelastminibatch.Foralldatasets,eachhiddenlayerwaspretrainedfor50passesthroughtheentiretrainingset.Theweightswereupdatedafter
4、eachmini-batchusingtheaveragesinEq.1ofthepaperwithalearningrateof .Inaddition, timesthepreviousupdatewasaddedtoeachweightand timesthevalueoftheweightwassub-tractedtopenalizelargeweights.Weightswereinitializedwithsmallrandomvaluessampl
5、edfromanormaldistributionwithzeromeanandstandarddeviationof .TheMatlabcodeweusedisavailableathttp://www.cs.toronto.edu/hinton/MatlabForSciencePaper.htmlDetailsofthefine-tuning:Forthefine-tuning,weusedthemethodofconjugategradientsonlargerminibatchesco
6、ntaining1000datavectors.WeusedCarlRasmussen's“minimize”code(1).Threelinesearcheswereperformedforeachmini-batchineachepoch.Todetermineanadequatenumberofepochsandtocheckforoverfitting,wefine-tunedeachautoencoderonafractionofthetrainingdataandtesteditsperfo
7、rmanceontheremainder.Wethenrepeatedthefine-tuningontheentiretrainingset.Forthesyntheticcurvesandhand-writtendigits,weused200epochsoffine-tuning;forthefacesweused20epochsandforthedocumentsweused50epochs.Slightoverfittingwasobservedforthefaces,buttherewasno
8、overfittingfortheotherdatasets.Overfittingmeansthattowardstheendoftraining,thereconstructionswerestillimprovingonthetrainingsetbutweregettingworseonthevalidationset.Weexperimentedwithvariousvaluesofthelearningrate,momentum,andweight-decay
此文档下载收益归作者所有