Return to search

Investigating the NIN Structure

In this thesis the NIN artificial neural network structure created by Min Lin et al in 2014 is investigated. This is done by varying stage numbers and layer depth. By doing this ten different networks including the original NIN were created. Testing is carried out on a preprocessed version of the CIFAR10 dataset for these ten networks for a maximum of 150’000 iterations. The results show that the number of stages generally affect NIN performance more than layer depth does. The network with three stages and a layer depth of two performs best at a top accuracy of 87.44%. This is below Min Lin et al’s results. However, this is likely due to overfitting and lack of specifics on their training methods. The thesis concludes that studies of different types of micro-networks in the NIN are required. Studies are also required on deeper NINs with larger datasets to prevent the overfitting observed in the results. Larger datasets could be obtained by data augmentation. Furthermore, the results suggests that less complicated (by less complicated it is meant that the network stages have less depth) NIN implementations are more accurate than deeper ones. / I detta kandidatexamensarbete unders¨oks den artificiella neuronn¨at-strukturen NIN som skapades av Min Lin et al 2014. Detta g¨ors genom att variera antalet moduler och lagerdjupet i dessa. Detta ger tio olika n¨atverk inklusive det originala NIN-n¨atet. Testerna utf¨ors p˚a en f¨orprocesserad version av CIFAR10-databasen f¨or alla tio n¨atverk i maximalt 150’000 iterationer. Resultaten visar p˚a att modulantalet generellt p˚averkar NINs prestanda mer ¨an lagerdjupet. N¨atverket med tre moduler och ett lagerdjup p˚a tv˚a har h¨ogst prestanda med 87.44% r¨att. Detta ¨ar under Min Lin et als resultat. Detta beror dock troligen p˚a overfitting och en brist p˚a n¨armare detaljer r¨orande tr¨aningsmetoderna de anv¨ant. Slutsatsen av arbetet ¨ar att studier p˚a olika typer av mikron¨atverk i NIN beh¨ovs. Studier p˚a djupare NINn¨atverk med st¨orre dataset f¨or att f¨orhindra den overfitting som syns i resultaten beh¨ovs ¨aven. St¨orre dataset skulle kunna erh˚allas genom data-augmentering. Resultaten verkar ¨aven antyda att mindre komplicerade (med mindre komplicerade menas att modulerna har mindre djup) NIN-implementationer har h¨ogre prestada ¨an djupa s˚adana.

Identiferoai:union.ndltd.org:UPSALLA1/oai:DiVA.org:kth-186368
Date January 2016
CreatorsHolmér, Viktor, Lundmark, Lukas
PublisherKTH, Skolan för datavetenskap och kommunikation (CSC)
Source SetsDiVA Archive at Upsalla University
LanguageEnglish
Detected LanguageEnglish
TypeStudent thesis, info:eu-repo/semantics/bachelorThesis, text
Formatapplication/pdf
Rightsinfo:eu-repo/semantics/openAccess

Page generated in 0.0019 seconds