Return to search

Enhanced Neural Network Training Using Selective Backpropagation and Forward Propagation

Neural networks are making headlines every day as the tool of the future, powering artificial intelligence programs and supporting technologies never seen before. However, the training of neural networks can take days or even weeks for bigger networks, and requires the use of super computers and GPUs in academia and industry in order to achieve state of the art results. This thesis discusses employing selective measures to determine when to backpropagate and forward propagate in order to reduce training time while maintaining classification performance. This thesis tests these new algorithms on the MNIST and CASIA datasets, and achieves successful results with both algorithms on the two datasets. The selective backpropagation algorithm shows a reduction of up to 93.3% of backpropagations completed, and the selective forward propagation algorithm shows a reduction of up to 72.90% in forward propagations and backpropagations completed compared to baseline runs of always forward propagating and backpropagating. This work also discusses employing the selective backpropagation algorithm on a modified dataset with disproportional under-representation of some classes compared to others. / Master of Science / Neural Networks are some of the most commonly used and best performing tools in machine learning. However, training them to perform well is a tedious task that can take days or even weeks, since bigger networks perform better but take exponentially longer to train. What can be done to reduce training time? Imagine a student studying for a test. The student likely solves practice problems that cover the different topics that may be covered on the test. The student then evaluates which topics he/she knew well, and forgoes extensive practice and review on those in favor of focusing on topics he/she missed or was not as confident on. This thesis discusses following a similar approach in training neural networks in order to reduce their training time needed to achieve desired performance levels.

Identiferoai:union.ndltd.org:VTETD/oai:vtechworks.lib.vt.edu:10919/83714
Date22 June 2018
CreatorsBendelac, Shiri
ContributorsElectrical and Computer Engineering, Ernst, Joseph M., Huang, Jia-Bin, Wyatt, Chris L., Headley, William C.
PublisherVirginia Tech
Source SetsVirginia Tech Theses and Dissertation
Detected LanguageEnglish
TypeThesis
FormatETD, application/pdf
RightsIn Copyright, http://rightsstatements.org/vocab/InC/1.0/

Page generated in 0.0017 seconds