Return to search

Improving Capsule Networks using zero-skipping and pruning

Capsule Networks are the next generation of image classifiers. Although they
have several advantages over conventional Convolutional Neural Networks (CNNs),
they remain computationally heavy. Since inference on Capsule Networks is timeconsuming, thier usage becomes limited to tasks in which latency is not essential.
Approximation methods in Deep Learning help networks lose redundant parameters
to increase speed and lower energy consumption.
In the first part of this work, we go through an algorithm called zero-skipping.
More than 50% of trained CNNs consist of zeros or values small enough to be considered zero. Since multiplication by zero is a trivial operation, the zero-skipping
algorithm can play a massive role in speed increase throughout the network. We
investigate the eligibility of Capsule Networks for this algorithm on two different
datasets. Our results suggest that Capsule Networks contain enough zeros in their
Primary Capsules to benefit from this algorithm.
In the second part of this thesis, we investigate pruning as one of the most popular
Neural Network approximation methods. Pruning is the act of finding and removing
neurons which have low or no impact on the output. We run experiments on four
different datasets. Pruning Capsule Networks results in the loss of redundant Primary
Capsules. The results show a significant increase in speed with a minimal drop in
accuracy. We also, discuss how dataset complexity affects the pruning strategy. / Graduate

Identiferoai:union.ndltd.org:uvic.ca/oai:dspace.library.uvic.ca:1828/13501
Date15 November 2021
CreatorsSharifi, Ramin
ContributorsBaniasadi, Amirali, Gulliver, T. Aaron
Source SetsUniversity of Victoria
LanguageEnglish, English
Detected LanguageEnglish
TypeThesis
Formatapplication/pdf
RightsAvailable to the World Wide Web

Page generated in 0.002 seconds