Return to search

Expanding multilayer perceptrons with a brain inspired activation algorithm : Experimental comparison of the performance of an activation enhanced multi layer perceptron

Machine learning is a field that is inspired by how humans and, by extension, the brain learns.The brain consists of a biological neural network that has neurons that are either active or inactive. Modern-day artificial intelligence is loosely based on how biological neural networks function. This paper investigates whether a multi layered perceptron that utilizes inactive/active neurons can reduce the number of active neurons during the forward and backward pass while maintaining accuracy. This is done by implementing a multi layer perceptron using a python environment and building a neuron activation algorithm on top of it. Results show that it ispossible to reduce the number of active neurons by around 30% with a negligible impact on test accuracy. Future works include algorithmic improvements and further testing if it is possible to reduce the total amount of mathematical operations in other neural network architectures with a bigger computational overhead.

Identiferoai:union.ndltd.org:UPSALLA1/oai:DiVA.org:his-21532
Date January 2022
CreatorsWajud Abdul Aziz, Karar, Gripenberg, Kim Emil Leonard
PublisherHögskolan i Skövde, Institutionen för informationsteknologi
Source SetsDiVA Archive at Upsalla University
LanguageEnglish
Detected LanguageEnglish
TypeStudent thesis, info:eu-repo/semantics/bachelorThesis, text
Formatapplication/pdf
Rightsinfo:eu-repo/semantics/openAccess

Page generated in 0.0022 seconds