Convolutional neural networks are equivariant to translations; equivariance to other symmetries, however, is not defined and the class output may vary depending on the input's orientation. To mitigate this, the training data can be augmented at the cost of increased redundancy in the model. Another solution is to build an equivariant neural network and thereby increasing the equivariance to a larger symmetry group. In this study, two convolutional neural networks and their respective equivariant counterparts are constructed and applied to the symmetry groups D4 and C8 to explore the impact on performance when removing and adding batch normalisation and data augmentation. The results suggest that data augmentation is irrelevant to an equivariant model and equivariance to more symmetries can slightly improve accuracy. The convolutional neural networks rely heavily on batch normalisation, whereas the equivariant models achieve high accuracy, although lower than with batch normalisation present.
Identifer | oai:union.ndltd.org:UPSALLA1/oai:DiVA.org:uu-507213 |
Date | January 2023 |
Creators | Bylander, Karl |
Publisher | Uppsala universitet, Avdelningen Vi3 |
Source Sets | DiVA Archive at Upsalla University |
Language | English |
Detected Language | English |
Type | Student thesis, info:eu-repo/semantics/bachelorThesis, text |
Format | application/pdf |
Rights | info:eu-repo/semantics/openAccess |
Relation | UPTEC X ; 23004 |
Page generated in 0.0019 seconds