1 |
More efficient training using equivariant neural networksBylander, Karl January 2023 (has links)
Convolutional neural networks are equivariant to translations; equivariance to other symmetries, however, is not defined and the class output may vary depending on the input's orientation. To mitigate this, the training data can be augmented at the cost of increased redundancy in the model. Another solution is to build an equivariant neural network and thereby increasing the equivariance to a larger symmetry group. In this study, two convolutional neural networks and their respective equivariant counterparts are constructed and applied to the symmetry groups D4 and C8 to explore the impact on performance when removing and adding batch normalisation and data augmentation. The results suggest that data augmentation is irrelevant to an equivariant model and equivariance to more symmetries can slightly improve accuracy. The convolutional neural networks rely heavily on batch normalisation, whereas the equivariant models achieve high accuracy, although lower than with batch normalisation present.
|
Page generated in 0.0892 seconds