NWU Institutional Repository

Impact of batch normalization on convolutional network representations

dc.contributor.authorPotgieter, Hermanus L.
dc.contributor.authorMouton, Coenraad
dc.contributor.authorDavel, Marelie H.
dc.date.accessioned2025-05-05T08:56:56Z
dc.date.available2025-05-05T08:56:56Z
dc.date.issued2025-02-14
dc.description.abstractBatch normalization (BatchNorm) is a popular layer normalization technique used when training deep neural networks. It has been shown to enhance the training speed and accuracy of deep learning models. However, the mechanics by which BatchNorm achieves these benefits is an active area of research, and different perspectives have been proposed. In this paper, we investigate the effect of BatchNorm on the resulting hidden representations, that is, the vectors of activation valuesformed as samples are processed at each hidden layer. Specifically, we consider the sparsity of these representations, as well as their implicit clustering – the creation of groups of representations that are similar to some extent. We contrast image classification models trained with and without batch normalization and highlight consistent differences observed. These findings highlight that BatchNorm’s effect on representational sparsity is not a significant factor affecting generalization, while the representations of models trained with BatchNorm tend to show more advantageous clustering characteristics.en_US
dc.identifier.citationPotgieter, H.L. et al. Impact of batch normalization on convolutional network representations. arXiv:2501.14441v2 [cs.LG] 13 Feb 2025en_US
dc.identifier.urihttp://hdl.handle.net/10394/42872
dc.language.isoenen_US
dc.titleImpact of batch normalization on convolutional network representationsen_US
dc.typeArticleen_US

Files

Original bundle

Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
Hermanus, L.P. et al..pdf
Size:
726.43 KB
Format:
Adobe Portable Document Format
Description:

License bundle

Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
license.txt
Size:
1.61 KB
Format:
Item-specific license agreed upon to submission
Description:

Collections