Equivariance in CNNs: how generalising the weight-sharing property increases data-efficiency
Marysia Winkels
Convolutional neural networks have become the methodology of choice for image related tasks, but why exactly is that?
In this talk, we will explore the theory behind how the weight-sharing property of the convolutional layer that leads to translational equivariance (a shift in input image leads to the same prediction) can be exploited and generalised to equivariance towards other types of transformations such as rotation and reflection.
The python library GrouPy implements a new type of convolutional layer that ensures equivariance towards transformations beyond just translation. This builds prior knowledge about our data (e.g. orientation should not influence prediction) into network itself which therefore no longer needs to be learned through data augmentation. In addition to that, these new networks achieve a significant reduction in sample complexity and a notable increase in performance, generally converge faster and prove far more effective in cases of class imbalance. The GrouPy convolution layer can simply be used as a drop-in replacement of regular convolutions.
As a use case, we study lung nodule detection - suspicious lesions in the lung visible on a 3D CT chest scan that may be indicative of lung cancer. We learn that using networks with these new convolutions achieve an increase in performance even when trained on 10x less data.
Marysia Winkels
Affiliation: Aidence
Marysia studied Artificial Intelligence at the University of Amsterdam. After that, she joined Aidence, an Amsterdam-based start-up that uses deep learning for medical image analysis, as a machine learning engineer, and has worked there ever since to help artificial intelligence improve healthcare.
She is also co-chair of PyData Amsterdam.
visit the speaker at: Homepage