AI

Exploiting cyclic symmetry in convolutional neural networks

Abstract

Many image modalities exhibit rotational symmetry. Convolutional neural networks are sometimes trained using data augmentation to exploit this, but the networks are still required to learn the rotation equivariance properties from the data. Encoding these properties into the network architecture, as we are already used to doing for translation equivariance by using convolutional layers, could result in a more efficient use of the parameter budget by relieving the model from learning them.

We introduce four operations which can be inserted into neural network models as layers, and which can be combined to make these models partially equivariant to rotations. They also enable parameter sharing across different orientations. We compare models modified in this way with unmodified baselines and demonstrate improved performance and regularization effects on several datasets with different rotation equivariance properties.