Multigrid-in-Channels Architectures for Wide Convolutional Neural Networks.

Jonathan Ephrath, Lars Ruthotto, Eran Treister

Research output: Working paper/PreprintPreprint

Abstract

We present a multigrid approach that combats the quadratic growth of the number of parameters with respect to the number of channels in standard convolutional neural networks (CNNs). It has been shown that there is a redundancy in standard CNNs, as networks with much sparser convolution operators can yield similar performance to full networks. The sparsity patterns that lead to such behavior, however, are typically random, hampering hardware efficiency. In this work, we present a multigrid-in-channels approach for building CNN architectures that achieves full coupling of the channels, and whose number of parameters is linearly proportional to the width of the network. To this end, we replace each convolution layer in a generic CNN with a multilevel layer consisting of structured (i.e., grouped) convolutions. Our examples from supervised image classification show that applying this strategy to residual networks and MobileNetV2 considerably reduces the number of parameters without negatively affecting accuracy. Therefore, we can widen networks without dramatically increasing the number of parameters or operations
Original languageEnglish
StatePublished - 2020

Fingerprint

Dive into the research topics of 'Multigrid-in-Channels Architectures for Wide Convolutional Neural Networks.'. Together they form a unique fingerprint.

Cite this