Input-dependably feature-map pruning

Atalya Waissman, Aharon Bar-Hillel

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

1 Scopus citations


Deep neural networks are an accurate tool for solving, among other things, vision tasks. The computational cost of these networks is often high, preventing their adoption in many real time applications. Thus, there is a constant need for computational saving in this research domain. In this paper we suggest trading accuracy with computation using a gated version of Convolutional Neural Networks (CNN). The gated network selectively activates only a portion of its feature-maps, depending on the given example to be classified. The network’s ‘gates’ imply which feature-maps are necessary for the task, and which are not. Specifically, full feature maps are considered for omission, to enable computational savings in a manner compliant with GPU hardware constraints. The network is trained using a combination of back-propagation for standard weights, minimizing an error-related loss, and reinforcement learning for the gates, minimizing a loss related to the number of feature maps used. We trained and evaluated a gated version of dense-net on the CIFAR-10 dataset [1]. Our results show that with slight impact on the network accuracy, a potential acceleration of up to ×3 might be obtained.

Original languageEnglish
Title of host publicationArtificial Neural Networks and Machine Learning – ICANN 2018 - 27th International Conference on Artificial Neural Networks, 2018, Proceedings
EditorsVera Kurkova, Barbara Hammer, Yannis Manolopoulos, Lazaros Iliadis, Ilias Maglogiannis
PublisherSpringer Verlag
Number of pages8
ISBN (Print)9783030014179
StatePublished - 1 Jan 2018
Event27th International Conference on Artificial Neural Networks, ICANN 2018 - Rhodes, Greece
Duration: 4 Oct 20187 Oct 2018

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Volume11139 LNCS
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349


Conference27th International Conference on Artificial Neural Networks, ICANN 2018


  • Acceleration
  • Conditional computation
  • Feature-map
  • Neural networks
  • Pruning

ASJC Scopus subject areas

  • Theoretical Computer Science
  • Computer Science (all)


Dive into the research topics of 'Input-dependably feature-map pruning'. Together they form a unique fingerprint.

Cite this