Evolution of activation functions for deep learning-based image classification.

Raz Lapid, Moshe Sipper

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

4 Scopus citations


Activation functions (AFs) play a pivotal role in the performance of neural networks. The Rectified Linear Unit (ReLU) is currently the most commonly used AF. Several replacements to ReLU have been suggested but improvements have proven inconsistent. Some AFs exhibit better performance for specific tasks, but it is hard to know a priori how to select the appropriate one(s). Studying both standard fully connected neural networks (FCNs) and convolutional neural networks (CNNs), we propose a novel, three-population, co-evolutionary algorithm to evolve AFs, and compare it to four other methods, both evolutionary and non-evolutionary. Tested on four datasets---MNIST, FashionMNIST, KMNIST, and USPS---coevolution proves to be a performant algorithm for finding good AFs and AF architectures.
Original languageEnglish
Title of host publicationProceedings of the Genetic and Evolutionary Computation Conference Companion GECCO 2022
Place of PublicationNew York, NY, USA
PublisherAssociation for Computing Machinery, Inc
Number of pages9
ISBN (Electronic)9781450392686
StatePublished - 19 Jul 2022
Event2022 Genetic and Evolutionary Computation Conference, GECCO 2022 - Virtual, Online, United States
Duration: 9 Jul 202213 Jul 2022


Conference2022 Genetic and Evolutionary Computation Conference, GECCO 2022
Country/TerritoryUnited States
CityVirtual, Online


  • Computing methodologies
  • Computer graphics
  • Image manipulation
  • Image processing
  • Machine learning
  • Discrete optimization
  • Mathematical optimization
  • Design and analysis of algorithms
  • Theory of computation
  • Optimization with randomized search heuristics


Dive into the research topics of 'Evolution of activation functions for deep learning-based image classification.'. Together they form a unique fingerprint.

Cite this