Feature selection and learning curves of a multilayer perceptron chromosome classifier

B. Lemer, H. Guterman, I. Dinstein, Y. Romern

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

1 Scopus citations

Abstract

A multilayer perceptron (MLP) neural network (NN) was used for human chromosome classification. The significance of relevant chromosome features to the classification procedure was evaluated using a feature selection mechanism. It yielded the benefit of using only a part of the available features to get performance close to the ultimate one, classifying chromosomes of 5 types. Only 10-20 examples were required for the MLP NN classifier to reach its supreme performance disregarding the number of features used. Furthermore, the empirical entropie error of the classifier was found to be highly comparable to the 1/t function that is a universal learning curve.

Original languageEnglish
Title of host publicationProceedings of the 12th IAPR International Conference on Pattern Recognition - Conference B
Subtitle of host publicationPattern Recognition and Neural Networks, ICPR 1994
PublisherInstitute of Electrical and Electronics Engineers
Pages497-499
Number of pages3
ISBN (Electronic)0818662700
StatePublished - 1 Jan 1994
Event12th IAPR International Conference on Pattern Recognition - Conference B: Pattern Recognition and Neural Networks, ICPR 1994 - Jerusalem, Israel
Duration: 9 Oct 199413 Oct 1994

Publication series

NameProceedings - International Conference on Pattern Recognition
Volume2
ISSN (Print)1051-4651

Conference

Conference12th IAPR International Conference on Pattern Recognition - Conference B: Pattern Recognition and Neural Networks, ICPR 1994
Country/TerritoryIsrael
CityJerusalem
Period9/10/9413/10/94

ASJC Scopus subject areas

  • Computer Vision and Pattern Recognition

Fingerprint

Dive into the research topics of 'Feature selection and learning curves of a multilayer perceptron chromosome classifier'. Together they form a unique fingerprint.

Cite this