Extreme Learning Machines for Multiclass Classification: Refining Predictions with Gaussian Mixture Models

Emil Eirola, Andrey Gritsenko, Anton Akusok, Kaj-Mikael Björk, Yoan Miche, Dušan Sovilj, Rui Nian, Bo He, Amaury Lendasse

Research output: Chapter in Book/Report/Conference proceedingConference contributionScientificpeer-review

7 Citations (Scopus)

Abstract

This paper presents an extension of the well-known Extreme Learning Machines (ELMs). The main goal is to provide probabilities as outputs for Multiclass Classification problems. Such information is more useful in practice than traditional crisp classification outputs. In summary, Gaussian Mixture Models are used as post-processing of ELMs. In that context, the proposed global methodology is keeping the advantages of ELMs (low computational time and state of the art performances) and the ability of Gaussian Mixture Models to deal with probabilities. The methodology is tested on 3 toy examples and 3 real datasets. As a result, the global performances of ELMs are slightly improved and the probability outputs are seen to be accurate and useful in practice.
Original languageEnglish
Title of host publicationInternational Work-Conference on Artificial Neural Networks : IWANN 2015: Advances in Computational Intelligence
Number of pages12
Place of PublicationCham
PublisherSpringer
Publication date06.06.2015
Pages153-164
ISBN (Print)978-3-319-19221-5
ISBN (Electronic)978-3-319-19222-2
DOIs
Publication statusPublished - 06.06.2015
MoE publication typeA4 Article in conference proceedings

Publication series

Name Lecture Notes in Computer Science (LNCS)
Volume9095

Keywords

  • 512 Business and Management
  • Classification
  • Machine learning
  • Neural network
  • Extreme learning machines
  • Gaussian mixture models
  • Multiclass classification
  • Leave-one-out cross-validation
  • PRESS statistics
  • Parental control
  • Internet security

Fingerprint

Dive into the research topics of 'Extreme Learning Machines for Multiclass Classification: Refining Predictions with Gaussian Mixture Models'. Together they form a unique fingerprint.

Cite this