Repository logo
 
No Thumbnail Available
Publication

Classification Performance of Multilayer Perceptrons with Different Risk Functionals

Use this identifier to reference this record.
Name:Description:Size:Format: 
ART_JorgeSantos_LEMA_2014.pdf795.12 KBAdobe PDF Download

Advisor(s)

Abstract(s)

In the present paper we assess the performance of information-theoretic inspired risks functionals in multilayer perceptrons with reference to the two most popular ones, Mean Square Error and Cross-Entropy. The information-theoretic inspired risks, recently proposed, are: HS and HR2 are, respectively, the Shannon and quadratic Rényi entropies of the error; ZED is a risk reflecting the error density at zero errors; EXP is a generalized exponential risk, able to mimic a wide variety of risk functionals, including the information-thoeretic ones. The experiments were carried out with multilayer perceptrons on 35 public real-world datasets. All experiments were performed according to the same protocol. The statistical tests applied to the experimental results showed that the ubiquitous mean square error was the less interesting risk functional to be used by multilayer perceptrons. Namely, mean square error never achieved a significantly better classification performance than competing risks. Cross-entropy and EXP were the risks found by several tests to be significantly better than their competitors. Counts of significantly better and worse risks have also shown the usefulness of HS and HR2 for some datasets.

Description

Keywords

Neural Networks Risk Functionals Classification Multilayer perceptrons

Citation

Research Projects

Organizational Units

Journal Issue

Publisher

World Scientific Publishing Company

CC License

Altmetrics