PlumX Metrics
Embed PlumX Metrics

Proving the efficacy of complementary inputs for multilayer neural networks

Proceedings of the International Joint Conference on Neural Networks, Page: 2062-2066
2011
  • 0
    Citations
  • 24
    Usage
  • 0
    Captures
  • 0
    Mentions
  • 0
    Social Media
Metric Options:   Counts1 Year3 Year

Metrics Details

Conference Paper Description

This paper proposes and discusses a backpropagation-based training approach for multilayer networks that counteracts the tendency that typical backpropagation-based training algorithms have to favor examples that have large input feature values. This problem can occur in any real valued input space, and can create a surprising degree of skew in the learned decision surface even with relatively simple training sets. The proposed method involves modifying the original input feature vectors in the training set by appending complementary inputs, which essentially doubles the number of inputs to the network. This paper proves that this modification does not increase the network complexity, by showing that it is possible to map the network with complimentary inputs back into the original feature space. © 2011 IEEE.

Provide Feedback

Have ideas for a new metric? Would you like to see something else here?Let us know