The general approximation theorem

Citation data:

1998 IEEE International Joint Conference on Neural Networks Proceedings. IEEE World Congress on Computational Intelligence (Cat. No.98CH36227), Vol: 2, Page: 1271-1274

Publication Year:
1998
Usage 88
Downloads 85
Abstract Views 3
Citations 2
Citation Indexes 2
Repository URL:
http://scholarsmine.mst.edu/ele_comeng_facwork/1909; https://works.bepress.com/donald-wunsch/338
DOI:
10.1109/ijcnn.1998.685957
Author(s):
Wunsch, Donald C.; Gorban, Alexander N.
Publisher(s):
Institute of Electrical and Electronics Engineers (IEEE)
Tags:
Stone Theorem; Approximation Theory; Function Approximation; General Approximation Theorem; Mathematics Computing; Neural Nets; Neural Networks; Neuron Activation Function; Stone Theorem; Approximation Theory; Function Approximation; General Approximation Theorem; Mathematics Computing; Neural Nets; Neural Networks; Neuron Activation Function; Electrical and Computer Engineering
conference paper description
A general approximation theorem is proved. It uniformly envelopes both the classical Stone theorem and approximation of functions of several variables by means of superpositions and linear combinations of functions of one variable. This theorem is interpreted as a statement on universal approximating possibilities ("approximating omnipotence") of arbitrary nonlinearity. For the neural networks, our result states that the function of neuron activation must be nonlinear, and nothing else