The general approximation theorem
 Citation data:

1998 IEEE International Joint Conference on Neural Networks Proceedings. IEEE World Congress on Computational Intelligence (Cat. No.98CH36227), Vol: 2, Page: 12711274
 Publication Year:
 1998

 Bepress 64

 EBSCO 1
 Bepress 1

 CrossRef 2
 Repository URL:
 http://scholarsmine.mst.edu/ele_comeng_facwork/1909
 DOI:
 10.1109/ijcnn.1998.685957
 Author(s):
 Publisher(s):
 Tags:
 Stone Theorem; Approximation Theory; Function Approximation; General Approximation Theorem; Mathematics Computing; Neural Nets; Neural Networks; Neuron Activation Function; Stone Theorem; Approximation Theory; Function Approximation; General Approximation Theorem; Mathematics Computing; Neural Nets; Neural Networks; Neuron Activation Function; Electrical and Computer Engineering
conference paper description
A general approximation theorem is proved. It uniformly envelopes both the classical Stone theorem and approximation of functions of several variables by means of superpositions and linear combinations of functions of one variable. This theorem is interpreted as a statement on universal approximating possibilities ("approximating omnipotence") of arbitrary nonlinearity. For the neural networks, our result states that the function of neuron activation must be nonlinear, and nothing else