JOURNAL OF MATHEMATICAL ANALYSIS AND APPLICATIONS | 卷:494 |
Negative results for approximation using single layer and multilayer feedforward neural networks | |
Article | |
Almira, J. M.1  Lopez-de-Teruel, P. E.1  Romero-Lopez, D. J.1  Voigtlaender, F.2  | |
[1] Univ Murcia, Dept Ingn & Tecnol Comp, Murcia 30100, Spain | |
[2] Catholic Univ Eichstatt Ingolstadt, Dept Sci Comp, D-85072 Eichstatt, Germany | |
关键词: Lethargy results; Rate of convergence; Approximation by neural networks; Ridge functions; Rational functions; Splines; | |
DOI : 10.1016/j.jmaa.2020.124584 | |
来源: Elsevier | |
【 摘 要 】
We prove a negative result for the approximation of functions defined on compact subsets of R-d (where d >= 2) using feedforward neural networks with one hidden layer and arbitrary continuous activation function. In a nutshell, this result claims the existence of target functions that are as difficult to approximate using these neural networks as one may want. We also demonstrate an analogous result (for general d is an element of N) for neural networks with an arbitrary number of hidden layers, for activation functions that are either rational functions or continuous splines with finitely many pieces. (C) 2020 Elsevier Inc. All rights reserved.
【 授权许可】
Free
【 预 览 】
Files | Size | Format | View |
---|---|---|---|
10_1016_j_jmaa_2020_124584.pdf | 376KB | download |