Convergence rates of support vector machines regression for functional data
Journal of Complexity, ISSN: 0885-064X, Vol: 69, Page: 101604
2022
- 3Citations
- 5Captures
Metric Options: CountsSelecting the 1-year or 3-year option will change the metrics count to percentiles, illustrating how an article or review compares to other articles or reviews within the selected time period in the same journal. Selecting the 1-year option compares the metrics against other articles/reviews that were also published in the same calendar year. Selecting the 3-year option compares the metrics against other articles/reviews that were also published in the same calendar year plus the two years prior.
Example: if you select the 1-year option for an article published in 2019 and a metric category shows 90%, that means that the article or review is performing better than 90% of the other articles/reviews published in that journal in 2019. If you select the 3-year option for the same article published in 2019 and the metric category shows 90%, that means that the article or review is performing better than 90% of the other articles/reviews published in that journal in 2019, 2018 and 2017.
Citation Benchmarking is provided by Scopus and SciVal and is different from the metrics context provided by PlumX Metrics.
Example: if you select the 1-year option for an article published in 2019 and a metric category shows 90%, that means that the article or review is performing better than 90% of the other articles/reviews published in that journal in 2019. If you select the 3-year option for the same article published in 2019 and the metric category shows 90%, that means that the article or review is performing better than 90% of the other articles/reviews published in that journal in 2019, 2018 and 2017.
Citation Benchmarking is provided by Scopus and SciVal and is different from the metrics context provided by PlumX Metrics.
Article Description
Support vector machines regression (SVMR) is an important part of statistical learning theory. The main difference between SVMR and the classical least squares regression (LSR) is that SVMR uses the ϵ -insensitive loss rather than quadratic loss to measure the empirical error. In this paper, we consider SVMR method in the field of functional data analysis under the framework of reproducing kernel Hilbert spaces. The main tool used in our theoretical analysis is the concentration inequalities for suprema of some appropriate empirical processes. As a result, we establish explicit convergence rates of the prediction risk for SVMR, which coincide with the minimax lower bound obtained recently in literature for LSR.
Bibliographic Details
http://www.sciencedirect.com/science/article/pii/S0885064X21000595; http://dx.doi.org/10.1016/j.jco.2021.101604; http://www.scopus.com/inward/record.url?partnerID=HzOxMe3b&scp=85114676556&origin=inward; https://linkinghub.elsevier.com/retrieve/pii/S0885064X21000595; https://dx.doi.org/10.1016/j.jco.2021.101604
Elsevier BV
Provide Feedback
Have ideas for a new metric? Would you like to see something else here?Let us know