Convergence Rate for <inline-formula> <tex-math notation="LaTeX">$l^{q}$ </tex-math></inline-formula>-Coefficient Regularized Regression With Non-i.i.d. Sampling
Many learning algorithms use hypothesis spaces which are trained from samples, but little theoretical work has been devoted to the study of these algorithms. In this paper, we show that mathematical analysis for the kernel-based coefficient least squares for regression with l<sup>q</sup>...
Main Authors: | , , |
---|---|
Format: | Article |
Language: | English |
Published: |
IEEE
2018-01-01
|
Series: | IEEE Access |
Subjects: | |
Online Access: | https://ieeexplore.ieee.org/document/8319967/ |
id |
doaj-766fb3cc54e14d209d67e67b4fca8d93 |
---|---|
record_format |
Article |
spelling |
doaj-766fb3cc54e14d209d67e67b4fca8d932021-03-29T21:02:45ZengIEEEIEEE Access2169-35362018-01-016188041881310.1109/ACCESS.2018.28172158319967Convergence Rate for <inline-formula> <tex-math notation="LaTeX">$l^{q}$ </tex-math></inline-formula>-Coefficient Regularized Regression With Non-i.i.d. SamplingQin Guo0https://orcid.org/0000-0002-5355-5857Peixin Ye1Binlei Cai2School of Mathematical Sciences and LPMC, Nankai University, Tianjin, ChinaSchool of Mathematical Sciences and LPMC, Nankai University, Tianjin, ChinaSchool of Computer Science and Technology, Tianjin University, Tianjin, ChinaMany learning algorithms use hypothesis spaces which are trained from samples, but little theoretical work has been devoted to the study of these algorithms. In this paper, we show that mathematical analysis for the kernel-based coefficient least squares for regression with l<sup>q</sup>-regularizer, 1 ≤ q ≤ 2, which is essentially different from that for algorithms with hypothesis spaces independent of the sample or depending only on the sample size. The error analysis was carried out under the assumption that the samples are drawn from a non-identical sequence of probability measures and satisfy the β-mixing condition. We use the drift error analysis and the independent-blocks technique to deal with the non-identical and dependent setting, respectively. When the sequence of marginal distributions converges exponentially fast in the dual of a Hölder space and the sampling process satisfies polynomially β-mixing, we obtain the capacity dependent error bounds of the algorithm. As a byproduct, we derive a significantly faster learning rate that can be arbitrarily close to the best rate O(m<sup>-1</sup>) for the independent and identical samples.https://ieeexplore.ieee.org/document/8319967/Coefficient-based regularized regressiondrift errorlearning ratemixing sequenceuniform concentration inequality |
collection |
DOAJ |
language |
English |
format |
Article |
sources |
DOAJ |
author |
Qin Guo Peixin Ye Binlei Cai |
spellingShingle |
Qin Guo Peixin Ye Binlei Cai Convergence Rate for <inline-formula> <tex-math notation="LaTeX">$l^{q}$ </tex-math></inline-formula>-Coefficient Regularized Regression With Non-i.i.d. Sampling IEEE Access Coefficient-based regularized regression drift error learning rate mixing sequence uniform concentration inequality |
author_facet |
Qin Guo Peixin Ye Binlei Cai |
author_sort |
Qin Guo |
title |
Convergence Rate for <inline-formula> <tex-math notation="LaTeX">$l^{q}$ </tex-math></inline-formula>-Coefficient Regularized Regression With Non-i.i.d. Sampling |
title_short |
Convergence Rate for <inline-formula> <tex-math notation="LaTeX">$l^{q}$ </tex-math></inline-formula>-Coefficient Regularized Regression With Non-i.i.d. Sampling |
title_full |
Convergence Rate for <inline-formula> <tex-math notation="LaTeX">$l^{q}$ </tex-math></inline-formula>-Coefficient Regularized Regression With Non-i.i.d. Sampling |
title_fullStr |
Convergence Rate for <inline-formula> <tex-math notation="LaTeX">$l^{q}$ </tex-math></inline-formula>-Coefficient Regularized Regression With Non-i.i.d. Sampling |
title_full_unstemmed |
Convergence Rate for <inline-formula> <tex-math notation="LaTeX">$l^{q}$ </tex-math></inline-formula>-Coefficient Regularized Regression With Non-i.i.d. Sampling |
title_sort |
convergence rate for <inline-formula> <tex-math notation="latex">$l^{q}$ </tex-math></inline-formula>-coefficient regularized regression with non-i.i.d. sampling |
publisher |
IEEE |
series |
IEEE Access |
issn |
2169-3536 |
publishDate |
2018-01-01 |
description |
Many learning algorithms use hypothesis spaces which are trained from samples, but little theoretical work has been devoted to the study of these algorithms. In this paper, we show that mathematical analysis for the kernel-based coefficient least squares for regression with l<sup>q</sup>-regularizer, 1 ≤ q ≤ 2, which is essentially different from that for algorithms with hypothesis spaces independent of the sample or depending only on the sample size. The error analysis was carried out under the assumption that the samples are drawn from a non-identical sequence of probability measures and satisfy the β-mixing condition. We use the drift error analysis and the independent-blocks technique to deal with the non-identical and dependent setting, respectively. When the sequence of marginal distributions converges exponentially fast in the dual of a Hölder space and the sampling process satisfies polynomially β-mixing, we obtain the capacity dependent error bounds of the algorithm. As a byproduct, we derive a significantly faster learning rate that can be arbitrarily close to the best rate O(m<sup>-1</sup>) for the independent and identical samples. |
topic |
Coefficient-based regularized regression drift error learning rate mixing sequence uniform concentration inequality |
url |
https://ieeexplore.ieee.org/document/8319967/ |
work_keys_str_mv |
AT qinguo convergencerateforinlineformulatexmathnotationlatexlqtexmathinlineformulacoefficientregularizedregressionwithnoniidsampling AT peixinye convergencerateforinlineformulatexmathnotationlatexlqtexmathinlineformulacoefficientregularizedregressionwithnoniidsampling AT binleicai convergencerateforinlineformulatexmathnotationlatexlqtexmathinlineformulacoefficientregularizedregressionwithnoniidsampling |
_version_ |
1724193632987119616 |