Density Problem and Approximation Error in Learning Theory

We study the density problem and approximation error of reproducing kernel Hilbert spaces for the purpose of learning theory. For a Mercer kernel on a compact metric space (, ), a characterization for the generated reproducing kernel Hilbert space (RKHS) to be dense in is given. As a corolla...

Full description

Bibliographic Details
Main Author: Ding-Xuan Zhou
Format: Article
Language:English
Published: Hindawi Limited 2013-01-01
Series:Abstract and Applied Analysis
Online Access:http://dx.doi.org/10.1155/2013/715683
id doaj-269d3a2105b64643ad364bc15cb4a250
record_format Article
spelling doaj-269d3a2105b64643ad364bc15cb4a2502020-11-25T01:07:42ZengHindawi LimitedAbstract and Applied Analysis1085-33751687-04092013-01-01201310.1155/2013/715683715683Density Problem and Approximation Error in Learning TheoryDing-Xuan Zhou0Department of Mathematics, City University of Hong Kong, Tat Chee Avenue, Kowloon, Hong Kong, ChinaWe study the density problem and approximation error of reproducing kernel Hilbert spaces for the purpose of learning theory. For a Mercer kernel on a compact metric space (, ), a characterization for the generated reproducing kernel Hilbert space (RKHS) to be dense in is given. As a corollary, we show that the density is always true for convolution type kernels. Some estimates for the rate of convergence of interpolation schemes are presented for general Mercer kernels. These are then used to establish for convolution type kernels quantitative analysis for the approximation error in learning theory. Finally, we show by the example of Gaussian kernels with varying variances that the approximation error can be improved when we adaptively change the value of the parameter for the used kernel. This confirms the method of choosing varying parameters which is used often in many applications of learning theory.http://dx.doi.org/10.1155/2013/715683
collection DOAJ
language English
format Article
sources DOAJ
author Ding-Xuan Zhou
spellingShingle Ding-Xuan Zhou
Density Problem and Approximation Error in Learning Theory
Abstract and Applied Analysis
author_facet Ding-Xuan Zhou
author_sort Ding-Xuan Zhou
title Density Problem and Approximation Error in Learning Theory
title_short Density Problem and Approximation Error in Learning Theory
title_full Density Problem and Approximation Error in Learning Theory
title_fullStr Density Problem and Approximation Error in Learning Theory
title_full_unstemmed Density Problem and Approximation Error in Learning Theory
title_sort density problem and approximation error in learning theory
publisher Hindawi Limited
series Abstract and Applied Analysis
issn 1085-3375
1687-0409
publishDate 2013-01-01
description We study the density problem and approximation error of reproducing kernel Hilbert spaces for the purpose of learning theory. For a Mercer kernel on a compact metric space (, ), a characterization for the generated reproducing kernel Hilbert space (RKHS) to be dense in is given. As a corollary, we show that the density is always true for convolution type kernels. Some estimates for the rate of convergence of interpolation schemes are presented for general Mercer kernels. These are then used to establish for convolution type kernels quantitative analysis for the approximation error in learning theory. Finally, we show by the example of Gaussian kernels with varying variances that the approximation error can be improved when we adaptively change the value of the parameter for the used kernel. This confirms the method of choosing varying parameters which is used often in many applications of learning theory.
url http://dx.doi.org/10.1155/2013/715683
work_keys_str_mv AT dingxuanzhou densityproblemandapproximationerrorinlearningtheory
_version_ 1725185834566549504