The original paper is in English. Non-English content has been machine-translated and may contain typographical errors or mistranslations. ex. Some numerals are expressed as "XNUMX".
Copyrights notice
The original paper is in English. Non-English content has been machine-translated and may contain typographical errors or mistranslations. Copyrights notice
Méthodes du noyau tels que la machine à vecteurs de support sont l’un des algorithmes les plus performants de l’apprentissage automatique moderne. Leur avantage est que les algorithmes linéaires sont étendus aux scénarios non linéaires de manière simple grâce à l’utilisation de l’astuce du noyau. Cependant, l’utilisation naïve des méthodes du noyau est coûteuse en termes de calcul, car la complexité du calcul évolue généralement de manière cubique par rapport au nombre d’échantillons d’apprentissage. Dans cet article, nous passons en revue les avancées récentes dans les méthodes du noyau, en mettant l'accent sur l'évolutivité pour des problèmes massifs.
The copyright of the original papers published on this site belongs to IEICE. Unauthorized use of the original or translated papers is prohibited. See IEICE Provisions on Copyright for details.
Copier
Hisashi KASHIMA, Tsuyoshi IDE, Tsuyoshi KATO, Masashi SUGIYAMA, "Recent Advances and Trends in Large-Scale Kernel Methods" in IEICE TRANSACTIONS on Information,
vol. E92-D, no. 7, pp. 1338-1353, July 2009, doi: 10.1587/transinf.E92.D.1338.
Abstract: Kernel methods such as the support vector machine are one of the most successful algorithms in modern machine learning. Their advantage is that linear algorithms are extended to non-linear scenarios in a straightforward way by the use of the kernel trick. However, naive use of kernel methods is computationally expensive since the computational complexity typically scales cubically with respect to the number of training samples. In this article, we review recent advances in the kernel methods, with emphasis on scalability for massive problems.
URL: https://global.ieice.org/en_transactions/information/10.1587/transinf.E92.D.1338/_p
Copier
@ARTICLE{e92-d_7_1338,
author={Hisashi KASHIMA, Tsuyoshi IDE, Tsuyoshi KATO, Masashi SUGIYAMA, },
journal={IEICE TRANSACTIONS on Information},
title={Recent Advances and Trends in Large-Scale Kernel Methods},
year={2009},
volume={E92-D},
number={7},
pages={1338-1353},
abstract={Kernel methods such as the support vector machine are one of the most successful algorithms in modern machine learning. Their advantage is that linear algorithms are extended to non-linear scenarios in a straightforward way by the use of the kernel trick. However, naive use of kernel methods is computationally expensive since the computational complexity typically scales cubically with respect to the number of training samples. In this article, we review recent advances in the kernel methods, with emphasis on scalability for massive problems.},
keywords={},
doi={10.1587/transinf.E92.D.1338},
ISSN={1745-1361},
month={July},}
Copier
TY - JOUR
TI - Recent Advances and Trends in Large-Scale Kernel Methods
T2 - IEICE TRANSACTIONS on Information
SP - 1338
EP - 1353
AU - Hisashi KASHIMA
AU - Tsuyoshi IDE
AU - Tsuyoshi KATO
AU - Masashi SUGIYAMA
PY - 2009
DO - 10.1587/transinf.E92.D.1338
JO - IEICE TRANSACTIONS on Information
SN - 1745-1361
VL - E92-D
IS - 7
JA - IEICE TRANSACTIONS on Information
Y1 - July 2009
AB - Kernel methods such as the support vector machine are one of the most successful algorithms in modern machine learning. Their advantage is that linear algorithms are extended to non-linear scenarios in a straightforward way by the use of the kernel trick. However, naive use of kernel methods is computationally expensive since the computational complexity typically scales cubically with respect to the number of training samples. In this article, we review recent advances in the kernel methods, with emphasis on scalability for massive problems.
ER -