The original paper is in English. Non-English content has been machine-translated and may contain typographical errors or mistranslations. ex. Some numerals are expressed as "XNUMX".
Copyrights notice
The original paper is in English. Non-English content has been machine-translated and may contain typographical errors or mistranslations. Copyrights notice
Nous remercions Kamata et al. (2023) [1] pour leur intérêt pour nos travaux [2], et pour avoir fourni une explication du noyau quasi-linéaire du point de vue de l'apprentissage à noyaux multiples. Dans cette lettre, nous donnons d'abord un résumé du SVM quasi-linéaire. Nous proposons ensuite une discussion sur la nouveauté des noyaux quasi-linéaires par rapport à l'apprentissage à noyaux multiples. Enfin, nous expliquons les apports de nos travaux [2].
Bo ZHOU
Xi'an Jiaotong University
Benhui CHEN
Dali University
Jinglu HU
Waseda University
The copyright of the original papers published on this site belongs to IEICE. Unauthorized use of the original or translated papers is prohibited. See IEICE Provisions on Copyright for details.
Copier
Bo ZHOU, Benhui CHEN, Jinglu HU, "Authors' Reply to the Comments by Kamata et al." in IEICE TRANSACTIONS on Fundamentals,
vol. E106-A, no. 11, pp. 1446-1449, November 2023, doi: 10.1587/transfun.2023EAL2006.
Abstract: We thank Kamata et al. (2023) [1] for their interest in our work [2], and for providing an explanation of the quasi-linear kernel from a viewpoint of multiple kernel learning. In this letter, we first give a summary of the quasi-linear SVM. Then we provide a discussion on the novelty of quasi-linear kernels against multiple kernel learning. Finally, we explain the contributions of our work [2].
URL: https://global.ieice.org/en_transactions/fundamentals/10.1587/transfun.2023EAL2006/_p
Copier
@ARTICLE{e106-a_11_1446,
author={Bo ZHOU, Benhui CHEN, Jinglu HU, },
journal={IEICE TRANSACTIONS on Fundamentals},
title={Authors' Reply to the Comments by Kamata et al.},
year={2023},
volume={E106-A},
number={11},
pages={1446-1449},
abstract={We thank Kamata et al. (2023) [1] for their interest in our work [2], and for providing an explanation of the quasi-linear kernel from a viewpoint of multiple kernel learning. In this letter, we first give a summary of the quasi-linear SVM. Then we provide a discussion on the novelty of quasi-linear kernels against multiple kernel learning. Finally, we explain the contributions of our work [2].},
keywords={},
doi={10.1587/transfun.2023EAL2006},
ISSN={1745-1337},
month={November},}
Copier
TY - JOUR
TI - Authors' Reply to the Comments by Kamata et al.
T2 - IEICE TRANSACTIONS on Fundamentals
SP - 1446
EP - 1449
AU - Bo ZHOU
AU - Benhui CHEN
AU - Jinglu HU
PY - 2023
DO - 10.1587/transfun.2023EAL2006
JO - IEICE TRANSACTIONS on Fundamentals
SN - 1745-1337
VL - E106-A
IS - 11
JA - IEICE TRANSACTIONS on Fundamentals
Y1 - November 2023
AB - We thank Kamata et al. (2023) [1] for their interest in our work [2], and for providing an explanation of the quasi-linear kernel from a viewpoint of multiple kernel learning. In this letter, we first give a summary of the quasi-linear SVM. Then we provide a discussion on the novelty of quasi-linear kernels against multiple kernel learning. Finally, we explain the contributions of our work [2].
ER -