The original paper is in English. Non-English content has been machine-translated and may contain typographical errors or mistranslations. ex. Some numerals are expressed as "XNUMX".
Copyrights notice
The original paper is in English. Non-English content has been machine-translated and may contain typographical errors or mistranslations. Copyrights notice
Kami mengucapkan terima kasih kepada Kamata et al. (2023) [1] untuk minat mereka dalam kerja kami [2], dan untuk memberikan penjelasan tentang kernel kuasi-linear dari sudut pandangan pembelajaran kernel berbilang. Dalam surat ini, kami mula-mula memberikan ringkasan SVM kuasi-linear. Kemudian kami menyediakan perbincangan tentang kebaharuan kernel kuasi-linear terhadap pembelajaran kernel berbilang. Akhir sekali, kami menerangkan sumbangan kerja kami [2].
Bo ZHOU
Xi'an Jiaotong University
Benhui CHEN
Dali University
Jinglu HU
Waseda University
The copyright of the original papers published on this site belongs to IEICE. Unauthorized use of the original or translated papers is prohibited. See IEICE Provisions on Copyright for details.
Salinan
Bo ZHOU, Benhui CHEN, Jinglu HU, "Authors' Reply to the Comments by Kamata et al." in IEICE TRANSACTIONS on Fundamentals,
vol. E106-A, no. 11, pp. 1446-1449, November 2023, doi: 10.1587/transfun.2023EAL2006.
Abstract: We thank Kamata et al. (2023) [1] for their interest in our work [2], and for providing an explanation of the quasi-linear kernel from a viewpoint of multiple kernel learning. In this letter, we first give a summary of the quasi-linear SVM. Then we provide a discussion on the novelty of quasi-linear kernels against multiple kernel learning. Finally, we explain the contributions of our work [2].
URL: https://global.ieice.org/en_transactions/fundamentals/10.1587/transfun.2023EAL2006/_p
Salinan
@ARTICLE{e106-a_11_1446,
author={Bo ZHOU, Benhui CHEN, Jinglu HU, },
journal={IEICE TRANSACTIONS on Fundamentals},
title={Authors' Reply to the Comments by Kamata et al.},
year={2023},
volume={E106-A},
number={11},
pages={1446-1449},
abstract={We thank Kamata et al. (2023) [1] for their interest in our work [2], and for providing an explanation of the quasi-linear kernel from a viewpoint of multiple kernel learning. In this letter, we first give a summary of the quasi-linear SVM. Then we provide a discussion on the novelty of quasi-linear kernels against multiple kernel learning. Finally, we explain the contributions of our work [2].},
keywords={},
doi={10.1587/transfun.2023EAL2006},
ISSN={1745-1337},
month={November},}
Salinan
TY - JOUR
TI - Authors' Reply to the Comments by Kamata et al.
T2 - IEICE TRANSACTIONS on Fundamentals
SP - 1446
EP - 1449
AU - Bo ZHOU
AU - Benhui CHEN
AU - Jinglu HU
PY - 2023
DO - 10.1587/transfun.2023EAL2006
JO - IEICE TRANSACTIONS on Fundamentals
SN - 1745-1337
VL - E106-A
IS - 11
JA - IEICE TRANSACTIONS on Fundamentals
Y1 - November 2023
AB - We thank Kamata et al. (2023) [1] for their interest in our work [2], and for providing an explanation of the quasi-linear kernel from a viewpoint of multiple kernel learning. In this letter, we first give a summary of the quasi-linear SVM. Then we provide a discussion on the novelty of quasi-linear kernels against multiple kernel learning. Finally, we explain the contributions of our work [2].
ER -