The original paper is in English. Non-English content has been machine-translated and may contain typographical errors or mistranslations. ex. Some numerals are expressed as "XNUMX".
Copyrights notice
The original paper is in English. Non-English content has been machine-translated and may contain typographical errors or mistranslations. Copyrights notice
Mesin vektor sokongan Laplacian (LSVM) ialah rangka kerja separa seliaan yang menggunakan penyelarasan manifold untuk belajar daripada data berlabel dan tidak berlabel. Walau bagaimanapun, parameter kernel optimum LSVM sukar diperoleh. Dalam kertas kerja ini, kami mencadangkan kaedah LSVM (MK-LSVM) berbilang kernel menggunakan rumusan pembelajaran berbilang kernel dalam kombinasi dengan LSVM. Formulasi pembelajaran kami mengandaikan bahawa satu set inti asas dikumpulkan, dan digunakan l2 penyelarasan norma untuk mencari gabungan linear optimum inti asas secara automatik. Ujian percubaan mendedahkan bahawa kaedah kami mencapai prestasi yang lebih baik daripada LSVM sahaja menggunakan data sintetik, Repositori Pembelajaran Mesin UCI dan pangkalan data Caltech bagi Klasifikasi Objek Generik.
The copyright of the original papers published on this site belongs to IEICE. Unauthorized use of the original or translated papers is prohibited. See IEICE Provisions on Copyright for details.
Salinan
Lihua GUO, Lianwen JIN, "Laplacian Support Vector Machines with Multi-Kernel Learning" in IEICE TRANSACTIONS on Information,
vol. E94-D, no. 2, pp. 379-383, February 2011, doi: 10.1587/transinf.E94.D.379.
Abstract: The Laplacian support vector machine (LSVM) is a semi-supervised framework that uses manifold regularization for learning from labeled and unlabeled data. However, the optimal kernel parameters of LSVM are difficult to obtain. In this paper, we propose a multi-kernel LSVM (MK-LSVM) method using multi-kernel learning formulations in combination with the LSVM. Our learning formulations assume that a set of base kernels are grouped, and employ l2 norm regularization for automatically seeking the optimal linear combination of base kernels. Experimental testing reveals that our method achieves better performance than the LSVM alone using synthetic data, the UCI Machine Learning Repository, and the Caltech database of Generic Object Classification.
URL: https://global.ieice.org/en_transactions/information/10.1587/transinf.E94.D.379/_p
Salinan
@ARTICLE{e94-d_2_379,
author={Lihua GUO, Lianwen JIN, },
journal={IEICE TRANSACTIONS on Information},
title={Laplacian Support Vector Machines with Multi-Kernel Learning},
year={2011},
volume={E94-D},
number={2},
pages={379-383},
abstract={The Laplacian support vector machine (LSVM) is a semi-supervised framework that uses manifold regularization for learning from labeled and unlabeled data. However, the optimal kernel parameters of LSVM are difficult to obtain. In this paper, we propose a multi-kernel LSVM (MK-LSVM) method using multi-kernel learning formulations in combination with the LSVM. Our learning formulations assume that a set of base kernels are grouped, and employ l2 norm regularization for automatically seeking the optimal linear combination of base kernels. Experimental testing reveals that our method achieves better performance than the LSVM alone using synthetic data, the UCI Machine Learning Repository, and the Caltech database of Generic Object Classification.},
keywords={},
doi={10.1587/transinf.E94.D.379},
ISSN={1745-1361},
month={February},}
Salinan
TY - JOUR
TI - Laplacian Support Vector Machines with Multi-Kernel Learning
T2 - IEICE TRANSACTIONS on Information
SP - 379
EP - 383
AU - Lihua GUO
AU - Lianwen JIN
PY - 2011
DO - 10.1587/transinf.E94.D.379
JO - IEICE TRANSACTIONS on Information
SN - 1745-1361
VL - E94-D
IS - 2
JA - IEICE TRANSACTIONS on Information
Y1 - February 2011
AB - The Laplacian support vector machine (LSVM) is a semi-supervised framework that uses manifold regularization for learning from labeled and unlabeled data. However, the optimal kernel parameters of LSVM are difficult to obtain. In this paper, we propose a multi-kernel LSVM (MK-LSVM) method using multi-kernel learning formulations in combination with the LSVM. Our learning formulations assume that a set of base kernels are grouped, and employ l2 norm regularization for automatically seeking the optimal linear combination of base kernels. Experimental testing reveals that our method achieves better performance than the LSVM alone using synthetic data, the UCI Machine Learning Repository, and the Caltech database of Generic Object Classification.
ER -