The original paper is in English. Non-English content has been machine-translated and may contain typographical errors or mistranslations. ex. Some numerals are expressed as "XNUMX".
Copyrights notice
The original paper is in English. Non-English content has been machine-translated and may contain typographical errors or mistranslations. Copyrights notice
Regresi logistik kernel (KLR) ialah algoritma pengelasan yang berkuasa dan fleksibel, yang mempunyai keupayaan untuk memberikan keyakinan ramalan kelas. Walau bagaimanapun, latihannya--biasanya dijalankan oleh kaedah (kuasi-) Newton--agak memakan masa. Dalam makalah ini, kami mencadangkan algoritma pengelasan probabilistik alternatif yang dipanggil Pengelas Kebarangkalian Kuasa Dua Terkecil (LSPC). KLR memodelkan kebarangkalian kelas-posterior dengan gabungan log-linear fungsi kernel dan parameternya dipelajari dengan kemungkinan maksimum (teratur). Sebaliknya, LSPC menggunakan gabungan linear fungsi kernel dan parameternya dipelajari oleh pemadanan kuasa dua terkecil yang teratur bagi kebarangkalian kelas-posterior sebenar. Terima kasih kepada rumusan kuasa dua terkecil tersusun linear ini, penyelesaian LSPC boleh dikira secara analitik hanya dengan menyelesaikan sistem persamaan linear terlaras dengan cara mengikut kelas. Oleh itu LSPC adalah sangat cekap dari segi pengiraan dan stabil dari segi berangka. Melalui eksperimen, kami menunjukkan bahawa masa pengiraan LSPC adalah lebih pantas daripada KLR dengan dua urutan magnitud, dengan ketepatan pengelasan yang setanding.
The copyright of the original papers published on this site belongs to IEICE. Unauthorized use of the original or translated papers is prohibited. See IEICE Provisions on Copyright for details.
Salinan
Masashi SUGIYAMA, "Superfast-Trainable Multi-Class Probabilistic Classifier by Least-Squares Posterior Fitting" in IEICE TRANSACTIONS on Information,
vol. E93-D, no. 10, pp. 2690-2701, October 2010, doi: 10.1587/transinf.E93.D.2690.
Abstract: Kernel logistic regression (KLR) is a powerful and flexible classification algorithm, which possesses an ability to provide the confidence of class prediction. However, its training--typically carried out by (quasi-)Newton methods--is rather time-consuming. In this paper, we propose an alternative probabilistic classification algorithm called Least-Squares Probabilistic Classifier (LSPC). KLR models the class-posterior probability by the log-linear combination of kernel functions and its parameters are learned by (regularized) maximum likelihood. In contrast, LSPC employs the linear combination of kernel functions and its parameters are learned by regularized least-squares fitting of the true class-posterior probability. Thanks to this linear regularized least-squares formulation, the solution of LSPC can be computed analytically just by solving a regularized system of linear equations in a class-wise manner. Thus LSPC is computationally very efficient and numerically stable. Through experiments, we show that the computation time of LSPC is faster than that of KLR by two orders of magnitude, with comparable classification accuracy.
URL: https://global.ieice.org/en_transactions/information/10.1587/transinf.E93.D.2690/_p
Salinan
@ARTICLE{e93-d_10_2690,
author={Masashi SUGIYAMA, },
journal={IEICE TRANSACTIONS on Information},
title={Superfast-Trainable Multi-Class Probabilistic Classifier by Least-Squares Posterior Fitting},
year={2010},
volume={E93-D},
number={10},
pages={2690-2701},
abstract={Kernel logistic regression (KLR) is a powerful and flexible classification algorithm, which possesses an ability to provide the confidence of class prediction. However, its training--typically carried out by (quasi-)Newton methods--is rather time-consuming. In this paper, we propose an alternative probabilistic classification algorithm called Least-Squares Probabilistic Classifier (LSPC). KLR models the class-posterior probability by the log-linear combination of kernel functions and its parameters are learned by (regularized) maximum likelihood. In contrast, LSPC employs the linear combination of kernel functions and its parameters are learned by regularized least-squares fitting of the true class-posterior probability. Thanks to this linear regularized least-squares formulation, the solution of LSPC can be computed analytically just by solving a regularized system of linear equations in a class-wise manner. Thus LSPC is computationally very efficient and numerically stable. Through experiments, we show that the computation time of LSPC is faster than that of KLR by two orders of magnitude, with comparable classification accuracy.},
keywords={},
doi={10.1587/transinf.E93.D.2690},
ISSN={1745-1361},
month={October},}
Salinan
TY - JOUR
TI - Superfast-Trainable Multi-Class Probabilistic Classifier by Least-Squares Posterior Fitting
T2 - IEICE TRANSACTIONS on Information
SP - 2690
EP - 2701
AU - Masashi SUGIYAMA
PY - 2010
DO - 10.1587/transinf.E93.D.2690
JO - IEICE TRANSACTIONS on Information
SN - 1745-1361
VL - E93-D
IS - 10
JA - IEICE TRANSACTIONS on Information
Y1 - October 2010
AB - Kernel logistic regression (KLR) is a powerful and flexible classification algorithm, which possesses an ability to provide the confidence of class prediction. However, its training--typically carried out by (quasi-)Newton methods--is rather time-consuming. In this paper, we propose an alternative probabilistic classification algorithm called Least-Squares Probabilistic Classifier (LSPC). KLR models the class-posterior probability by the log-linear combination of kernel functions and its parameters are learned by (regularized) maximum likelihood. In contrast, LSPC employs the linear combination of kernel functions and its parameters are learned by regularized least-squares fitting of the true class-posterior probability. Thanks to this linear regularized least-squares formulation, the solution of LSPC can be computed analytically just by solving a regularized system of linear equations in a class-wise manner. Thus LSPC is computationally very efficient and numerically stable. Through experiments, we show that the computation time of LSPC is faster than that of KLR by two orders of magnitude, with comparable classification accuracy.
ER -