The original paper is in English. Non-English content has been machine-translated and may contain typographical errors or mistranslations. ex. Some numerals are expressed as "XNUMX".
Copyrights notice
The original paper is in English. Non-English content has been machine-translated and may contain typographical errors or mistranslations. Copyrights notice
Dans un problème de classification pratique, il existe des cas où des étiquettes incorrectes sont incluses dans les données de formation en raison du bruit des étiquettes. Nous introduisons une méthode de classification en présence de bruit d'étiquette qui idéalise une méthode de classification basée sur l'algorithme d'espérance-maximisation (EM), et évaluons théoriquement ses performances. Sa performance est évaluée asymptotiquement en évaluant la fonction de risque définie comme la divergence de Kullback-Leibler entre la distribution prédictive et la distribution vraie. Le résultat de cette évaluation des performances permet une évaluation théorique de la performance la plus réussie que la méthode de classification basée sur l'EM peut atteindre.
Goki YASUDA
Waseda University
Tota SUKO
Waseda University
Manabu KOBAYASHI
Waseda University
Toshiyasu MATSUSHIMA
Waseda University
The copyright of the original papers published on this site belongs to IEICE. Unauthorized use of the original or translated papers is prohibited. See IEICE Provisions on Copyright for details.
Copier
Goki YASUDA, Tota SUKO, Manabu KOBAYASHI, Toshiyasu MATSUSHIMA, "Asymptotic Evaluation of Classification in the Presence of Label Noise" in IEICE TRANSACTIONS on Fundamentals,
vol. E106-A, no. 3, pp. 422-430, March 2023, doi: 10.1587/transfun.2022TAP0013.
Abstract: In a practical classification problem, there are cases where incorrect labels are included in training data due to label noise. We introduce a classification method in the presence of label noise that idealizes a classification method based on the expectation-maximization (EM) algorithm, and evaluate its performance theoretically. Its performance is asymptotically evaluated by assessing the risk function defined as the Kullback-Leibler divergence between predictive distribution and true distribution. The result of this performance evaluation enables a theoretical evaluation of the most successful performance that the EM-based classification method may achieve.
URL: https://global.ieice.org/en_transactions/fundamentals/10.1587/transfun.2022TAP0013/_p
Copier
@ARTICLE{e106-a_3_422,
author={Goki YASUDA, Tota SUKO, Manabu KOBAYASHI, Toshiyasu MATSUSHIMA, },
journal={IEICE TRANSACTIONS on Fundamentals},
title={Asymptotic Evaluation of Classification in the Presence of Label Noise},
year={2023},
volume={E106-A},
number={3},
pages={422-430},
abstract={In a practical classification problem, there are cases where incorrect labels are included in training data due to label noise. We introduce a classification method in the presence of label noise that idealizes a classification method based on the expectation-maximization (EM) algorithm, and evaluate its performance theoretically. Its performance is asymptotically evaluated by assessing the risk function defined as the Kullback-Leibler divergence between predictive distribution and true distribution. The result of this performance evaluation enables a theoretical evaluation of the most successful performance that the EM-based classification method may achieve.},
keywords={},
doi={10.1587/transfun.2022TAP0013},
ISSN={1745-1337},
month={March},}
Copier
TY - JOUR
TI - Asymptotic Evaluation of Classification in the Presence of Label Noise
T2 - IEICE TRANSACTIONS on Fundamentals
SP - 422
EP - 430
AU - Goki YASUDA
AU - Tota SUKO
AU - Manabu KOBAYASHI
AU - Toshiyasu MATSUSHIMA
PY - 2023
DO - 10.1587/transfun.2022TAP0013
JO - IEICE TRANSACTIONS on Fundamentals
SN - 1745-1337
VL - E106-A
IS - 3
JA - IEICE TRANSACTIONS on Fundamentals
Y1 - March 2023
AB - In a practical classification problem, there are cases where incorrect labels are included in training data due to label noise. We introduce a classification method in the presence of label noise that idealizes a classification method based on the expectation-maximization (EM) algorithm, and evaluate its performance theoretically. Its performance is asymptotically evaluated by assessing the risk function defined as the Kullback-Leibler divergence between predictive distribution and true distribution. The result of this performance evaluation enables a theoretical evaluation of the most successful performance that the EM-based classification method may achieve.
ER -