The original paper is in English. Non-English content has been machine-translated and may contain typographical errors or mistranslations. ex. Some numerals are expressed as "XNUMX".
Copyrights notice
The original paper is in English. Non-English content has been machine-translated and may contain typographical errors or mistranslations. Copyrights notice
Il a été démontré que des ensembles de données étiquetés à grande échelle facilitent le succès de l’apprentissage automatique. Cependant, la collecte de données étiquetées est souvent très coûteuse et sujette aux erreurs dans la pratique. Pour résoudre ce problème, des études antérieures ont envisagé l'utilisation d'une étiquette complémentaire, qui spécifie une classe à laquelle une instance n'appartient pas et peut être collectée plus facilement que les étiquettes ordinaires. Cependant, les étiquettes complémentaires pourraient également être sujettes aux erreurs et atténuer l’influence du bruit des étiquettes constitue donc un défi important pour rendre l’apprentissage des étiquettes complémentaires plus utile dans la pratique. Dans cet article, nous dérivons des conditions pour la fonction de perte telles que l'algorithme d'apprentissage soit ne sauraient affectés par le bruit dans les étiquettes complémentaires. Des expériences sur des ensembles de données de référence avec des étiquettes complémentaires bruitées démontrent que les fonctions de perte qui satisfont nos conditions améliorent considérablement les performances de classification.
Hiroki ISHIGURO
University of Tokyo
Takashi ISHIDA
University of Tokyo,RIKEN
Masashi SUGIYAMA
University of Tokyo,RIKEN
The copyright of the original papers published on this site belongs to IEICE. Unauthorized use of the original or translated papers is prohibited. See IEICE Provisions on Copyright for details.
Copier
Hiroki ISHIGURO, Takashi ISHIDA, Masashi SUGIYAMA, "Learning from Noisy Complementary Labels with Robust Loss Functions" in IEICE TRANSACTIONS on Information,
vol. E105-D, no. 2, pp. 364-376, February 2022, doi: 10.1587/transinf.2021EDP7035.
Abstract: It has been demonstrated that large-scale labeled datasets facilitate the success of machine learning. However, collecting labeled data is often very costly and error-prone in practice. To cope with this problem, previous studies have considered the use of a complementary label, which specifies a class that an instance does not belong to and can be collected more easily than ordinary labels. However, complementary labels could also be error-prone and thus mitigating the influence of label noise is an important challenge to make complementary-label learning more useful in practice. In this paper, we derive conditions for the loss function such that the learning algorithm is not affected by noise in complementary labels. Experiments on benchmark datasets with noisy complementary labels demonstrate that the loss functions that satisfy our conditions significantly improve the classification performance.
URL: https://global.ieice.org/en_transactions/information/10.1587/transinf.2021EDP7035/_p
Copier
@ARTICLE{e105-d_2_364,
author={Hiroki ISHIGURO, Takashi ISHIDA, Masashi SUGIYAMA, },
journal={IEICE TRANSACTIONS on Information},
title={Learning from Noisy Complementary Labels with Robust Loss Functions},
year={2022},
volume={E105-D},
number={2},
pages={364-376},
abstract={It has been demonstrated that large-scale labeled datasets facilitate the success of machine learning. However, collecting labeled data is often very costly and error-prone in practice. To cope with this problem, previous studies have considered the use of a complementary label, which specifies a class that an instance does not belong to and can be collected more easily than ordinary labels. However, complementary labels could also be error-prone and thus mitigating the influence of label noise is an important challenge to make complementary-label learning more useful in practice. In this paper, we derive conditions for the loss function such that the learning algorithm is not affected by noise in complementary labels. Experiments on benchmark datasets with noisy complementary labels demonstrate that the loss functions that satisfy our conditions significantly improve the classification performance.},
keywords={},
doi={10.1587/transinf.2021EDP7035},
ISSN={1745-1361},
month={February},}
Copier
TY - JOUR
TI - Learning from Noisy Complementary Labels with Robust Loss Functions
T2 - IEICE TRANSACTIONS on Information
SP - 364
EP - 376
AU - Hiroki ISHIGURO
AU - Takashi ISHIDA
AU - Masashi SUGIYAMA
PY - 2022
DO - 10.1587/transinf.2021EDP7035
JO - IEICE TRANSACTIONS on Information
SN - 1745-1361
VL - E105-D
IS - 2
JA - IEICE TRANSACTIONS on Information
Y1 - February 2022
AB - It has been demonstrated that large-scale labeled datasets facilitate the success of machine learning. However, collecting labeled data is often very costly and error-prone in practice. To cope with this problem, previous studies have considered the use of a complementary label, which specifies a class that an instance does not belong to and can be collected more easily than ordinary labels. However, complementary labels could also be error-prone and thus mitigating the influence of label noise is an important challenge to make complementary-label learning more useful in practice. In this paper, we derive conditions for the loss function such that the learning algorithm is not affected by noise in complementary labels. Experiments on benchmark datasets with noisy complementary labels demonstrate that the loss functions that satisfy our conditions significantly improve the classification performance.
ER -