The original paper is in English. Non-English content has been machine-translated and may contain typographical errors or mistranslations. ex. Some numerals are expressed as "XNUMX".
Copyrights notice
The original paper is in English. Non-English content has been machine-translated and may contain typographical errors or mistranslations. Copyrights notice
Dans de nombreuses situations pratiques d’apprentissage NN, les exemples de formation ont tendance à être fournis un par un. Dans de telles situations, l’apprentissage incrémental semble plus naturel que l’apprentissage par lots au vu des méthodes d’apprentissage des êtres humains. Dans cet article, nous proposons une méthode d'apprentissage incrémental dans les réseaux de neurones sous le critère d'apprentissage par projection. Bien que l’apprentissage par projection soit une méthode d’apprentissage linéaire, atteindre l’objectif ci-dessus n’est pas simple car il implique des expressions redondantes de fonctions avec des bases sur-complètes, ce qui est essentiellement lié à des bases (ou frames) pseudo biorthogonales. La méthode proposée fournit exactement le même résultat d’apprentissage que celui obtenu par apprentissage par lots. Il est théoriquement montré que la méthode proposée est plus efficace en calcul que l’apprentissage par lots.
The copyright of the original papers published on this site belongs to IEICE. Unauthorized use of the original or translated papers is prohibited. See IEICE Provisions on Copyright for details.
Copier
Masashi SUGIYAMA, Hidemitsu OGAWA, "Incremental Construction of Projection Generalizing Neural Networks" in IEICE TRANSACTIONS on Information,
vol. E85-D, no. 9, pp. 1433-1442, September 2002, doi: .
Abstract: In many practical situations in NN learning, training examples tend to be supplied one by one. In such situations, incremental learning seems more natural than batch learning in view of the learning methods of human beings. In this paper, we propose an incremental learning method in neural networks under the projection learning criterion. Although projection learning is a linear learning method, achieving the above goal is not straightforward since it involves redundant expressions of functions with over-complete bases, which is essentially related to pseudo biorthogonal bases (or frames). The proposed method provides exactly the same learning result as that obtained by batch learning. It is theoretically shown that the proposed method is more efficient in computation than batch learning.
URL: https://global.ieice.org/en_transactions/information/10.1587/e85-d_9_1433/_p
Copier
@ARTICLE{e85-d_9_1433,
author={Masashi SUGIYAMA, Hidemitsu OGAWA, },
journal={IEICE TRANSACTIONS on Information},
title={Incremental Construction of Projection Generalizing Neural Networks},
year={2002},
volume={E85-D},
number={9},
pages={1433-1442},
abstract={In many practical situations in NN learning, training examples tend to be supplied one by one. In such situations, incremental learning seems more natural than batch learning in view of the learning methods of human beings. In this paper, we propose an incremental learning method in neural networks under the projection learning criterion. Although projection learning is a linear learning method, achieving the above goal is not straightforward since it involves redundant expressions of functions with over-complete bases, which is essentially related to pseudo biorthogonal bases (or frames). The proposed method provides exactly the same learning result as that obtained by batch learning. It is theoretically shown that the proposed method is more efficient in computation than batch learning.},
keywords={},
doi={},
ISSN={},
month={September},}
Copier
TY - JOUR
TI - Incremental Construction of Projection Generalizing Neural Networks
T2 - IEICE TRANSACTIONS on Information
SP - 1433
EP - 1442
AU - Masashi SUGIYAMA
AU - Hidemitsu OGAWA
PY - 2002
DO -
JO - IEICE TRANSACTIONS on Information
SN -
VL - E85-D
IS - 9
JA - IEICE TRANSACTIONS on Information
Y1 - September 2002
AB - In many practical situations in NN learning, training examples tend to be supplied one by one. In such situations, incremental learning seems more natural than batch learning in view of the learning methods of human beings. In this paper, we propose an incremental learning method in neural networks under the projection learning criterion. Although projection learning is a linear learning method, achieving the above goal is not straightforward since it involves redundant expressions of functions with over-complete bases, which is essentially related to pseudo biorthogonal bases (or frames). The proposed method provides exactly the same learning result as that obtained by batch learning. It is theoretically shown that the proposed method is more efficient in computation than batch learning.
ER -