The original paper is in English. Non-English content has been machine-translated and may contain typographical errors or mistranslations. ex. Some numerals are expressed as "XNUMX".
Copyrights notice
The original paper is in English. Non-English content has been machine-translated and may contain typographical errors or mistranslations. Copyrights notice
Le cadre de modélisation des mélanges est largement utilisé dans de nombreuses applications. Dans cet article, nous proposons un réduction des composants technique, qui réduit un modèle de mélange gaussien en un mélange gaussien avec moins de composants. L'algorithme EM (Expectation-Maximization) est généralement utilisé pour ajuster un modèle de mélange aux données. Notre algorithme est dérivé en étendant l'apprentissage du modèle de mélange à l'aide de l'algorithme EM. Dans cette extension, une difficulté vient du fait que certaines quantités cruciales ne peuvent pas être évaluées analytiquement. Nous surmontons cette difficulté en introduisant une approximation efficace. L'efficacité de notre algorithme est démontrée en l'appliquant à une tâche simple de réduction de composants synthétiques et à un problème de regroupement de phonèmes.
The copyright of the original papers published on this site belongs to IEICE. Unauthorized use of the original or translated papers is prohibited. See IEICE Provisions on Copyright for details.
Copier
Kumiko MAEBASHI, Nobuo SUEMATSU, Akira HAYASHI, "Component Reduction for Gaussian Mixture Models" in IEICE TRANSACTIONS on Information,
vol. E91-D, no. 12, pp. 2846-2853, December 2008, doi: 10.1093/ietisy/e91-d.12.2846.
Abstract: The mixture modeling framework is widely used in many applications. In this paper, we propose a component reduction technique, that collapses a Gaussian mixture model into a Gaussian mixture with fewer components. The EM (Expectation-Maximization) algorithm is usually used to fit a mixture model to data. Our algorithm is derived by extending mixture model learning using the EM-algorithm. In this extension, a difficulty arises from the fact that some crucial quantities cannot be evaluated analytically. We overcome this difficulty by introducing an effective approximation. The effectiveness of our algorithm is demonstrated by applying it to a simple synthetic component reduction task and a phoneme clustering problem.
URL: https://global.ieice.org/en_transactions/information/10.1093/ietisy/e91-d.12.2846/_p
Copier
@ARTICLE{e91-d_12_2846,
author={Kumiko MAEBASHI, Nobuo SUEMATSU, Akira HAYASHI, },
journal={IEICE TRANSACTIONS on Information},
title={Component Reduction for Gaussian Mixture Models},
year={2008},
volume={E91-D},
number={12},
pages={2846-2853},
abstract={The mixture modeling framework is widely used in many applications. In this paper, we propose a component reduction technique, that collapses a Gaussian mixture model into a Gaussian mixture with fewer components. The EM (Expectation-Maximization) algorithm is usually used to fit a mixture model to data. Our algorithm is derived by extending mixture model learning using the EM-algorithm. In this extension, a difficulty arises from the fact that some crucial quantities cannot be evaluated analytically. We overcome this difficulty by introducing an effective approximation. The effectiveness of our algorithm is demonstrated by applying it to a simple synthetic component reduction task and a phoneme clustering problem.},
keywords={},
doi={10.1093/ietisy/e91-d.12.2846},
ISSN={1745-1361},
month={December},}
Copier
TY - JOUR
TI - Component Reduction for Gaussian Mixture Models
T2 - IEICE TRANSACTIONS on Information
SP - 2846
EP - 2853
AU - Kumiko MAEBASHI
AU - Nobuo SUEMATSU
AU - Akira HAYASHI
PY - 2008
DO - 10.1093/ietisy/e91-d.12.2846
JO - IEICE TRANSACTIONS on Information
SN - 1745-1361
VL - E91-D
IS - 12
JA - IEICE TRANSACTIONS on Information
Y1 - December 2008
AB - The mixture modeling framework is widely used in many applications. In this paper, we propose a component reduction technique, that collapses a Gaussian mixture model into a Gaussian mixture with fewer components. The EM (Expectation-Maximization) algorithm is usually used to fit a mixture model to data. Our algorithm is derived by extending mixture model learning using the EM-algorithm. In this extension, a difficulty arises from the fact that some crucial quantities cannot be evaluated analytically. We overcome this difficulty by introducing an effective approximation. The effectiveness of our algorithm is demonstrated by applying it to a simple synthetic component reduction task and a phoneme clustering problem.
ER -