The original paper is in English. Non-English content has been machine-translated and may contain typographical errors or mistranslations. ex. Some numerals are expressed as "XNUMX".
Copyrights notice
The original paper is in English. Non-English content has been machine-translated and may contain typographical errors or mistranslations. Copyrights notice
Un nombre limité de types d'événements sonores se produisent dans une scène acoustique et certains événements sonores ont tendance à se produire simultanément dans la scène ; par exemple, les événements sonores « plats » et « tintement de verre » sont susceptibles de coexister dans la scène acoustique « cuisine ». Dans cet article, nous proposons une méthode de détection d'événements sonores utilisant la régularisation laplacienne graphique avec prise en compte de la cooccurrence d'événements sonores. Dans la méthode proposée, les occurrences d'événements sonores sont exprimées sous forme d'un graphe dont les nœuds indiquent les fréquences d'occurrence des événements et dont les bords indiquent les cooccurrences d'événements sonores. Cette représentation graphique est ensuite utilisée pour la formation de modèles de détection d'événements sonores, qui est optimisée sous une fonction objectif avec un terme de régularisation prenant en compte la structure graphique de l'occurrence et de la cooccurrence d'événements sonores. Des expériences d'évaluation utilisant les ensembles de données TUT Sound Events 2016 et 2017 et l'ensemble de données TUT Acoustic Scenes 2016 montrent que la méthode proposée améliore les performances de détection des événements sonores de 7.9 points de pourcentage par rapport à la méthode de détection conventionnelle basée sur CNN-BiGRU en termes de score F1 basé sur les segments. En particulier, les résultats expérimentaux indiquent que la méthode proposée permet de détecter des événements sonores concomitants avec plus de précision que la méthode conventionnelle.
Keisuke IMOTO
Ritsumeikan University
Seisuke KYOCHI
University of Kitakyushu
The copyright of the original papers published on this site belongs to IEICE. Unauthorized use of the original or translated papers is prohibited. See IEICE Provisions on Copyright for details.
Copier
Keisuke IMOTO, Seisuke KYOCHI, "Sound Event Detection Utilizing Graph Laplacian Regularization with Event Co-Occurrence" in IEICE TRANSACTIONS on Information,
vol. E103-D, no. 9, pp. 1971-1977, September 2020, doi: 10.1587/transinf.2019EDP7323.
Abstract: A limited number of types of sound event occur in an acoustic scene and some sound events tend to co-occur in the scene; for example, the sound events “dishes” and “glass jingling” are likely to co-occur in the acoustic scene “cooking.” In this paper, we propose a method of sound event detection using graph Laplacian regularization with sound event co-occurrence taken into account. In the proposed method, the occurrences of sound events are expressed as a graph whose nodes indicate the frequencies of event occurrence and whose edges indicate the sound event co-occurrences. This graph representation is then utilized for the model training of sound event detection, which is optimized under an objective function with a regularization term considering the graph structure of sound event occurrence and co-occurrence. Evaluation experiments using the TUT Sound Events 2016 and 2017 detasets, and the TUT Acoustic Scenes 2016 dataset show that the proposed method improves the performance of sound event detection by 7.9 percentage points compared with the conventional CNN-BiGRU-based detection method in terms of the segment-based F1 score. In particular, the experimental results indicate that the proposed method enables the detection of co-occurring sound events more accurately than the conventional method.
URL: https://global.ieice.org/en_transactions/information/10.1587/transinf.2019EDP7323/_p
Copier
@ARTICLE{e103-d_9_1971,
author={Keisuke IMOTO, Seisuke KYOCHI, },
journal={IEICE TRANSACTIONS on Information},
title={Sound Event Detection Utilizing Graph Laplacian Regularization with Event Co-Occurrence},
year={2020},
volume={E103-D},
number={9},
pages={1971-1977},
abstract={A limited number of types of sound event occur in an acoustic scene and some sound events tend to co-occur in the scene; for example, the sound events “dishes” and “glass jingling” are likely to co-occur in the acoustic scene “cooking.” In this paper, we propose a method of sound event detection using graph Laplacian regularization with sound event co-occurrence taken into account. In the proposed method, the occurrences of sound events are expressed as a graph whose nodes indicate the frequencies of event occurrence and whose edges indicate the sound event co-occurrences. This graph representation is then utilized for the model training of sound event detection, which is optimized under an objective function with a regularization term considering the graph structure of sound event occurrence and co-occurrence. Evaluation experiments using the TUT Sound Events 2016 and 2017 detasets, and the TUT Acoustic Scenes 2016 dataset show that the proposed method improves the performance of sound event detection by 7.9 percentage points compared with the conventional CNN-BiGRU-based detection method in terms of the segment-based F1 score. In particular, the experimental results indicate that the proposed method enables the detection of co-occurring sound events more accurately than the conventional method.},
keywords={},
doi={10.1587/transinf.2019EDP7323},
ISSN={1745-1361},
month={September},}
Copier
TY - JOUR
TI - Sound Event Detection Utilizing Graph Laplacian Regularization with Event Co-Occurrence
T2 - IEICE TRANSACTIONS on Information
SP - 1971
EP - 1977
AU - Keisuke IMOTO
AU - Seisuke KYOCHI
PY - 2020
DO - 10.1587/transinf.2019EDP7323
JO - IEICE TRANSACTIONS on Information
SN - 1745-1361
VL - E103-D
IS - 9
JA - IEICE TRANSACTIONS on Information
Y1 - September 2020
AB - A limited number of types of sound event occur in an acoustic scene and some sound events tend to co-occur in the scene; for example, the sound events “dishes” and “glass jingling” are likely to co-occur in the acoustic scene “cooking.” In this paper, we propose a method of sound event detection using graph Laplacian regularization with sound event co-occurrence taken into account. In the proposed method, the occurrences of sound events are expressed as a graph whose nodes indicate the frequencies of event occurrence and whose edges indicate the sound event co-occurrences. This graph representation is then utilized for the model training of sound event detection, which is optimized under an objective function with a regularization term considering the graph structure of sound event occurrence and co-occurrence. Evaluation experiments using the TUT Sound Events 2016 and 2017 detasets, and the TUT Acoustic Scenes 2016 dataset show that the proposed method improves the performance of sound event detection by 7.9 percentage points compared with the conventional CNN-BiGRU-based detection method in terms of the segment-based F1 score. In particular, the experimental results indicate that the proposed method enables the detection of co-occurring sound events more accurately than the conventional method.
ER -