A sound event detection based on hybrid convolution neural network and random forest

Sound event detection (SED) assists in the detainment of intruders. In recent decades, several SED methods such as support vector machine (SVM), K-Means clustering, principal component analysis, and convolution neural network (CNN) on urban sound have been developed. Advanced work on SED in a rare s...

全面介绍

书目详细资料
发表在:IAES International Journal of Artificial Intelligence
主要作者: Afendi M.A.S.M.; Yusoff M.
格式: 文件
语言:English
出版: Institute of Advanced Engineering and Science 2022
在线阅读:https://www.scopus.com/inward/record.uri?eid=2-s2.0-85125871813&doi=10.11591%2fijai.v11.i1.pp121-128&partnerID=40&md5=44b0989f691b5f5ada55fa7dae5f3b85
实物特征
总结:Sound event detection (SED) assists in the detainment of intruders. In recent decades, several SED methods such as support vector machine (SVM), K-Means clustering, principal component analysis, and convolution neural network (CNN) on urban sound have been developed. Advanced work on SED in a rare sound event is challenging because it has limited exploration, especially for surveillance in a forest environment. This research provides an alternative method that uses informative features of sound event data from a natural forest environment and evaluates the CNN capabilities of the detection performances. A hybrid CNN and random forest (RF) are proposed to utilize a distinctive sound pattern. The feature extraction involves mel log energies. The detection processes include refinement parameters and post-processing threshold determination to reduce false alarms rate. The proposed CNN-RF and custom CNN-RF models have been validated with three types of sound events. The results of the suggested approach have been compared with well-regarded sound event algorithms. The experiment results demonstrate that the CNN-RF assesses the superiority with remarkable improvement in performance, up to a 0.82 F1 score with a minimum false alarms rate at 10%. The performance shows a functional advantage over previous methods. © 2022, Institute of Advanced Engineering and Science. All rights reserved.
ISSN:20894872
DOI:10.11591/ijai.v11.i1.pp121-128