Gammatone Wavelet features for sound classification in surveillance applications

Xavier Valero, Francesc Aliás

Research output: Book chapterConference contributionpeer-review

21 Citations (Scopus)

Abstract

Sound can deliver highly informative data about the environment, which can be of particular interest for hometeleassistance and surveillance purposes. In the sound event recognition process, the signal parameterisation is a crucial aspect. In this work, we propose Gammatone-Wavelet features (GTW) by merging Wavelet analysis, which is well-suited to represent the characteristics of surveillance-related sounds, and Gammatone functions, which model the human auditory system. An experimental evaluation that consists of classifying a set of surveillance-related sounds employing Support Vector Machines has been conducted at different SNR conditions. When compared to typical Wavelet analysis with Daubechies mother function (DWC), the GTW features show superior classification accuracy both in noiseless conditions and noisy conditions for almost any SNR level. Finally, it is observed that the combination of DWC and GTW yields the highest classification accuracies.

Original languageEnglish
Title of host publicationProceedings of the 20th European Signal Processing Conference, EUSIPCO 2012
Pages1658-1662
Number of pages5
Publication statusPublished - 2012
Event20th European Signal Processing Conference, EUSIPCO 2012 - Bucharest, Romania
Duration: 27 Aug 201231 Aug 2012

Publication series

NameEuropean Signal Processing Conference
ISSN (Print)2219-5491

Conference

Conference20th European Signal Processing Conference, EUSIPCO 2012
Country/TerritoryRomania
CityBucharest
Period27/08/1231/08/12

Keywords

  • Ambient Assisted Living
  • Gammatone function
  • Wavelet analysis
  • audio classification
  • audio-based surveillance
  • feature extraction

Fingerprint

Dive into the research topics of 'Gammatone Wavelet features for sound classification in surveillance applications'. Together they form a unique fingerprint.

Cite this