Underwater Acoustic Propagation and Immersive Soundscapes

Projecte: Ajuts interns/convocatòries pròpiesAjuts interns a projectes

Detalls del projecte

Description

Underwater Acoustic Propagation and Immersive Soundscapes explores how marine soundscapes can be measured, analyzed, and reconstructed in an immersive environment. The project uses extensive underwater recordings, including campaigns in the Black Sea and Bergen fjords, to investigate the physical processes that govern sound propagation underwater. These datasets, originally collected in the framework of international marine noise monitoring initiatives, provide a baseline to compare quiet ocean conditions with human-dominated acoustic environments.
The project begins with a curation and synchronization stage, merging underwater acoustic recordings with vessel tracking data (AIS), oceanographic and meteorological metadata, and the technical capabilities of the IASLAB immersive room. Extracting both baseline “quiet” recordings and noisy anthropogenic segments ensures that the contrast between natural and human-driven soundscapes can be rigorously analyzed.
The central focus is on physical acoustics and underwater sound propagation. Using propagation models (ray theory, normal modes, parabolic equation methods [1][2]), the project will reconstruct how sound travels through seawater under diverse conditions of depth, salinity, stratification, and bathymetry. Parameters such as transmission loss (TL = 20 log R + αR), reverberation, and frequency-dependent absorption will be computed, validated with field data, and then used to drive immersive reconstructions. In this way, the IASLAB installation will reproduce not only the recordings themselves but also the underlying physics of underwater sound transmission.
To ensure scientific depth, automated event detection and relevance filtering will be applied. Machine learning techniques—unsupervised clustering, spectrogram classification, and anomaly detection [3][4]—will identify meaningful acoustic events such as vessel passages, biological signals, or extended periods of quiet.
Within the IASLAB immersive environment, an interactive sonification and visualization platform will allow audiences to navigate an “acoustic map.” By moving within the room or selecting points on the map, visitors will trigger soundscapes reconstructed with accurate propagation effects, dynamically changing with vessel proximity or environmental conditions. Visual projections will complement the auditory experience by illustrating physical phenomena such as spreading, scattering, and masking.
The project culminates in public engagement and scenario building, offering narrative modes such as “quiet ocean” versus “human-dominated ocean.” Optional machine-learning personalization may adapt scenarios to visitor behavior or real environmental inputs. This dual approach—rigorous underwater acoustic modeling coupled with immersive dissemination—demonstrates how marine acoustics can advance both scientific research and public awareness.

Life cycle:
The project will formally run for one year, combining new field recordings with data synchronization, propagation modeling, event detection, and immersive reconstruction. The IASLAB will serve both as a research infrastructure to test data-driven rendering of underwater sound propagation, and as a public-facing platform to communicate scientific insights.
Expected outputs include scientific publications on underwater acoustic propagation and event detection, and an immersive installation open to audiences. This pilot will lay the groundwork for larger-scale initiatives combining acoustics, machine learning, and public engagement in the study of marine soundscapes.
References:
[1] Jensen, F. B., Kuperman, W. A., Porter, M. B., & Schmidt, H. (2011). Computational Ocean Acoustics. Springer.
[2] Etter, P. C. (2018). Underwater Acoustic Modeling and Simulation. CRC Press.
[3] Sethi, S. S., Jones, N. S., Fulda, N., Clink, D. J., Klinck, H. (2020). “Characterizing soundscapes across diverse ecosystems using a universal acoustic feature set.” PNAS, 117(29), 17049–17055.
[4] Ibrahim, A. K., Salamon, R., & Cohen, I. (2022). “Machine learning for underwater acoustic event detection: Challenges and opportunities.” Applied Acoustics, 190, 108638.
EstatusActiu
Data efectiva d'inici i finalització1/01/2531/12/25

Fingerprint

Explora els temes de recerca tractats en aquest projecte. Les etiquetes es generen en funció dels ajuts rebuts. Juntes formen un fingerprint únic.