Scan and paint: theory and practice of a sound field visualization method

Sound visualization techniques have played a key role in the development of acoustics throughout history. The development of measurement apparatus and techniques for displaying sound and vibration phenomena has provided excellent tools for building understanding about specific problems. Traditional...

Full description

Bibliographic Details
Main Authors: Fernandez Comesana, Daniel (Author), Steltenpool, Steven (Author), Carrillo Pousa, Graciano (Author), de Bree, Hans-Elias (Author), Holland, Keith R. (Author)
Format: Article
Language:English
Published: 2013-08-27.
Subjects:
Online Access:Get fulltext
LEADER 02066 am a22001813u 4500
001 356150
042 |a dc 
100 1 0 |a Fernandez Comesana, Daniel  |e author 
700 1 0 |a Steltenpool, Steven  |e author 
700 1 0 |a Carrillo Pousa, Graciano  |e author 
700 1 0 |a de Bree, Hans-Elias  |e author 
700 1 0 |a Holland, Keith R.  |e author 
245 0 0 |a Scan and paint: theory and practice of a sound field visualization method 
260 |c 2013-08-27. 
856 |z Get fulltext  |u https://eprints.soton.ac.uk/356150/1/ISRN_Mechanical_Engineering_Journal_Scan%2520and%2520Paint%2520principles.pdf 
520 |a Sound visualization techniques have played a key role in the development of acoustics throughout history. The development of measurement apparatus and techniques for displaying sound and vibration phenomena has provided excellent tools for building understanding about specific problems. Traditional methods, such as step-by-step measurements or simultaneous multichannel systems, have a strong tradeoff between time requirements, flexibility, and cost. However, if the sound field can be assumed time stationary, scanning methods allow us to assess variations across space with a single transducer, as long as the position of the sensor is known. The proposed technique, Scan and Paint, is based on the acquisition of sound pressure and particle velocity by manually moving a P-U probe (pressure-particle velocity sensors) across a sound field whilst filming the event with a camera. The sensor position is extracted by applying automatic color tracking to each frame of the recorded video. It is then possible to visualize sound variations across the space in terms of sound pressure, particle velocity, or acoustic intensity. In this paper, not only the theoretical foundations of the method, but also its practical applications are explored such as scanning transfer path analysis, source radiation characterization, operational deflection shapes, virtual phased arrays, material characterization, and acoustic intensity vector field mapping. 
540 |a other 
655 7 |a Article