Visually Indicated Sounds

Objects make distinctive sounds when they are hit or scratched. These sounds reveal aspects of an object's material properties, as well as the actions that produced them. In this paper, we propose the task of predicting what sound an object makes when struck as a way of studying physical intera...

Full description

Bibliographic Details
Main Authors: Isola, Phillip (Author), McDermott, Josh (Author), Adelson, Edward H. (Author), Freeman, William T. (Author), Torralba, Antonio (Contributor), Owens, Andrew Hale (Contributor)
Other Authors: Massachusetts Institute of Technology. Computer Science and Artificial Intelligence Laboratory (Contributor), Massachusetts Institute of Technology. Department of Electrical Engineering and Computer Science (Contributor)
Format: Article
Language:English
Published: Institute of Electrical and Electronics Engineers (IEEE), 2017-12-08T17:59:29Z.
Subjects:
Online Access:Get fulltext
LEADER 02087 am a22003013u 4500
001 112659
042 |a dc 
100 1 0 |a Isola, Phillip  |e author 
100 1 0 |a Massachusetts Institute of Technology. Computer Science and Artificial Intelligence Laboratory  |e contributor 
100 1 0 |a Massachusetts Institute of Technology. Department of Electrical Engineering and Computer Science  |e contributor 
100 1 0 |a Torralba, Antonio  |e contributor 
100 1 0 |a Owens, Andrew Hale  |e contributor 
700 1 0 |a McDermott, Josh  |e author 
700 1 0 |a Adelson, Edward H.  |e author 
700 1 0 |a Freeman, William T.  |e author 
700 1 0 |a Torralba, Antonio  |e author 
700 1 0 |a Owens, Andrew Hale  |e author 
245 0 0 |a Visually Indicated Sounds 
260 |b Institute of Electrical and Electronics Engineers (IEEE),   |c 2017-12-08T17:59:29Z. 
856 |z Get fulltext  |u http://hdl.handle.net/1721.1/112659 
520 |a Objects make distinctive sounds when they are hit or scratched. These sounds reveal aspects of an object's material properties, as well as the actions that produced them. In this paper, we propose the task of predicting what sound an object makes when struck as a way of studying physical interactions within a visual scene. We present an algorithm that synthesizes sound from silent videos of people hitting and scratching objects with a drumstick. This algorithm uses a recurrent neural network to predict sound features from videos and then produces a waveform from these features with an example-based synthesis procedure. We show that the sounds predicted by our model are realistic enough to fool participants in a "real or fake" psychophysical experiment, and that they convey significant information about material properties and physical interactions. 
520 |a National Science Foundation (U.S.) (grant 6924450) 
520 |a National Science Foundation (U.S.) (grant 6926677) 
520 |a Shell Oil Company 
520 |a Microsoft Corporation 
546 |a en_US 
655 7 |a Article 
773 |t IEEE Conference on Computer Vision and Pattern Recognition, 2016. CVPR 2016