Sound Research WIKINDX

WIKINDX Resources

Gaver, W. W. (1993). How do we hear in the world? Explorations in ecological acoustics. Ecological Psychology, 5(4), 285–313. 
Added by: sirfragalot (08/24/2005 01:08:50 PM)   Last edited by: sirfragalot (04/25/2013 04:39:09 PM)
Resource type: Journal Article
BibTeX citation key: Gaver1993a
View all bibliographic details
Categories: Sound Design, Typologies/Taxonomies
Keywords: Acoustic ecology, Acoustics, Caricature, Earcons & Auditory Icons, perception, Psychoacoustics, Semantic categorization, Sound objects
Creators: Gaver
Collection: Ecological Psychology
Resources citing this (Bibliography: WIKINDX Master Bibliography)
Views: 8/674
Abstract
"Everyday listening is the experience of hearing events in the world rather than sounds per se. In this article, I explore the acoustic basis of everyday listening as a start toward understanding how sounds near the ear can indicate remote physical events. Information for sound-producing events and their dimensions is investigated using physical analyses of events to inform acoustic analyses of the sounds they produce. The result is a variety of algorithms which enable the synthesis of sounds made by basic-level events such as impacts, scraping, and dripping, as well as by more complex events such as bouncing, breaking, spilling, and machinery. These algorithms may serve as instantiated hypotheses about the acoustic information for events. Analysis and synthesis work together in their development: Just as analyses of the physics and acoustics of sound-producing events may inform synthesis, so listening to the results of synthesis may inform analysis. This raises several issues concerning evaluation, specification, and the tension between formal and informal physical analyses. In the end, however, the fundamental test of these algorithms is in the sounds they produce: I describe them in enough detail here that readers may implement, test, and extend them."
Added by: sirfragalot  Last edited by: sirfragalot
Notes
Some useful information on and arguments for synthesis of sounds based on the events they represent rather than the traditional acoustic properties such as frequency, amplitude and time etc. Using traditional acoustic analysis (musical listening) may be fine for analysing and synthesising musical instruments but to analyse and synthesise non-musical sound-producing objects requires the use of everyday listening.

The second in a pair of articles (Gaver 1993).

Gaver, W. W. (1993). What in the world do we hear? An ecological approach to auditory perception. Ecological Psychology, 5(1), 1–29.
Added by: sirfragalot  Last edited by: sirfragalot
Quotes
p.310   "...material per se does not exist as for mechanical physics, but instead is separated into many other dimensions such as density, elasticity, and homegeneity. Nonetheless, people do seem to hear the material of a struck object, rather than these other properties (Gaver 1993)."

Gaver, W. W. (1993). What in the world do we hear? An ecological approach to auditory perception. Ecological Psychology, 5(1), 1–29.   Added by: sirfragalot
p.311   Using his simplified algorithms for modelling and synthesis of sound (simplification by discarding parts of the algorithm that have no perceptible effect, is likened by Gaver to "cartoon sounds" analogous to visual cartoons which "capture some defining features while leaving out incidental ones."   Added by: sirfragalot
Paraphrases
pp.285-287   Suggests that it is a mistake that acousticians and psychologists make when they assume that hearing a sound as an event requires higher and independent thought processes (because this requires memory and experience) because the sound (these people assume) does not carry such information itself.   Added by: sirfragalot
Keywords:   perception
WIKINDX 6.4.12 | Total resources: 1102 | Username: -- | Bibliography: WIKINDX Master Bibliography | Style: American Psychological Association (APA)


PHP execution time: 0.16178 s
SQL execution time: 0.08402 s
TPL rendering time: 0.00526 s
Total elapsed time: 0.25106 s
Peak memory usage: 9.3067 MB
Memory at close: 9.2051 MB
Database queries: 88