Sound Research WIKINDX

List Resources

Displaying 1 - 3 of 3 (Bibliography: WIKINDX Master Bibliography)
Parameters
Keyword:  Binding problem
Order by:

Ascending
Descending
Use all checked: 
Use all displayed: 
Use all in list: 
Harvey, M. A., & Sanchez-Vives, M. V. (2005). The binding problem in presence research. Presence: Teleoperators and Virtual Environments, 14(5), 616–621.  
Last edited by: sirfragalot 09/04/2020 02:51:50 PM
      Explaining why virtual environments are capable of facilitating presence despite lacking sensory modalities, it is not absence of one modality or another but rather incongruity between them that will break presence. The brain can fill in missing sensory data. In a virtual world displaying a rose, "it should be less disruptive for the rose to have no smell than the wrong smell."
Vroomen, J., & Keetels, M. (2010). Perception of intersensory synchrony: A tutorial review. Attention, Perception, & Psychophysics, 72, 871–884.  
Added by: sirfragalot 08/25/2021 10:08:14 AM
      "The perception of time, however, and, in particular, synchrony among the senses, is not straightforward, because no sense organ registers time on an absolute scale. Moreover, to perceive synchrony, the brain must deal with differences in physical and neural transmission times. Sounds, for example, travel through air much more slowly than does light (330 vs. 300,000,000 m/sec), whereas no physical transmission time through air is involved for tactile stimulation, which is usually presented directly at the body surface. The neural processing time also differs among the senses, being typically slower for visual stimuli than for auditory stimuli (approximately 50 vs. 10 msec, respectively), whereas, for touch, the brain may have to take into account where the stimulation originated, because the traveling time is longer from the toes to the brain than from the nose (the typical conduction velocity is 55 m/sec, which results in a ~30-msec difference between toe and nose for a distance of 1.60 m; Macefield, Gandevia, & Burke, 1989). Because of these physical and neural differences, it has been argued that auditory and visual information arrives synchronously at the primary sensory cortices only if the event occurs at a distance of approximately 10–15 m from the observer. This has been called the horizon of simultaneity (Pöppel, 1985; Pöppel, Schill, & von Steinbüchel, 1990), assuming that, arguably, synchrony is perceived at the primary sensory cortices. Sounds should thus appear to arrive before visual stimuli if the audio–visual event is within 15 m of the observer, whereas vision should arrive before sounds for events farther away. Surprisingly, however, despite these naturally occurring lags among the senses, observers perceive intersensory synchrony for most multisensory events in the external world and not only for those at 15 m. Only in exceptional circumstances, such as the thunder that is heard afterthe lightning, is a single multisensory event perceived as being separated in time."

Pöppel, E. (1985). Grenzen des Bewußtseins. Stuttgart: Deutsche Verlags-Anstalt. [Translated as Mindworks: Time and conscious ex- perience. New York: Harcourt Brace Jovanovich, 1988.]

Pöppel, E., Schill, K., & von Steinbüchel, N. (1990). Sensory inte- gration within temporally neutral systems states: A hypothesis. Natur- wissenschaften, 77, 89-91. doi:10.1007/BF01131783

      "the assumption of unity. It states that, as information from different modalities share more (amodal) properties, the more likely it is that the brain treats them as originating from a common object or source. . . Without doubt, the most important amodal property is temporal coincidence . . . From this perspective, one expects intersensory interactions to occur if, and only if, information from the different sense organs reaches the brain at around the same time; otherwise, separate events are perceived, rather than a single multisensory one."
Westerhoff, J. (2011). Reality: A very short introduction. Oxford: Oxford University Press.  
Added by: sirfragalot 07/20/2021 11:22:00 AM
      "The different information coming in from our senses, visual, auditory, tactile, olfactory, and gustatory sensations are processed in different regions of the brain. They have to travel different distances [...] and arrive at different times. The processing speed for different kinds of sensory information varies; visual stimuli take longer to process than other stimuli. (The difference is about 40 milliseconds. [...]) On the other hand, light travels much faster than sound. Putting together these different speeds means that sights and sounds from about 10 metres away are available to consciousness at about the same time; for everything closer or further away information about its sight or sound arrives at different times. In these cases, the apparent simultaneity of, for example, hearing a voice and seeing the speaker's lips move has to be constructed by our brain."
WIKINDX 6.4.12 | Total resources: 1102 | Username: -- | Bibliography: WIKINDX Master Bibliography | Style: American Psychological Association (APA)


PHP execution time: 0.13901 s
SQL execution time: 0.04465 s
TPL rendering time: 0.00502 s
Total elapsed time: 0.18868 s
Peak memory usage: 8.9570 MB
Memory at close: 8.7948 MB
Database queries: 38