-
We __ events of our world through our sensory __. Events give rise to __ in different sensory channels; the brain stiches this __ __ together to __ events.
- sample
- apparatus
- energy
- multisensory information
- estimate
-
Give 2 reasons why multisensory processing improves perceptual estimates?
- 1. info from one sensory modality can be ambiguous by itself but clarified and constrained by info from another modality (eg. Sekuler et al, 1997, bouncing ball illusion)
- 2. all sensory processing subject to noise (eg. photon noise, stochastic nature of action potentials and loss of info at higher coginitive levels). Having multiple modalities helps minimise variability.
-
Talk to me a bit about the modularity of the sensory systems. Why is it important?
- Sensory processing is, broadly speaking, initially modular
- followed by subsequenst stages of comination
- Important that noise is independent from these different modalities, because we gain a lot more from the cancelling out that happens with multisensory processing
- Basic organisation of cerebral cortex suggests division of labour and modularity (eg. visual cortex, auditory cortex)
- HOWEVER: some debate as to the extent to which this mulsisensory processing happens
- More evidence (Ghazanfar & Schroeder, 2006) suggests multisensory processing in almost all cortical areas.
-
Does one sensory modality dominate others? Esp in relation to sound and vision.
- Some examples.
- Vision: ventriloquist effect - movement of puppet's mouth leads viewers to infer the source of sound
- Sound: perception of number of visual events dominated by number of auditory events such as beeps and flashes (Shams et al, 2000)
- Fiarly widely-held idea - certain modalities have different dominances because they are intrinsically beteer at signalling certain types of info.eg. Sound - temporal resolution; vision - spatial resolution (about 100:1 to sound)
-
Describe what Shams et al (2000) experiment was about.
- the two-flash illusion
- participant presented with single flash of light accompanied by two sound beeps
- they perceive two flashes
-
However, even if these intrinsic advantages exist, does this mean some modalities should 'capture' certain tasks?
- Computationalluy, it does not sensible to believe this happens
- eg. echoey cave - shouldn't rely on sound as much
- Does not make much sense to throw away one source of information
-
Instead of dominance, how else can the brain process two or more different sensory modalities?
- Combination
- eg. McGurk effect
- film of a person visually saying /ga/ with soundtrack of person saying /ba/ is heard as /da/.
- Thus, interestingly, their perceptual interpretation of the event is something does not correspond to either modality in isolation.
- But what actually governs when modalities are combined and when single modalities appear to dominate?
-
Estimates of the stimulus get more __ (have lower __) when sensory modalities are __. However, instead of simple averaging of the different modalities, what is done that is more effective?
- reliable
- variance
- combine
- The estimate is the combination fo the singals weighted by the reliability of each.
-
What do we typically use to describe sensory estimates? What can it tell us?
- Gaussian model (Normal Distribution)
- Can tell us the variance of many different responses from the same modality to see what its reliability is like.
-
How would you combine the information from two modalities using Gaussian distributions?
- statistically optimal inference using Maximum Likelihood Estimation - of weighted combination and minimising variance
- Multiply the two distributions together
- Resultant estimate - Gaussian that is centred between the two component estimates and has lower variance (higher reliability) than either modality alone.
- mean of this estimate is weighted average of the means of the individual cues (where weight is determined by reliability of each cue).
-
Why is the Maximum Likelihood Estimation (MLE) useful for understanding how different modelities are used?
- It seems to resolve the problem of dominance vs combination arguments
- Experimentally, results which seemed to show dominance may have just been due to the fact that it was very difficult to seprated a combined estimate form a single componeent (when one modality's reliability is very high)
-
Decribe the classic study in 2002 which provided evidence for optimal integration.
- Ernst & Banks (2002)
- Studied perception of object size from visual and haptic cues
- asked to judge thickness of a horizontal bar based on 3 info:
- 1. visual
- 2. haptic (graping it)
- 3. visual and haptic
- Asked to compare standard object with various others and to judge smaller or larger
- Participants better using visual cues, but crucially, visual+haptic yielded best performance.
- Change of condition:
- add visual 'noise' - by randomising poisition of some of the elemets that defined object's size
- Result: As predicted by MLE model, as reliability of visual cue was reduced, participants moved to give more weight to the haptic cues. Visual dominance --> haptic dominance
-
Describe the follow-up study to Ernst & Banks (2002) study.
- Alais and Burr (2004)
- Manipulation of azimuth of visual and auditory targets ('is this left or right of a reference?')
- Vision: spots of light on computer
- Sound: positions defined by interaural time difference (ITD)
- Judgements on sound alone much worse than vision (needed bigger differences from reference).
- When both cues combined for same location, performance was better, again, adding weight to MLE model.
- Condition:When sound and vision presented together but specifed different locations
- --> Judgements of position pulled all the way towards the visual spatial position
- Ventriloquist effect
- Condition 2:Blur the visual target (making it less reliable)
- --> judgements shifted twards being dominate dby auditory cue.
-
Final word about MLE model and combination of multiple senses.
- This MLE model has helped the understanding of multisensory combination away from the ad hoc way of understanding the way sensory info were combined.
- Evidence from Ernst & Banks; Alais & Burr as well as many subsequent studies show that brain uses a process similar to the 'statistically optimal' thing to do.
-
How is Bayes theory/rule important in understanding MLE and the combination of mulsisensory information?
- Bayes rule states that an unknown probability of an events can be calculated using the knowledge of related probabilities
- This can be thought of as a broader throty in which MLE sits
- Basically: the use of previous knowledge of the statistical likelihood of events when making inferences (eg about position or shape of object).
- Classic example of this: interpretation of shading patterns
- shading patterns can look like a hole or bump, but humans interpret shading patterns by assuming the pattern was produced by illumination above the object.
-
We can think about __ __ in the exact same way as we thought about sensory cues. It will have a __ __ and can be __ with sensory data using the same rules for integrating different sensory modalities (__ __ __).
- prior information/knowledge
- probability distribution
- combined
- multiplying probability distributions
|
|