June 2015
Volume 56, Issue 7
ARVO Annual Meeting Abstract  |   June 2015
Induced Audio/Visual Cortical Remapping via Looming Stimulus
Author Affiliations & Notes
  • Mark H Myers
    Ophthalmology, University of Tennessee Health Science Center, Memphis, TN
  • Daniel Albarran
    Ophthalmology, University of Tennessee Health Science Center, Memphis, TN
  • Ally Dobbins
    Ophthalmology, University of Tennessee Health Science Center, Memphis, TN
  • Charlotte Joure
    Ophthalmology, University of Tennessee Health Science Center, Memphis, TN
  • Aaron Canales
    Psychology, University of Memphis, Memphis, TN
  • Gavin Bidelman
    School of Communication Sciences & Disorders, University of Memphis, Memphis, TN
  • Footnotes
    Commercial Relationships Mark Myers, None; Daniel Albarran, None; Ally Dobbins, None; Charlotte Joure, None; Aaron Canales, None; Gavin Bidelman, None
  • Footnotes
    Support None
Investigative Ophthalmology & Visual Science June 2015, Vol.56, 2929. doi:
  • Views
  • Share
  • Tools
    • Alerts
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Mark H Myers, Daniel Albarran, Ally Dobbins, Charlotte Joure, Aaron Canales, Gavin Bidelman; Induced Audio/Visual Cortical Remapping via Looming Stimulus. Invest. Ophthalmol. Vis. Sci. 2015;56(7 ):2929.

      Download citation file:

      © ARVO (1962-2015); The Authors (2016-present)

  • Supplements

Purpose: Induced audio/visual integration in non-synesthetes will demonstrate similar intraparietal (IPL) behavior which has been found to elicit signature neural activity found in as individuals who have the condition known as synesthesia. The parietal activations might reflect involvement of a “multisensory nexus” that, via disinhibition, lead to synesthetic experiences. Measurement of audio/visual integration will be accomplished via electroencephalography (EEG) signal neuro-markers.

Methods: Our studies demonstrate Audio/Visual (AV) integration through the application of looming audio and visual stimuli on non-synesthetes. Participants were subjected to a combination of looming, receding or static audio and visual stimuli during EEG recordings. AV stimuli were presented as audio-looming (AL), visual looming (VL), or audio-looming/visual looming (ALVL). Audio looms start at a low intensity and quickly increase by ~10 dB over 1000 ms (AL), while images would start small and quickly enlarge into the visual area (VL), or stimuli may occur simultaneously as ALVL. Individuals were required to rapidly judge whether stimuli was looming or receding. Behavioral responses to AV stimuli were programmatically coupled to EEG recordings so that stimuli could be later associated to cortical responses.

Results: Distributed current source density analysis using low resolution electromagnetic tomography (LORETA) is applied to the EEGs of n=4 subjects. Auditory looming (AL) stimuli containing only auditory cues primarily recruit cortical areas associated with auditory sensory encoding including Heschl’s Gyrus (HG) in the superior temporal plane. Similarly, visual-only looms activate primary visual centers situated in lateral occipital (LO) cortex. In contrast, looming stimuli combining both auditory and visual cues (ALVL) engage multimodal area in intraparietal (IPL) cortex. These preliminary findings indicate that our novel stimulus paradigm induces cross-modal “hyperbinding” and the integration of AV neural activity patterns as observed in actual synesthetes.

Conclusions: These preliminary findings demonstrate feasibility of the proposed stimulus paradigm and perceptual and neurophysiological correlates of cross-modal “hyperbinding”.


This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.