Crossmodal Interactions

  • Audio-visual semantic interactions in environmental event recognition
  • Art and cross-modality
  • Cross-modal attention:
    In everyday life, the adaptive control of behaviour requires the integration and coordination of information originating from different input modalities. When trying to follow a conversation in a noisy environment with distracting sounds, attending to relevant lip movements may be as important as attending to the speaker’s voice coming from the same location. This fact have important implications for mechanisms of attentional selectivity, which could involve spatial synergies (crossmodal links) in the attentional processing of information across sensory modalities. Until recently, most experimental investigations of spatial attention have focused on spatially selective processing within single sensory modalities. Thus, the questions of whether there are crossmodal links in spatial attention between vision, audition, and touch; which mechanisms are involved in such links; and how these links affect the processing of information at attended and unattended locations have only now begun to be addressed systematically.
    - Olivetti Belardinelli, M. & Santangelo, V. (2005). The head-centered meridian effect: auditory attention orienting in conditions of impaired visual spatial information. Disability and Rehabilitation, 27: 761-768.
    - Santangelo, V., Van der Lubbe, R.H.J., Olivetti Belardinelli, M., & Postma, A. (in press). Spatial attention triggered by unimodal, crossmodal and bimodal exogenous cues: a comparison on reflexive orienting mechanisms. Experimental Brain Research.
    - Olivetti Belardinelli, M., & Santangelo, V., & Botta, F. (submitted). Structural and Functional Interference Between Endogenous Orienting Mechanisms.
    - Santangelo, V., Van der Lubbe, R.H.J., Olivetti Belardinelli, M., & Postma, A. (in preparation). On the influence of multimodal integration on exogenous orienting: an ERP study.
  • Audio-Visuo-Motor integration:
    In agreement with recent data that show the existence of tri-modal (audio-visuo-motor) neurons in the prefrontal cortex (Kohler et al., 2002) multimodal integration mechanisms between auditory, visual, and motor processing appeared to be a crucial aspect of today’s neuro-cognition research. Relevant issues treated by our Lab are: a) the recognition of environmental unimodal/multimodal sounds, b) the time course, or temporal relation, in the perception of sensory stimuli, and c) the time management in motor performance. In these processes prefrontal cortex has been recently shown to play a key role (see Rubia and Smith, 2004 for a review), as well as other well-known and basic structures (cerebellum and basal ganglia) responsible for motor timing control. More specifically, these investigations are aimed to closer examine the temporal variables underlying the execution of motor acts related to audiovisual stimuli.
    - Rubia, K., & Smith, A. (2004). The neural correlates of cognitive time management: a review. Acta Neurobiol Exp, 64(3), 329-340.
    - Kohler, E., Keysers, C., Umilta`, M. A., Fogassi, L., Gallese, V., Rizzolatti, G. (2002). Hearing Sounds, Understanding Actions: Action Representation in Mirror Neurons. Science, 297, 846-848.

Messaggio di errore

User warning: The following module is missing from the file system: imagcache_actions. For information about how to fix this, see the documentation page. in _drupal_trigger_error_with_delayed_logging() (line 1138 of /opt/webhost/labcog/includes/bootstrap.inc).