Research

Ongoing Research Projects

  • DFG project SH166/3-1:  Dynamic prior and contextual calibration (with Stefan Glasauer, 2015-2018)
  • DFG project MU 773/16-1: Modeling predictive weighting systems in visual search (with Hermann J. Müller, 2015-2018)
  • DFG project TO 940/1: Adaptive attentional orienting to multisensory events (with Thomas Töllner, Hermann J. Müller, 2015-2018)
  • Bayerisches Hochschulzentrum für China (ProWA 2015-2016)

Topics of Research

  • Time perception and applications
  • Visual Search
  • Visual-haptic integration and human-machine interaction
  • Audiovisual interaction
  • Motion perception

Time Perception

The sense of time, unlike other senses, is not generated by a specific sensory organ. Rather, all events that stimulate the brain, regardless of sensory modality, contain temporal cues. Due to heterogeneous processing of sensory events, subjective time may differ signficantly for a given durfation across modality. For example, an auditory event is often perceived longer than a visual event of the same phyiscal inerval. Subjective time is also susceptible to temporal context, voluntary actions, attention, arousal and emotional states. In this research topic, we focus on mechanisms underlying temporal process and multisensory integration in time perception using behavioral investigation and Bayesian modeling.

Bayesian Optimization of time perception

Subjective time frequently departs from physical time, causing various types of temporal contextual effects, such as the central tendency effect. In reently review papers (Shi, Church, & Meck, 2013; Shi & Burr, 2016) we suggest that subjective time arises from minimizing prediction errors and adaptive recalibration, which can be unified in Bayesian framework with information-processing models of timing, a framework rooted in Helmholtz’s ‘perception as inference’.

Optimal integration in auditory duration reproduction

Duration estimation is known to be far from veridical and to differ for sensory estiamtes and motor reproductions. Our brain encounters challenges to integrate different sources of temporal information so as to enable accurate timing for multisensory or sensorimotor events. To investigate sensorimotor integration of duration, Shi, Ganzenmüller and Müller (2013) compared three different duration estimation tasks: perceptual comparison, motor reproduction, and auditory reproduction. The auditory reproduction task is a combined perceptual-motor task. They found consistent overestimation in both motor and perceptual-motor auditory reproduction tasks. Using Bayesian approach, Shi et al. (2013) found the prediction from the MLE model is in good agreement with the behavioral results.

Influence of visual emotional pictures on tactile duration judgment

Judging the duration of emotional stimuli is known to be influenced by their valence and arousal values. However, whether and how perceiving emotion in one modality affects time perception in another modality is still unclear. In this study (Shi, et al., 2012), we compared the influence of different types of emotional pictures—a picture of threat, disgust, or a neutral picture presented at the start of a trial—on temporal bisection judgments of the duration of a subsequently presented vibrotactile stimulus. We found systematically modulation of visual threat on the perceived tactile duration, suggesting that crossmodal linkages in the processing of emotions and emotional regulation are two main factors underlying the manifestation of crossmodal duration modulation.

Duration reproduction with sensory feedback delay

Research showes that voluntary action can attract subsequent events towards the action. However, whether and how sensorimotor delay affects duration reproduction is still unclear. Is the onset timing of feedback  important for duration reproduction? How does feedback duration influence your motor reproduction? By injecting onset- and offset-delays in a reproduced signal, Ganzenmüller and colleagues (Ganzenmüller, Shi, & Müller, 2012) found that the reproduced duration was lengthened in both visual and auditory onset-manipulation conditions. The lengthening effect was evident immediately. In contrast, a shortening effect was found with feedback signal offset-delay, though the effect was weaker and manifested only in the auditory offset-delay condition. These findings indicate that participants tend to mix the onset of action and the feedback signal more when the feedback is delayed, and they heavily rely on motor-stop signals for the duration reproduction.

Selected publications:

  • Shi, Z. *, Burr, D. (2015), Predictive coding of multisensory timing, Current Opinion in Behavioral Sciences. (in press)
  • Shi, Z., Church, R. M., Meck, W. H. (2013), Bayesian optimization of time perception, Trends in Cognitive Sciences, 17(11), 556-564. DOI:10.1016/j.tics.2013.09.009
  • Shi, Z., Ganzenmüller, S., & Müller, H. J. (2013). Reducing Bias in Auditory Duration Reproduction by Integrating the Reproduced Signal. (W. H. Meck, Ed.)PLoS ONE, 8(4), e62065. doi:10.1371/journal.pone.0062065
  • Shi, Z., Jia, L., Müller, H. J. (2012), Modulation of tactile duration judgments by emotional pictures, Front. Integr. Neurosci. 6:24. doi: 10.3389/fnint.2012.00024
  • Ganzenmüller, S., Shi, Z., & Müller, H. J. (2012). Duration reproduction with sensory feedback delay: differential involvement of perception and action time. Frontiers in Integrative Neuroscience, 6(October), 1–11. doi:10.3389/fnint.2012.00095

Visual and Tactile Search

Spatially uninformative auditory cue and visual search

Spatially informative auditory cues has long been used for guiding attention in visual search. Recently study by van der Burg et al. (2008) showed that spatially uninformative sounds can also enhance visual search when the sounds are synchronized with color changes of visual target, a phenomenon referred to as “pip-and-pop” effect. In this project, we are interested in the relationship of this “pip-and-pop” effect to changes in oculomotor scanning behavior induced by the sounds. In our recent study (Zou, Müller, Shi. 2012), we showed that sound events to increase fixation durations upon their occurrence and to decrease the mean number of saccades, suggesting that non-spatial sounds cause a general freezing effect on oculomotor scanning behavior, an effect which in turn benefits visual search performance by temporraly and spatially extended information sampling.

Reference:

Non-spatial sounds regulate eye movements and enhance visual search,

Heng Zou, Hermann J. Müller, and Zhuanghua Shi, J Vis May 4, 2012 12(5): 2; doi:10.1167/12.5.2

Contextual cueing effect and its application

Invariant spatial context can benefit visual search task with a faster response, which has been termed as contextual cueing effect. In most mobile devices, application icons are arranged in relative fixed configuration. When the holding position of the mobile device changes, all icons are shuffled and remppaed to a new location with linear position order. This, however, may discrupt learned spatial configuration, which in turn impedes search performance. In this project, we aim to compare search performances in terms of contextual cueing effect under different spatial remapping methods.

MyHMyV

Contextual cueing and tactile search

In this project we investigate how contextual learning influences tactile search and its underlying spatial reference frames. Previous studies on contextual learning and search have been exclusively done in visual domain. Here we setup vibrotactile search interface, analogue to visual search display, presenting vibrotactile stimuli to participants’ fingers. Analogue to visual contextual cueing paradigm, we compared search performance between repeated and non-repeated tactile search arrays, and found reaction times were faster for the repeated tactile search arrays, even though those arrays couldnot be explicitly recognized (Assumpcao et al., 2015) . In several ongoing studies we now examine how tactile spatial information is encoded in the brain.

tactile-interface

 

Selected publications:

  • Assumpção, L., Shi, Z., Zang, X., Müller, H. J., & Geyer, T. (2015), Contextual cueing: implicit memory of tactile context facilitates tactile search. Attention, Perception, & Psychophysics. doi:10.3758/s13414-015-0848-y
  • Shi, Z., Zang, X., Jia, L., Geyer, T., & Müller, H. J. (2013). Transfer of contextual cueing in full-icon display remapping. Journal of Vision, 13(3), 2, 1–10. doi:10.1167/13.3.2.
  • Zang, X., Jia, L., Müller, H. J., Shi, Z. * (2014), Invariant spatial context is learned but not retrieved in gaze-contingent tunnel-view search, Journal of Experimental Psychology: Learning, Memory, and Cognition. DOI: 10.1037/xlm0000060
  • Zou, H., Müller, H. J., & Shi, Z. (2012). Non-spatial sounds regulate eye movements and enhance visual search. Journal of Vision, 12(5), 1–18. doi:10.1167/12.5.2.