Siyi Chen became a member of the lab team in late 2018, focusing on the crossmodal contextual learning project. In 2022, she was awarded funding for her independent research project from the German Research Foundation (DFG), which explores the impact of uncertainty on context guidance and suppression mechanisms. She currently holds the position of Junior Researcher in Residence (Center for Advanced Studies) at LMU (2023/24).
Research interests
Neuropsychology, Cognitive Psychology, Computational Neuroscience, Visual Attention and Working & Long-term Memory, Multisensory Attention and Integration
Publications
2025
Context-based guidance versus context suppression in contextual learning: Role of un-/certainty in the target-context relations in visual search.
Journal of Experimental Psychology: Human Perception and Performance
3 citations
2024
The impact of task measurements on sequential dependence: a comparison between temporal reproduction and discrimination tasks
Psychological Research
5 citations
Opposing Sequential Biases in Direction and Time Reproduction: Influences of Task Relevance and Working Memory
bioRxiv
3 citations
Contextual facilitation: Separable roles of contextual guidance and context suppression in visual search
Psychonomic Bulletin & Review
4 citations
ERPs and alpha oscillations track the encoding and maintenance of object‐based representations in visual working memory
Psychophysiology
3 citations
2023
A study on the application of contrastive learning in the brain-computer interface of motor imagery
6th International Conference on Advanced Electronic Materials, Computers and Software Engineering (AEMCSE 2023)
2 citations
Statistical context learning in tactile search: Crossmodally redundant, visuo-tactile contexts fail to enhance contextual cueing
Frontiers in Cognition
2022
Multisensory Rather than Unisensory Representations Contribute to Statistical Context Learning in Tactile Search
Journal of Cognitive Neuroscience
2 citations
Cross-modal contextual memory guides selective attention in visual-search tasks.
Psychophysiology
6 citations
Information Exchange between Cortical Areas: The Visual System as a Model
The Neuroscientist
1 citations
2021
Feedback from lateral occipital cortex to V1/V2 triggers object completion: Evidence from functional magnetic resonance imaging and dynamic causal modeling
Human Brain Mapping
12 citations
Multisensory visuo-tactile context learning enhances the guidance of unisensory visual search
Scientific Reports
14 citations
When visual distractors predict tactile search: The temporal profile of cross-modal spatial learning.
Journal of Experimental Psychology. Learning, Memory and Cognition
5 citations
2020
Object-based grouping benefits without integrated feature representations in visual working memory
Attention, Perception, & Psychophysics
Object-based grouping benefits without integrated feature representations in visual working memory
Attention, perception & psychophysics
18 citations
2019
Crossmodal learning of target-context associations: When would tactile context predict visual search?
Attention, Perception, & Psychophysics
Crossmodal learning of target-context associations: When would tactile context predict visual search?
Attention, perception & psychophysics
7 citations
Tracking the completion of parts into whole objects: Retinotopic activation in response to illusory figures in the lateral occipital complex
NeuroImage
9 citations
2018
Kanizsa-figure object completion determines attentional selection in time: Evidence from the attentional blink
Journal of Vision
Amodal Completion of a Target Template Enhances Attentional Guidance in Visual Search
i-Perception
6 citations
Surface Filling-In and Contour Interpolation Contribute Independently to Kanizsa Figure Formation
Journal of Experimental Psychology: Human Perception and Performance
13 citations
Author accepted manuscript: Kanizsa-figure object completion gates selection in the attentional blink.
Quarterly Journal of Experimental Psychology
5 citations
Object maintenance beyond their visible parts in working memory.
Journal of Neurophysiology
14 citations
2017
2016
Amodal completion in visual working memory.
Journal of Experimental Psychology: Human Perception and Performance
22 citations
MSense