Spatio-temporal visual statistical learning in context

Dominik Garber, József Fiser*

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

Abstract (may include machine translation)

Visual Statistical Learning (VSL) is classically investigated in a restricted format, either as temporal or spatial VSL, and void of any effect or bias due to context. However, in real-world environments, spatial patterns unfold over time, leading to a fundamental intertwining between spatial and temporal regularities. In addition, their interpretation is heavily influenced by contextual information through internal biases encoded at different scales. Using a novel spatio-temporal VSL setup, we explored this interdependence between time, space, and biases by moving spatially defined patterns in and out of participants' views over time in the presence or absence of occluders. First, we replicated the classical VSL results in such a mixed setup. Next, we obtained evidence that purely temporal statistics can be used for learning spatial patterns through internal inference. Finally, we found that motion-defined and occlusion-related context jointly and strongly modulated which temporal and spatial regularities were automatically learned from the same visual input. Overall, our findings expand the conceptualization of VSL from a mechanistic recorder of low-level spatial and temporal co-occurrence statistics of single visual elements to a complex interpretive process that integrates low-level spatio-temporal information with higher-level internal biases to infer the general underlying structure of the environment.

Original languageEnglish
Article number106324
Number of pages13
JournalCognition
Volume266
DOIs
StatePublished - Jan 2026

Keywords

  • Context dependent learning
  • Perceptual biases
  • Spatio-temporal visual information
  • Unconscious inference

Fingerprint

Dive into the research topics of 'Spatio-temporal visual statistical learning in context'. Together they form a unique fingerprint.

Cite this