Skip to main content

Lund University Publications

LUND UNIVERSITY LIBRARIES

Neural Event Segmentation in Narrative Film: Constructing and Remembering Events Across Sensory Modalities

Johansson, Roger LU orcid ; Isberg, Karin ; Rudling, Maja LU orcid ; Mårtensson, Johan LU and Holsanova, Jana LU orcid (2025) International Conference on Cognitive Neuroscience (ICON2025)
Abstract
A central function of human cognition is the ability to perceive, understand, and remember life experiences as meaningful sequences of discrete events. This process, known as event segmentation, allows individuals to spontaneously construct and update event models and has been shown to engage a distributed network of brain regions that transiently increase in activity at event boundaries (e.g., Zacks et al., 2010, Front. Hum. Neurosci.), supporting narrative comprehension across sensory modalities (Zadbood et al., 2017, Cereb. Cortex).
However, it remains unclear how these neural dynamics are shaped by the sensory modality through which a narrative is experienced. In this study, we investigated how the presence or absence of visual... (More)
A central function of human cognition is the ability to perceive, understand, and remember life experiences as meaningful sequences of discrete events. This process, known as event segmentation, allows individuals to spontaneously construct and update event models and has been shown to engage a distributed network of brain regions that transiently increase in activity at event boundaries (e.g., Zacks et al., 2010, Front. Hum. Neurosci.), supporting narrative comprehension across sensory modalities (Zadbood et al., 2017, Cereb. Cortex).
However, it remains unclear how these neural dynamics are shaped by the sensory modality through which a narrative is experienced. In this study, we investigated how the presence or absence of visual input shapes neural event segmentation and its relationship to comprehension and memory.
We used 7-Tesla functional magnetic resonance imaging (fMRI) to measure brain activity while thirty participants experienced the short film The Red Balloon (cf. Zacks et al., 2010), either as a full audiovisual movie or as an audio-described version without visual input tailored for visually impaired audiences. After scanning, participants rated their comprehension and completed a verbal free recall of the film. Transient brain responses were modeled time-locked to key event boundaries identified independently by human raters. This analysis revealed activity in a network of regions consistently implicated in the construction and updating of event models, including the hippocampus, angular gyrus, precuneus, and posterior cingulate cortex. While both presentation formats engaged elements of this network, contrasts revealed modality-sensitive differences: audiovisual viewing more strongly recruited occipital and parietal regions involved in perceptual and spatial processing, whereas audio-described narration more prominently engaged temporal and frontal areas associated with semantic and linguistic processing. In two case studies of blind participants who experienced the audio-described format, prominent activity emerged primarily in the posterior cingulate cortex, possibly reflecting a greater reliance on internally driven event models in the absence of visual input.
By relating neural segmentation responses to participants’ comprehension and verbal free recall, we further elucidate how the dynamic interplay between sensory modality, brain activity, and memory shapes the way narrative events are constructed in the brain.
(Less)
Please use this url to cite or link to this publication:
author
; ; ; and
organization
publishing date
type
Contribution to conference
publication status
published
subject
conference name
International Conference on Cognitive Neuroscience (ICON2025)
conference location
Porto
conference dates
2025-09-15 - 2025-09-20
project
Syntolkning och tillgänglig information/Audio description and accessible information
language
English
LU publication?
yes
id
299d139a-294c-4dfc-a791-e113eaaa5f77
date added to LUP
2025-09-09 12:34:03
date last changed
2025-09-11 11:18:49
@misc{299d139a-294c-4dfc-a791-e113eaaa5f77,
  abstract     = {{A central function of human cognition is the ability to perceive, understand, and remember life experiences as meaningful sequences of discrete events. This process, known as event segmentation, allows individuals to spontaneously construct and update event models and has been shown to engage a distributed network of brain regions that transiently increase in activity at event boundaries (e.g., Zacks et al., 2010, Front. Hum. Neurosci.), supporting narrative comprehension across sensory modalities (Zadbood et al., 2017, Cereb. Cortex).<br/>However, it remains unclear how these neural dynamics are shaped by the sensory modality through which a narrative is experienced. In this study, we investigated how the presence or absence of visual input shapes neural event segmentation and its relationship to comprehension and memory.<br/>We used 7-Tesla functional magnetic resonance imaging (fMRI) to measure brain activity while thirty participants experienced the short film The Red Balloon (cf. Zacks et al., 2010), either as a full audiovisual movie or as an audio-described version without visual input tailored for visually impaired audiences. After scanning, participants rated their comprehension and completed a verbal free recall of the film. Transient brain responses were modeled time-locked to key event boundaries identified independently by human raters. This analysis revealed activity in a network of regions consistently implicated in the construction and updating of event models, including the hippocampus, angular gyrus, precuneus, and posterior cingulate cortex. While both presentation formats engaged elements of this network, contrasts revealed modality-sensitive differences: audiovisual viewing more strongly recruited occipital and parietal regions involved in perceptual and spatial processing, whereas audio-described narration more prominently engaged temporal and frontal areas associated with semantic and linguistic processing. In two case studies of blind participants who experienced the audio-described format, prominent activity emerged primarily in the posterior cingulate cortex, possibly reflecting a greater reliance on internally driven event models in the absence of visual input.<br/>By relating neural segmentation responses to participants’ comprehension and verbal free recall, we further elucidate how the dynamic interplay between sensory modality, brain activity, and memory shapes the way narrative events are constructed in the brain.<br/>}},
  author       = {{Johansson, Roger and Isberg, Karin and Rudling, Maja and Mårtensson, Johan and Holsanova, Jana}},
  language     = {{eng}},
  month        = {{09}},
  title        = {{Neural Event Segmentation in Narrative Film: Constructing and Remembering Events Across Sensory Modalities}},
  url          = {{https://lup.lub.lu.se/search/files/227461542/ICON_poster_2025_Johansson.pdf}},
  year         = {{2025}},
}