Complex neuroimaging data can be explored by translation into an audiovisual format (a video accompanied by a musical soundtrack) to help interpret what is happening in the brain when certain behaviors are performed.
David Thibodeaux and colleagues from Columbia University, US, present this technique in the open access journal PLOS ONE. Examples of these magnificent “brain films” are included below.
Recent technological advances have made it possible to record in real time several components of waking brain activity. Scientists can now observe, for example, what happens in a mouse’s brain when it performs a specific behavior or receives a certain stimulus. However, such research produces large amounts of data that can be difficult to explore intuitively to better understand the biological mechanisms behind patterns of brain activity.
Previous research has shown that some brain imaging data can be translated into sound representations. Building on such approaches, Thibodeaux and his colleagues developed a flexible toolkit that enables the translation of different types of brain imaging data – and associated video recordings of laboratory animal behavior – into audiovisual representations.
The researchers then demonstrated the new technique in three different experimental settings, showing how audiovisual representations can be prepared with data from various brain imaging approaches, including 2D wide-field optical mapping (WFOM) and excitation microscopy. 3D scanned confocal aligned planar (SCAPE).
The toolbox was applied to previously collected WFOM data that detected both neuronal activity and changes in cerebral blood flow in mice engaging in different behaviors, such as running or grooming.
The neural data was represented by piano sounds that played in time with spikes in brain activity, with the volume of each note indicating the magnitude of the activity and its pitch indicating the location in the brain where the activity was occurring. was produced.
Meanwhile, blood flow data was represented by violin sounds. The sounds of the piano and violin, played in real time, demonstrate the coupled relationship between neuronal activity and blood flow. By viewing a video of the mouse, the viewer can discern which patterns of brain activity correspond to different behaviors.
The authors note that their toolkit does not replace quantitative analysis of neuroimaging data. Still, it could help scientists examine large data sets to detect patterns that would otherwise have gone unnoticed and merit further analysis.
The authors add: “Listening to and seeing representations of (brain activity) data is an immersive experience that can tap into our ability to recognize and interpret patterns (think of the online safety feature that asks you to ‘select traffic lights in this image”). ” – a challenge beyond most computers, but trivial for our brains)… (It) is almost impossible to look at and focus on both time-varying data (brain activity) and “Behavioral video, our eyes will need to go back and forth to see things happening together.”
“You usually have to continually replay clips over and over again to be able to understand what happened at any given moment. Having an auditory representation of the data makes it much easier to see (and hear) when exactly things are happening at the moment. same time.”
More information:
Audiovisualization of neuroimaging data in real time, PLoS ONE (2024). DOI: 10.1371/journal.pone.0297435. journals.plos.org/plosone/arti …journal.pone.0297435
Provided by the Public Science Library
Quote: “Movies” with color and music visualize brain activity data in beautiful detail (February 21, 2024) retrieved February 21, 2024 from
This document is subject to copyright. Apart from fair use for private study or research purposes, no part may be reproduced without written permission. The content is provided for information only.