Established in 2020 Wednesday, April 17, 2024


Movies, music and pictures can train synthetic brain
A new AI-based technology developed by Cornell researchers will help gain new insights into how our brains respond to external stimuli. Image: Denise Jans, Unsplash.



ITHACA, NY.- A new AI-based technology developed by Cornell researchers will help gain new insights into how our brains respond to external stimuli.

Meenakshi Khosla, doctoral student in the field of electrical and computer engineering, along with her adviser, Mert Sabuncu, associate professor of electrical and computer engineering in the College of Engineering and colleagues at Weill Cornell Medicine, are authors of a new paper published in Science Advances, “Cortical response to naturalistic stimuli is largely predictable with deep neural networks.”

“Major discoveries in the field of sensory neuroscience have been driven by controlled experiments that present animals with carefully designed artificial visual and auditory stimuli,” Khosla said. “In this study, we present an alternate way to expedite neuroscientific discovery.”

This alternate method involves collecting neural responses to rich, complex stimuli that mimic natural conditions, namely movies, and training computational models with different inductive biases to predict the evoked response. The study uses neurological data from the Human Connectome Project database, collected while subjects were passively watching movies including clips taken from commercial films including “Home Alone,” “Star Wars” and “Inception.”




“We developed an AI-based technology that can reliably predict the activation patterns across the brain of a person watching a movie, listening to an audio track or looking at a picture,” said Sabuncu, also an assistant professor of electrical engineering research in radiology at Weill Cornell Medicine. “We can view this technology as a synthetic brain that can allow researchers to gain new insights into how our brains respond to external stimuli.”

Such a synthetic brain can lead to novel neuroscientific findings and brain-computer-interface tools for assisting or augmenting cognitive or sensory and motor functions.

Amy Kuceyeski, an adjunct associate professor of computational biology in the College of Agriculture and Life Sciences and an associate professor of mathematics in radiology at Weill Cornell Medicine, is a co-author and collaborator in the work.

“Studying how the brain processes external stimuli is an important area of neuroscience research because it allows a window into how the brain works,” Kuceyeski said. “Knowing how the brain processes images and sound may provide ideas on how to mirror these mechanisms in artificial systems, such as computer vision.”

The researchers’ proposed approach can replicate findings from a large number of prior studies in neuroscience, spanning research on multisensory integration, cortical temporal hierarchy and functional selectivity within the brain.

Taken together, the paper states, the findings underscore the potential of neural encoding models as a powerful tool for studying brain function in ecologically valid conditions.







Today's News

June 13, 2021

Microscope reveals the secrets of a material's structure

Germany's CureVac faces delay on Covid vaccine

Understanding the evolution of viruses

Novel liquid crystal metalens offers electric zoom

Indonesia dengue fever study offers hope in disease battle

Rover leaves 'China's imprint' on Mars

'Space pups': Mouse sperm stored on ISS produces healthy young

Portable technology offers boost for nuclear security, arms control

Mixing solutions in the world's smallest test tubes

COVID-19 PCR tests can be freeze dried

Air pollution exposure during pregnancy may boost babies' obesity risk

Monolayer superconductor exhibits unusual behavior

No evidence mystery UFOs are alien spacecraft, report finds: NYTimes

Black and white women have same mutations linked to breast cancer risk

Edge of Pine Island Glacier's ice shelf is ripping apart, causing key Antarctic glacier to gain speed

Movies, music and pictures can train synthetic brain

Rice lab peers inside 2D crystal synthesis

AI predicts how patients with viral infections, including COVID-19, will fare



 


Editor & Publisher: Jose Villarreal
Art Director: Juan José Sepúlveda Ramírez



Tell a Friend
Dear User, please complete the form below in order to recommend the ResearchNews newsletter to someone you know.
Please complete all fields marked *.
Sending Mail
Sending Successful