
A mouse’s brain activity can give an indication of what it’s seeing
EPFL/Hillary Sancutary/Alain Herzog/Allen Institute/Roddy Grieves
A black and white movie was extracted almost perfectly from mouse brain signals using an artificial intelligence tool.
Mackenzie Mathis at the Ecole polytechnique fédérale de Lausanne and colleagues examined brain activity data from about 50 mice as they watched a 30-second video clip nine times. The researchers then trained an AI to link that data to the 600-frame clip, in which a man runs towards a car and opens its trunk.
The data was previously collected by other researchers who inserted metal probes, which record electrical impulses from neurons, into the mice’s primary visual cortices, the area of the brain involved in processing visual information. Some data on brain activity was also collected by imaging the brains of mice using a microscope.
Next, Mathis and his team tested the ability of their trained AI to predict the order of frames in the clip using brain activity data collected from the mice as they watched the movie for the tenth time.
This revealed that the AI could predict the correct image within a second 95% of the time.
Other AI tools designed to reconstruct images from brain signals work best when trained on the brain data of the individual mouse they are making predictions for.
To test if this applied to their AI, the researchers trained it on brain data from individual mice. It then predicted viewed movie frames with between 50 and 75 percent accuracy.
“Training the AI on data from multiple animals makes predictions more robust, so you don’t need to train the AI on data from specific individuals to make it work for them,” says Mathis.
By revealing links between patterns of brain activity and visual inputs, the tool could potentially reveal ways to generate visual sensations in people with low vision, Mathis says.
“You can imagine a scenario where you might actually want to help a visually impaired person see the world in an interesting way by playing into neural activity that gives them that sense of seeing,” she says.
This breakthrough could be a useful tool for understanding the neural codes that underlie our behavior and it should be applicable to human data, says Shinji Nishimoto at Osaka University, Japan.
Topics: