The Phenomenon: Stimuli (movie scenes, naturalistic, music etc) evoke attention and potentially similar brain activity across viewers.
The Data:
- Study Forrest Movie Transcript
- Study Forrest Eye Gaze and Neuroimaging
- Human Connectome Project Neuroimaging + Story Q/A
The Question: Can audience preference be linked to reliable neural representations of stimuli (scenes, naturalistic, music etc). Do more preferred/attention grabbing stimuli produce shared neural signals among audiences. In other words, can we predict sentiments (plus attention) based on the fMRI functional connectivity metric?
Approach & Goal To predict sentiments from functional connectivity in the brain. Use NLP derived sentiment metrics for audience preference!??! (Eye gaze/fixation/saccade for attention too!) And see if shared preferences evoke shared brain patterns.
Neuroscience: Can we use BOLD fMRI functional connectivity matrix to predict sentiments or stimuli features? Or the other way around?
nl-procesors
: code base for deep learning componentsdata
: datasets to play with... currently trying to reproduce existing work.