Modeling Language Representation in the Brain: from Single Words to Passages

PNC First Year Milestone Presentation
Center for the Neural Basis of Cognition (CNBC)

Modeling Language Representation in the Brain: from Single Words to Passages

Mariya Toneva
October 8, 2015 - 10:30am

Abstract: Often, models of language processing in the brain take a bag-of-words approach that considers each word independently of others. In contrast, the current work views language processing in the brain as a time series and accounts for the word to word dynamics during reading. The aim of the current work is to investigate the importance of incorporating context into models of language representation in the brain, and to propose a model that explicitly leverages context. We propose a linear Gaussian-noise model that is able to exactly infer the best sequence of semantic and context vectors that gives rise to previously unseen brain activity. We present results on one synthetic and two magnetoencephalograpy data sets. The results hint at the importance of incorporating context representations in models of language processing of passages and in-context words. Future work, extending the current proposed model to one that includes latent states with previously unspecified form, may further improve the passage and in-context-word classification accuracy presented in this work.