Saturday, February 20, 2010: 3:30 PM-5:00 PM
Room 2 (San Diego Convention Center)The past decade has seen an explosion of research on music and the brain. It is clear that music engages much of the brain and coordinates a wide range of cognitive processes. This naturally raises the question of how music cognition relates to other complex cognitive abilities. Language is an obvious candidate, since (like music) it relies on interpreting complex acoustic sequences that unfold in time. Whether music and language cognition share basic brain mechanisms has only recently begun to be studied empirically. An exciting picture is emerging. There are more connections between the domains than might be expected on the basis of dominant theories of musical and linguistic cognition. Furthermore, these connections have real-world implications for the study and treatment of disorders of speech and language. This symposium explores music-language relations from three different perspectives that combine behavioral and brain imaging methods: how speech is encoded by brainstem auditory structures; how “melodic intonation therapy” helps patients with non-fluent aphasia recover some of their spoken language fluency; and syntactic processing.
Aniruddh D. Patel, Neurosciences Institute