Sunday, February 19, 2012: 8:00 AM-9:30 AM
Room 110 (VCC West Building)Communication, language, performance, and cognition are all shaped in varying ways by our embodiment (our physicality, including brain and body) and our embeddedness (our place in the world: physical, social, and cultural). The real-time production of spoken and signed language involves the dynamic control of speech articulators, limbs, face, and body, and the coordination of movement and gesture, by and between individuals. Increases in computing power and the recent emergence of ubiquitous and flexible sensing and measurement technologies, from inexpensive digital video and other devices to higher end tools, are beginning to make it possible to capture these complex activities more easily and in greater detail than ever before. We are on the cusp of a revolution in sign, gesture, and interactive communication studies. New computational and statistical tools and visualization techniques are also helping us to quantify and characterize these behaviors and, in certain instances, use them to control and synthesize speech, gesture, and musical performance. This symposium brings together experts spanning linguistics, computer science, engineering, and psychology to describe new developments in related areas of inquiry. These include coordination and synchrony during spoken and signed language, gestural control of musical performance, physiologically and acoustically realistic articulatory speech synthesis, and cognitive and linguistic development.
Philip Rubin, Haskins Laboratories
Eric Vatikiotis-Bateson, University of British Columbia