BCS Roger Needham Lecture 2016
Title: ‘Language learning in humans and machines: making connections to make progress’
Lecturer: Dr Sharon Goldwater
Date and Time:21 November 2016 @ 18h30
Venue: The Royal Society, London
Sharon Goldwater is a Reader in the Institute for Language, Cognition and Computation at the University of Edinburgh’s School of Informatics. She received her PhD from Brown University, supervised by Mark Johnson, and spent two years as a postdoctoral researcher at Stanford University before moving to Edinburgh. Her research interests include probabilistic machine learning approaches to natural language processing (especially unsupervised approaches), computer modelling of language acquisition in children, and computational studies of language use. Dr Goldwater holds a Scholar Award from the James S McDonnell Foundation for her work on “Understanding synergies in language acquisition through computational modelling”. Outside of research, Dr Goldwater enjoys DIY, sewing, baking, and Argentine tango.
Computer processing of speech and language has advanced enormously in the last decade, with many people now using applications such as automatic translation, voice-activated search, and even language-enabled personal assistants. Yet these systems still lag far behind human capabilities, and the success they do have relies on machine learning methods that learn from very large quantities of human-annotated data (for example speech data with transcriptions or text labelled with syntactic parse trees). These resource-intensive methods mean that effective technology is available for only a tiny fraction of the world’s 5000 or more languages, mainly those spoken in large rich countries.
This talk will argue that in order to solve this problem, we need a better understanding of how humans learn and represent language in our minds, and we need to consider how human-like learning biases can be built into computational systems. I will illustrate these ideas using examples from my own research. I will discuss why language is such a difficult problem, say a bit about what we know about human language learning, and then show how my own work has taken inspiration from that to develop better methods for computational language learning.
Further information on the speaker and lecture can be found here: http://academy.bcs.org/content/roger-needham-lecture