Oregon State University

We’d like your feedback: Calendar User Survey – Event Creator Survey

Calendar

Calendars

Event Details

PhD Final Oral Examination – James Cross


Thursday, December 8, 2016 3:00 PM - 5:00 PM

Parsing with Recurrent Neural Networks
Machine learning models for natural language processing have traditionally relied on large numbers of discrete features, built up from atomic categories such as word forms and part-of-speech labels, which are considered completely distinct from each other. Recently however, the advent of dense feature representations combined with deep learning techniques has led to powerful new models which can automatically learn to exploit various dimensions of implicit similarity between such discrete linguistic entities. This work extends that line of research as it applies to syntactic parsing, particularly by introducing recurrent network models which can encode the entirety of a sentence in context and by proposing novel parsing systems to take advantage of such models.

Syntactic parsing is an inherently difficult problem in natural language processing because of the ambiguous and highly compositional nature of language itself. Perfect agreement is not possible even among expert human annotators. Statistical and machine learning prediction of the syntactic structure of sentences has been the subject of decades of study. Recent advances in applying deep neural models to language problems, however, have led to rapid strides in this domain, with models which are able to automatically exploit a whole new realm of hidden regularities in language, without relying on painstaking and imperfect human feature engineering. We continue this trend with feature-learning recurrent networks to model entire sentences, which allow the parser to incorporate information from the entire sentence context when making every decision. We also introduce new parsing paradigms designed explicitly to leverage this new representational power, including a state-of-the-art transition-based constituency parser, the first constituency parser ever to achieve competitive results with greedy decoding.

We also introduce a straightforward dynamic oracle for the aforementioned constituency parsing system, and show that it is optimal in both label recall and precision. This is the first ever provably optimal dynamic oracle for a transition-based constituency parser. In addition to its optimality, our dynamic oracle is computable in amortized constant time per step, a dramatic improvement over its forerunners for arc-standard dependency parsing, which required worst-case cubic time per step.

Extending the optimality proof for that dynamic oracle, we show the surprising result that the entire space of possible parser states for a sentence of length $n$ can be reduced to $O(n^2)$ using a further simplified feature space. This simplification could have important future impact for search-based or globally-optimized training methods.

Finally, we extend our parsing model still further, by applying it to morphologically rich languages, using continuous embeddings over previously predicted morphological features. We find that we achieve very competitive results over a range of languages despite no language-specific architectural or hyper-parameter tuning, including achieving the best reported parsing results on the French Treebank.

Major Advisor: Liang Huang
Committee: Xiaoli Fern
Committee: Prasad Tadepalli
Committee: Alan Fern
GCR: Brett Tyler


Kelley Engineering Center (campus map)
1007
Nicole Thompson
1 541 737 3617
Nicole.Thompson at oregonstate.edu
Sch Elect Engr/Comp Sci
This event appears on the following calendars: