News from the MG frontier

πŸ•‘ 3 min • πŸ‘€ Aniello De Santo • πŸ“† June 24, 2019 in Discussions • 🏷 MGs, parsing, NLP

True to my academic lineage, I’m a big fan of Minimalist grammars (MGs): they are a pretty malleable formalism, their core mechanisms are very easy to grasp on an intuitive level, and they are close enough to current minimalist syntax to allow for interesting computational insights into mainstream syntax. However, I often find that MGs’ charms don’t work that well on my more NLP-oriented colleagues β€” especially when compared to some very close cousins like TAGs or CCGs. There are very practical reasons for this, of course, but two in particular come to mind right away: the lack of any large MG corpus (and/or automatic ways to generate such corpora) and, relatedly, the lack of efficient, state-of-the-art, probabilistic parsers.

This is why I’m very excited about this upcoming paper by John Torr and co-authors (henceforth TSSC), on a (the first ever?) wide-coverage MG parser. The parser is implemented by smartly adapting the \(A^*\) search strategy developed by Lewis and Steedman (2014) for CCGs to MGs (basically, a CKY chart + a priority queue), and coupling it with a complex neural network supertagger trained on an MG treebank.


Continue reading