## Circumambient patterns in syntax

🕑 12 min • 👤 Thomas Graf • 📆 November 11, 2019 in Discussions • 🏷 subregular, syntax, phonology, tone plateauing, movement, complementizer agreement

Last week I gave an invited talk at UMass on the subregular program and the computational parallels it reveals between syntax and phonology. If you’re curious, the slides are on my website. The talk went over a lot better than I expected, and there were lots of great questions. UMass has a tradition of letting students ask questions first before the faculty get to chime in, and the students were relentless in a good way. I think there was only 5 minutes left for faculty questions at the end. It was a great experience, and probably the best question period I’ve ever been on the receiving end of.

Anyways, after the colloquium Brian Dillon asked a few questions about more complex movement cases, and those are very interesting because they’re yet another instance of computational parallelism between phonology and syntax: tone plateauing = movement-driven complementizer agreement.

## Shellshock

🕑 8 min • 👤 Thomas Graf • 📆 November 06, 2019 in Discussions • 🏷 syntax, Larsonian shells, constituency, CCG, Minimalist grammars

This semester I am teaching a seminar on computational syntax. It’s mostly on subregular syntax, but I started out with a discussion of CCG. CCG is noteworthy because it is a theory-rich approach that has managed to make major inroads into NLP. It would be cool if we could replicate this with MGs, but in order to do that you need a killer app. Subregular complexity might just be that because CCG doesn’t have a regular backbone, so it can’t have a subregular one either (more on that in a future post). CCG’s killer app was flexible constituency and a one-to-one mapping from syntax to semantics. You combine that with a corpus (CCGbank) and an efficient parsing algorithm (e.g. supertagging with A* parsing), and you have something that is both linguistically sophisticated and sufficiently fast and robust for practical applications. Anyways, this post collects some of my thoughts on flexible constituency and how it could be emulated in MGs. Spoiler: shells, lots and lots of shells.

## Locality, suppletion, and cognitive parallelism

🕑 11 min • 👤 Thomas Graf • 📆 October 23, 2019 in Tutorials • 🏷 suppletion, locality, prosody, subregular, syntax

Oh boy, this month is killing me. I know I promised you one more detailed discussion of subregular complexity before we finally get back to the topic that started it all, Omer Preminger’s post on listedness. But in the interest of time I’ll merge those two posts into one. That means some points will be a bit more hazy than I’d like, but I think it will work just fine nevertheless (and for those of you sick of this series of posts, we’ll get to something new sooner). Alright, with that out of the way, here’s the basic question: why aren’t PF and LF more alike?

## Syntax as phonology: Syntactic constraints as string constraints

🕑 12 min • 👤 Thomas Graf • 📆 September 29, 2019 in Tutorials • 🏷 subregular, syntax, locality, c-command, constraints, islands

The previous post in this series discussed the lay of the (is)land from the perspective of TSL (I’m so disappointed in myself for not making this pun last time; better late than never). I mentioned that the TSL view cannot handle all island constraints. Sometimes, we need an alternative approach. But this alternative approach doesn’t come out of nowhere. It is also what we need for all kinds of licensing constraints, and it also handles restrictions on movement that are not island constraints.

## Islands are unexpectedly expected

🕑 6 min • 👤 Thomas Graf • 📆 September 23, 2019 in Tutorials • 🏷 subregular, syntax, locality, Move, islands

In the previous post we saw Merge is SL-2 over dependency trees, and Move is TSL-2. For every movement feature f we project a separate tier that contains only lexical items that have a licensor feature f+ or a licensee feature f-. A tier is well-formed iff every lexical item with a licensee feature has a mother with a licensor feature, and every lexical item with a licensor feature has exactly one lexical item among its daughters that carries a licensee feature. It’s a pretty simple system. Despite that simplicity, it predicts a fundamental aspect of movement: island effects!

## The subregular complexity of Merge and Move

🕑 9 min • 👤 Thomas Graf • 📆 September 18, 2019 in Tutorials • 🏷 subregular, syntax, locality, strictly local, tier-based strictly local, Minimalist grammars, Merge, Move

Alright, syntax. Things are gonna get a bit more… convoluted? Nah, interesting! In principle we’ll see a lot of the same things as in phonology, and that’s kind of the point: phonology and syntax are actually very similar. But syntax isn’t quite as exhibitionist as phonology, it doesn’t show off its subregular complexity in the open for the world to see. So the first thing we’ll need is a suitable representation. Once we have that, it’s pretty much phonology all over again, but now with trees.

## Some musings on corpora

🕑 9 min • 👤 Thomas Graf • 📆 September 05, 2019 in Discussions • 🏷 syntax, corpus linguistics, Minimalist grammars, Combinatory categorial grammar

Pro tip: Don’t start a multi-part series of posts on locality right before the beginning of the semester and when you have a pile of papers to review. On the upside, this will give you guys some extra time to digest all the concepts in the three previous posts. In the meantime, here’s a quick and dirty post on corpus linguists and why it should be part of our syntax curriculum. I didn’t even proofread it, so beware.

## KISSing syntax

🕑 7 min • 👤 Thomas Graf • 📆 July 12, 2019 in Discussions • 🏷 methodology, syntax

Here’s a question I first heard from Hans-Martin Gärtner many years ago. I don’t remember the exact date, but I believe it was in 2009 or 2010. We both happened to be in Berlin, chowing down on some uniquely awful sandwiches. Culinary cruelties notwithstanding the conversation was very enjoyable, and we quickly got to talk about linguistics as a science, at which point Hans-Martin offered the following observation (not verbatim):

It’s strange how linguistic theories completely lack modularity. In other sciences, each phenomenon gets its own theory, and the challenge lies in unifying them.

Back then I didn’t share his sentiment. After all, phonology, morphology, and syntax each have their own theory, and eventually we might try to unify them (an issue that’s very dear to me). But the remark stuck with me, and the more I’ve thought about it in the last few years the more I have to side with Hans-Martin.

## More observations on privative features

🕑 7 min • 👤 Thomas Graf • 📆 June 17, 2019 in Discussions • 🏷 features, privativity, phonology, syntax, transductions

In an earlier post I looked at privativity in the domain of feature sets: given a collection of features, what conditions must be met by their extensions in order for these features to qualify as privative. But that post concluded with the observation that looking at the features in isolation might be a case of the dog barking up the wrong tree. Features are rarely of interest on their own, what matters is how they interact with the rest of the grammatical machinery. This is the step from a feature set to a feature system. Naively, one might expect that a privative feature set gives rise to a privative feature system. But that’s not at all the case. The reason for that is easy to explain yet difficult to fix.

## Some observations on privative features

🕑 9 min • 👤 Thomas Graf • 📆 June 11, 2019 in Discussions • 🏷 features, privativity, phonology, syntax

One topic that came up at the feature workshop is whether features are privative or binary (aka equipollent). Among mathematical linguists it’s part of the general folklore that there is no meaningful distinction between the two. Translating from a privative feature specification to a binary one is trivial. If we have three features $$f$$, $$g$$, and $$h$$, then the privative bundle $$\{f, g\}$$ is equivalent to $$[+f, +g, -h]$$. In the other direction, we can make binary features privative by simply interpreting the $$+$$/$$-$$ as part of the feature name. That is to say, $$-f$$ isn’t a feature $$f$$ with value $$-$$, it’s simply the privative feature $$\text{minus} f$$. Some arguments add a bit of sophistication to this, e.g. the Boolean algebra perspective in Keenan & Moss’s textbook Mathematical Structures in Language. So far so good unsatisfactory.