The subregular complexity of Merge and Move

🕑 9 min • 👤 Thomas Graf • 📆 September 18, 2019 in Tutorials • 🏷 subregular, syntax, locality, strictly local, tier-based strictly local, Minimalist grammars, Merge, Move

Alright, syntax. Things are gonna get a bit more… convoluted? Nah, interesting! In principle we’ll see a lot of the same things as in phonology, and that’s kind of the point: phonology and syntax are actually very similar. But syntax isn’t quite as exhibitionist as phonology, it doesn’t show off its subregular complexity in the open for the world to see. So the first thing we’ll need is a suitable representation. Once we have that, it’s pretty much phonology all over again, but now with trees.


Continue reading

A final stroll through the complexity zoo in phonology

🕑 8 min • 👤 Thomas Graf • 📆 September 09, 2019 in Tutorials • 🏷 subregular, phonology, locality, strictly piecewise, strictly local, tier-based strictly local, typology, learnability

After a brief interlude, let’s get back to locality. This post will largely act as a recap of what has come before and provide a segue from phonology to syntax. That’s also a good time to look at the bigger picture, which goes beyond putting various phenomena in various locality boxes just because we can.


Continue reading

The subregular locality zoo: SL and TSL

🕑 10 min • 👤 Thomas Graf • 📆 August 19, 2019 in Tutorials • 🏷 subregular, phonology, locality, strictly local, tier-based strictly local

Omer has a recent post on listedness. I have a post coming up that expands on my comments there, but it isn’t really fit for consumption without prior knowledge of subregular complexity and how it intersects with the linguistic concept of locality. So I figured I’d first lead in with a series of posts as a primer on some of the core concepts from subregular complexity. I’ll start with phonology — for historical reasons, and because the ideas are much easier to grok there (sorry phonologists, but it’s a playground compared to syntax). That will be followed by some posts on how subregular complexity carries over from phonology to syntax, and then we’ll finally be in a position to expand on Omer’s post. Getting through all of this will take quite a while, but I think it provides an interesting perspective on locality. In particular, we’ll see that the common idea of “strict locality < relativized locality < non-local” is too simplistic.

With all that said, let’s put on our computational hats and get going, starting with phonology. Or to be more specific: phonotactics.


Continue reading

KISSing semantics: Subregular complexity of quantifiers

🕑 9 min • 👤 Thomas Graf • 📆 July 26, 2019 in Discussions • 🏷 subregular, strictly local, tier-based strictly local, monotonicity, quantifiers, semantics, typology

I promised, and you shall receive: a KISS account of a particular aspect of semantics. Remember, KISS means that the account covers a very narrowly circumscribed phenomenon, makes no attempt to integrate with other theories, and instead aims for being maximal simple and self-contained. And now for the actual problem:

It has been noted before that not every logically conceivable quantifier can be realized by a single “word”. Those are very deliberate scare quotes around word as that isn’t quite the right notion — if it can even be defined. But let’s ignore that for now and focus just on the basic facts. We have every for the universal quantifier \(\forall\), some for the existential quantifier \(\exists\), and no, which corresponds to \(\neg \exists\). English is not an outlier, these three quantifiers are very common across languages. But there seems to be no language with a single word for not all, i.e. \(\neg \forall\). Now why the heck is that? If language is fine with stuffing \(\neg \exists\) into a single word, why not \(\neg \forall\)? Would you be shocked if I told you the answer is monotonicity? Actually, the full answer is monotonicity + subregularity, but one thing at a time.


Continue reading

Features and the power of representations

🕑 13 min • 👤 Thomas Graf • 📆 June 06, 2019 in Discussions • 🏷 features, constraints, representations, generative capacity, subregular, strictly local, transductions

As you might have gleaned from my previous post, I’m not too fond of features, but I haven’t really given you a reason for that. It is actually straight-forward: features lower complexity. By itself, that is actually a useful property. Trees lower the complexity of syntax, and nobody (or barely anybody) uses that as an argument that we should use strings. Distributing the workload between representations and operations/constraints over these representations is considered a good thing. Rightfully so, because factorization is generally a good idea.

But there is a crucial difference between trees and features. We actually have models of how trees are constructed from strings — you might have heard of them, they’re called parsers. And we have some ways of measuring the complexity of this process, e.g. asymptotic worst-case complexity. We lack a comparable theory for features. We’re using an enriched representation without paying attention to the computational cost of carrying out this enrichment. That’s no good, we’re just cheating ourselves in this case. Fortunately, listening to people talk about features for 48h at the workshop gave me an epiphany, and I’m here to share it with you.


Continue reading

A song of middles and suffixes

🕑 3 min • 👤 Thomas Graf • 📆 May 21, 2019 in Discussions • 🏷 phonology, subregular, strictly local, fun allowed

Am I the only one who’s worn out by the total lack of fun and playfulness in all public matters? Everything is serious business, everything is one word away from a shit storm, everybody has to be proper and professional all the time, no fun allowed. It’s the decade of buzzkills, killjoys, and sourpusses. Linguistics is very much in line with this unfortunate trend. Gone are the days of Norbert Hornstein dressing up as Nim Chimpsky. It is unthinkable to publish a paper under the pseudonym Quang Phuc Dong (that’s at least a micro-aggression, if not worse). Even a tongue-in-cheek post on Faculty of Language is immediately under suspicion of being dismissive. Should have added that /s tag to spell it out.

Compared to other fields, linguistics has never been all that playful, perhaps because we’re already afraid of not being taken seriously by other fields. But we’ve had one proud torch bearer in this respect: Geoff Pullum. His Topic… comment column should be mandatory grad school reading. Formal Linguistics Meets the Boojum is a classic for the ages (and did, of course, get a very proper and professional response). My personal favorite is his poetic take on the Halting problem. So I figured instead of complaining I’d lead by example and inject some fun of my own. To be honest, I’m probably better at complaining, but here we go…


Continue reading