Surprising theorems

🕑 4 min • 👤 Thomas Graf • 📆 June 08, 2019 in Discussions • 🏷 history, literature, formal language theory

Time for a quick break from the on-going feature saga. A recent post on the Computational Complexity blog laments that theorems in complexity theory have become predictable. Even when a hard problem is finally solved after decades of research, the answer usually goes in the expected direction. Gone are the days of results that come completely out of left field. This got me thinking if mathematical linguistics still has surprising theorems to offer.


Continue reading

Features and the power of representations

🕑 13 min • 👤 Thomas Graf • 📆 June 06, 2019 in Discussions • 🏷 features, constraints, representations, generative capacity, subregular, strictly local, transductions

As you might have gleaned from my previous post, I’m not too fond of features, but I haven’t really given you a reason for that. It is actually straight-forward: features lower complexity. By itself, that is actually a useful property. Trees lower the complexity of syntax, and nobody (or barely anybody) uses that as an argument that we should use strings. Distributing the workload between representations and operations/constraints over these representations is considered a good thing. Rightfully so, because factorization is generally a good idea.

But there is a crucial difference between trees and features. We actually have models of how trees are constructed from strings — you might have heard of them, they’re called parsers. And we have some ways of measuring the complexity of this process, e.g. asymptotic worst-case complexity. We lack a comparable theory for features. We’re using an enriched representation without paying attention to the computational cost of carrying out this enrichment. That’s no good, we’re just cheating ourselves in this case. Fortunately, listening to people talk about features for 48h at the workshop gave me an epiphany, and I’m here to share it with you.


Continue reading

Omnivorous number and Kiowa inverse marking: Monotonicity trumps features?

🕑 10 min • 👤 Thomas Graf • 📆 May 31, 2019 in Discussions • 🏷 features, monotonicity, morphosyntax, hierarchies, omnivorous number, inverse marking, Kiowa

I just came back from a workshop in Tromsø on syntactic features, organized by Peter Svenonius and Craig Sailor — thanks for the invitation, folks! Besides yours truly, the invited speakers were Susana Béjar, Daniel Harbour, Michelle Sheehan, and Omer Preminger. I think it was a very interesting and productive meeting with plenty of fun. We got along really well, like a Justice League of feature research (but who’s Aquaman?).

In the next few weeks I’ll post on various topics that came up during the workshop, in particular privative features. But for now, I’d like to comment on one particular issue that regards the feature representation of number and how it matters for omnivorous number and Kiowa inverse marking. Peter has an excellent write-up on his blog, and I suggest that the main discussion about features should be kept there. This post will present a very different point of view that basically says “suck it, features!” and instead uses hierarchies and monotonicity.


Continue reading

Underappreciated arguments: Underlying representations

🕑 4 min • 👤 Thomas Graf • 📆 May 28, 2019 in Discussions • 🏷 phonology, morphology, underlying representations, abstractness, bimorphisms, T-model

Time for another entry in the Underappreciated arguments series. This post will be pretty short as it is a direct continuation of the previous entry on how the inverted T-model emerges naturally from the bimorphism perspective. You see, the very same argument also gives rise to underlying representations in phonology and morphology.


Continue reading

Beeing a linguist

🕑 1 min • 👤 Thomas Graf • 📆 May 22, 2019 in Discussions • 🏷 fun allowed

Continuing yesterday’s theme of having fun, here’s a highly, highly accurate typology of our field in picture form.


Continue reading

A song of middles and suffixes

🕑 3 min • 👤 Thomas Graf • 📆 May 21, 2019 in Discussions • 🏷 phonology, subregular, strictly local, fun allowed

Am I the only one who’s worn out by the total lack of fun and playfulness in all public matters? Everything is serious business, everything is one word away from a shit storm, everybody has to be proper and professional all the time, no fun allowed. It’s the decade of buzzkills, killjoys, and sourpusses. Linguistics is very much in line with this unfortunate trend. Gone are the days of Norbert Hornstein dressing up as Nim Chimpsky. It is unthinkable to publish a paper under the pseudonym Quang Phuc Dong (that’s at least a micro-aggression, if not worse). Even a tongue-in-cheek post on Faculty of Language is immediately under suspicion of being dismissive. Should have added that /s tag to spell it out.

Compared to other fields, linguistics has never been all that playful, perhaps because we’re already afraid of not being taken seriously by other fields. But we’ve had one proud torch bearer in this respect: Geoff Pullum. His Topic… comment column should be mandatory grad school reading. Formal Linguistics Meets the Boojum is a classic for the ages (and did, of course, get a very proper and professional response). My personal favorite is his poetic take on the Halting problem. So I figured instead of complaining I’d lead by example and inject some fun of my own. To be honest, I’m probably better at complaining, but here we go…


Continue reading

Two dimensions of talking past one another

🕑 8 min • 👤 Thomas Graf • 📆 May 20, 2019 in Discussions • 🏷 theory, methodology, Minimalism

This post will be a bit of a mess since it’s mostly me trying to systematize some issues I’ve been grappling with for a while now. I have discussed my research with people who come from very different backgrounds: theoretical linguists, computer scientists, molecular biologists, physicists, and so on. Many of these discussions have involved a fair amount of talking past one another. To some extent that’s unavoidable without a large shared common ground. But ironically, most of the talking past one another actually didn’t occur with, say, biologists, but with theoretical linguists, in particular Minimalists. The rest of this post presents my personal explanation for what might be going on there. I believe there are two factors at play. Both concern the horribly fuzzy notion of a linguistic theory.


Continue reading

Underappreciated arguments: The inverted T-model

🕑 9 min • 👤 Thomas Graf • 📆 May 15, 2019 in Discussions • 🏷 syntax, transductions, bimorphisms, T-model

There’s many conceptual pillars of linguistics that are, for one reason or another, considered contentious outside the field. This includes the competence/performance split, the grammar/parser dichotomy, underlying representations, or the inverted T-model. These topics have been discussed to death, but they keep coming up. Since it’s tiring to hear the same arguments over and over again, I figure it’d be interesting to discuss some little known ones that are rooted in computational linguistics. This will be an ongoing series, and its inaugural entry is on the connection between the T-model and bimorphisms.


Continue reading

Leaving the field

🕑 3 min • 👤 Thomas Graf • 📆 April 26, 2019 in Discussions • 🏷 academia, buck the trend

While being a social media Luddite has many perks, it does mean occasionally missing out on an interesting thing until a resident of those walled gardens points it out to you. Most recently this was a post by Hadas Kotek about her decision to leave the field after several years in temp positions. She gives a detailed account of how she reached that decision, and I’m happy to see that it got a lot of positive feedback. However, there’s one thing that rubs me the wrong way about this whole incident, and that’s the implicit assumption that leaving the field is something that needs to be justified. If anything, it should be staying in the field that needs justification!


Continue reading

Computational Phonology Workshop 3

🕑 2 min • 👤 Jeffrey Heinz • 📆 April 21, 2019 in Discussions • 🏷 phonology, subregular, Stony Brook, IACS, photos

Yesterday at Stony Brook, we concluded an informal workshop on computational phonology, which focused on theoretical, logical, model-theoretic, and automata-theoretic aspects of phonology (and some syntax). Here ‘informal’ means the workshop itself has no advanced schedule of talks, nor are there any talks except for the co-located Linguistics Colloquium and Frontiers series talks. Instead we list topics we are interested in presenting and presenters lead discussion, utilizing the whiteboard and as much time as they want, or until the group becomes restless. We take breaks when we want, and have plenty of time to ask questions, talk with each other, and get to see what we are working on. Personally, I find it very refreshingly different from national and international conferences which (perhaps necessarily) come with a planned schedule of tighly-timed talks, Q&A and so on.

I wanted to take a moment to sumarize some of the big picture issues that emerged for me over the past couple of days.


Continue reading