Against math: When sets are a bad setup

🕑 11 min • 👤 Thomas Graf • 📆 April 06, 2020 in Discussions • 🏷 methodology, syntax, set theory, Merge, linearization

Last time I gave you a piece of my mind when it comes to the Kuratowski definition of pairs and ordered sets, and why we should stay away from it in linguistics. The thing is, that was a conceptual argument, and those tend to fall flat with most researchers. Just like most mathematicians weren’t particularly fazed by Gödel’s incompleteness results because it didn’t impact their daily work, the average researcher doesn’t care about some impurities in their approach as long as it gets the job done. So this post will discuss a concrete case where a good linguistic insight got buried under mathematical rubble.


Continue reading

Against math: Kuratowski's spectre

🕑 8 min • 👤 Thomas Graf • 📆 March 30, 2020 in Discussions • 🏷 methodology, syntax, set theory, Merge, linearization

As some of you might know, my dissertation starts with a quote from My Little Pony. By Applejack, to be precise, the only pony that I could see myself have a beer with (and I don’t even like beer). You can watch the full clip, but here’s the line that I quoted:

Don’t you use your fancy mathematics to muddy the issue.

Truer words have never been spoken. In light of my obvious mathematical inclinations this might come as a surprise for some of you, but I don’t like using math just for the sake of math. Mathematical formalization is only worth it if it provides novel insights.


Continue reading

Trees for free with tree-free syntax

🕑 5 min • 👤 Thomas Graf • 📆 March 06, 2020 in Discussions • 🏷 syntax, strings, derivation trees, phrase structure trees

Here’s another quick follow-up to the unboundedness argument. As you might recall, that post discussed a very simple model of syntax whose only task it was to adjudicate the well-formedness of a small number of strings. Even for such a limited task, and with such a simple model, it quickly became clear that we need a more modular approach to succinctly capture the facts and state important generalizations. But once we had this more modular perspective, it no longer mattered whether syntax is actually unbounded. Assuming unboundedness, denying unboundedness, it doesn’t matter because the overall nature of the approach does not hinge on whether we incorporate an upper bound on anything. Well, something very similar also happens with another aspect of syntax that is beyond doubt in some communities and highly contentious in others: syntactic trees.


Continue reading

Unboundedness, learning, and POS

🕑 6 min • 👤 Thomas Graf • 📆 February 26, 2020 in Discussions • 🏷 learnability, poverty of stimulus, lattices

Ignas Rudaitis left a comment under my unboundedness post that touches on an important issue: the interaction of unboundedness and the poverty of the stimulus (POS). My reply there had to be on the short side, so I figured I’d fill in the gaps with a short follow-up post.


Continue reading

Unboundedness is a red herring

🕑 13 min • 👤 Thomas Graf • 📆 February 20, 2020 in Discussions • 🏷 syntax, methodology, competence, performance

Jon’s post on the overappreciated Marr argument reminded me that it’s been a while since the last entry in the Underappreciated arguments series. And seeing how the competence-performance distinction showed up in the comments section of my post about why semantics should be like parsing, this might be a good time to talk one of the central tenets of this distinction: unboundedness. Unboundedness, and the corollary that natural languages are infinite, is one of the first things that we teach students in a linguistics intro, and it is one of the first things that psychologists and other non-linguists will object to. But the dirty secret is that nothing really hinges on it.


Continue reading

Hey syntax, where's my carrot?

🕑 5 min • 👤 Thomas Graf • 📆 January 31, 2020 in Discussions • 🏷 syntax, textbooks, teaching

Last week I blogged a bit about syntax textbooks. One question I didn’t ask there, for fear of completely derailing the post, is what should actually be in a syntax textbook. There’s a common complaint I hear from students about syntax classes, and it’s that syntax courses are one giant bait-and-switch. They’re right, and it’s also true for syntax textbooks.


Continue reading

Syntax textbook recommendations

🕑 6 min • 👤 Thomas Graf • 📆 January 22, 2020 in Discussions • 🏷 syntax, textbooks, teaching

Recently I found in my inbox an inquiry from a student who wants to pick up syntax on their own and would like to get some textbook recommendations. I happily complied, but I’m actually not all that qualified to give such recommendations. It’s been a long time since I’ve read a syntax textbook. I have never taught a grad-level syntax introduction. At the undergraduate level I got to teach syntax twice at the very beginning of my Stony Brook career, and never again since then — I can only surmise that what I did in those courses was too radical for the linguistic deep state to tolerate. But the prestigious Outdex readership includes at least some syntacticians, so why not crowdsource the recommendations?


Continue reading

Physics cranks = syntax cranks?

🕑 2 min • 👤 Thomas Graf • 📆 January 15, 2020 in Discussions • 🏷 physics, syntax, Minimalism

Congratulations, you’re reading the shortest Outdex post yet! Peter Woit of Not Even Wrong, reacting to Sabine Hossenfelder of Lost in Math fame, has another post on the lack of progress in high-energy particle physics. If you’ve been following the debate about string theory and super symmetry in recent years, nothing in the post will shock you. What I find fascinating is that you could replace “particle physics” by “Minimalism” in this debate and the whole thing all of a sudden looks very familiar.


Continue reading

Semantics: Corrections and further thoughts

🕑 6 min • 👤 Thomas Graf • 📆 January 08, 2020 in Discussions • 🏷 semantics, donkey sentences, parsing

This is a follow-up to my previous post on semantics. It has been pointed out to me that this post contains several inaccuracies and grave omissions. Some of them are in the summary of Lucas’ talk, and that would probably have been noticed earlier if I had provided a link to the slides or the paper. Thanks to Lucas for sending me those by email and for walking me through the account again. I’ll briefly explain some of the misleading points later on in this post.

But the much bigger issue is that I failed to point out that Lucas wasn’t just presenting his own work. He made it very, very clear that this was joint work with Dylan Blumford (UCLA) and Robert Henderson (UArizona). I’m really upset with myself about that one, in some sense giving partial credit is even worse than giving no credit at all, and the latter is already a dick move. My sincerest apologies to Dylan and Robert.

If I had run the post past Lucas before publishing it, a lot of this could have been avoided, so I’ll make that a priority for future posts that talk about work that I’m not well-acquainted with. Alright, so let’s talk a bit what I got wrong and how that affects the central message of the previous post.


Continue reading

Semantics should be like parsing

🕑 5 min • 👤 Thomas Graf • 📆 December 28, 2019 in Discussions • 🏷 semantics, donkey sentences, parsing

I spent a few days before Christmas at the Amsterdam colloquium, which exposed me to a much heavier dose of semantics than I’m used to. I’ve always had a difficult relation with semantics. On the one hand I like that it has its fair share of KISS theories, and generalized quantifier theory is aesthetically very pleasing to me. On the other hand most of semantics is pretty dull, and I think that’s because semanticists put way too much stuff in their theories that has nothing to do with natural language semantics. I’ve previously had a hard time putting this into concrete terms, but Lucas Champollion’s invited talk on donkey sentences finally presented me with a specific example.


Continue reading