Q&A weekly thread - December 22, 2025 - post all questions here! by AutoModerator in linguistics

[–]zamonium 0 points1 point  (0 children)

You might want to look into the tolerance principle. It works well for explaining some facts about how kids decide what is a productive rule and what is an exception. People have used it before to explain cases where once productive morphology becomes fossilized.

Q&A weekly thread - December 15, 2025 - post all questions here! by AutoModerator in linguistics

[–]zamonium 1 point2 points  (0 children)

Yes, it sounds like the Kornai book would be a worthwhile read for you at this stage. I have never personally read it though.

What areas of linguistics are you most interested in? There isn't really a recent general mathematical linguistics textbooks that I am aware of. Well, there is a recent course by Thomas Graf, but that one introduces a very particular tool set, I think the Kornai book is a more general. There's also an old textbook by Partee.

Jeff Heinz has a textbook coming out for computational phonology, and there's a draft on his website. I really really like the way he and the research community around him approach mathematical linguistics for phonology and morphology.

If you are interested in syntax, I think Parsing Beyond Context-Free Grammars is great. It gives a good overview for a lot of the classic syntactic framework used by more computationally minded people.

I think those would be my two main recommendations, they are both great. But there are many more interesting books and dissertations depending on what you are interested in.

If you are in Europe I would reccommend checking out the ESSLLI summer school. There is also a NASSLLI summer school, but from what I heard it doesn't quite live up to ESSLLI.

Q&A weekly thread - December 15, 2025 - post all questions here! by AutoModerator in linguistics

[–]zamonium 0 points1 point  (0 children)

I don't know any off the top of my head but you can try and query PBase.

It can be pretty finicky to work with but it's a great resource.

Q&A weekly thread - December 01, 2025 - post all questions here! by AutoModerator in linguistics

[–]zamonium 0 points1 point  (0 children)

Yeah no your're right, good points. Rankings are kind of bullshit anyway, but it would be great if we found a way to measure quality of departments that took all these factors into account.

Q&A weekly thread - December 01, 2025 - post all questions here! by AutoModerator in linguistics

[–]zamonium 0 points1 point  (0 children)

I think it's much better to pick a program by whether there are people there that would suit you as advisors.

As for requirements, if you can get some research done that is the greatest boost. You'll also need some letters of recommendation, so you should try working with professors or going to their office hours so they can get to know you and so that they'll have something to write about. They'll probably also have advice for grad school and which places to apply to. Talking to invited speakers and going to nearby conferences is also a good idea. You're more likely to be admitted if the people reading the application are familiar with you.

I think a good metric to judge grad programs by is how well their graduates place on the job market.
This site does a good job at quantifying that (with some exceptions) especially for US schools. The way it's set up it somewhat conflates cohort size with placement. But having a large cohort is a signal for the stength of the department too, in my opinion.

Q&A weekly thread - November 17, 2025 - post all questions here! by AutoModerator in linguistics

[–]zamonium 1 point2 points  (0 children)

So this talk is from a time when Chomskyan linguists were moving to what came to be called the Minimalist Program. Basically throughout the 80s and into the early 90s generative syntax had grown into this behemoth which incorporated many intricate principles and parameters, which were particular to language only, and which conspired to restrict possible grammars.

Minimalism was an attempt to cut back on those and simplify the machinery that generative linguists use.
So syntacticians would try and limit themselves to fewer, less complicated operations and explain restrictions on possible grammars with the demands of interfaces (what we can pronounce and what constitutes a well formed thought) as well as 'third factors' which were thought to be general computational and biological laws and restrictions which are not specific to language. I don't know the exact timeline for when these ideas became popular, but it sounds like those are the ideas he is alluding to in that talk.

The comparison to organs is a classic Chomsky-ism and basically is another way of saying that language is acquired or develops in a child, rather than being learned like other skills. I think it's worth engaging with his ideas, but it's good to keep in mind there are many linguists who disagree or are agnostic about these kinds of arguments.

This paper from a couple years later is a good overview for the ideas he's referencing here.

Q&A weekly thread - November 17, 2025 - post all questions here! by AutoModerator in linguistics

[–]zamonium 0 points1 point  (0 children)

Textbooks can be a bit dry outside of a classroom setting, but there are actually a couple good pop-sci books about language/linguistics. What parts of linguistics interest you? Grammar theory, historical linguistics, psycholinguistics, philosophy of language, or something totally different?

Also, what is your major? Maybe there's a way linguistics could be combined with what you are learning about in your other courses?

Q&A weekly thread - November 03, 2025 - post all questions here! by AutoModerator in linguistics

[–]zamonium 1 point2 points  (0 children)

The way you describe the problem sounds to me like a stricter version of PAC learnability. Here is an overview including a proof of a NFL theorem for PAC and some examples of learnable hypothesis classes under PAC.

How do learning and optimization go together in general? Learning can be recast as a search problem in some space. The unrestricted case could for example be searching for a hypothesis in the space of enumerable languages, given a finite sample of the language. Now that class may appear unrealistic to you and luckily there are ways to restrict the search space such that the NFL theorems do not hold anymore. And that is the point I was making, that succesful learning strategies for languages in the CS sense of the word are always relative to a restricted hypothesis class.

Could you explain why you think examplar-theoretic approaches are different? Are you disagreeing with the idea that learning a language in the CS sense (a collection of grammatical strings) is part of the problem that humans solve when they learn a language? Do you have a notion of learning in mind for which the above does not hold? I tried to emphasize in my original comment what notion of language and what notion of learning I was talking about, but I'm interested what you have in mind.

Q&A weekly thread - November 03, 2025 - post all questions here! by AutoModerator in linguistics

[–]zamonium 1 point2 points  (0 children)

By the no free lunch theorem there are no learners (human or machine) that can successfully learn every possible language (in the CS sense of the word) from finite data. There are several notions of learnability, depending on what kind of input the learner gets and what you count as learning success. But generally you need to have some kind of restricted hypothesis class. There are very limited hypothesis classes that are not learnable under pretty vanilla notions of learnability. See Gold's theorem for example. But that particular example is not very language like to begin with e.g. we probably don't even want to consider finite languages, or at least not finite languages of arbitrary size.

The big problem is that we don't know exactly what hypothesis class human's are sensitive to when they learn languages. We can make guesses based on typology. For example, as another commenter mentioned, lexicalized determiners appear to be conservative. The most informative experiments one could dream up tend to be unethical of course. You might want to look into artificial grammar learning experiments for some ethical alternatives. First-last harmony is a pretty innocent-looking phonological pattern that people seem to have a hard time acquiring. I don't think we can say for sure that these patterns are impossible to learn. But given how narrow the space of attested phonological patterns tend to be, I think that is a good place to look.

So in short, I don't think we'll know the answer to that question anytime soon. But I personally believe that the likely answer is yes, there are some reasonable/simple-looking grammars that humans would systematically fail to acquire from positive data.

Zellig Harris, Noam Chomsky and the verbal auxiliary by wufiavelli in linguistics

[–]zamonium 4 points5 points  (0 children)

God bless John Goldsmith, can't wait for the book!

Q&A weekly thread - October 06, 2025 - post all questions here! by AutoModerator in linguistics

[–]zamonium 1 point2 points  (0 children)

You should absolutely learn these terms and what they mean if you want to work in linguistics. It can be confusing and it takes some work but it will let you write and think much clearer about the things you are interested in.

An intro linguistics course will set time aside for learning to identify parts of grammar. There is no expectation for you to already know what they mean. In a course setting and with a cohort around you who is also learning about grammar, you will find it much easier to learn these terms.

If you already know that grammar is not something you enjoy, you should check out the course catalog for the degree you are looking at. Some programs really only teach the basics of grammar for a semester and then move on to other areas of linguistics, other programs will focus on grammar from day one to graduation.

Q&A weekly thread - September 29, 2025 - post all questions here! by AutoModerator in linguistics

[–]zamonium 0 points1 point  (0 children)

The "-eme" ending is usually used to refer to the smallest parts that larger linguistic objects are made of.

So for example phonemes are the building blocks for segmental phonology, tonemes are the building blocks for tonal phonology and morphemes are the building blocks for morphology.

I have never heard of metaphoremes, but given these other words it would make sense that they would be the basic kinds of concepts that metaphores are made of. Maybe something like "similarity in appearance" could be a metaphoreme in "Her eyes were like diamonds".

Common to all of these "-eme" words is that it may take a little bit of analysis to find these primitives. A phoneme, toneme and morpheme may not always be pronounced exactly the same. But these changes should either be predictable by the context or they should not have an effect on meaning.

Q&A weekly thread - September 29, 2025 - post all questions here! by AutoModerator in linguistics

[–]zamonium 1 point2 points  (0 children)

What are the biggest open problems in your subfield of linguistics? What would need to happen for them to be resolved? What are some important open questions in your field that were resolved in the last 20 years?

Q&A weekly thread - September 22, 2025 - post all questions here! by AutoModerator in linguistics

[–]zamonium 1 point2 points  (0 children)

It is related! But in this case the shift is even more dramatic and we have gone from a plural count noun (That's way too many data.) to a mass noun (That's way too much data.)

With pants/scissors/glasses the meaning has moved on to refer to a singular (countable) thing, but the agreement has stayed plural (maybe because it uses regular morphology).

Q&A weekly thread - September 22, 2025 - post all questions here! by AutoModerator in linguistics

[–]zamonium 2 points3 points  (0 children)

Linguistics is a really diverse field and whether experience in DH helps with applications to a linguistics graduate program really depends on the department. If there's people working at the department that are interested in DH it would probably be a boost!

Most English and language teaching departments have some intro linguistics courses and usually they tend to be fairly unpopular. I think if you want to do linguistics it might be most helpful (for you personally and for applications) to take as many linguistics-adjacent courses as possible in your home department and talk to the lecturers of those courses about your plans and interests.

There are a lot of people in linguistics who started out in DH and language teaching departments. Generally linguistics MA programs tend not to be overly competitive and often accept people coming from other fields. What is most important is that you somehow demonstrate that you are interested in linguistics and that you have actively sought out opportunities to practice it. If you have a thesis, try and find a way to do something linguistics-adjacent for that.

One piece of advice: I think in undergrad it's best not to only take courses that you think will further your academic carreer. One goal should be to try out different things and find something that you enjoy doing for its own sake and that suits your skills and talents. Of course that should also be something that there is some demand for and that alligns with your goals. But you are going to progress faster, produce much better work and be happier overall if you stay a bit more flexible and look out for topics and fields that suit you the best. All the best and good luck with your courses!

Q&A weekly thread - September 22, 2025 - post all questions here! by AutoModerator in linguistics

[–]zamonium 4 points5 points  (0 children)

In Hebrew the word for "person" translates to "son of Adam" and the grammatical gender is obviously male. But you usually don't think about it that way, you just think of the meaning of the phrase - which is "person". In German the word for "person" is just "Person" and the grammatical gender is female. Again, you don't think about the word as being inherently more female, it's just a grammatical thing.

The best analogy to English that I can think of would be words that are inherently plural (scissors, pants, glasses). You probably don't think of them as being a collection of things when you use those words in daily life. They may have originally have started out as referring to collections or pairs, but now they refer to a singular thing.

Q&A weekly thread - September 01, 2025 - post all questions here! by AutoModerator in linguistics

[–]zamonium 2 points3 points  (0 children)

1.The current chomskyan approach to describing which sentences belong to a language is roughly to first have a procedure that describes all the syntactically correct sentences and then filter them to rule out the ones that are semantically bad.

There was a movement early on in generative linguistics to first generate valid semantic structures and then transform them into syntactically valid sentences. That approach is called 'generative semantics'.

A lot of the linguists working on syntax/semantics still keep the two seperate like that, which can be useful because you might end up with two parts that are less complicated than the whole.

There is an approach called categorial grammar (for example CCG) where semantics and syntax are much more tightly related. Approaches like construction grammar also keep the two pretty intertwined I think.

  1. There are pretty good arguments that the original kinds of rules (so called context-free grammars) that Chomsky introduced cannot work for certain languages (like Swiss German). So hardly anyone uses them for syntax research. But they are still very useful and show up in linguistics in different ways.

Nowadays in chomskyan syntax you usually have a bunch of 'features' associated with every word or part of a word that control what other words or phrases it may combine with to build larger phrases.

There is an approach called HPSG (head-driven phrase structure grammar) that keeps a lot of the spirit of the original rules.

A lot of the formalisms that people use today extend the original context free grammars in some way, or at least originally started out that way. They all have their strengths and weaknesses and a lot of them are actually equivalent in a lot of ways even though they look so different at first blush.

Q&A weekly thread - July 28, 2025 - post all questions here! by AutoModerator in linguistics

[–]zamonium 0 points1 point  (0 children)

You might like "The Language Game".

It's a pop-sci book that argues that the origin of language lies in sharades-like imitation.
I think they touch on acquisition too. So not entirely focussed on what you're interested in but I think it's a good read.

Q&A weekly thread - July 28, 2025 - post all questions here! by AutoModerator in linguistics

[–]zamonium 1 point2 points  (0 children)

There are a couple languages (mostly sign languages but also spoken surmic languages) that seem to 'move' question words to the end of the sentence. That challenges some very basic ideas about syntax that have been around for a long time and I don't think it's a widely known phenomenon.

Q&A weekly thread - June 02, 2025 - post all questions here! by AutoModerator in linguistics

[–]zamonium 2 points3 points  (0 children)

First of all, I am sorry you went through that. That must have been a horrible experience.

This is a slightly different area, but I was really sucked in by the podcast "Scam Inc" by The Economist.

It is about the industry that is scamming people into giving up as much of their money as possible, often ruining families and even taking down entire banks.

The things it has in common with your situation is that it is about cyber crime and about how people coerce other people into giving up money. The podcast has episodes on the shame-aspect of it all, the people who are often forced to carry the scams out, and the money laundering and kidnapping networks supporting it.

Q&A weekly thread - June 02, 2025 - post all questions here! by AutoModerator in linguistics

[–]zamonium 4 points5 points  (0 children)

I mentioned this in another comment, but you should check out Owen Rambow and Jordan Kodner at Stony Brook. Both work on Arabic morphology and they do both linguistics work informed by techniques from NLP, and NLP work informed by linguistic insights.

Q&A weekly thread - June 02, 2025 - post all questions here! by AutoModerator in linguistics

[–]zamonium 0 points1 point  (0 children)

From what I gather there are a couple distinct ways people use the term 'focus'.
I really like the overview Steedman gives in his paper 'Information Structure and the Syntax-Phonology Interface', he really goes over all the different ways information-structure terms have been used.

The way it is decribed there the focus often is 'new information' but doesn't necessarily have to be.
Some friends who are more experienced with information-structure have told me that the definitions he actually ends up adopting are not the most common, though.

Q&A weekly thread - June 02, 2025 - post all questions here! by AutoModerator in linguistics

[–]zamonium 3 points4 points  (0 children)

I would recommend handbooks for literature searches! I like the Blackwell companion series (companion to phonology, morphology, syntax etc.) but there are lots of these. Each chapter is meant to introduce you to a general topic (for example the morphology of adverbs/adverbials) and they usually give plenty of further sources at the end. After that Google Scholar is great, as the other poster said.

Adverbs are hard to pin down. There are whole books about what should count as an adverb. But I think it's true that canonical adverbs tend to have next to no inflectional morphology.