Posted on my class discussion forum, in response to the following prompt:
‘The problem of which word-classes (or “lexical categories”) should be set up for an unfamiliar language can be approached in two different ways. One can start out by asking either of the two questions in (1)-(2):
- Which categories allow the most elegant description of the language?
- Can the language be described in terms of a universal set of categories, hypothesized on the basis of other languages?’
[…] What are the pros and cons of the two approaches?
The key difference I see between approaches (1) and (2) is that the researcher adopting approach (2) has the benefit of appealing to an established theory. At the same time, both researchers must have the tools that enable some sort of formal analysis, if their research is going to avoid being entirely abstract or impressionistic.
One advantage that (1) might seem to have over (2) is that it is more free from preexisting assumptions. A typical objection to approach (2) might be that the standard theory might have been established on the basis of extensive hypothesis-testing, but the next counter-example might just not have been discovered yet. In approaching an unfamiliar language, approach (1) might seem less prone to confirmation bias than approach (2).
Confirmation bias, however, runs both ways: in my view, one is as apt to posit that a language has novel features if one is looking for them, as one is to affirm that the standard theory is applicable. (This seems to be something that Chung is reacting to.)
As such, it would appear that approach (2) is the more powerful approach, simply in the resources (descriptive tools, previously analyzed cases) that it affords the researcher.
The problem is that rigorously applying approach (2) is difficult to do. I take Chung’s analysis of Chamorro as an exemplar. In first encountering an unfamiliar language, would it be possible to bring to bear all the analytical resources that Chung manages? I would imagine that in the field the advantages of (2) might not be as real, and the dangers of finding what one is looking for become more difficult to manage. Approaches (1) and (2) become less distinguishable, in such a situation.
From our first few sessions with our consultant, what seems to be a better picture of my experience is that I employ my shallow knowledge of established theory in making certain conjectures, which I attempt to verify through elicitation. I find I am sometimes correct, and sometimes completely mystified. In the latter case, I am unsure if the problem is my lack of knowledge of the theory or of other languages, or if the theory as I happen to understand it is broken.
In approaching the mystery, then, I like to think that my instinct is to look only at patterns in the distribution, while trying to temporarily suspend the categorical schemes that seem to apply. In reality, though, I think that that suspension of assumptions might not actually be happening. The cognitive work performed – pattern recognition – is essentially the same, as ‘much’ or as ‘little’ as I think I might be assuming.
So much for my thoughts on these approaches. I think the question that Chung asks about whether grammar is an ‘independent system’ is super interesting. Is syntax Ideal or is it in the head? (This is a hard question not the least because we’re not sure about how our heads comprehend ideal forms in the first place.) I’m not disinclined towards the project of the search for the Ideal in syntax.
This brings us to another problem, however: Chung also considers the extent to which ‘human cognition or human interaction’ shape language, as opposed to Platonic constraints. The difficulty is that what linguistics defines as language doesn’t exist outside the human brain (at least in my understanding), so it seems inevitable to me that ‘canonical use’ (a question of pragmatics) and ‘canonical meaning’ (something of semantics) will need to be appealed to, in justifying Conversion of some assumed-to-be-basic elements of syntax (nouns, verbs, and adjectives). Such an approach seems predetermined by the paradigm. The relevant circularity here doesn’t emerge so much from the formalism as it does from the definition of ‘language’. To address the circularity at this level would mean going outside the discipline. (I’m thinking of biosemiotics, and how it imagines biopragmatics, biosemantics, and biosyntax; and I’m sure there are other things.)