Saturday, March 24, 2012

Chomsky, the Pirahã, and turduckens of the Amazon

A seething dispute has burst back into life with the publication of Language: The Cultural Tool (Economist, New York Times, The Chronicle of Higher Education). I’ve yet to read the book, though I’m pretty sure I will. I admire its author, Dan Everett, as an ex-missionary, who saw the light in those on whom he had sought to foist salvation, as a fieldworker, whose time in the wild is something I can only dream of, and as a defender of one tribe's right to continue its traditions against the depredations of modernity. And I admire his detractors, amongst them, my professor from phd days, David Pesetsky, and my one-time neighbour at MIT, and now near neighbour at UCL, Andrew Nevins.

The Pirahã, and their language, are, Everett believes, different from other groups and other grammars, so different as to threaten Chomsky’s theory of natural language. I wouldn’t describe myself as a card-carrying Chomskian—or as a card-carrying anything. However, my work is inconceivable without the program of research he initiated, and, so, potential Pirahã problems interest me.

Before you venture into foreign terrain, you set your bearings. If all you can see is rainforest, with no clear line of sight to the horizon, then it’s easy to forget your general direction and get caught up in undergrowth and bogged down in mud. So, before I open Everett’s book (or any other), I ask myself what the argument would need to look like to make me reevaluate where my research is headed and why. I’ve found Pirahã very helpful for my work (it’s cited in my last five(?) papers). Does it really destroy the edifice I’m building?

The Pirahã maelstrom has had two vortices: recursion, and the language–culture connection. Recursion promises/threatens to slay Chomsky, who argues that much of grammar is innate. The language–culture connection promises/threatens to resurrect Whorf, who argues that language shapes how we think. I work on the latter and I’ve written (here, here) on why I choose not to work on the former. For now, I’ll concentrate Chomsky and recursion, because, truth be told, the “Pirahã slays Chomsky” headlines seem to me like errors in elementary reasoning. In other words, the kind of “because” abuse that this blog is named after.

Recipes for recursion

Recursion means sticking something you made earlier into something else. So, preparing perogi (I’m in Kraków just now) isn’t culinary recursion—you’ve just put filling in pastry and left it there—but making borsht with dumplings is—you put something in something to make pirogi and then put your pirogi into your soup. The ultimate in culinary recursion would be turducken, a chicken stuffed inside a duck stuffed inside a chicken (stuffed inside a person).

“Turducken recursion” and “dumpling-borsht recursion” are different and both are found in human language. In “turducken recursion”, you take two things of the same type and put one inside the other—a sentence inside a sentence (Pawel ate the pirogi his mother wanted to sell) or a noun inside a noun (Pawel’s mother’s pirogi). In “dumpling-borsht” recursion, you put something inside something different—a noun inside a verb phrase (ate pirogi) inside a full sentence (Pawel ate pirogi).

Why does recursion matter to Chomsky? Well, one of the ways to think about what he is up to (and how I explain my work at dinner parties) is to pretend the brain is a kind of computer, like an iphone. (Sorry for mixing metaphors, computers with food. But, well, iphones are apples.) Obviously, we share lots of our brain hardware with other animals. But other animals apparently don’t have anything like human languages, not even the vocal, gregarious, communicative ones, not even primates formally schooled in sign languages by eager experimenters. Chomsky has co-written that the crucial difference might be that our hardware at some point became capable of recursion.*

Which recursion does Everett think Pirahã lacks and should Chomsky care? (And why all this talk of “putting inside”, rather than “putting next to”? E.g., why is Pawel’s “inside”, not “next to”, mother’s pirogi?)

Turduckens, iphones, and irrelevance

Everett says that there’s no turducken amongst the Pirahã. No sentence-in-sentence or noun-in-noun. You have duck (Pawel ate pirogi) and you have chicken (His mother wanted their sale), but you’re not getting to ducken, with one inside the other.

But if Everett, and the media, don’t get turducken, they make hoopla. Everett’s claim has been portrayed as a Chomsky-slayer of a fact.** In truth, though, the turducken hunt is a red herring.

My iphone tags my photos to say where I took them. It tells me what’s nearby (cafés, restaurants, museums, shops, ...). It shows me where I am on maps and how to get to where I want to go and which of my friends are nearby. My mother never tags her photos, she never wants to know what’s around her, she never displays herself as a blip on a map, and, if she wants to know if you’re nearby, she calls. But that’s a fact about how my mother has set her iphone up. All location services are off. But that doesn’t mean her iphone can’t provide that information. The computational capacity of our phones is the same, her configuration is just different.

If the Pirahã don’t have turducken sentences, that’s a fact about how their language is configured. It’s not a fact about their hardware. If you kidnapped a Pirahã child (a practice inflicted on numerous indigenous communities) and raised it speaking Portuguese—or, less horrifically, if you exposed it to enough Portuguese for it to grow up bilingual—you’d expect them to be just as capable of learning Portuguese as any other child they were raised with. When Chomsky is concerned with recursion, he is concerned with hardware. The claim is about what brains can learn, not what a particular brain has learned. A dearth of turduckens of the Amazon just doesn’t matter.

Six degrees of separation

Here’s a different path to the same conclusion. “Six degrees of separation”. It’s every Chomskian’s favourite game. You pick some humdrum language, like English, and, with just a tweak here and tweak there, you get yourself up the Amazon without a turducken. The point is to show which exotic delicacies are just familiar fare in fancy sauce. Here’s a pertinent example inspired by the first paper I read in grad school.

What’s the difference between ask and wonder? A normal person will say it’s something about their meanings. Fair enough. But a linguist will it’s about recursion. You can ask what the time is or ask the time. Both are fine. Not so with wonder: you might wonder what the time is, but you can’t wonder the time. Such differences are widespread and don’t appear to depend on meaning. After all, ask and inquire are near synonyms, but I can’t inquire the time. I’m limited to inquiring what the time is.

And now we tweak. Imagine we go, verb by verb, taking everything like ask and making it like inquire, so that it can only combine with a noun, not a sentence-like, “concealed” question. By the time we’re done, English would be on its way to being Pirahã: verbs would no longer be the aperture through which you can turducken one sentence inside another.

Sure, there’d still be relative clauses (Pawel’s mother wanted to sell the pirogi that Pawel ate). But they depend on there being words like that. We could get rid of them too.

Yet, none of this would mean that “English-ish” speakers’ brains had become incapable of recursion. They’d just have turned off their iphones’ location services. No change in hardware, just change in use. So, again, finding a language without turducken recursion is, simply, irrelevant to deciding whether recursion is the crucial component of hardware that makes us computationally competent for language.

Back to borsht and dumplings

We have an expression in English, “to string words together”. This probably reflects what most people think sentences are. Words strung together. One of the major insights of early work by Chomsky & Co is that real generalizations about sentences aren’t phrased in terms of strings. To characterize what is possible a sentence in a language and how possible sentences are related to each other, you don’t talk about which bit follows which other bit. You talk about which two bits were combined first, and about which other bit their combination was combined with next, and so on. Language, in the computery, grammatical sense relevant to Chomsky, is not about strings, it’s about structures.

If Pirahãs’ lack of turducken recursion is irrelevant to Chomsky’s claims, can Everett make hoopla from dumpling-borsht recursion instead? Does Pirahã make us think that its sentences are not built up by this kind of recursion? Three things make me sceptical that Language: The Cultural Tool can show this.

First, the rhetoric. If Pirahã has such sentences, then it is remarkable for what it has, not for what is missing. It would have sentence types that can’t come from recursion, rather than merely not having certain types that do. Yet, all discussion I’ve seen has focused exclusively on what Pirahã lacks. Quite a U-turn.

Second, a point-by-point rebuttal of Everett’s earlier formulations, indicated where Pirahã is similar to Chinese, German, and Hebrew, amongst others. So, if Chinese, etc don’t have sentence types irreconcilable with recursion, then Pirahã probably doesn’t either.

Third, it is not hard to show what a theory cannot do. For all I admire Everett for his post-missionary enlightment, his time in the field, his advocacy of indigenous rights, and his desire to go beyond descriptive work and engage with the foundations of cognition—something I share—I have to recognize that discussion of turduckens and iphones suggests that Everett isn’t the person to do this. The task requires good basic logic and an understanding of the fundamentals of the theories you’re trying to engage with. Turduckens and iphones may seem like silly metaphors, but they reveal, in familiar and concrete terms, Everett’s errors in logic (arguing from irrelevant data) and in understanding (the crucial distinction between linguistic hardware versus its use in a given language). This does not bode well.

Ready for incursion

Not everyone is good at everything. My mother can’t use location services on her iphone, and, frankly, I’m not a fan of the thought of turducken. But my mother doesn’t preach the downfall of Apple and I don’t portray myself as an expert turduckenist.

There’s more to Language: The Cultural Tool than recursion (watch for future posts). But then why all this hoopla about its impact on Chomsky’s view of our mental hardware, if the main subject of the book in all likelihood has nothing of relevance to contribute? Having marshalled my thoughts, I will open the book with a heavy heart. I’ve seen a laudable fieldworker produce lamentable theory before.

**************************************

*So all you need to get Dostoyevski is a monkey brain and recursion? No. When Chomsky talks about language, he’s talking only about our computational hardware. Your hardware needs a set of concepts to compute over and, once there are enough people who can also compile their concepts into beautifully articulated thoughts, it’s pretty handy to have some way of getting thoughts out of your head and into theirs: communication—which potentially raises new computational questions, about flattening thoughts built by recursion into sequences sayable/signable one bit at a time. For the curious, I should add that I focus as much on the store of concepts as on building by recursion, if not more. So, I’m not as deeply involved in these issues as some.

**There’s been debate about whether it is a fact at all: he previously found turducken in his neck of the Amazon, and there’s dis-ag-ree-ment about whether he’s been able to reanalyze the examples at stake as separate sentences standing side by side.

46 comments:

  1. It is a nice post, and I very much agree with the main points. Some metaphors are unclear, though. To what extend is Chomsky concerned "with hardware"? Or better, to what extend is recursion hardware? I thought that the "hardware" part of the brain was the particular configuration of the neurons that make possible cognition, and that cognition was in the "software" side---not "software" in the sense of "Windows" or "Excel", but perhaps in the sense of a programming language, or even deeper, the executable machine code to which the results of programming languages are compiled.

    ReplyDelete
    Replies
    1. To nitpick on your metaphor a bit, the "executable machine code" would just be the configuration of neurons - i.e. the hardware. The brain is not a computer in the same sense as your desktop, where there's a processor and memory registers, and the processor executes instructions. The brain is rather a massive state machine. The distinction between "executable machine code" and "the hardware" is not as clear as it is in a PC. My own opinion is that Chomsky is properly concerned with language as software - as program in a high-level programming language - but that he is aware that it can only be studied as manifest in the brain - i.e. in light of the reality that it is "compiled" into a particular configuration of physical neurons. It is in this latter sense that he speaks of our "biological endowment," and in the former sense that he speaks of the need to distinguish between competence (the program) and performance (realities of the machine that may or may not be related to the program). The fact that it's sometimes hard to decide whether Chomsky is talking about a program or a hardware configuration is an inevitable consequence of the fact that there is no clean separation between the hardware and the software in the kind of computer the brain is.

      Delete
    2. True, but, taking the Fortran/Algol comparison below, I think it makes some sense to say that the brain's 'firmware' is like an Algol interpreter, on which the Pirahã have chosen to run a program that would also run on a Fortran interpreter (intepreters feel like a less bad metaphor than compilers).

      Delete
    3. I was using the hardware/software distinction sloppily. It would have been better to talk of what iphones (or humans) are capable of when you take them out of the box (or womb). Location services (or turducken recursion) are there to be used but nothing forces their uptake.

      Delete
  2. Pirahã clearly has b&d recursion, since it has NP structure and multi-NP sentences, but the lack of NP-within-NP recursion is I think worse for UG than you portray, since the suppression of prenominal possessor NP recursion in German is too messy to be handled with a simple on-off parameter, and why should such a thing exist anyway? It's an 'epicycle', suppressing what looks like a basic consequence of the theory that if you can say "John's canoe" and "John's brother", you ought to be able to say "John's brother's canoe".

    ReplyDelete
    Replies
    1. Off the top of my head, there are two reasons I can think of as to why a language might have John’s canoe and John’s brother but not John’s brother’s canoe.

      (1) Maybe the possessors are prepositional (or applicatives), not DP’s with genitive case. So, the two licit structures combine PP with NP (dumpling in borscht), but the illicit structure embeds PP in PP (bird in bird, like turducken) and it might be a lexical property of Pirahã P’s that they can’t select other P’s.

      (2) There may be an issue with case. In recursive possessives in Scottish Gaelic, for instance, only the lowest possessor has genitive case; all the others have nominative. so, distinct case assignment strategies are available crosslinguistically for possessors of possessors. If the availability of any one given strategy depends on a particular parametric setting, that the availability of none means that the language in question has none of the parameter settings in question.

      Both of these are UG-compatible lines of explanation, and there are probably more. This isn’t any area I’ve ever thought about. So, the hypotheses aren’t at my fingertips.

      Delete
    2. I'm having a problem with (1) - I'd expect the PP to contain an NP that could then contain a PP. And, with both (1) and (2), there's reliance on invisible machinery that would presumably require some fancy parameter-setting triggering to get it to work.
      (shifted from where it got posted by mistake a few days ago)

      The basic phenomenon appears to be that possessors don't branch (Adj+N and Det+N don't occur either, just checked with Everett), so a 'less UG' alternative is that they are just N, not NP, learned that way because it fits the data better (stuff you'd expect people to say if they had an NP in that position doesn't get said).

      Delete
    3. Oh, so it’s the same kind of restriction as on Mohawk noun incorporation (as opposed to Tiwa, which incorporates modified nouns and compounds). If so, then still no boat rocking, just variation along usual lines. (I wonder whether it could be a prosodic constraint. Again, not my area of expertise.)

      Delete
    4. Not quite usual, due the lack of simple monoclausal alternatives ways of formulating it. But indeed possibly not so amazing from the point of view of formal syntax, although there's still the question of how people learn these sorts of restrictions.

      Delete
    5. Agreed. For all the flack that Bayesian approaches get from theoretical linguists, I quite like them as means of evaluating different i-grammars. Take incorporation. Suppose the child has ”figured out” that objects sometimes huddle next to the verb. Given what it has already learned about its ambient language, the child will have several means of generating object-next-to-verb structures. It favours the one or ones that generate outputs most closely matched to its inputs. If one yields a very exact match, then all rival grammars are deactivated, leading to a restriction on production. Just a hunch, but that’s how I think about things informally. Not what is usually meant by evaluation metrics, I think. Let me know if you have any hunches.

      Delete
    6. My hunch is pretty close to that: intuitively, one of the things learners do is try to match the statistics of how meanings get expressed. So if you learn only indefinitely interpretable objects expressed next to the verb with intonation that goes with single words, you learn that incorporation is restricted to generic/novel entities rather than things you've already been introduced to, since the restricted grammar gives you fewer ways of expressing things like 'I saw the buffalo'.

      I had fantasized that a Bayesian notion of fit whereby the data was pairs of the form (U,M) and the learner is trying to balance the complexity of the grammar against probability that the utterance will be U given that the meaning is M, but some discussion with more statistically knowledgeable people indicates that I've got a lot to learn before I can formulate this properly.

      Psychologists would probably also complain about the idea of children figuring out all the different ways that their current grammar could express the meanings of the stuff they hear.

      Delete
    7. Off the top of my head, there are two reasons I can think of as to why a language might have John’s canoe and John’s brother but not John’s brother’s canoe.

      This is exactly what happens in Slavic languages. You can say John’s brother’s canoe, but you have to use another construct - not the possessive adjective (they can be formed only of a single noun) but a "genitive chain". From POSS + N you have to switch to N + GEN + GEN.

      Delete
  3. Just to be clear; a Turducken is a chicken in a duck in a turkey, not another chicken. (I think what you're describing would be a "Chiducken", not a Turducken.) So, a real turducken isn't actually recursive in the way you're describing.

    ReplyDelete
    Replies
    1. Ah, I should have been clearer. I was thinking of turduckens as bird-in-bird-in-bird recursion.

      Delete
  4. I'm not sure the iphone metaphor is really that appropriate. What the linguist who examines a language in fact does is more like a piece of reverse engineering: we found a device, and by studying what it can and cannot do, we try to get a picture of its internal structure. Suppose we in fact found no evidence at all of there being a gps-receiver inside the device. That might mean either of the following:
    (a) there is no gps-receiver in the device, or
    (b) there is one, but for some reason the device does not use it in any way, so that we do not detect it.
    Same with absence of recursion: suppose we don't find recursion in a language, this might be interpreted as providing evidence for (a) or (b):
    (a) the language has no recursion, or
    (b) the language has the possibility for using recursion but does not make use of it.
    While granting that (b) is a theoretical possibility, I must confess that I find it extremely unlikely, and if we were indeed to find such a language (which, by the way, I do not believe we have), the much more plausible conclusion to my mind would be (a).

    By this, I don't want to be understood as saying that I'm convinced by Everett's argument for the absence of recursion in Piraha. I actually think Nevins, Pesetsky & Rodriguez make quite a convincing case that Piraha does have recursion (in fact, a simple possessive nominal of the type [[Bill]'s canoe] already has NP-recursion). But I don't think the discovery of a language without recursion, should we ever find one, can be dismissed as irrelevant by the proponents of UG (which I consider myself to be).

    ReplyDelete
    Replies
    1. So far, the only recursion that Pirahã might lack is turducken recursion (though this is debated for negated verbs of saying: “I don’t order you to make arrows” means pretty well the opposite of “I don’t order you” plus the imperative “Make arrows!”); and this might be a simple lexical fact (about selection). If the rest of the language looks like dumpling-borsht recursion, then that’s recursion enough for me (and UG). As you’re suggesting, I think, it would be genuinely fascinating to find a real string language, one that demonstrably did could not be thought of as borscht and dumplings.

      Delete
  5. It seems to me that there are four different questions that are being conflated.

    Question 1 is whether humans are "genetically programmed" to be good at using grammars. I think that everyone agrees that the answer to this is "Yes, humans are natural grammarians."

    Question 2 is whether we're programmed to be good at using Chomsky Type 2 (recursive) grammars.

    Question 3 is whether all human languages use recursive grammars (or better).

    Question 4 is whether any conceivable human language must necessarily use a recursive grammar (or better) - whether or not any _actual_ human grammar is non-recursive.

    It's not clear that the answers to questions 2 through 4 need logically be the same. For example, my best personal guess, for whatever that's worth, is:

    2. yes, humans are naturally good at recursive grammars
    3. Pirahã might be a counter-example of a non-recursive human language, but the jury isn't out yet.
    4. However, even if Pirahã does have some elements that are recursive, I believe that it's sufficient proof that non-recursive human languages could perfectly well exist.

    ReplyDelete
    Replies
    1. I don’t think a plausible case has been made yet to justify “if” bit of point 4, i.e., that we can start an argument with the premise “even if Pirahã does have some elements that are recursive”: so far, everything looks recursive, just with a scant supply of turduckens. Check out NPR and let me know if you disagree, but it seems to me that the Pirahã doesn’t present any structures that don’t like those seen in more familiar languages. Indeed, the constellation of grammatical/lexical properties that Everett draws attention too look similar to constellations found in Australian languages, which have been happily and productively studied by generative linguists since the earliest days of the discipline.

      Delete
  6. I'd also like to point out that it wouldn't have occurred to anyone until fairly recently to call t&d recursion 'recursion' at all, since the capacity to produce a phrase of type X inside of one of type X was for a long time considered to be an especially striking feature of human language, and what programmars would call recursion, as introduced in Algol, but not found in Fortran, due to no stacks for parameters
    and return locations. The current Chomskian use is an innovation, and possibly not a very well conceived one, for the purposes of communicating with people in related fields.

    ReplyDelete
    Replies
    1. Is “programmars” a really cool typo, or a word I don’t know?

      You may be right about what’s innovative and what’s not. But innovation happens. And words get used different ways in different disciplines.

      Oh, are you suggesting that Everett has fallen victim to misunderstanding because he applied some ”foreign” definition of recursion? I don’t see that as much of a defence. Once people have gotten used to the idea that words have theory- or field-specific meanings, they should know well enough to check whether they need to change meanings when they change fields.

      Delete
    2. Typo, unfortunately. Poking around a bit in sources, I find that Everett's concept of recursion seems to me to be identical to the one in Carnie's 2007 textbook, where it gets a reasonable amount of airtime as an important fact about language, so I am still puzzled as to why people have given him so much grief about it.

      Delete
    3. This comment has been removed by the author.

      Delete
    4. Ooh, really? I’ve don’t know the book. But, as a QMer, I, of course, recommend my colleague’s Core Syntax. Maybe if Carnie were spending as much time and energy on the error as Everett is, their grief quotients would begin to equalize.

      Delete
    5. Error? It is the original usage of the term, in linguistics, as well as neighboring fields (and not everybody doing syntax buys into the MP - C himself presents it as highly speculative, or at least used to). The difference between situations where you need a return stack vs those where you don't is still interesting, regardless of how it is treated in detail.

      Delete
    6. Yep, error. I once met a Hindu who was all buoyed by the fact that Einstein endorsed “cosmic religion”, a term that she also used to describe her beliefs. She thought she was of one mind with a big scientist, or that science had given (her variety of) Hinduism the seal of approval. But she was way off: Einstein’s cosmic religion is quite different from what she believed in (which he rather disparaged). Hinduism’s use of the term surely predates Einstein’s. But the mistake was hers. Same term, different meaning.

      Delete
    7. In case it's helpful here, Dan has said in a number of places that his original use of the term recursion and the arguments he deployed in using it were not well matched. He was confused between rule and system recursion. See, for example, his 2006 reply to criticism in Current Anthropology. There are also still unanswered questions about modality, given his Principle of Cultural Immediacy

      Delete
  7. Chomsky is an intellect far beyond the prosaic.
    However, when he Emailed me to say that he "Couldn’t" watch me wee video, because of “Time restrictions,”
    [It's 01:11 in duration]
    I disagreed with his statement and added
    “It’s not because of time you do not watch but rather that it ain’t Jesuit approved.”
    Here it
    (“Hey! Hey! We’re The Humans”) is, if’in ye care to view.
    http://www.youtube.com/watch?v=2LubuSAgB5s

    SOGS,
    Tor

    ReplyDelete
    Replies
    1. I wonder how he reacted to the phrase “Jesuit approved”…

      Delete
  8. Daniel, I have tried to email you about this but your MIT Alum email address is bouncing. Eugenie

    ReplyDelete
  9. Nice post, I think you explain nicely why this book won't convert that many Chomskians. A question, though:

    'I ask myself what the argument would need to look like to make me reevaluate where my research is headed and why'

    So what would it need to look like?

    ReplyDelete
    Replies
    1. Formal linguistics, to my mind, is about atoms and algorithms. I tend to concentrate primarily on the atoms (features) and secondarily on the algorithms (that structure and interpret them). Others concentrate much more on the algorithms. So, a general question, for me, is how much of my work is dissociable from generative linguistics and belongs to (internalist) cognitive science in general. Probably a lot.

      But suppose you’re more interested in the algorithms, as Chomsky is. The position put forward in Chomsky, Hauser, Fitch suggests that the properties of human languages will follow from two things: what’s in “narrow syntax”, and what the systems that narrow syntax has to work with (gain its inputs from, feed its output to) force its inputs and outputs to look like.

      Consequently, we’re looking at a program of research, not a single theory. There are two ways to kill off programs: give them enough rope to hang themselves, or show that the whole enterprise was ill conceived to begin with. Either approach would be enough to convince me to massively reconceive my work. It’s impossible to say in advance precisely what shape those arguments will take though, as creating them will take originality.

      Delete
    2. Both approaches have been tried, both in the early days of generative grammar and more recently. Early whole-program critics were mostly philosophers. My background, prior to linguistics, was in (mathematics and) philosophy, and I was won over from the philosophical side precisely because I found that the arguments on the generative side were stronger and the program, more promising.

      Recent whole-program critiques have often been more computational (taking Bayesian methods from what strikes me as a legitimate place—the evaluation of rival grammars during acquisition—and trying to show that they are enough in and of themselves). They misreckon, though, both the facts and the generative accounts of them, frequently ignoring past findings about linguistic structure and repeating errors of reasoning that, to my mind, died in the pages of Aspects of the Theory of Syntax.

      Of course, you can never argue that a convincing whole-program critique won’t come along. It would be nice, though, for such critiques to take into account what’s actually known (Searle’s, for instance, struck me absolutely devastating and, as soon as I find someone who does what he devastated, I’ll let them know that of their devastation; I can’t see that what he was talking about, though, had anything to do with what I do).

      For the rope-to-hang strategy to work, you let people who believe in the program get on with it and then you come back a few hundred thousand “thought hours” (link) later. If no progress has been made, then the program has failed. Personally, I don’t think that this, this and here) and I’ve published elsewhere about Evans and Levinson’s (here). In my assessment, these critiques make obvious mistakes in how they represent what they’re attacking and how they handle their own data. If the best attacks are logically flawed, maybe logic is on the side of what they’re attacking.

      Even if these attacks are all wrong, I find them valuable. They make me return to the philosophical fundamentals of what I’m doing and that, in turn, makes me try to ground my work directly on to its conceptual foundations.

      Delete
    3. Gesplunn. Some text went missing in the second last paragraph. Oh well. Let me know if you want me to reinvent it…

      Delete
  10. I don't think its on to say Everett is guilty of elementary logical howlers or couldn't spot recursion if it walked up to him in a bar. That's a merely ad hominem attack for one thing and recursion is not a very difficult idea to understand for another. It's pretty much the first thing you'd check if you were a researcher in Everett's position. Hopefully, Everett's data will be independently checked in the near future and there'll be a consensus one way or the other.

    For what it's worth, I don't think the elementary logical howler criticism will stick. You think the details of Piraha don't bear on the veracity of Chomsky's position because Chomsky is taking a position about language faculties (UG, language "hardware"), whereas the purported refutation concerns the structure of a specific language ("software"). That's rewriting history a little.

    If I know my history correctly, UG was advocated as giving the best explanation of the existence oflinguistic universals (and other things like the so-called 'poverty of stimulus' argument). It seems to my outsider's perspective that the list of linguistic universals has pretty much shrunk to just recursion as Chomsky has tinkered with his theory over the years. It only takes one counter-example to show that something is not universal, so if Everett has discovered a language without recursion, then there may be no linguistic universals for UG to explain.

    It may well be that there is a section of the human brain that specifically deals with recursive structures, and nothing in the Piraha case could show there isn't, assuming Piraha children can learn Portugese), but we sure as hell don't need to postulate UG to posit a centre for recursive thinking. The battle was always whether languages showed the need for a language specific faculty in the brain as opposed to general intelligence and general principles of learning. If there are no universals, as the Piraha case, if correct, appears to show, then the case for UG is to that extent somewhat undermined (there could be other arguments for UG, like the poverty of stimulus one mentioned, although I think that particular one has been discredited by people like Geoffrey Pullum and, especially, Geoffrey Sampson)

    ReplyDelete
  11. Thanks for your comments. Good thing I didn’t say ”Everett is guilty of elementary logical howlers or couldn’t spot recursion if it walked up to him in a bar” then. I agree that you’d expect definitions of recursion to be the first thing a researcher in this area would check. When I wrote the post, I was veered towards the view that the errors more likely arose from press coverage, than from Everett himself, and I’m pretty sure that my tone is more “ad mediam” than ad hominem. But since the original posting, a number of people who were at Everett’s early presentations of his material (at the Linguistics Association of Great Britain, and then, later, a special workshop on recursion) came away absolutely convinced that what you and I take to be basic groundwork, viz. definition checking, had not been done. It looks you and I might both have been wrong: you for a misplaced confidence, me for misplaced caution.

    But the substantial point you raise concerns whether I might be rewriting linguistic history. But the history isn’t my concern. The science is. And, to be frank, I find the paragraph “If I know my history correctly… ” to bear no relation to what I think of as sensible research. It repeats several well worn errors.

    In brief, yes, surface universals a killed off by a single counterexample. But generative grammar isn’t about finding surface universals. The whole program would proceed perfectly happily if there weren't a single one. (UG isn’t Greenbergian typology, which, is, indeed, a Titanic that can be sunk by a lone iceberg.) The universals that UG is concerned with are universals of the computational system. Those universals haven’t ”shrunk” to recursion (unless all scientific success is to be characterized as “shrinkage”: the Einsteinian shrinkage of physics overshrunk the Newtonian shrinkage). Rather, there has been an attempt, compelling in my view, to reduce various statements about linguistic structure to just two things, (a) properties of the modules that syntax must interact with, and (b) recursion [though not of the turducken type]. If Pirahã structure violates all aspects of (2a) and (2b)—and this includes the structure dependence of semantic interpretation—then it’s problematic. But if it’s just a dearth of turduckens, then it’s all hoopla: no serious theory of UG guarantees turduckens.

    ReplyDelete
    Replies
    1. (I tried posting a reply to this earlier, but it's gone AWOL...)

      History may not be important but begging the question should be. Linguistic universals were first put forward to sell the need for dedicated linguistic hardware, UG. Now you, as a Chomskian in the broad sense (generative grammarian etc), say that Everett's contentions are irrelevant because they only relate to properties of particular languages 'language software'. They don't show there isn't recursion in the language hardware, because now Chomsky no longer makes predictions about what aspects of actual human languages are determined by the essential structure of UG. Even Everett agrees that the absence of recursion in Piraha (if it is absent) wouldn't show their brains are incapable of recursion or recursive thought. He even says they do have recursion in their stories, so obviously their 'hardware' supports recursion. But you beg the earlier question against Everett when you say his results are irrelevant. They are NOT irrelevant to the original question of whether we need dedicated linguistic hardware as opposed to all purpose intelligence. If he's right, then one of the original arguments for UG falls by the way side. That's what makes it relevant.

      Of course, your own work might not be affected. You could just say assuming we have a dedicated language module,... and carry on as before.

      Delete
  12. Completely agree with Daniel. UG was never about giving an explanation for linguistic universals of the greenbergian type (although these may provide evidence for the nature of UG). And the Poverty of Stimilus arguments are as sharp as they ever were (see he recent Berwick et al paper in Cognitive Science 2011). There is no explanation of the structure dependence if syntactic rules that doesn't assume domain specific unacquired information.

    ReplyDelete
    Replies
    1. I find structure dependence convincing for UG as not learned (even if it could be learned, the typological restrictions on apparent string rather than structure based operations (some cases of P2 clitics) seem too strict for it to be plausible that it actually is learned.

      But not convincing for UG as task or species specific, since structure dependence could plausibly be a general constraint on operations involving structured representations, which might be useful for many animals getting around in their environments. For example, Marr's 1982 ideas about shape recognition involved a lot of hierarchical structure.

      Delete
  13. I agree the poverty of stimulus arguments are as sharp as they ever were. Structure dependence is parsimoniously explained in Geoff Sampson's application of Herbert Simon's work on complexity. A hierarchical structure is the overwhelmingly statistically likely outcome of a process of historical development as the evolution of language from protoloanguage presumably was. See Geoffrey Sampson, Making Sense,Oxford University Press, 1980: 133-165.

    ReplyDelete
    Replies
    1. I suspect that progress in machine learning theory makes arguments from typology more compelling that the classic learnability arguments.

      So, structure dependence can be learned, but the apparently extreme restrictions on what string-but-not-structure dependent operations show up in languages (basically, sandhi-allomorphy such as a/an, and positioning of postpositives after first words (words only, not multi-word constituents, which are best handled by putting them in utterance-initial discourse positions in front of the postpositive, cf Legate's work on Warlpiri) are not plausibly explained by simple conservative inheritance from an original language.

      Delete
  14. Actually, I'm not too distant from you on the issue of hierarchically of structure, and nor is Chomsky. The issue is the dependence of grammatical rules (or dependencies) on structure. It is perfectly possible to define non structure dependent rules on structured representations (say rules dependent on linear order, pace P2 clinics). So something has to be said about why grammatical rules (or, if you like, dependency relations within structures which have concomitant semantic dependencies) are dependent on structure not linearity. That is at least task specific. The question about species specific is, I agree moot, and if there's something species specific about recursion it's probably about how structures are connected to interpretation, not about the building of structure per se (cf birdsong).

    ReplyDelete
  15. The interesting recursion/language claim is not that recursion is possible, but that human(like) language is not possible without recursion. Piraha is therefore not besides the point - either 1) recursion is not essential to Piraha, or 2) Piraha is not a human(like) language, or 3) the claim is false.

    ReplyDelete
  16. Advances in applied turducken theory: http://www.youtube.com/watch?v=pjrI91J6jOw

    ReplyDelete
  17. This is an excellent post, and one of the best I have read after seeking some sanity on the web after reading Everett's 'language the cultural tool' book.

    Frankly I could find no evidence reported in Everett's book that language could be understood as a 'tool' (he doesn't define the term properly, but we assume it means some kind of solution arbitrarily created to solve a problem of survival/existence). The whole approach of searching for a language which appears to contain no recursion in its spoken form in order to disprove that there is any innate recursive or otherwise language capability is hard to comprehend.

    If we understand 'language' as a 'serialised form of thinking structures' (which seems unavoidable to me), we see what has long been recognised: recursive structures at various levels (sentence story, ..). It seems an unavoidable conclusion that there is a more powerful capacity in humans that other animals to create, remember and use recursive structures in order to 'think' (i.e. plan, design, communicate...). It's also a fact that humans have an advanced capacity to serialise these structures into language, which is a form that can be communicated (to self, others); many such languages have similar abstract grammatical complexity, including un/loosely related ones. It seems further likely based on evidence that these two things are neurally or biologically related since they only appear in humans at anything more than the very primitive level found in animals.

    So the first basic objection is to the premise that finding a language that appears to have no/limited recursion in its grammar proves that there is no innate recursive language capability in these humans. What else could the Piraha language be evidence for?

    Inference to the best explanation from a number of possible theories would lead surely to a more obvious conclusion like: Pirahas have the same innate structural thinking and linguistic capacity as other humans, but simply don't need or want to use a lot of it in their timeless detached environment. This may be partly due to their specific philosophy of life as well as exterior circumstances (since some other Amazon languages show more complexity).

    I would say that even if your basic thesis was that 'language is a tool', the Piraha language is evidence of the simplicity of the mental models that are satisfactory for leading the existence of the Piraha life, not that humans are generally lacking an innate language capability.

    There are numerous completely unscientific statements in the book which I won't quote here, and I can't help but suspect that Everett's emotional attachment to the Pirahas, their culture, and his life's work on their language (an attachment that I find entirely reasonable) makes him quite subjective in his theorising.

    It's certainly miles away from anything that would make me think that humans simply constructed the computational capability of language (representation, serialisation, parsing, ...) to meet the needs of survival and living in the same way we built roads and carts to solve the needs of transport.

    On the other hand, when we view thinking and expressive capabilities of humans in terms of computational complexity, the concept of Universal Grammar doesn't seem out of place; more likely it is a misnomer for a generalised computational capacity that clearly includes something that acts like recursion, parameterisation, and other features Chomsky-bashers love to hate. Whatever it's called, it's hard to see how we can argue that there is not an innate recursive computational thinking and linguistic capability present in humans.

    ReplyDelete
  18. Awesome article, it was exceptionally helpful! I simply began in this and I'm becoming more acquainted with it better! Cheers, keep doing awesome! Shopin token

    ReplyDelete