Oddly enough, past usage is one of the best arguments against prescriptivist proclamations of right and wrong. We find that the "aks" pronunciation of ask has an unbroken lineage and usage from African American Vernacular English to Southern American English to Old English. Nouns can be verbed all kinds of ways and often have been if you check the OED. Being aware and in control of your verbal 'appearance' is useful, particularly in formal situations. However, smugly deriding the grammar or spelling of someone who has communicated with practical clarity is a despicable and misguided sport.
The phrase "verbal appearance" is genius and really summarizes my feelings on grammar. One shouldn't dismiss a person because of his appearance, but one would be wise to consider that he's more likely to be taken seriously if his appearance is up to par. The idea applies just as well to grammar.
Only if you take "appearance is up to par" to mean "fitting in with the crowd" or else you are swapping objective judgements on grammar for objective judgments on clothes and grooming, and begging the question.
Inasmuch as the crowd you choose to fit into can correlate with other behaviors it may be perfectly reasonable to make objective biases based on appearance (verbal or otherwise).
When push comes to shove and you need to actually decide what your assumption is, you're going to assume something, it may at least have the best possible shot at being accurate.
Even Jeff Foxworthy would often joke that he wouldn't want to be operated on by someone who talks like he does.
One point that most grammar discussions seem to miss is that grammar is a model of how people communicate, not the real thing. When a construction is labeled "ungrammatical," then, it could be, as the prescriptivists would have us believe, that the construction is indeed poor at communicating the intended meaning and side-tones. But it could also be that the construction is perfectly effective and the model has simply failed to capture that aspect of reality.
A large part of rhetoric, for example, is breaking grammar for effect.
Those words are powerful, but not because they form a comma splice. That is, their power does not derive from failing the model's rules. Rather, the model's rules fail to capture that those words are powerful. (Or even allowed.)
Again, grammar is a model. A model that people don't actually use to communicate with one another. Nobody hears words and constructs a parse tree to figure out what they mean. So what's encoded in the model (or not) has little effect on the effect that the words actually have.
Change the model all you want. Nobody will hear any differently.
EDIT: P.S. When I wrote that rhetoric was "breaking grammar for effect," I didn't mean to imply that the effect of rhetoric came from breaking grammar. Rather, I meant that the focus in rhetoric is on the effect and, because grammar fails to admit many effective phrasings, rhetoric necessarily "breaks grammar" from the outset.
Not the best illustration, as there's no such thing as a comma splice in Latin (or most other languages, actually, including English a century or two ago). There is however a rhetorical term for the construction: asyndeton.
Can you help me understand why you believe that my example was a poor choice?
I offered a sentence in English and made the claims that (1) it was powerful and (2) not admitted by the grammars that govern such constructions. I thought that these were necessary and sufficient to make my point.
Do you believe that either of these claims is false (or somehow depends on what's true or false for Latin)?
"Not the best illustration" isn't equivalent to "poor choice." Caesar isn't a bad example, per se, just not the best one. I'm simply saying (re: "A large part of rhetoric, for example, is breaking grammar for effect") that an example from a language where no grammatical rule has been broken isn't the most apropos. You may have given the sentence in English, but everyone knows its provenance.
Here's one from English, in honor of the upcoming holiday:
Go back to Mississippi, go back to Alabama, go back to Georgia, go back to Louisiana, go back to the slums and ghettos of our northern cities, knowing that somehow this situation can and will be changed.
As far as your claims, (2) is contentious. Some grammars rail against the evils of the comma splice, others acknowledge its place and provide more nuanced guidelines for proper use. It's never been a cut-and-dried issue. I personally disagree with (1) for English while agreeing with it for Latin, but that is neither here nor there.
Thanks for your feedback. I think I now understand your argument and, while I have a hard time believing the claims upon which it rests, I believe that reasonable people could believe them.
My claim wasn't that there was no rhyme or reason to human communication but rather that grammar does not capture that rhyme and reason, just a small part of it.
That is, there is no "rule set" in reality. Nevertheless, you can create a rule set that describes some of reality. Just don't start thinking it's the real thing.
How does descriptivism deal with people who are just plain wrong? To pick one example among many, a recent trend I've noticed is people who use "equivocate" when they really mean "equate". Should they be corrected, or should we simply accept that because a lot of people are doing it and we can understand from context what they really mean, it's now an appropriate usage? Where exactly is that line drawn?
In general you can find accepted and common usage in a dictionary. I doubt you will find this meaning of "equivocate" in any dictionary today.
Prescriptivists usually go beyond the dictionary to enforce ideas of language that they got using the problematic methods described in the linked article.
There is no line. To take another example, in British English, alternate does not mean a kind of alternative, whereas in American English, this is now accepted usage. Is this wrong? Not really any more, and certainly not to a large enough group of speakers to normalise the usage. Language often evolves through mistakes, particularly in words which sound similar (see loose versus lose for another example which will probably change from an error to a change in common usage over time). So I imagine most descriptivists would simply refuse to accept that there is a clear line between error and evolution in language usage.
If they're understood, they're not wrong. If they're causing confusion (and you can understand them only because the rest of the sentence makes it clear - you would understand just as well if they had said "foo" rather than "equate") then that's worth questioning. The way we use many words now (condone and fulsome spring immediately to mind) would have reasonably been described as "plain wrong" twenty years ago.
Well, given context, they could use any word at all, even the opposite of the correct one, and could be understood. The problem is that it seems foolish to allow that the person is correct in using the incorrect word, or that the word in question now means the same as the correct word. Effectiveness is a continuum, too - would they have communicated more effectively by using the correct word? Then that word is preferable, though not necessary to impart the meaning in the greater context.
Not quite. Merely being understood isn't the only possible goal of human language. Usually, speakers also want to appear educated and reliable, and using grammatical rules that the largest possible portion of your audience agrees with is a good way to do that.
If neither usage (#1) nor logic (#2) are acceptable authorities, then language becomes an arbitrary construct subject to everyone's whims. As the author promptly proceeds to show with his unsupported assertion that "different than is perfectly fine".
But language _is_ an arbitrary construct subject wholly to societal agreement. That's why language evolves over time, why different dialects of a given language develop, etc.
Web standards were also initially an arbitrary social construct. That doesn't mean we want them to change every year according to the latest popular fancies.
He didn't attack arguments based on usage, just outdated usage. The point of syntax is to be understood, not to prove you are educated. I'd hope you know what I mean when I say 'different than'. Language is going to change no matter what; being a prescriptivist means leading a significantly more annoyed life.
> Last year's buzzwords are this year's forgotten fancies.
Some of them, yes. A great many are not. Taking your "last year" to be poetic rather than literal: -gate is cliche, "meme" is totally a thing now, "my bad" has diffused into informal communication all over Anglophonia, lolcat-speak still has everybody's great-aunt snorting at her Facebook wall, texting acronyms are well-known to everyone, word truncation + s is on its way to becoming totes ubiqs, etc.
Not to mention you're using a pretty useless definition of "outdated" if it includes literally everything.
"then language becomes an arbitrary construct subject to everyone's whims"
ding ding ding, you're starting to get it!
More seriously, language can be thought of as a middle ground between production of an external symbol as a realization of thought, and interpretation of that symbol in another mind.
Even using language adhering to the strictest of grammars, there can be room for misunderstanding in the mind of the interpreter as they will be decoding your symbol expressions through a lens of their own understanding. Adhering to the model you think best does not guarantee clarity of communication!
You adjust the production model you use in order to better suit the interpretation model your audience is using. If your use of language is skillful enough, you can even help the audience select from several competing language models in order to add nuance -- you can speak in a vernacular or in a formal mode.
What the author says is that "usage from years past" does not form some sort of authority. That is a different thing from today's usage, as described in dictionaries.
I defy you to stop being so tribalistic. I didn't take position on any side of the "different than" controversy. All I said is that the author did not support his assertion.
While I agree with some of his ideas (despite being something of a prescriptivist when it comes to my own writing), they're not exactly "ill-founded" as much as "the author disagrees with them."
"language is not a logical system"
-_communication_ is not a logical system, but language is.
> "language is not a logical system" -_communication_ is not a logical system, but language is.
Care to explain this?
From most NLP classes I have taken and work I have done, I have always understood that languages are logical only in a very loose sense of that word. It is pretty hard to express grammar systems in formal logic for most practical human languages as far as I know.
Language is systematic communication, and while it's imperfect, it has basic logical underpinnings. The fact that some sentences and words work and others don't is proof enough of that. It's arbitrary, yes, like any artificial system, but it's a system nevertheless. Deviations from the system as we understand it ("effective" but not "right," as suggested above) help further define it. This is all just my personal ideas about it, though, not like I've made a study of it my whole life.
I'm not sure what you mean. More specifically, what logical underpinnings does language have that communication doesn't? For me, communication is the transfer of ideas from one brain to another, and I would define language as the means of doing so. So you can't have one without the other.
Spoken and written language are one means of communicating ideas. I can communicate anger with my fist or love with a kiss (or vice versa). The subset of communication we call language is a systematization by which we can communicate specific ideas using a specific, mutually agreed-upon method. The method must be logical in its formation because otherwise it can't be sure it is understood by both parties.
But I don't see a kiss as not-language: it's body language. It still requires agreement between the parties on what it means, and it doesn't require any logic to be understood - just a common psychological framework plus similar cultural experiences. And same with the raised fist.
> The method must be logical in its formation because otherwise it can't be sure it is understood by both parties.
Again I don't know what you mean by "logical". Logic is universal and absolute and doesn't change. But while languages each have an internal consistency, they're all very different from each other, and each of them evolves quite drastically given enough time.
How could language have ever developed if it needed to be logical for it to be understood? Seems to me all you need is a bit of empathy and some dexterous body parts (fingers/arms, vocal cords/tongue) to build a real language starting from pointing and gesturing and grunting, none of which is at all systematic.
Over time, I think modern languages developed that internal consistency because their users needed to express more and more complicated ideas. If the idea I'm conveying to you has lots of actors and actions and nuanced imagery in it, things have to be orderly and highly patterned-based or else I know that you will not understand. So we all mutually agreed on an arbitrary system for these things.
I think the problem nollidge is having with your assertion is that language is likely better described as "probabilistic" rather than "logical".
"The method must be logical in its formation because otherwise it can't be sure it is understood by both parties."
Isn't true in any description of language. You can't be 0/1 sure that language will be understood by both parties. You can only make a guess that probably the way I'm producing communication symbols will probably be understood by the receiver. If the probabilities align, then communication has succeeded...but a 1:1 production-to-understanding ratio almost never happens.
Receivers of communication understand some percentage less than 100% every time. But it's their own internal models of the language that let's them fill in the gaps. You as a communication producer hope that they are filling in the gaps with a model that's close enough to your model that everything works. And for the most part it does, we're pretty good at it, but miscommunication occurs dozens of times a day, every day for every daily set of communication events.
Linguists have been working on a formal syntax for natural language for more than a hundred years and still haven't succeeded completely. I'd say formal logic is a language, not the reverse. You have to take away a lot of what we do with language to get it to work as a logical system.