A Pernicious Influence? See Geoffrey Pullum's take
I have a dilemma. Every year I teach students how to write. I teach them how to come up with a thesis for their essay; how to research it; how to make their arguments persuasive; how to arrange those arguments into a coherent structure; and, ultimately, how to form the sentences that convey those arguments. In teaching the last of these skills, I am forced to confront the thorny topic of writing styles. It is then that my dilemma arises.
Style guides commonly present students with lists of prescriptive rules. Following these rules is thought to promote good writing. You are probably familiar such rules. I’m talking about things like ‘Don’t end a sentence with a preposition’, ‘don’t split infinitives’, ‘adopt the active voice’, ‘learn how to use apostrophes, semi-colons, colons (etc) properly’, ‘don’t use words that aren’t really words, such as “irregardless” or “amazeballs”’, ‘the word ‘hopefully’ should not be used as a sentence adjective’, ‘don’t use ‘presently’ to refer to the present moment; only use it to refer to something that will happen in the near future’ and so on. Hopefully, you get the idea; presently, we’ll see the problem. The problem is that I am an extreme non-prescriptivist when it comes to language usage. I don’t believe there is a ‘rule’ against using split infinitives, ending sentences with prepositions, or any of the other commonly listed offences. I fully embrace the non-prescriptivist view of language promoted by the likes of David Crystal, Steven Pinker, Geoffrey K. Pullum, and Oliver Kamm. I think the rules just alluded are really nothing more than (fussy) aesthetic preferences, and that the English language consists in a number of overlapping and equally valid styles.
And yet, when it comes to grading student essays, I often find my inner prescriptivist creeping to the surface. I don’t like it if students use idioms such as ‘It don’t seem no good’ or ‘it was proper good’. I rail against students who misspell words, put punctuation marks in the wrong place, adopt colloquial or slang terms, and generally fail to adhere to the conventions of Standard English. But am I entitled to do this? If there is no hard and fast set of rules to be followed, if English really consists in a number of equally valid styles, how can I complain when my students don’t conform to my preferences? This is my dilemma.
I was recently grappling with this dilemma and it occurred to me that there are some interesting philosophical issues at play. I decided it was possible to justify my quasi-prescriptivist attitude, but to do so I first needed to isolate and understand the competing metaphysical and ethical views of language that underlay my dilemma. Once I did this, I could better explain why a certain amount of prescriptivism is justified. I’m going to try to share the fruits of this analysis in this blogpost.
I am hoping that this doesn’t come across as a rant. But there is a danger that it will since I do find some of the classic rules to be absurd and my frustration with them may well show through.
1. The Metaphysics of Language
Metaphysics is my first port of call. I think the debate about language usage is best understood as a war between two competing metaphysical views of language. That is to say, two competing views of what language is (throughout this discussion I focus on the English language but I presume the same can be said for most other languages), where these in turn dictate particular ethics of language usage (i.e. sets of views on how we ought to speak and write).
The proponents of the two competing views can be given names. I’ll call them sticklers and pragmatists:
Sticklers: Have a legislative or Platonic view of language. They think language consists in rules relating to semantics, grammar and spelling, that are either set down by appropriate authorities (the legislative view) or are intrinsic to the language itself (the Platonic view). This dictates a deontological approach to the ethics of usage. You simply must follow the rules in order to speak or write properly. This is sometimes accompanied by a consequentialist ethic, which is largely focused on conservative values such as preserving a dominant national identity and preventing the pollution of the language by ethnic groups or the lower classes (to be clear: I don’t wish to tar all sticklers with this conservative ethos — it is just that it is sometimes present).
Pragmatists: Have a conventional and evolutionary view of language. They think language consists in a set of constantly shifting and evolving conventions governing semantics, grammar and spelling. This dictates a consequentialist approach to the ethics of usage. This ethic takes different forms, some focusing on achieving a communicative aim and others more political in nature (such as resisting the conservative ethos and celebrating linguistic diversity). Pragmatists can be pure act-consequentialists — that is to say: they can decide which conventions to follow based solely on what is best in a particular communicative act; or they can be more like rule-consequentialists — that is to say: they can follow a set of default conventions because doing so leads to the best outcome on average.
Although I am here imagining that sticklers and pragmatists fall into two distinct ‘camps’, the reality is likely to be more complex. It is more likely that the labels ‘stickler’ and ‘pragmatist’ define a spectrum of possible views. This spectrum filters into the teaching of styles In the diagram below, I illustrate a spectrum of possible learning outcomes for the teaching of writing styles. The spectrum ranges from ‘Stickler Heaven’ at one end, to ‘Pragmatist’s Anarchy’ at the other. I don’t want my students to end up in Stickler Heaven, but I don’t want them to end up in Pragmatist’s Anarchy either. I need to stake out some middle ground (Pragmatist’s Paradise) and explain why they should join me there.
2. Why the Sticklers are Wrong
As a first step toward staking out that middle ground, I need to explain why the stickler’s approach to language is wrong. I do so with two arguments. The first tries to illustrate why the legislative/Platonic conception of language is false (and, contrariwise, why the conventional and evolutionary view is correct). The second tries to argue that adopting the deontic ethic has unwelcome consequences. Of course, if you have fully imbibed the stickler’s deontic Kool Aid, you may be unswayed by such consequentialist reasoning, but I doubt many people will have fully imbibed the Kool Aid. In ethical debates, people often resort to consequentialist reasoning when following a deontic rule would lead to a horrendous outcome. And while I do not promise horrendous outcomes, I think the outcomes to which I appeal will be sufficient to persuade most people that the deontic ethic is inappropriate.
Let’s focus on the first argument: why the legislative/Platonic view of language is wrong. To some, this will simply be obvious: English is not governed by a legislative authority and the rules of language are not like other Platonic entities (say the rules of mathematics). We don’t discover eternal truths about sentence structure and word meaning; the truths, such as they are, are clearly the result of contingent, messy, cultural evolution.* This can be easily demonstrated by focusing on the history of some of the stickler’s favourite so-called rules. These histories illustrate how what the stickler’s take to be ironclad rules are in fact produced by historical accidents. I’ll give a few examples. An excellent source for the historical evolution of usage rules is David Crystal’s book The Fight for English:
Orthographic Conventions: Orthography refers to how words appear on the printed page. Remember, language began as a spoken medium. Words were conveyed through phonemes (i.e. sound structures) not through written symbols. Many words share the same phonemes (i.e. they are pronounced in the same way), even if they have distinct meanings. Listeners are usually able to tell the intended meaning from the context, or by simply asking the speaker follow up questions. Things were different once writing took hold. Conventions had to be adopted so that different meanings could be discriminated. But these conventions emerged gradually and messily. One classic illustration of this concerns the use of the apostrophe. Conventions emerged in which the apostrophe was used to signal an abbreviation (as in ‘don’t’) or possession (as in ‘greengrocer’s’). But these conventions clashed in some cases, most famously in the distinction between it’s and its. The former is an abbreviation of ‘it is’ (or ‘it has’) whereas the latter is a possessive form of ‘it’. There is no logic to this distinction. It is a purely arbitrary compromise that emerged because of the awkward evolutionary history of the apostrophe. In this sense it is akin to biological evolutionary accidents like the laryngeal nerve of the giraffe. I could list numerous other examples of orthographic evolution but I won’t. Just read any book from the 1700 and 1800s and you’ll see how orthographic conventions have changed over the course of relatively recent history.
The Split Infinitive Rule: This is a famous stickler preoccupation. The belief is that it is somehow wrong to say things like ‘to faithfully uphold’ or ‘to boldly go’ because in these cases an adverb (faithfully/boldy) is being used to break-up the infinitive form of a verb (to uphold/to go). Crystal notes that this rule only seems to have entered English grammar books in the 19th century and was an example of Latin reasoning (i.e. the belief that English should copy the conventions of Latin) which has been popular at various times over the history of the English language. In other words, it originated in 1800s as a particular manifestation of a recurrent cultural fad. For some bizarre reason, fealty to the rule lingers and, as Pinker argues, may even have been responsible for Chief Justice John Roberts’s bungled administration of the presidential oath of office to Barack Obama back in 2009. This is bizarre because, as many have pointed out, the English language doesn’t really have an infinitive form of the verb. Instead, it has a subordinate (‘to’) combined with a simple form of the verb (‘uphold/go’): the infinitive is already split. Good writers have routinely and consistently breached the ‘rule’, much to that chagrin of the sticklers. It is odd that some continue to insist upon it.
Concern about ‘Proper Words’: One of strangest of all stickler beliefs is that there is a fixed font of words, and that some so-called words aren’t really words and so shouldn’t be used. Examples include words like ‘irregardless’ or ‘gotten’ (to name but a few). This betrays a misunderstanding of how language works. Nothing illustrates the historical and conventional nature of language more clearly than the passing in and out of existence of new words (read Shakespeare to see some famous examples of this). We need new words to explain new phenomena (‘selfie’, ‘googling’ etc.), and we abandon old words when they are no longer needed. The only standard for whether something counts as a word is whether it is being widely used and has become conventionally understood. So, of course, ‘irregardless’ and ‘gotten’ are words. They are widely used and conventionally understood. You may not like them, but they are words irregardless of what you might like.
The conventionality of language is also illustrated by syntactic rules. In the case of English, it is common to adopt a subject-verb-object order (e.g. ‘John saw the dog’). But in other languages different orders are common. For example, Japanese commonly adopts a subject-object-verb order (i.e., roughly equivalent to ‘John the dog saw’). Both syntactical structures seem ‘normal’ to their relevant communities.
So much for the stickler’s metaphysics. What about their ethics? Even if you accept that language is a messy nest of conventions, you might nevertheless think that we ought to follow certain rules lest we wander into pragmatic anarchy. I agree with this to an extent (as I’ll explain below). It’s probably not a good thing to constantly invent your own new words, or ditch the traditional orthographic rules, but I still think it is a mistake to adopt the deontic attitude of the sticklers. This is for two reasons. First, the rules that are beloved by the sticklers are often barriers to good communication. Second, the deontic attitude seems to encourage an overly moralistic approach to the teaching of style.
There are several examples of how following the stickler’s rules create barriers to good communication. Take the split infinitive rule. Sticklers would have you believe that Captain Kirk should say ‘[boldly to go] or [to go boldly] where no man has gone before’ instead of ‘to boldly go where no man has gone before’. But the latter would seem to preferable to the former. Not just because the phrase has become deeply embedded in the popular psyche, but because the adverb is supposed to modify the verb: it is a particular attitude toward going somewhere that Kirk is invoking. It makes sense to stick the adverb in front of the verb. Similarly, the oft-quoted rule about writing in the active voice can be an impediment to good communication. Use of the active voice directs the reader’s attention to the doer of an action (John kicked the dog) but oftentimes you will want to direct their attention to the done-to (The dog was kicked by John). If you rigidly stick to the rule, you will make your prose more difficult to follow.
As for the moralising attitude, it is present in passages like this (from Lynne Truss’s Eats, Shoots and Leaves):
If the word does not stand for ‘it is’ or ‘it has’ then what you require is ‘its’. This is extremely easy to grasp. Getting your itses mixed up is the greatest solecism in the world of punctuation… If you still persist in writing, ‘Good food at it’s best’, you deserve to be struck by lightning, hacked up on the spot and buried in an unmarked grave.
I know that Truss’s tongue was firmly in cheek when writing this. But similar pronouncement’s are found throughout the work of the sticklers (Oliver Kamm’s book Accidence will Happen aptly illustrates the tendency). And even if this doesn’t always end with hacked-up bodies in unmarked graves, it does seem to end with a sneering condescension towards the idiots who just can’t get it right. I don’t think such an attitude is becoming in an educator.
3. Pragmatic Prescriptivism?
Where does that leave us? It leaves with the pragmatic approach to style. We cannot plausibly conceive of language as a legislative or Platonic phenomenon. We must conceive of it as a conventional and evolutionary phenomenon. What’s more, we must recognise that there isn’t one set of agreed-upon conventions. If there was, we might be warranted in favouring a form of Stickler’s Heaven. But there isn’t. There are, instead, shifting and sometimes competing conventional systems. In certain contexts, it is conventional to use non-Standard spellings and idioms. If you are texting your friends, you can say things like ‘gr8’ or ‘c ya later’ (although, ironically, this seems less common now that there are fewer restrictions on message-length). If you are hanging out with your friends, it might be conventional to say things like ‘proper good innit!’ or ‘I’m well jel!’ or ‘I didn’t do nothing’. But if you are writing an academic essay…
…Here’s where I come back to my dilemma. When writing an academic essay, I think students probably should adopt a fairly traditional, so-called ‘Standard’ style of expression. This means they should probably avoid slang, non-Standard spellings, unusual punctuation and so forth. They should also probably master the different meanings of ‘enormity’, ‘meretricious’ and ‘disinterested’, and learn to put apostrophes in the conventional places. But why should they do this? If there is no right or wrong — if, as Pinker says, when it comes to English the lunatics are literally (or should that be figuratively?) running the asylum — then why can’t they mix and match conventional styles?
This is where the pragmatist's consequentialist ethic kicks-in. I think all pragmatists should adopt the following ‘master’ principle of style:
Pragmatist’s Master Principle of Style: One’s writing (or speaking) style should always be dictated by one’s communicative goals, i.e. what one is trying to achieve and who one is trying to achieve it with.
In the academic context, students are (in effect) trying to impress their teachers. They are trying to show that they understand the concepts and arguments which have been presented in class. They are trying to demonstrate that they have done an adequate amount of reading and research. They are, above all else, trying to defend a persuasive thesis/argument. What’s more, they are trying to do this for someone who isn’t sure that they are capable of it. As I say to my students, ‘you might know what you are talking about, and I might know what you are talking about, but I don’t know that you know what you are talking about — you need to convince me that you do’. The style they adopt should be dictated by those communicative goals.
This means that, in most cases, they should adopt a traditional and Standard style of expression. There are two main reasons for this. First, this is the style that dominates academia and adopting it eases communication. Students have to do a lot to convince me that they know what they are talking about. They won’t help their cause if they adopt countless neologisms and non-Standard idioms. It will put me in a bad mood. I’ll have to work that much harder to understand what they are saying. Second, adopting that style allows students to earn acceptance and respect within the relevant academic community. Certain conventions may be absurd or ridiculous, but it is easier to break them once you have earned respect. Oliver Kamm gives the example of the actress Emma Thompson who urged a teenage audience to avoid overuse of ‘like’ and ‘innit’ ‘Because it makes you sound stupid and you’re not stupid’. This feels right. It is not that students are genuinely stupid for adopteding non-Standard styles; it is that they will be perceived to be so and that, in most cases, is not a good thing. There is a pragmatic case for some forms of linguistic snobbery.
That said, there are no hard-and-fast rules. This is one of the discomfiting features of the pragmatic approach to language. We can’t fall back into the reassuring embrace of ironclad prescriptivism. Some academic styles are maddeningly opaque; it would probably be a good thing to break with their conventions. Sometimes a bit of slang can liven up an otherwise staid piece of prose. Sometimes you have to coin a new word or misappropriate an old one to label a new concept. You have to exercise wisdom and discernment, not blind-faith in a set of rules. This takes time and practice.
I have only one rule: the more you read and write, the easier it becomes.
* As far as I am aware, there may be a Chomskyan linguistic theory that does favour a quasi-Platonic view of language structures. But this arises at a very abstract level, not at the level of particular languages, nor at the level of style. Such Chomskyans would, I am sure, accept that there are many contingent cultural variations in semantics, orthographics and preferred idioms.