Language and Politics

Language and Politics

I remember it like yesterday—the day I first encountered pronoun mania. It was in London, the late Seventies, at a student party in the philosophy department of University College London, where I used to teach. A female (girl, woman) student told me she had enjoyed our tutorial on the analysis of knowledge but had one question for me: why did I keep saying “he” when I discussed analyses of knowledge? Was I forgetting that women (girls) often know things too? This struck me as a bizarre criticism, but I addressed myself to it, making what I thought were rather obvious points. No, I wasn’t forgetting that, but merely availing myself of the conventional means of expressing generality in the English language. To my surprise I made little headway with this newly minted zealot, who had no doubt heard this “criticism” from a female professor, or possibly read it in a feminist tract. I had been a feminist since the Sixties and needed no conversion, but this business with pronouns struck me as extreme, unhelpful, and perverse. Little did I know it would become feminist orthodoxy across the globe, taken as self-evidently correct.[1] I never imagined it would become a political touchstone, an article of faith and righteousness, a model for other politically motivated linguistic measures. I could not have foreseen that this seemingly mild piece of speech reform would become the seed of the ascendancy of right-wing politics—the reason left-leaning politicians fail to be elected, the reason the Democratic party in America lost the support of non-college-educated voters. Allow me to explain.

I will be blunt: the thinking behind the student’s question stems from a misguided and wacky theory of the relation between thought and language. We are familiar with this theory under the name “the Sapir-Whorf hypothesis”, but it is actually a general theory to the effect that language shapes (determines, constitutes) thought (knowledge, emotion, intention). Your mind is held to be a product of the language you speak; it has no other form. Conceptual schemes are composed of words. And words limit what your mind can grasp, biasing it, distorting it. Thus, when you say “he” to express generality you are tacitly assuming that everyone you are talking about is male. When I formulated a Gettier case using “he” I was actually thinking or assuming that all knowers are male. I was excluding female knowers from my thinking. The pronoun “he” is a masculine pronoun, so my meaning was masculine, so I must be thinking that knowers are always male, or at least making that assumption. If I retort that I was not thinking that and did not mean that, I will be sternly reminded that I am violating the Sapir-Whorf hypothesis, or some variant of it. But this is a terrible theory of the relation between conventional linguistic meaning and speaker meaning—between what words mean in themselves and what speakers mean in using them to perform speech acts. The word “he” has a masculine meaning, but when I used that word, I was expressing my speaker meaning, which was not marked masculine—I meant “he or she”, in effect. More generally, it is not the case that spoken language shapes or determines thought, even what is meant by speakers when they speak. I won’t go into the reasons for saying this; my point is that the objection raised by the student depends on a contestable theory of the relation between thought and language. I would argue that this false theory reflects behaviorist assumptions that deny the reality of the inner: language is what is outer and observable, so it must constitute whatever is real in our mental talk. This theory in turn happens to suit a capitalist ideology that views the human being as essentially a machine with no inner life, a useful tool in the production process—so that no ethical implications arise for the exploitation of workers in factories and the like. But that is another question; again, my point is that the theory has complex relations to economic and political positions. It is not a self-evident truth. In fact, it is pretty wild and implausible when impartially considered. It is characteristic of a range of far-out theories developed and popularized by social scientists (sic) working in universities and passed on to the general population. It is a philosophical theory, in the broad sense—speculative, controversial, almost certainly false. Think Freud and Skinner, Armstrong and Ryle. All the contortions engaged in by the grammar police involving “he” and “she” ultimately rely on a dubious theory invented by academics. This theory floats around universities and infects the minds of the educationally impressionable, eventually transmogrifying into politics and policy. But—and this is the practical political point—it is not absorbed by people who didn’t go to college and are never exposed to the theories prevalent there (and which change with the seasons). These theories are the province of the semi-educated—those with college degrees but not advanced degrees, roughly. Freud and Skinner were eventually demolished, but not before entering the minds of people unable to critically evaluate them for themselves. Similarly for the theories behind the pronoun mania that has swept campuses, boardrooms, and public services. Theoretically speaking, the pronouns of natural language are devices proper to language as a formal system; they are not determinative of the very structure of human thought. There is thus no need to reform ordinary grammar in order to protect thought from malign influences; we just need to recognize the distinction between linguistic meaning and speaker meaning. Words are our tools; we are not their puppets. We don’t need to police language in order to make our minds politically perfect. The concept of an “ideal language” is misguided and unnecessary. Sociologically, people who have never been exposed to these false theories are not influenced by them; and they regard them, correctly, as strange and unconvincing. They will therefore be disinclined to vote for politicians who espouse them. They will think, in short, that they are bullshit, and they are not wrong to think that. If a political party becomes strongly associated with such theories, or their practical applications, it will lose support among the non-college-educated population. The pronoun mania will drive them away from a party that indulges in it. And this is likely to carry over to good theories too, because they will be tarred with the same brush by the skeptical electorate. The party that opposes all that shaky theory-mongering will gain ascendancy—it will become the party of “common sense”. Does any of this sound familiar?

The linguistic theory of thought is not the only academic theory that has captured the minds of impressionable people, usually young people. But it is a particularly powerful example of the general phenomenon: pronoun reform has led many zealots to believe that they have here an undeniable victory against the old guard. This set the tone for further theoretical incursions into politics. The patriarchy had seeped into our native language and needed to be rooted out if political reforms were to be achieved (often laudable enough in their own right). An academic theory thus led to policy recommendations. Are there other examples that mimic the pronoun paradigm? They are not far to seek. Consider the push for “diversity”. Suppose you believe (as many do) that truth is relative; there is no such thing as “absolute” or “objective” truth. You have had this drummed into you by your professors, and anyway it sounds vaguely egalitarian to you. Then you will primed to accept that diversity is a good thing: if there is no such thing as a single objective truth, shouldn’t we encourage a plurality of viewpoints on “the truth”? Let’s gather the many truths that people accept: these will be best found by assembling a diverse group of people. Thus, “diversity hires”. The trouble with this line of reasoning is that the motivating theory is terrible: truth is not relative. Again, I’m not going to argue the matter here—I am making a political point. Those who accept the underlying theory will find it self-evident that diversity is a value to be promoted, while those who have never got the relativist memo will be perplexed by the urge towards diversity. The latter don’t think truth is relative, so there is no need to hire a diverse group of people to teach a bunch of relative truths. All this will seem like mumbo-jumbo to them—and I think they are right so to think. They will not be inclined to vote for a political party that advocates such mumbo of the jumbo. That party will become, in their minds, the party of the wacky, the nonsensical, the phony. College graduates may be more tolerant of such fantasies, given their educational background, but the rest of the electorate will not be taken in. Maybe there are counterintuitive theories that should be accepted (I am thinking of sound economic theories), but the tendency will be to suspect any political party that traffics in silly-sounding theories. The same is true of all the recent talk of “power imbalances”. Again, this comes from theoreticians in the weaker areas of the humanities (e.g., Foucault)—the idea that “power imbalances” imperil “agency”. There cannot be voluntary liaisons where there are asymmetries of power, it is held. This is utter rubbish, but it can be made to seem plausible by choosing certain kinds of example and keeping the language abstract and abstruse. Impressionable minds are easily manipulated by this kind of sophistry. But to those who have never been subjected to such “teaching” it will seem preposterous, pretentious, and plain stupid. How can people who mouth this kind of crap be trusted with running the economy? That is what people will think; and they will desert the party that promotes it as superior virtue. Nonsense does not win elections, especially the higher type of nonsense, the type found in universities. Not that everything taught in universities is nonsense, but some of it is—and it leaks into political policy and rhetoric. Plain language, plainly spoken, is what is needed to win elections.

What are the implications for democracy of these reflections? If the college-educated part of the electorate supports a particular party, that party is likely to champion intellectual ideas and theories originating in universities. These ideas will almost certainly include dubious theories, often absurd theories, which are then applied to practical issues. This will alienate the non-college-educated part of the electorate, leading them to fall into the hands of the more untheoretical and philistine populist party. Thus, depending on the proportions of educated and non-educated voters, the former kind of party will find itself unelectable. It will need to purge itself of these theories, at least as elements of the party platform. Do not defend abortion as justified by women’s’ “bodily autonomy” (“a woman has the right to control her own body”). Do not insist on strict pronoun rules or punish deviations from the approved norm. Do not speak of “safe spaces” or “power imbalances” or use any jargon invented by humanities professors. It’s asking for trouble. Keep ideological feminism out of it. Never talk of “defunding the police”. Don’t tell people how to pronounce unfamiliar words. Avoid obscure and unmemorable acronyms like “LGBTQ”. Don’t use phrases like “critical race theory” even if the denoted theory is perfectly sound.  Above all, never import academic theories of dubious credentials into policy discussions. If in doubt, consult an expert in the field at issue, especially when there is academic controversy, which there nearly always is. The generalizing use of “he” should never have been stigmatized, looked down upon, viewed as a sign of moral illiteracy. This kind of attitude could destroy democracy.[2]

[1] Some years later I wrote an article on the analysis of knowledge for an American publication. I used “he” a good deal. When it came back to me the copy-editor had re-written the whole thing to fit the new pronoun orthodoxy, with many a cumbersome paraphrase and lumpy grammar—without my permission. That is how entrenched and taken-as-gospel it had become. I should have made a fuss then, but I let it go. Now I see that it wasn’t just harmless pedantry in a good cause; it was the root of the linguistic political correctness that threatens to undermine left-wing politics. Nowadays it’s hard for me to declare myself a lefty liberal.

[2] I write this because of all the handwringing occasioned by the recent election of Donald J. Trump. What caused the American electorate to desert the Democratic party? No doubt many things, but I am focusing on one thing that is not usually mentioned, viz. bad ideas born in the academy migrating into practical politics. This is a vice of the party of the college-educated, leaving others cold. The left has clearly succumbed to theories invented by academics, mostly bad theories: postmodernism, social constructionism, linguistic idealism, feminist ideology, relativism of all kinds, bad psychology, etc. In a word, it has become pseudointellectual.

Share

A Political Song

Were You Ever

 

Were you ever right

When you arrived that night

With your books and your guns

With your daughters and sons

 

Were you ever right

 

You landed and looked

You built and you cooked

You cut down and burned

You rampaged and spurned

 

You landed and looked

 

Did you do right

When you used all your might

To bring them in ships

To work in your fields

 

Did you do right

 

You bought and you sold

You made them grow old

You forced them to change

You locked them in chains

 

You bought and you sold

 

Are you sure it was good

When you imported more men

To toil under ground

Until soon they were dead

 

Are you sure it was good

 

Did you ever care

Do you feel it was fair

Did you really mean well

Did you really do well

 

Did you ever care

 

Was it part of the plan

To elect such a man

Is that what you need

To see them bleed

 

Was it part of the plan

 

Were you ever

Were you ever

That beacon of light

Did you ever

Did you ever

Do right

 

Did you ever

Did you ever

Share

Ex-Friends

Ex-Friends

I have many ex-friends. Consider the case of Mark Rowlands: this is a person who I’ve known for forty years. I supervised him at Oxford; I asked him to contribute to a series I was editing on ethics; I brought him to Miami; I saw him every day for lunch when I was in the department; my wife was good friends with his wife. I have not set eyes on him in over ten years and I live close to him in Miami. I have tried many times to arrange a meeting with him. After giving me the runaround for years, he finally made it clear that no meeting was going to happen. No explanation was given. Or Otavio Bueno: I was instrumental in bringing him to Miami as a junior member of the department; I befriended him when he arrived; we were on good terms. That friendship is over. No explanation given. Some nasty emails exchanged. Told me I was not welcome on campus. Or Aimee Thomasson: I literally saved her life when she was in danger of drowning; she came to my house and I to hers; we had a good relationship. I reached out to her to discuss things, but got no response. She publicly took against me. I haven’t seen her or heard from her in over ten years. I could mention many others. You might want to ask these people to explain themselves, because I have no explanation from them. Cancellation can get very personal.[1]

[1] I should say that I have many academic friends who have not distanced themselves from me.

Share

True Lies

True Lies

Suppose you see John steal a cookie. He did it and you saw him do it. However, you don’t believe that Johnstole the cookie because John disguised himself as Jack. You believe, falsely, that Jack stole the cookie. As it happens, you don’t like John so you decide to lie and say that John did it. You therefore say, “I saw John steal the cookie”—while believing that Jack did. You intended to say something false but ended up saying something true. Did you tell a lie? If you did, it was a true lie. No false belief was produced in anyone by your assertion of John’s guilt, even though that was your intention. No injustice occurred because of your statement—the right person was punished. Still, is it true that you lied? I think not: there are no true lies—though there can be true attempted lies. You tried to lie, but you failed. You are guilty of attempted lying but not actual lying. The case is rather like an attempted murder when you aim at a wax effigy you mistake for a living person. You can’t actually murder unless you really kill someone, but you can attempt to murder without killing anyone. Similarly, you can attempt to lie but fail in the attempt—you didn’t actually lie. Incompetent failed lying isn’t lying. It is a necessary condition for lying that the proposition you put forward is false: “false lie” is redundant and “true lie” is contradictory. A very incompetent and unlucky would-be liar could go through life never telling a lie yet always attempting to. He could not be called a liar (he is a liar manque). The same goes for perjury: someone might try to commit perjury but fail, because she accidentally tells the truth, contrary to intention. She is guilty of attempted perjury and may therefore be legally sanctioned accordingly, but technically there was no perjury, i.e., lying under oath. The same goes for truth-value gaps: if your statement is neither true nor false, it cannot be a lie. Suppose I try to lie about the state of the king of France’s head—I say “The king of France is bald”. That can’t be a lie because my statement was not false (if we follow Strawson). It is the same if I say something meaningless under the impression that I am making a meaningful statement: I have not said anything false, so I have not lied. Whether I lie is not completely up to me; it requires the cooperation of the world. It is not enough that I have a lying intention; I also need the right beliefs and the right linguistic vehicle. What I say has to be objectively false, but I may be wrong about this. Should we blame a person less if his attempted lie turns out not to be a lie? No: he had a lying will and that is what matters to blameworthiness. We should certainly not say “No harm, no foul”—no actual lie, so no blame attaching. Attempted murder is a crime, and attempted lying is unethical. You can’t plead innocence on the basis of factual error. Such cases very seldom arise (I have never even heard the possibility discussed), but the concept of lying seems clear enough—no falsehood, no lie. Lying is stating a falsehood with the intention to deceive, not merely intending to do that. What this shows, I think, is that the badness of a lie resides mainly in the falsity of what is said not in the speaker’s intention. If attempted lies never led to actual lies, i.e., false statements that lead to false beliefs, then we would not care much about the prevalence of would-be liars—they would be powerless to propagate false beliefs. Insincerity is not the problem; the problem is actual falsehood. We abominate lying because it leads to false belief, not because liars aim to produce false belief. If they failed in their aims, we would regard them as negligible cranks. There is nothing to fear in truth-telling (would-be) liars. Liars have to be able to make actually false statements in order to be a menace to society.

Share

Fuck

Fuck

The word “fuck” has multiple uses. The OED gives us two definitions: “have sex with” and “damage or ruin”. Thus, we have “fuck up”, “fuck about”, “fuck with”, “fuck all”, “fuck off”, “fuck you”, “fucked”, “what the fuck?”, “cluster fuck”, “mind fuck”, “fuck face”, “fuckable”, “fuck!”, and so on. The two definitions are opposed to each other: we don’t normally think that having sex with someone is damaging or ruining them; nor do we think that damaging or ruining someone is having sex with them. The word slides from positive to negative with remarkable ease. Generally, it connotes something not at all good. Its literal meaning has given way to an opposite conversational meaning. All very curious and no doubt indicative of deep psychic currents. However, I am not concerned with such psycholinguistic matters here; I am interested in promoting a new use for the word. I think it expresses our present political moment. Yesterday was a fuck. Tomorrow will be a fuck. The next four years will be a giant fuck. I started using the word this way while being treated for cancer—it was a fuck. Surgery is a fuck; so is radiation treatment—but immunotherapy isn’t much of a fuck (except expense-wise). This usage is particularly useful in the future tense: “That is going to be a fuck”. The meaning is roughly “deeply unpleasant, aggressive, and unavoidable”. Not many experiences qualify for this use of the word—a bad lunch or movie is not a fuck. It has to make an impact on the suffering psyche. It has to hurt; you have to grit your teeth. A visit to the dentist can be a fuck, though it need not be. Inflation is a fuck if it’s high enough. Being sued is a fuck. Is being cancelled? Not really. A fuck has to be extreme, ruinous, spectacular, life-altering. True, there can be minor fucks, like being towed or audited; but the concept is really designed for the big things. Being thrown in jail is definitely a fuck. The word must be using sparingly in this sense or else its impact will be debased. So, feel free to use the word in this way if the spirit takes you. I have the feeling we are going to need it.[1]

[1] Speaking of novel linguistic uses, I said to my son the other day (he is forty-five) that I wanted to go into an English country pub with him one afternoon and whisper “Phasers on stun”.

Share

Meme Selection

Meme Selection

What kind of selection applies to memes? According to my scheme, there are two kinds of selection: intentional and nomological.[1]  Suppose the meme is a jingle: if it takes up residence in your mind, is that an intentional act? Not generally, since jingles usually repeat themselves against your will. You don’t choose to have a jingle running through your head all day. Maybe someone intended to put it there, but its ability to stick around is not a result of the recipient’s intentions. So, the selection must be of the nomological type—the jingle must gain a foothold in virtue of laws of nature (plus initial conditions). What are these laws? They are not physical laws, presumably, so they must be psychological laws. The relevant law might be this: catchy tunes tend to be remembered and rehearsed. The jingle acquires meme status in virtue of that law. Other laws of meme propagation might be more complex (consider fashions). There are psychological laws governing gene propagation and these are the selective agents in bestowing meme status (meme “survival”). Thus, meme selection is a case of nomological selection. This is a form of “natural” selection. So, meme selection belongs with body selection, as conceived by Darwin. This seems like a nice result. Stars, organisms, and memes all exist because of nomological selection—unlike selective breeding, works of art, machines, political systems, etc. Intentions and laws cover the whole field.[2]

[1] See my “The Selective Universe”.

[2] We now have a useful structure for the subject of selection science (including philosophy). This is the general theory as opposed to the special theory represented by animal breeding and Darwinian natural selection.

Share

American Trump

American Trump

Perhaps I see the current catastrophe differently from others. To me it reveals and magnifies the worst American traits: stupidity, credulity, nastiness, thirst for violence, love of bullshit, childishness, amorality. I have seen these traits manifested so many times in my thirty-five years here that Trump’s rise hardly surprises me. Only in America could this happen; as a Finnish friend of mine (Esa Saarinen) recently said to me, “Trump is impossible in Finland”. The same traits operate even in university settings: there is a bit of Trump in (nearly) everyone, even on the left. Oh, how they love to demonize and destroy! Careful thought is alien to their raucous cramped minds. But I will not expatiate further.[1]

[1] If you think I enjoyed writing that, you are very much mistaken.

Share

Double Death

Double Death

You might think there is nothing more to say about death. You might think death has been done to death. But I am here to report, from the depths of death studies, that death has a new wrinkle—it has a surprise up its sleeve. It turns out that every death is a double death. When a person dies two things die, not one. Both these things have value; we would like both to go on. But the death of one entails the death of the other—not necessarily, but contingently. Death has always been double, but it need not be; the two deaths are in principle detachable. This is good (but not very good).

What (on earth) am I talking about? The thought occurred to me while I was taking a massage, so that I was focused on my body. The two things are the self and the body. The thought I had was that it would be nice if my body went on when I died. We are familiar with the thought that the self could survive the body—say, by transferring the brain to a new body. That would certainly be good; it would take most of the sting out of death, especially if the new body was a splendid specimen. The old body would die, but the old self would live on. But what about the converse? What if the self died but the body lived on? In particular, the skills of the old body were preserved, especially the cool useful skills. Suppose you are a ballet dancer and your brain is in bad shape—it’s not going to make it. But you are told that the surgeons can keep your body alive along with your dancing skills, with a new brain installed in it. It will be up on stage again exercising the skills you have instilled in it. Realistically, this will require keeping alive and well your motor cortex and associated brain structures—but not you (not your frontal cortex etc.). Suppose, indeed, that they can preserve all your motor capacities, though not (alas) the person that is you—musical, athletic, balletic. It took years to acquire those capacities, and it is possible to retain them in your healthy body (pity about the dying self-brain). Wouldn’t this be better than the usual double death? Isn’t it some kind of compensation? We might describe it as the survival of the bodily self, if not the mental self. Instead of two deaths, there is only one. True, you might prefer things the other way round, because your mental self is more precious to you, but it would at least be something if your bodily self survived. Offered the choice, you would opt for the single death over the double death. This might be called half-death.[1]

Can we extend this idea to the mind? Suppose my mental abilities are located in a part of my brain separate from the part that houses my self: could we preserve the former part while losing the latter part? It doesn’t seem impossible. Then we could preserve Beethoven’s musical genius while not preserving him. Surely, that would be better than losing both. It would be possible for Beethoven to lose his musical genius because of brain deterioration while not losing his self, because they are separately located. Equally, it should be possible for the converse to happen; a new self could be conjoined with the old genius. If I were Beethoven, I would prefer this to losing both things. Could the same be done with memories—keep the memories, lose the self? That seems pushing it, because of the close connection between the self and memory; but maybe some kinds of memory could be retained in the absence of the old self, in which case a lot that is valuable could be preserved without survival of the self. This never happens as things are, but in principle it could. Double death leaves open the possibility of partial survival. Survival of the self is not the only thing that matters. Death is more discriminating (in principle) than we thought. Death need not always be total death. Death may come in degrees and types.[2]

[1] I can see a Ridley Scott sci-fi movie: Half-Death.

[2] When do we start dying? Is it when we reach adulthood? We don’t begin to die when we stop growing, do we? We start losing capacities quite early on, whittled away by the years. We are certainly beginning the dying process when we reach sixty normally—the journey to death has already begun. We become less and less young. We gradually start to fade away, sometimes passing through dementia and physical collapse. We don’t suddenly grow old. Maybe we die many deaths before the Big Death.

Share