Applause

Applause

I once gave a lecture in Finland on externalism and twin earth cases. At the end there was a very brief burst of applause, lasting no more than two seconds. Afterwards I said to my Finnish friend Esa Saarinen that they must not have liked it much given the brevity of the response. On the contrary, he replied, Finnish academics never applaud at the end of a lecture and this was the first time he had seen it happen in the Helsinki philosophy department; I should be flattered. Funny, I thought. Last night I was watching Bill Maher on HBO and found myself constantly irritated by the applause (from an American audience): it interrupted the flow of the conversation, was too prolonged, and too self-congratulatory. It also reduced a serious discussion to a branch of show business—the audience was enjoying the performance. Surely the expression of a moral position on a serious matter should not be greeted with clapping. No one should be applauded for stating the moral truth as they see it. Of course, in America everything is showbiz, performance, theater: but really! Should I be applauded for expressing my opposition to capital punishment or my belief in animal rights? I am not trying to entertain (that is not the illocutionary force of my speech acts). So, when is it right to applaud, and when is it wrong? The ethics of applause has not, I believe, been explored in moral philosophy, but someone has to do it.

A strict view (call it the Finnish doctrine) is that applause is only proper at theatrical performances–operas, concerts, plays, ballets, and the like. It is not proper at scientific conferences, philosophy talks, political speeches, commencement addresses, and the like. At such events, the intention is to convey truth, impart knowledge, not to entertain or amuse; they are not performances for which the performer should be congratulated. No one thinks that when a doctor gives a diagnosis, or a lawyer a legal opinion, they should be applauded. I think there is a lot to be said for the Finnish doctrine; I also think the Finnish preference for brevity is to be recommended. Nodding is fine in other contexts, or facial expressions of appreciation, but not this wretched clapping of the hands—so loud, so raucous. However, we might make an exception for things like graduation ceremonies and athletic victories—here we might allow some of that “putting your hands together”. Graduating, like singing an aria, is a type of achievement, unlike arguing a philosophical thesis or offering a moral judgement. I like the idea of telling someone you think they gave a good paper, but not slapping your hands together as if he or she just performed a double somersault. It’s debasing, levelling. We must resist the urge to reduce everything to a form of entertainment. The Finnish doctrine embodies this resistance: not all types of appreciation must take the form of that appropriate to an opera or rock concert. My own feeling is that ballet most warrants the response of applause, because of the degree of discipline and achievement that goes into it, closely followed by opera, then drum solos. It should be kept out of intellectual contexts or acceptance speeches or psychotherapy sessions.

One thing about applause that is deeply suspicious is that it is itself a type of performance for which applause might be appropriate. I applaud your applause for its vigor, loudness, sincerity, etc.—and you might in turn applaud my applause for your applause. I suspect this is what is going on with Bill Maher’s audience: they want to be applauded for their good judgment, right-thinking, and sheer loudness. People who applaud at ballet performances for minutes on end, till their palms are red and sore, are like this—look at my excellent taste! I think ten seconds is good enough for a sterling ballet dance. Whooping is the worst: what a dismal performance that is! So, don’t draw attention to yourself when you applaud—don’t perform the act of applauding. And don’t applaud at all if you don’t feel it: when deeply moved by an artistic work (e.g., a Shakespearean tragedy) clapping your hands together may be the last thing you want to do (the noise, the percussive blows). Applause has gotten out of hand; it needs to be handled more discreetly. I actually would like to see Finnish austerity extended across the board, at least for a period of time—no applause for anything, just inner appreciation. The performers could be apprised of this new policy and not take it amiss, relishing those appreciative looks and admiring whispers; there is really no need for all that routine racket and hand spanking. Isn’t it really a paltry substitute for genuine feeling, reflection, thought? Instead of quietly taking in what we have just witnessed, we launch into a frenzied cacophony. Why is that a good idea?[1]

[1] When I used to give papers, I would often be confronted with a wall of applause, quite long-lasting. I would think, “Yes, but did you agree with it?” In earlier years I used to perform drum solos and felt no dissonance at the applause (applause is a bit like drumming).

Share

Science Without Language

Science Without Language

Clearly, there could not be poems or novels or essays without language: these things consist of words arranged into sentences. Equally clearly, there could be athletic activities without language: football, high jumping, sprinting, badminton. People may talk as they engage in these activities, but they don’t consist in talking. Could there be art (painting, music) without language, or economic activity, or politics, or carpentry? Some may say no, but that is certainly not obvious (I don’t think these activities are necessarily language-dependent). Could there be science without language? That question is much murkier; it is like the question of whether there can be mathematics without language, or logic, or geography, or history. If we ask whether there can be science without thought (knowledge, belief), the answer is quickly returned: no, because science isscientific thought. Science can’t exist without minds for it to exist in, and propositional attitudes are its necessary vehicle. There is no such thing as scientific knowledge if there is no scientific belief. The sciences, as we have them, are bodies of belief or knowledge, so there can’t be the former without the latter.[1] In the same way, there can’t be science (as we have it) without mathematics or logic or observation or inferential reasoning. Viewed as a human cognitive structure, science consists of all these things, but centrally of thought: there is no such thing as thoughtless science. Then the question is whether scientific thought requires scientific language. Can there be science without symbols? Someone might say that there cannot because allthought requires language. Such a person might mean an internal language or an external language—a language of thought or a language of speech. The former view would imply the scientific necessity of language trivially; the latter view would imply it by making inner thought depend upon outer spoken language. I am concerned with the second question: so, we are asking whether science depends on speech. I don’t think all thought depends on speech, for reasons I won’t go into here; I am interested in the question of whether scientific thought in particular depends on speech. Is there anything specific to science that makes it essentially linguistic? Is it like poetry or is it like painting? Is the presence of spoken language necessary or contingent to the existence of scientific thought?

It might be replied, evidently plausibly, that science, as we have it, does essentially involve language because of the existence of scientific communication: conferences, journals, conversations, books, letters, peer review. But that answer is superficial: why shouldn’t there be a Robinson Crusoe figure keenly interested in science and yet cut off from all scientific communication with other scientists? He has scientific thoughts (observational and theoretical) but he never talks about them with anyone. That seems perfectly possible; and isn’t the ordinary scientist in essentially this position when alone in his lab or study? Talking about science is subsequent to thinking about it not a pre-condition of it. Any serious connection between science and language must cut deeper than this. It can’t be just that speech is the externalization of scientific thoughts, if the connection is to be of any significance. Art requires artistic materials in order that anything be made (e.g., pigments and sounds); is there anything about science that makes it require linguistic materials? Does it require, say, the existence of grammar in order to count as science? Granted, not all thought presupposes grammar (say, animal thought), but does scientific thought need the resources provided by grammar? The question is suggestive and appropriate, but the answer to it appears plainly in the negative. There could be science without syntax. What if we had evolved to the present time with the general intelligence we now possess but without ever acquiring spoken language? Language came along recently in human evolution, but our big brains were there all along; so, in principle, it looks possible for us to have developed scientific thought but not spoken about it (solitary Newtons and Darwins, say). And yet it seems funny to say that our present scientific knowledge owes nothing to language, though it may be difficult to identify what it is exactly. Our intuitions are pulling us in two directions: on the one hand, thought as such does not entail spoken language to express it; on the other hand, our scientific world-view seems steeped in language, and inconceivable without it.

The science of linguistics needs language, obviously, because it is about language. But most science is not about language but about the extralinguistic world. We can truthfully say that science requires more than just scientific beliefs: it requires observation, memory, and theory construction—but where does speech come in? Compare history and geography: they too require observation and memory, possibly also theory construction, but do they also require language? Well, look at a typical history or geography book—what do you see? You see sentences, dates, maps, but also names—names of people, names of places, names of movements. If you were to delete these names, you would be left with very little. The knowledge you acquire from these books is typically name-involving (Paris is the capital of France, say). True, you could have some historical or geographical knowledge without the introduction and use of names, but those subjects would be crippled without the apparatus of naming. Such knowledge is name-centric. Thus, if we ask whether history or geography requires language, the answer is yes—part of language, at least. You couldn’t be a languageless being and have our geographical knowledge, because that requires mastery of the practice of naming. Geography would never have got off the ground without naming as a pre-existing psycholinguistic achievement. It could exist in embryonic form but not in its current splendor. Huge amounts of history and geography are about names, in the sense that you have to be aware of what is involved in something being called by a certain name: these subjects are implicitly metalinguistic. One knows, for example, that the city called “Paris” is in the country called “France”. So, they resemble linguistics. Not every concept is like this: some concepts, and the thoughts they feature in, are not tacitly metalinguistic—color concepts, shape concepts, moral concepts, etc. But concepts like Paris and France are, so knowledge involving them is language-dependent. To be Paris is to be called “Paris” (but to be red or square is not to be called “red” or “square”).

The extension to science is obvious. Enormous tracts of scientific discourse consist of name-like expressions–labels, tags, designators, cognomens–and hence introduce a metalinguistic element. You can’t grasp the propositions expressed without understanding the practice of naming. You have to be name competent, and hence a speaker. Without the use of names science would be crippled: just consider zoology and astronomy, to name but two sciences. The ability to name things, often using what are called “technical terms”, is critical to advanced science; that is why neologism is so common in science. The roots of naming no doubt trace back to the vernacular, but this resource is massively exploited in the sciences; and it is very useful there, because we often don’t know the nature of the things we wish to refer to—we need a nondescriptive label. If we could replace all names in scientific discourse with general descriptions, we could in principle dispense with language as an aid to scientific thought; but in practice that is impossible, so we are stuck with them. The result is that science cannot do without language, at least for limited beings such as ourselves. It can exist at a primitive level without language, but once we start to insert names into our scientific statements, we are introducing language into scientific thought. The answer to our question then is that much of science is language-dependent, though not all. Science as we have it requires language mastery for its possession.

This changes our picture of scientific theories. Empiricism pictures science as a congeries of experiences not essentially bound up with language—observations (not observation statements). That is the cognitive kernel of our scientific knowledge. But, according to the position here advanced, it is also a linguistic construction—a congeries of names (name-like expressions). Nor are theories “sets of propositions” that may be grasped by the nonlinguistic, but assemblages of words accessible only to the linguistically initiated. This is “nominalism” not empiricism (or even “propositionalism”).[2] Our scientific knowledge is constrained by our cognitive capacities (trivially), but these involve the apparatus of naming with its distinctive features. Scientists thus need to be speakers in order to do science in any meaningful way. Thinking isn’t enough (though necessary). This is why you have to learn the nomenclature of a science in order to do that science. It is the basis of categories and classification. A monograph entitled “Naming and Science” would be well named. Fortunately, we are prodigious name-users in ordinary life, so the cognitive demands of science are not too daunting. In all probability, we evolved this capacity as a tool of social interaction—we needed names for other people. So, the basis of scientific language is social psycholinguistics (in part). No human science without human society. We called each other by names and we then extended that practice to the rest of the universe (notice how personalized astronomical names are). Our intuition was therefore correct when it prompted us to declare science (partly) parasitic on language—without endorsing the strong and implausible claim that thought is always dependent on language. An adequate theory of scientific knowledge must include an account of scientific naming (a neglected field).[3]

[1] It will be noticed that I bypass completely such views as that scientific knowledge is a “language game” or a series of “texts”. That is an impoverished and misleading picture of what science is. In the first instance, scientific knowledge is a type of psychological state (competence, cognitive structure). So, there is no quick route from science to language.

[2] I don’t mean “nominalism” in its usual acceptation; my neologism is intended to capture the idea that scientific knowledge partly concerns what things are called, i.e., is metalinguistic.

[3] I often think that philosophy of science covers a too narrow range of topics. It has been overly influenced by logical positivism. This paper attempts to introduce a new topic into the subject.

Share

Administrators

Administrators

University administrators are rapidly becoming the most reviled people in America, and with good reason. When was the last time you heard of one making a good decision? It has always been thus, you say. But it is getting worse: atrocious decisions abound, heavy handedness is the norm, authoritarian attitudes prevail, academic freedom is trampled. Clearly, these people are not up to the job (where were they taught, what kind of qualification do they have?). They don’t seem to understand the basic principles of fairness, proportionality, and common decency. They come across as thugs and fools, completely out of their depth. With this in mind, I would like to offer my professional services, for free: I am available to be consulted by any university administration struggling with its decision-making. I have had a lot of experience in this area, am a competent moral philosopher, and not a total idiot. You could do a lot worse than enlist my aid. In fact, I think I could save you from your worst blunders. So, call me, let’s talk.

Share

Age

Age

We have the wrong idea about age. We think too much in terms of bodily change and the passage of objective time. We can certainly talk about bodily age and temporal age, but we also need to recognize mental age—the age of a person’s mind. This may not correlate closely with the other two types of age: someone might be young psychologically but old physically and temporally, and vice versa. In particular, the growth and development of the body may not track the maturity of the mind. We tend to be fixated on the transition from pre-reproductive human to reproductive human—the period we call “adolescence”. This is a period of rapid growth and sexual maturation: the organism becomes capable of reproducing itself. This biologically important transition occurs close on the heels of what we call childhood: the mind of the child is not far behind, gradually transforming, as the body make a sudden leap to reproductive maturity (this is true for animals as well as humans). Reproductive age is not psychological age. In principle, a baby could attain reproductive maturity, with the mind of a baby. This kind of age is not indicative of the age of the human (or animal) mind: the mind lags behind the body as time moves on. Obviously, too, the mere passage of time has nothing essential to do with the mental age of the organism: you are not mentally old just by living a long time, or mentally young by living a short time. The aging of the mind (its evolution or maturation) is an autonomous process, largely independent of the body and objective time. It is less publicly accessible, less measurable, less evident to the senses: we see the body age, we experience the passage of time, but we don’t see or experience the aging of the mind. That is something hidden. Yet it happens: people do change psychologically over the years, especially in the early years of life. I would say, roughly, that the child is more adult than we tend to think, and the adult is more childlike than we tend to think: the physical facts belie the psychological facts. Being big and hairy is not the same as mental maturity. But our language and senses don’t register the dynamics of psychological aging; they give us a misleading picture of true psychological aging. What does it consist in exactly? What concepts capture it most accurately? What are its characteristic phases and triggers?

Psychologists have tried to map the processes of childhood mental development (Freud, Piaget), articulating stages and laws, but I am concerned with the whole life-cycle. Still, they were right to stress the purely mental aspects of the maturation process. A theme that appears in much thinking about child development is decentering: the child comes to see itself as one being among many, thus achieving a degree of objectivity. It is a kind of epistemic maturing—from subjective (egocentric) to objective (impartial). Clearly, this has a lot to do with moral development (see Kohlberg). I prefer to say that psychological aging has everything to do with knowledge: a person’s psychological age is a matter of his or her state of knowledge. How much knowledge does the individual possess? What kind of knowledge? How was the knowledge acquired? How well founded is it? These are the kinds of questions that determine an individual’s mental age. It is generally supposed that (temporally) older people know more than younger people, and that is surely statistically correct; I am saying that this is constitutive of mental age. I would hazard the conjecture that people reach mental maturity around the age of forty (certainly not sooner), long after reaching the age of sexual maturity; up to that point they are still mental children (of varying stages of maturation). We can call this “cognitive age”: a given individual might be sexually immature, temporally immature, and cognitively mature—or other combinations. A common type is well past the age of sexual maturity, and far on in years, and yet childlike cognitively (with “the mind of a child”). In principle, it would be possible to track the stages that precede the age of cognitive maturity, spelling out their internal features, their triggers and pathologies: that would be the task of the whole-life developmental psychologist. We know very little about this as things stand, save anecdotally, but empirical work could be undertaken to establish the natural history of the individual mind—the distinctive phases, underlying principles, and individual variations. We might even be able to measure cognitive maturity by suitable tests (analogous to IQ tests). Each of us may be assigned three ages: years since birth, bodily development, and cognitive state. Someone might be 70 calendar years old, 50 in bodily years, and 80 in cognitive years, i.e., pretty old in temporal terms, middle aged in terms of bodily condition, and advanced in mental terms. Age would be regarded in a more fine-grained and nuanced manner than it currently is. This would be fairer in all sorts of ways (it would make ageism a lot more difficult to sustain).

Let me put it intuitively and crudely: how old you are is determined by how much you know about the ways of the world. The idea is not that age is a matter of your amount of trivia knowledge, so that Jeopardy champions have the highest mental age. It’s about how much you know of relevance to your environment and life-style, including your social world. It will certainly include the capacity for sound judgment and careful thought (both essential even in jungle-dwelling tribes). This is not an elitist proposal in the pejorative sense. Knowledge can be practical as well as theoretical or book-learned. I also mean to include emotional maturity: this too will have a knowledge component. Mature emotions are regulated by rational thought, information, and openness to correction. So, a person cannot be fully mentally mature just by knowing a lot of facts while being emotionally juvenile; emotional age also matters to mental age. In fact, psychological maturity is largely about emotional maturity; the emotional and the cognitive are intimately connected. The point is that the mind grows and matures, reaching a sort of steady state (analogous to adult height); and this needs to be recognized in our thinking about aging. We should not be focused exclusively on merely corporeal or calendar considerations; indeed, mental age is really the central fact of aging. We should be as obsessed with it as we are with our calendar age or bodily age—more so. Have I reached mental maturity yet? Am I getting more juvenile mentally? How do I keep myself young in mind? Are there areas in which I am still mentally immature, or callow, or childish? Do I know too little about things that really matter? Am I sometimes hopelessly puerile in my judgments and opinions?[1]

[1] People look in the mirror to see how they have aged (I include children growing up as well as more senior people), but there is no mirror for mental age. We cannot immediately ascertain the impact of years on our mental age. But memory affords a way to gauge the effects of time on the mind and its propensities: how did I used to think and feel about things? And how does my mind differ from the minds of older and younger people? We are not completely closed off from knowledge of our mental age.

Share

Universities

Universities

I am reading Mary McCarthy’s 1951 novel The Groves of Academe. It is a marvelous satire on university politics and pretensions, centering on one Henry Mulcahy, unjustly fired from his post. What is astonishing is how little things have changed in the interim, except for the worse. There is the rogues’ gallery of credulous clowns (her phrase), arid deceptive administrators, callow students, creepy careerists, the constitutionally corrupt, nasty pieces of work, bookish buffoons, and ideological idiots. Mulcahy has done nothing wrong, but that makes very little difference once doubts have been sown. Friends flee, students turn, presidents politicize—the usual parade of human viciousness. No one seems immune from hysteria and dubitation (a word the author introduced me to). The book is startlingly well written, funny, gimlet-eyed, and generally spot on. There is even some philosophy in it. I haven’t reached the end yet, but I will be interested to hear the fate of this hapless and well-meaning (if eccentric) man. Of course, the story is steeped in American psychopathologies. A book for our time—perhaps for all time.

Share

Index

Index

I thought it might be useful to provide a list of words that could be searched on this blog, in case people wanted to look up specific subject areas.

Analysis, a priori, truth, meaning, knowledge, skepticism, reality, names, reference, fact, necessity, biology, psychology, physics, astronomy, science, philosophy, identity, existence, freedom, self, person, consciousness, intentionality, causation, subjective, objective, good, ethics, value, mathematics, logic, literature, Plato, Wittgenstein, Hume, Descartes, Berkeley, Locke, Kant, Strawson, Chomsky, Kripke, Quine, Davidson, predication, innateness, color, shape, space, time, animals, God, the big bang, life, language, thought, belief, desire, concepts, pain, evolution, music, sport, art, games, matter, mind, body, brain, food, disgust, hand, sex, vision, world, economics, metaphysics, linguistics, politics, mystery.

Share

Trumperica

Trumperica

I claim no originality in asserting Trump’s abysmal character; it stares us in the face every day. Nasty, stupid, witless, bigoted—you name it, he has it.  But, as is also ruefully noted, at least a third of the country thinks he is just fine. The point that is not observed, however, is that it can hardly be an accident that he is so widely approved. He is the average American writ large and ugly. Can it be that the non-Trump supporters are totally different from him and his followers? He is an amoral idiot, but he is not alone in that distinction. I have to admit that since his improbable (?) ascent I have been assailed by the thought that he is America’s not-so-hidden id: there is a bit of Trump in a lot of people. I won’t venture a percentage, but I swear I have glimpsed it in many people, some working in universities. Nasty, stupid, vindictive, amoral, humorless, uncompassionate, unthinking—you know the type. It’s the combination of the puerile and the violent that really sticks out. Their idea of virtue is destroying the people deemed “bad”, without worrying too much about fairness, due process, careful evaluation. Shoot first and don’t ask any questions later. Trump embodies all the pathologies of the American psyche. It really isn’t all that surprising that he is not summarily dismissed from civilized society; he is that society as it has come to exist. Where did he come from, this monster of the deep? He came from these United States, of course. There is nothing singular about him.

Share

Retirement

Retirement

I once heard Michael Dummett remark that he was looking forward to retirement so that he could get some work done. My sentiments exactly: work has never been so sweet as it has been post-employment. You work on what you want to work on and you don’t have to break off to fulfill your teaching (etc.) “duties”. I never retired from philosophy; I retired from teaching it (and allied occupations). I simply carried on doing what I’d been doing for forty years but without the bad bits. I don’t mean I hated teaching philosophy; on the contrary, I enjoyed it. But I didn’t enjoy the institutional framework of teaching: the formalization of it, the grading, the evaluating. Still less did I enjoy the hiring, promoting, letter writing, placement, recruitment, etc. I didn’t retire; I simply transitioned to the good stuff. I feel sorry for all the poor saps still chained to the industry we call education. I imagine Plato’s Academy was a pretty nice place to work, but the university as it has become is a miserable decline. Who does not hate university administrators these days? Who does not find half their colleagues a pain in the butt? Who loves every last one of their students? Who looks forward to a morning of writing letters of recommendation? No wonder most of the great philosophers didn’t work in universities—you can’t get any work done. Retirement (re-employment) is just a better state of mind to be in. But don’t leave it too late—don’t wait till you have nothing left to give. Retire while you can still work!

Share