Secularism plays a crucial role in a certain moral narrative of modernity. This narrative tells a story of the liberation that is supposed to have emerged as people came to realize that the agency they had imputed to false gods, or to gods altogether, in fact belonged to them. Some familiar variations on the basic story date back at least to the 18th century. Perhaps less often noted is the semiotic ideology it tends to presuppose (for details, see my book Christian Moderns). A glance back at the debates that ensued after the notorious affair of the Danish cartoons of the Prophet Muhammad may help illuminate how this semiotic ideology is associated with secularism. It may also shed light on how that ideology helps sustain the common sense of secularism and its ties to ideas of freedom in general, and of the press and its publics more specifically.
In September 2005, the right-wing Danish newspaper Jyllands-Posten published a number of political cartoons, most of which used the image of the Prophet Muhammad to lampoon Islam in one form or another. According to the newspaper’s editors, their purpose was to test the courage of Danes in standing up for their tradition of freedom of expression, in effect making press freedom a distinctive feature of Danish ethnonationalism. We are all aware of one result, the wave of sometimes violent anger that lasted several months (followed by a second wave in 2008), that extended across the Muslim world. In October both the editor and cultural editor of the Jyllands-Posten insisted they had done nothing wrong, and stressed that freedom of speech is at the heart of Danish democracy. A spokesman for the paper said there was no intention to provoke Muslims; “Instead we wanted to show how deeply entrenched self-censorship has already become” among Danes.
I revisit this by now quite familiar incident not in order to rehearse once again the many arguments about immigration, citizenship, Islam, or European politics. What I am interested in is how the European response to Muslim anger reveals some of the aporia of a semiotic ideology closely tied to secular and liberal thought. I want to ask what it was about the Danish denials that may have made them seem so persuasive to other observers across a fairly wide political spectrum. This means asking what the resulting debates about freedom and blasphemy might reveal about certain moral claims of the press, and the underlying assumptions those claims presuppose. I want to suggest that these claims involve semiotic ideologies whose genealogies reach back much earlier, and extend far wider, than the current politics of immigration, identity, and the current geo-political strife.
By focusing on freedom of the press rather than social relations, the defenders of the newspaper could count on a family of common sense views of what pictures and words are, and how they function in the world. They tapped into a widespread and habitual way of thinking that treats representational acts as referential and communicative in function. In this view, pictures and words are vehicles (and in the case of words, arbitrary social conventions) for information, itself a distinct entity that stands apart from persons and their actions.
This view is not the only one found in the Euro-American world, nor should we imagine that the sense of offence some Muslims expressed is fundamentally alien to “the West,” as American reactions to the “Piss Christ” artwork make clear. We should also not assume it arises from some sensitivity peculiar to religious faith, as American responses to flag-burning and Spanish laws against lèse majesté show. But it does have a privileged relationship to the moral narrative of modernity, in particular to those strands associated with liberal thought and the concepts of freedom associated with them. It is implicit in John Stuart Mill’s classic defense of press freedom, according to which the reader should evaluate the message and ask how well it fares in competition with the alternatives, which determines whether we should accept it as true. Expressions of truth should be set into free circulation to be sorted out by the invisible hand of their readership, as the aggregate outcome of so many individual judgments. Certainly, the European arguments are somewhat different from the American ones (as European Holocaust denial laws show), but they share a deep background. The classic defense of freedom of expression draws, in part, on a semiotic ideology that takes words and pictures to be vehicles for the transmission of opinion or information among otherwise autonomous and unengaged parties, and the information they bear to be itself so much inert content more or less independent of the activity of representation.
This assumption about words and pictures, or semiotic ideology, tends to place them in a domain apart from that of action and actors. Moreover, there is at least an affinity between this semiotic ideology and the view of action I have described, in which the action and the actor’s intentions remain relatively independent of the social relations into which they enter. (Notice that the classic exceptions to the referential and predicational model of speech that typify legitimate restrictions on free speech, “fighting words” and crying “Fire!” in a theater, retain this character of discrete actions on the part of autonomous subjects. To an extent, this also characterizes some of the more familiar portrayals of the so-called performative character of language.) That is, how one understands words or images can both express and reinforce one’s understanding of social action and its moral import, and therefore, its political consequences.
If it is disingenuous to overlook or misconstrue the ways in which expression can form an aggressive form of interaction, it can seem reasonable in part because of a prevalent model of communication that, in its most familiar forms, has roots in iconoclasm. The theological, institutional, and political history of this concept is complex. But even a simplified version can, I think, tell us something about the assumptions and habits that make the Danish position seem commonsensible to so many.
Western liberalism draws on some iconoclastic themes that are ultimately shared by the three major Abrahamic scriptural religions. One underpinning of this iconoclasm is the worry that people would be distracted by sounds or images at the expense of those spiritual things that transcend experience. They might even come to worship those sounds or images. The liberal tradition shows the more specific effects of the Protestant Reformation as a purification movement. In its religious form, the iconoclastic impulse led to the stripping of imagery from the churches. Pictures should only convey visual information; they should not inspire devotion and become objects of worship. Indeed, for some reformers, they should not even stir the feelings.
A similar purifying impulse ran through the Protestant Reformers’ treatment of language. The Latin liturgy and Bible seemed to them to verge both on pagan magic, and idolatry of the word. Protestant churches brought a new focus to the pulpit. Sermons, now central to the service, emphasized the communication of ideas over supplication, blessing, confession, or non-verbal ritual actions such as making the sign of the cross. Opposed to the treatment of Latin as a sacred language, the Protestants translated the Bible into vernaculars. In effect this desacralization of the words and images encouraged hearers and readers to treat them as vehicles for communicating information, and not as aspects of interactions among, and constitutive of, moral subjects. It also tended to treat the truth conveyed by words and images as lying in a realm distinct from the words and images themselves and from the relations among those who wield them. A long history produced an underlying understanding of verbal and visual signs as conduits, empty in themselves, for the conveyance of ideas between otherwise autonomous people.
The classic arguments for freedom of the press commonly rest on this by now habitual view of words and pictures as vehicles for information that are fundamentally independent of social relations and interactions, other than serving as ready-at-hand tools. This background is one reason why it has been so difficult for Danes, and indeed for Americans, to deal with verbal or visual expressions of hatred: to the extent that they are mere words, it is hard to see clearly how they are also forms of action in any serious way, beyond, say, making misleading truth claims or hurting another’s feelings. Even accepting that they are actions, they are actions understood as taking place between otherwise independent agents. Since those agents are independent, the response of the wounded is ultimately in their own hands (one might ask, for instance, why they can’t be less emotional, or what is it about religion that makes people so sensitive). These habits of thinking and action are very deep. Cartoonists, whose daily bread depends on having keen instincts for the potency of words and drawings, may themselves have trouble explaining why their work has the effects it does, for it runs against the grain of some habitual and ordinary ways of thinking and speaking in the world that liberalism created.
This is not to say important alternative views of words, things, and persons do not exist in western Christianity. Deep background includes the role of visual imagery in the imitatio Christi, the transubstantiation of matter in the Eucharist, potent language in the form of exorcisms, curses, oaths, the uses of scriptural texts in divination, the practices of votive offerings, and so forth. But these examples lie in the religious domains that are rather too easily dismissed as relics of a vanishingly “traditional” worldview. There was also a long tradition in rhetoric that stressed the interpersonal effects of speech. One could argue, however, that within the emergent public spheres of the liberal world, rhetorical, poetic, or performative action increasingly tended to remain confined within marked domains, as models based on communication, information, and the autonomy of social agents grew in dominance and generality.
To say the aggrieved feelings of Muslims are independent of the act of publishing caricatures of the Prophet is to say they misconstrue the real nature of action; to say that cartoons are only pictures is to say Muslims misconstrue the real nature of symbolic forms. Both assertions draw on the common sense of a particular semiotic ideology to cast doubt not just on the other’s respect for freedom, but more deeply yet, on the other’s grip on reality. Viewed in the light of the moral narrative of modernity, the semiotic failure of the offended Muslims is a symptom not only of difference, but more specifically, of an anachronistic ontology. They are, in this respect, like prosperity gospel preachers, faith healers, and other apparently magical thinkers. Their false grasp of the nature of signs is a manifest symptom of this, that, failing to grasp reality, they lie on the other side of a boundary between rational and irrational, modern and pre-modern.
Now, if the matter were to rest there, we would have nothing more than a familiar story about the clash of civilizations; they have their reality and their values, we have ours. But my point is somewhat different, for it rests on the observation that the defenders of the Danes are not merely asserting a different view of reality, or even of signs, from those of their critics. Rather, they are themselves as much in the grip of a selective semiotic ideology as are their antagonists, an ideology that leads them to misconstrue the nature of their own actions. The “otherness” lies not just between liberal secularists or Christians and conservative Muslims, or between Danes by genealogy and Danes by residence, but also between any given actors’ self-understandings and practices.
In Christian art images have not always, or have seldom, been worshipped per se.
Rowan Williams, for example, in treating the place of icons in worship, shows how the tradition of icons, when properly used as an aid to prayer, draws the devout into the image which centres the heart
of worshippers and opens up the space for God to act on them. In this particular case, the icon is not so much a representation as a living relationship!
Thomas Aquinas’s understanding of human knowledge is an objective participation in nature, where the objects around us are already in a living relationship with us, and yield their meaning to us through the mediation of the intellect. In this respect the natural universe is a sacrament.
This more recent recovery of Aquinas, by a minority of scholars in response to the ecological crisis, addresses the flaw of the Renaissance viz. that it forged an alienation from the world.
I don’t disagree with Webb Keane at all about the Danish cartoons. But I worry a bit about his view of secularism.
He writes that secularism’s narrative “tells a story of the liberation that is supposed to have emerged as people came to realize that the agency they had imputed to false gods, or to gods altogether, in fact belonged to them.” Did secularists really believe, or have to believe, that they themselves possessed the agency they denied to the gods? I don’t think so. All they did was deny that the gods should be credited with such agency. In other words, secularism too could be properly modest (a virtue the post-secularists tend to attribute mainly to themselves) about what it didn’t know, couldn’t be certain of, and couldn’t control about the world. To me, the modesty seems more on the secularists’ side.
To me there is also a problem in Keane’s treatment of the view that representations can be seen as actions or performances. At the end of this post, he says modestly that defenders of the Danes “are themselves as much in the grip of a selective semiotic ideology as are their antagonists, an ideology that leads them to misconstrue the nature of their own actions.” I agree. But throughout the post, it seems to me that he suggests (though perhaps this is my misreading) not just that both groups are equally “in the grip” of an “ideology” that is “selective,” but that the opponents of the cartoons are right—that their “ideology” is the truth. Representations ARE really actions. Shouting fire in a crowded theater, the classic exception for the secularists, is really the representative example. It’s BETTER to think of speech as action than to think of it as picture or image.
Let’s suppose again that I agree. My question is: on what grounds would either of us embrace this position? By what standard do we decide it’s correct? Is this position too an action, to be judged as an action and not by any other measure? I say this, obviously enough, because I worry that giving up on any standard other than that of action leaves us open to a judgment of actions by standards that have not yet been specified, at least in this post, and that might produce or encourage actions, or social situations, that we would want to disapprove of. Strongly. For example: “This is not the speech of an ally. Whether it is true or not, you are aiding and abetting the enemy.” Or “Such a thing is inappropriate for a member of my family to say.” You can fill in the possibilities yourself.
One of the impulses behind secularism is to inhibit such social pressures from inhibiting free expression. It’s not an impulse to follow to the ends of the earth, but it’s surely not one we can afford to dispense with either.
Webb Keane suggests that speech is an action and that those who defend the publication of the Danish cartoons do not fully appreciate this fact.
On one reading, this is a deeply and embarrassingly uncharitable characterization of the views of any reasonable person who might defend the publication of the cartoons in question. If we are charitable and do not read Keane as offering this account, then what is said above simply above obscures complicated but revealing factors in this debate with a simplicity deceptively packaged in irrelevant musings about the secular.
No one reasonably can deny that speaking and publishing are actions. Uttering sentences and printing text (or images) are actions par excellence. They are subject to all the normal practical considerations to which other actions are subject. For, as a formal matter (i.e., abstracting from the particulars), whether I ought to say something (or print something) or do something else is no different a question than whether I ought to run to catch the bus or just walk and wait for the next bus. In all cases, what is at issue is intentional behavior.
So, Keane surely cannot mean that defenders of the printing and publication of the Danish cartoons (or utterers of offensive speech) believe that printing and speaking are not actions in this sense. If he does mean this, then the target of his objections above is a particularly dimwitted crew. Maybe this is his target, though. If so, then Keane needn’t have rattled on for so long about secularism. He could have just made the point I made above in five simple sentences and been done with it.
So, if I am to be charitable, I must assume that Keane means something else when he ascribes to others the conviction that speech is not an action.
I suspect that what Keane means is one or both of the following: those who defend the Danish cartoons believe either/both that (i) acts of speech/publishing cannot affect others in the extreme way other actions are capable of affecting others; and/or (ii) acts of speech/publishing cannot be as morally objectionable as other actions can be. Let us call (i) the Impotence Thesis (IT) and let us call (ii) the Benignity Thesis (BT).
I’ve stated the IT in vague terms so as to capture a wide range of positions. The basic idea behind it, though, is that speech (from here on I will write “speech” to refer to speech-acts and other related acts like publishing), in order to do anything qua speech, must be taken up by the hearer (or reader or whatever) in order for the speech to do anything qua speech (it might do something qua vibrations in the air, or qua ink smeared on a page). And, even if there is uptake, the hearer ultimately has substantial control over the role the speech-act plays in her life. For example, even though I might hear someone say something cruelly negative about me, I can, through a sheer act of will, manage to ignore it. Insofar as I am unable to overcome my responses to what others say, it may be due to some failing on my part: I am deeply insecure, or I put too much stock in what that person thinks of me, or while I am not insecure I do lack self-respect and so am prone to taking quite seriously negative things people say about me even if I have good reason to disbelieve them, and so on. A strong-willed, secure person who has a robust sense of self-respect may be wounded a bit by truly cruel comments, but in the normal course of discourse, most of what other people say can be ignored (except in circumstances in which she has intentionally made herself vulnerable). For example, a strong-willed secure person with a robust sense of self-respect will be able to ignore insults or mocking hurled his way on the subway by rowdy teenagers.
The BT, on the other hand, is the view that no matter its content, speech is never as morally serious (i.e., it can never be as morally significant) as other actions can be. Here is an example that confirms a *weak* version of the BT: speech, whatever its content, can never be as morally odious as murder nor can it ever be as worthy of moral approbation as, e.g., profound acts of charity or sacrifice for others. Stronger versions of the BT involve claims about a more general moral insignificance of speech relative to other actions.
Now, many believe BT partially in virtue of IT: because the power of speech to affect others ultimately depends upon factors that presumably others largely control, then speech cannot be as morally serious as other kinds of actions that, in some sense, cannot be resisted by others. (For example, A stealing B’s car is not something over which B has any control.)
I’m actually inclined to think that this is a fairly reasonable position. The exceptions, of course, are illustrative: shouting “fire” in a crowded theater is likely to create conditions that will cause grave harm to many of the hearers, even if they wished to calmly ignore the speech. So, in this case, the IT does not apply.
This approach in turn allows us to diagnose the problems associated with the Danish cartoons. These cartoons were apt to create morally objectionable conditions over which Danish muslims could not control, even if cooler heads had prevailed and the Muslim community had, in general, willfully ignored the cartoons. In short, the relative political weakness of the European muslim community along with widespread xenophobia and racism in general creates conditions not unlike the conditions Justice Holmes identified in the shouting fire case.
The dispute with defenders of the Danish cartoons, then, is over whether conditions are such that the effects of the speech is subject to the control (to some extent) of those who are the “targets” of the speech. Of course, the defenders might balk here and endorse an excessively strict version of the BT. But, as Holmes argued, there are surely conditions under which the BT does not apply. So, let us not saddle the defenders of the Danish cartoons with such an absurd view (or at least, let us invent defenders of the cartoons who do not saddle themselves with such an absurd view and engage with them).
Thus, we are able to introduce political conditions into our analysis of the moral significance of speech. We do not need some additional introduction of a baroque discussion either of the secular or of semiotics. We just need a careful analysis of the circumstances under which speech manifests certain powers.
As a final aside, I’d like to note that perhaps the best response to the Danish cartoons was not to meet the potential of violence due to the cartoons with paroxysms of actual violence. For, actual paroxysms of violence do not reliably generate the changes in power that can alter (for the better) the potentialities of a speech act. The best response is probably something like a collective, organized effort to gain power so that such speech loses some of its capacities.
Matthew Noah Smith is certainly correct to say that the Muslim reaction to the Danish cartoons must be understood in the context of their political weakness in Europe. It is also hard to disagree with his views about the value of “actual” (by which I assume he means physical) violence. But these lie beside the point of difference between us. And here it’s tempting to take the intemperate tone of his response as a symptom that something beyond mere error on my part is at stake.
It’s an overstatement to say I claim to characterize any and all possible defenders of the cartoons. Rather, my interest lies in bringing out some underlying assumptions on which a certain kind of defense lies, and setting them within a larger historical context. I suggest these assumptions reinforce certain views about the irrationality of Muslims. They may help shed light on why cartoons, rather than numerous other provocations, served as the flashpoint they did for both sides, and why so much of the European and American reaction focused more on supposed religious sensitivities than, say, the political vulnerability to which Smith rightly points. Implicit in my approach is a further point, that idealizations of society, of rationality, or of communication will not help us make much headway in understanding actual historical realities.
Now perhaps, as Smith remarks, no one denies that speaking and publishing are actions (although one may then wonder why people like Wittgenstein and Austin took the trouble to spell out so obvious a point to their colleagues). The questions still remain, (a) what kind of actions they are, and (b) how those actions are affected by actors’ understandings of what kind of actions they are. Smith seems to find the latter question uninteresting or even “baroque.” Whatever else our disagreements may be, here I fear the problem is compounded by differences of disciplines. Those of us who are familiar with social and cultural worlds unlike those which formed me, and no doubt Smith as well, or those parts even of European history that precede the recent past, tend to be skeptical that everyone agrees what are the “normal practical considerations to which other actions are subject,” or, more generally, that what counts as “reasonable” is a simple matter.
Notice, for example, how Smith’s formulation of “IT” reproduces the worldview I have tried to put into context. First, it seems to be predicated on the original autonomy of actors from one another, prior to their entering into communication, which itself must be understood in terms of individual intentionality. Such an approach would seem to have little interest in such mundane experiences as the internalizing of others’ words, the moralization of certain speaking styles, the collaborative construction of meaning, and numerous other well-attested sociolinguistic phenomena. Second, it requires the elimination of certain possible views of language, the outcome of processes that, as I have argued elsewhere, are historically involved with secularism. Thus, for instance, a curse spoken unwittingly cannot be consequential, an icon can only be the product of a human hand, words inscribed on an amulet or set within a mezuzah are expressions of ideas, gods or spirits cannot speak through me. As an empirical matter, there exists quite a wide range of views of social relations, of images, and of words, even in the midst of so-called secular societies. Smith may not agree with them—indeed, no one can agree simultaneously with all empirically known views about these things—but that will not make them go away. They form part of the context an analysis of action must consider.
Smith characterizes my sketch as “baroque,” and also hints that I am leaving out the real politics. As I have suggested, in this we may see an unfortunate clash of disciplinary cultures. (Some of my colleagues in history, linguistics, and anthropology may think my brief blog post is too simple.) I suspect our difference lies in the weight we’re willing to accord the real complexities of history, and the capacity of idealized models of rationality and social relations to override them. My worry is that when the idealized models win out over the history in our accounts of what actually happens in the world, somehow we—whoever gets to formulate those models—always turn out to be the rational ones. All others are rendered merely unreasonable, and therefore incomprehensible, if not worse. There’s a politics to that as well.
Bruce Robbins’s concerns center on two issues. One is about my portrayal of secularism, the other about evaluating actions. Regarding the first, certainly secularism is a far more complex business than one blog post can capture, and no doubt there are modest forms of secularism. In a brief post I can’t do justice to the strong version of secularism. I defend it at some length, however, in Christian Moderns. The imputation that some people displace, and therefore don’t fully grasp, the agency that belongs to humans is, I have argued, characteristic of a common understanding of what it is to be modern, and what it is to be backward. I suspect some of our best friends in the American academy today hold similar views of our evangelical neighbors. I don’t want to insist, however, that this aspect of secularism therefore holds in all contexts without exception.
As for the second concern, to say that representations are actions is surely not to say they cannot be many things at once. It’s not an either/or matter. It’s not the case that we must chose: either that representations are actions, and therefore must be judged on exactly the same terms as any other action, or that they are not actions, but some special other kind of thing. The point, rather, is to understand the various kinds of action within which they can play a role, and the kinds of roles they can play in them. And by recognizing this, I am hardly recommending that we give up any standards of judgment. Again, this is not an all or nothing matter. After all, we don’t judge even such physical actions as a dash for the subway and a dash in the Olympics in the same way either.
It seems that Robbins may be asking me to do something I’m not attempting to do. I am trying to understand something about an empirical problem of social and political differences, not to legislate a normative framework for all further judgments. I’m surely not in a position to achieve the latter with any success.
It is hard to know where to begin in response to Webb Keane’s comments. First, though, a clarification: Austinian speech act theory – and of course Geach’s famous paper “Assertion” and Searle’s famous early work – was largely work that attempted to address questions of meaning in speech. And “meaning” here is to be understood intuitively (i.e., in a way that any kid might understand it – not in some postmodern way). Austin and the other British (and one American) gentlemen by no means were laboring to demonstrate to their colleagues that speaking is an act on a par with walking. That fact was as plainly obvious as English winters are dreary. Perhaps such simplicity is an outrage in this post-Derridean age, but in mid-20th-Century Britain, it wasn’t. As far as what the late Wittgenstein was up to… I suspect the answer here depends upon to whom you pose the question and how you pose it.
To the meat of the response: As far as I can tell, Keane suggests that I have failed to appreciate how utterances are affected by actors’ understandings of what kinds of actions those utterances are. Keane offers few clear examples, but seems to suggest that on my view all speech-acts are, in some very important sense, the same. But, where do I say that? Of course different speech acts are different. Saying “Hello!” to me and saying “Fuck off!” to me are quite different and they obviously generally have different effects on me. On the other hand, if I don’t speak your language, then your greetings and your cusses have no effect – they are all just so much gobbledy-gook. If I don’t care about you or think I am of nobler stock than you, but I still speak your language, then your greetings and cusses, while sensical, will likely mean little to me (why bother attending to ramblings of some peon?). Or, if I see myself as weak, and I see you as powerful, then your cheerful greeting will mean a great deal to me, and your curses will be devastating.
There is no alchemy here and certainly nothing distinctively secular or religious. Of course, the power of curses, icons, amulets, mezzuzot, and speaking in tongues all depend upon one’s religious outlook. And, surely as a matter of basic respect for others (or is this too secular, Western, and disciplinarily ghettoized a moral attitude to be, um, respectable?), each of us ought to attend to what matters to others. Insofar as my neighbor is religious and finds horrifying the errant curses I mutter when I smack my thumb with a hammer, then I should learn to shut my trap or hammer indoors (or just learn to hammer properly). I may judge my neighbor to be foolish to hold these religious views, but he may also rightly judge me to be a first-class jerk if I disregard his religious views. That is, regardless of how dumb I think he is to take a bunch of religious clap-trap seriously, I still owe it to him not to run roughshod over the fact that he cares about this religious clap-trap. In general, the kookiness of the views others hold makes little difference when deciding how to live with those others.
Of course, discerning why certain strings of words are curses for some and forgettable comments for others requires an enormous amount of historical and anthropological work. But, that work simply helps us to understand why, for example, Muslims care if their prophet is depicted in an obnoxious manner for all the public to see. It does not tell us that we ought to care that Muslims care about this. That is just common decency and if one has to study Derrida or understand the history of secularism to learn this, then intemperate blog posts by grouchy philosophers are the least of our worries.
As far as how my formulation of the Impotence Thesis (IT) is “predicated on the original autonomy of actors from one another, prior to their entering into communication,” I can only make sense of this if Keane is assuming that there is no way to find a reasonably stable background of shared meaning and shared life against which what we might call “normal speech” proceeds. This normal speech is what is impotent. It may be the case that when large numbers of immigrants enter a community, there is no reasonably stable background and so the IT fails to hold. That’s an interesting question but not one that Keane queries.
I do not know what the internalization of others’ words, the moralization of certain speaking styles, and the collaborative construction of meaning are. At least, I do not know what they could be if they pose a problem for my account. I’d have to see more.
We may now consider a word on curses, oaths, promises, and the like: Speech is obviously a very powerful act and it is a mystery how it gets that power. The fact that in contemporary Western, secular (?) society, the words “I promise” can change the moral order such that the person who sincerely utters them undertakes a specific obligation and the person to whom the words are uttered gains a claim-right against the utterer is rather mysterious. This applies to oaths, of course, and it also, in fact, applies to other remarkable speech-acts like, for example, when one says to another whilst proffering some item, “Here, take this, it’s yours.” Such an act changes the property rights relations for all people and not just between the two parties (since before the utterance all people owed it to the utterer not to take the item and then after the utterance all people owed it to the recipient not to take the item). Oh the miracle of speech! Given a different set of social practices or whatever, curses, prayers, and so on have similar (albeit not isomorphic) powers.
No one can deny the magic and mystery of language in this regard. How we make sense of this great potency of language is neither an historical project nor an anthropological project, since history and anthropology reveal only descriptive features of the world. If we want to know why language is fraught with ought, to borrow from Wilfrid Sellars, then we must consign ourselves to the idealizations of philosophy. Perhaps the power of speech depends upon the power of God (yet another great mystery!) or perhaps on the ineffable power of the supernatural (ineffable so more mystery!) or perhaps on the many layers of the natural (you guessed it: mystery here, also). In short, no matter one’s worldview, if one is a curious person and wishes not simply to take things as they are but instead to try to understand the power of language, then mystery abounds.
Finally, the ultimate issue here is what is behind the response to the Muslim response to the cartoons. I find it quite difficult to believe that secularism has any meaningful role here. The charge of irrationality leveled by some against Muslims could just as well be leveled against sports fans who respond violently to verbal attacks on their sports team (this is common throughout the United States and Europe, especially in college towns and Philadelphia). In both cases, the sacred has been profaned. Criticizing the angry responses as irrational obviously misses the deeper point: broader political, historical, and sociological analysis is required in order to understand the angry response and in one case talk of religion and therefore the secular is apt. But, in either case does secularism have anything to do with the responses to the responses (i.e., does secularism have anything to do the charge of irrationality)? It seems quite unlikely.
Between his general flailing about and his mystical dismissal of empirical research, Smith’s new message sheds no new light, and gives me nothing new to respond to. Those who are interested in evaluating the actual case I have made are welcome to follow the three links to other publications that can be found in my first response.
In his above reply to Keane, Smith writes: “How we make sense of this great potency of language is neither an historical project nor an anthropological project, since history and anthropology reveal only descriptive features of the world.” He continues with a correlative claim: “If we want to know why language is fraught with ought, to borrow from Wilfrid Sellars, then we must consign ourselves to the idealizations of philosophy.” I do not wish for now to step into the cross-fire between Keane and Smith, over the main subject of their dispute, but I do wish to challenge the simple distinction—implicit in these sentences from Smith, and in much analytic philosophy—between description and prescription, or between ‘is’ and ‘ought.’
When we say of a particular watch, one which does not tell time accurately, that it is broken, this appears to be a description, and it is. But this description is also a prescription, since it disguises a chronometric teleology, so to speak: this watch ought to tell the time accurately, at least according to our culture at this time, because all watches ought to tell the time accurately. That is what makes them watches for us rather than mere jewelry. In other words, it is because watches perform a certain function for us—arguably a social function—that they are what they are for us. Anything whose definition is functional, indeed, is susceptible of such descriptions that are also prescriptions.
History and anthropology teach us not only that language is functional, but show in particular instances how it functions socially. The great figure in this tradition is of course Lévi-Strauss, but even Austin, in How To Do Things With Words, was operating as an anthropological linguist as well a philosopher. After all, to a foreign reader then and perhaps even a native reader nowadays there is as much in this book to learn about peculiar mid-20th century British social customs—like royalty christening ships—as there is to learn about the “idealizations of philosophy.” That is likely why it is so much more interesting than most other books by Austin’s Oxbridge colleagues and their American epigones.
Alas, denied an opportunity to hear more from Keane.
But, it is worth saying that I hardly dismiss empirical research. In fact, I strongly endorse it as one of the ways to solve the mysteries to which I alluded above (although analytic philosophy is another method for achieving the same in some instances). But what counts as empirical for me (the physical and biological sciences) surely counts as some form of wooly-headed colonization of the unconscious to Keane and his ilk. I hardly consider anthropology and history the kind of empirical work that with any finality tips the philosophical balance one way or another. They, at best, only teach us that the limits of our philosophical imagination may not be the limits of philosophy.
Miller, in his interesting comment, raises the question of functions. No doubt functions introduce normativity into the mix but I cannot see the significance of this. Just because much of our discourse involves “oughts” does not mean that there is no distinction to be made. It simply suggests that it is hard to tease apart the two. But, the concepts are easily distinguished and pure cases of each are easily constructed (e.g.: “My watch is round” vs. “My watch is broken”).
None of this is meant to deny that at a deeper level we are stuck with loads of normativity in language. I take the lesson of Kripke’s famous work on Wittgenstein, and much of the Pittsburgh school of philosophy to defend various versions of this claim. But, the mere fact that discourse sometimes presumes objects have particular functions hardly suggests a collapse of the is/ought distinction.
Re: anthropology and history. I hardly think we need these disciplines to teach us that language can be deployed in various ways, to various ends. When deployed solely for this purpose, these august disciplines provide only nifty examples of the functionality of language. I’d rather historians focus their great skills on other matters.
Regarding the “anthropological” strain of philosophy: I just take it to be the case that philosophers proceed in their work through introspection and interrogation of doxa. It’s unsurprising, then, that excellent philosophical works would be particularly crystalline expositions of the practices extant at their composition. To borrow from Hegel: Great philosophers grasp their age in thought – hence Plato, Hobbes, Locke, Hegel, Marx, Rawls, and so on are so capable of bringing to the fore the great contradictions of their own eras.
In the absence of a reply from Keane, Smith has taken up the subject of my functional critique of the ‘is’ / ‘ought’ distinction—a distinction enshrined by David Hume; a critique advanced by Alasdair MacIntyre—so at the risk of diverting attention from his debate with Keane I would like to return the favor by clearing up some misunderstandings of this critique. First of all, Smith agrees that “functions introduce normativity into the mix,” but protests that he “cannot see the significance of this.” The significance is that many important words in language are functional: they are interwoven with other words that together help to constitute social practices; these social practices grant functions to people and objects, so that the words for these people and objects are functional; consequently, no one is better equipped to understand these words than anthropologists, historians, and other social scientists.
Although he recognizes that the ‘is’ / ‘ought’ distinction becomes blurry in some intermediate cases, Smith rightly claims that there are pure cases where the distinction is more readily apparent. He is also correct that “the mere fact that discourse sometimes presumes objects have particular functions hardly suggests a collapse of the is / ought distinction.” My critique was not meant to collapse the distinction wholly, but only to expose it as simplistic. For even though the distinction applies elsewhere—that is, to anything without a function—it is inapplicable to anything with a function. Because many important people and things have (social) functions, however, this is not a trivial qualification. Moreover, there are specialists who study these people and things, as well as their inter-relationships and the language used to describe them—namely, social scientists—so I do not understand the desire to discredit their contributions to debates about these relationships and language.
Perhaps Smith is right that we no longer “need these disciplines to teach us that language can be deployed in various ways, to various ends,” since we have come a long way from thinking God Himself spoke King James English, but we still need these disciplines to teach us precisely how language is being deployed in specific ways, to specific ends. An example should help make the difference between the general lesson and the specific clearer. We no longer need anthropology to teach us that ‘family’ or ‘faith’ or ‘freedom’ (etc.) are words being deployed in various ways, to various ends, that much is now obvious, but we still rather badly need the help of social scientists to show us precisely how they are being deployed, by whom, when, in what ways, and to what ends. No one is suggesting that social scientists exercise their skills “solely for this purpose,” but if they were to stop exercising them for this purpose altogether we would lack far more than “nifty examples of the functionality of language.” We would lack an understanding of the social worlds in which we live.
“Regarding the ‘anthropological’ strain of philosophy,” writes Smith, “I just take it to be the case that philosophers proceed in their work through introspection and interrogation of doxa.” Doxa is a Greek word meaning beliefs; endoxa means common beliefs—common, in Aristotle’s usage, among either the few wise or the many ignorant or both. He often begins his treatises by surveying the endoxa, assuming that the common beliefs on the subject in question are not completely misguided. One of them may turn out to be wholly true, some of them will turn out to be partly true, and many of them will turn out to be false, exposed as inconsistent either with themselves or the results of empirical research. But even in the cases of the many false beliefs, Aristotle thinks he must explain why so many or such wise people were mistaken. Indeed, he provides an explanation for this ‘inductive’ method in his Analytics. Introspection is rare in Aristotle, or any Greek philosopher before Plotinus, who is the first to give it a central philosophical role.
The relevant point in the midst of this history is that even the philosopher most famous for using endoxa as his data did not exalt them the way analytic philosophers would later do. In his Theory of Justice, for instance, John Rawls articulated a method of “reflective equilibrium,” which entailed little more than seeking consistency between common moral beliefs—and here ‘common’ had gained an altogether democratic meaning, making no distinctions between experts and the many. Whatever value this democratic reformulation had in ethics and politics, it marked a retreat from the epistemological sophistication of Aristotle to the simple interrogations of Socrates. For when two beliefs have been exposed as inconsistent, consistency can be achieved by abandoning one of them, but there is nothing in the method to decide which one to abandon. Nevertheless, for a century in Anglophone philosophy, these two types of data constituted prime evidence for speculation: “intuitions” and “ordinary language.” Intuitions were the deliverance of philosophers’ introspection on matters moral, mental, and otherwise. Not surprisingly, those who did not share the culture of these philosophers found their intuitions—and perhaps also their language at times—somewhat different, and largely ignored.
These two types of data for twentieth-century analytic philosophy resemble Smith’s “introspection and interrogation of doxa.” Fortunately, that century is over. Last year’s presidential address to the American Philosophical Association by Kwame Anthony Appiah brought to the profession what John XXIII’s aggiornamento brought to the post-War Catholic Church. By reminding us of our predecessors’ interest in empirical research, Appiah invited this century’s philosophers to emerge from the cubicle of introspection and wander the fields of real data. By no means does Smith dismiss empirical research: “In fact, I strongly endorse it as one of the ways to solve the mysteries to which I alluded above (although analytic philosophy is another method for achieving the same in some instances).” What is mysterious, however, is why he restricts his endorsement to the physical and biological sciences, adding “I hardly consider anthropology and history the kind of empirical work that with any finality tips the philosophical balance one way or the other.”
But this complaint sets the bar too high, even for the physical and biological sciences. How often have the hard sciences tipped a philosophical balance, at least when it comes to the most important questions: Who are we, and what kind of life should we lead? Since we are still debating these questions first introduced by the Greeks, and have made only limited incontestable progress since their time, does not the empirical evidence of history—specifically now, the history of philosophy—argue that hardly anything tips the philosophical balance one way or the other? Ironically, if history can tip that particular balance—whether there is progress in fundamental philosophy—is there not also room for the other humanities and social sciences to throw their weights and measures around?
Patrick Miller raises some very interesting questions regarding philosophical methodology. These issues are quite deep and deserve a dedicated space. Nonetheless, I shall discuss some of the issues Miller raises here because he presses them with such vigor and clarity.
Miller appears to hypostatize the functions picked out by language: in some “real” way, the function of watches is to tell time. Of course, one wonders what happens when I start using my watch to keep the pages of my book from turning as I read it open on my desk or if I view my watch as a status symbol and care not one whit whether it can tell time. Has my watch ceased to have the function of telling time? Or has it acquired a new function in addition to the time-telling function? There is no easy way to answer this question, as analytic philosophers Larry Wright and Robert Cummins famously came to demonstrate in the 1970’s: both an artifact’s and a biological system’s functions are difficult to define. Perhaps the most systematic treatment of these questions was undertaken by Ruth Garrett Millikan in her celebrated book, Language, Thought, and Other Biological Categories. In general, we oughtn’t posit the existence of a function in the world because of a mere artifact of linguistic usage (just as, e.g., we ought not posit the existence of sakes just because the only way to make sense of phrases such as “for the sake of X,” we must treat sakes as regular, referring nouns).
Miller may respond that ‘function-words’ that “help to constitute social practices” are different than, e.g., words like “watch” and in particular that hypostatization is appropriate in their cases. I doubt this, though. For one, it is far from obvious that certain words play central and stable roles in constituting social practices. For, it is dubious that some particular word is in some sense essential to the well-functioning of a social practice, or even that it plays a determinate role. I’d need to see more to buy this line of argument.
Social scientists obviously have something to offer us and nowhere did I suggest that they didn’t. I simply think that they don’t have much substantive to add to normative theory. Often, though, giving a genealogy of some concept, or providing context for some discourse is treated as sufficient for determining the content and/or referent of the concept, or sufficient for critical engagement with discourses that deploy the concept. But, this is a thin view of content and criticism. The work of semantics and critical value theory comes also in doing the linguistic, moral, and political theory necessary in order to develop a flexible theory of content and a substantive moral theory and constructive (and not merely critical) political theory.
Excuse me for writing “doxa” where I meant “endoxa.” Also, I thank Miller for his erudite Aristotle scholarship. As always, he comes out on top on this front. But, Aristotle was hardly my focus. I really meant this to be general point – one that applied, if not prior to Plotinus, then apparently since Plotinus.
Miller’s take on reflective equilibrium is not quite right. This is not the space to go into Rawls exegesis, but suffice it to say that the whole project of Theory of Justice was to get away from what Rawls called “intuitionism,” which is, to borrow from Miller, “little more than seeking consistency between common moral beliefs—and here ‘common’ [has] gained an altogether democratic meaning, making no distinctions between experts and the many.” Miller represents reflective equilibrium here as a methodology. But it isn’t: reflective equilibrium is best understood as a regulative principle, not as a methodology. Constructivism is the methodology and for Rawls the primary constructivist tool is the original position. (The best recent book on Rawls and constructivism in general is G.A. Cohen’s Rescuing Justice and Equality.) I could go into this in greater detail if necessary, but suffice to say that Miller’s characterization of Rawls is inaccurate and his criticisms therefore miss their target.
Miller is surely correct that I have set the bar too high if I banish history and anthropology altogether as having some role to play in philosophy. My point was primarily that they have little or no role to play in normative theory. I am sure that historians and anthropologists can offer a great deal to other philosophical endeavors. But, in general, insofar as I put my stock in any science when it comes to normative theory, I put my stock in biology, cognitive science, psychology, and the “hard sciences.” And perhaps I should not have said “tips the balance” and instead said something like “adds weight.”
But really, questions of meta-philosophy are quite difficult and I haven’t space here to engage with them. I, alas, am not a fan of Appiah’s lecture or his companion book, but I haven’t space here to explain why. Instead, I shall only recommend what I see as the most sophisticated (and bold) treatise on these issues so far: Tim Williamson’s Philosophy of Philosophy.
We have hijacked this discussion and diverted it away from Keane’s fascinating post on secularism and press freedom and toward a subject that may be interesting only to us philosophers, or worse, only to those interested in the “philosophy of philosophy.” My hope, however, is that resolution of my friendly disputes with Smith will help return attention where it belongs, that is to say, upon the contributions of scholars in the humanities and social sciences, such as Keane, to discussions of normative questions. In this case, the normative question would be the proper response to the publication of the Danish cartoons and the ensuing Muslim violence. I pose that question conditionally because Keane did not prescribe any course of action, or render any moral judgment. He did, however, make a persuasive case that there is a widespread misunderstanding of this sad episode, a misunderstanding based on a deeper neglect of the cultural gap between the secular West and the non-secular Islamic world, and any informed discussion of the proper response should take this gap into consideration.
Specifically, we learn from Keane, this gap affects even semantics, territory that philosophers who have had exclusive deeds to it for a long time will not easily relinquish to parvenu disciplines such as religious history. (Incidentally, a similar quarrel arose in the last century when linguistics began to tread on this territory. Until then, the Philosophy of Language had been too easily confused with the Philosophy of English.) A remarkable parallel between the form and content of this debate has thus appeared. Our disciplinary misunderstandings have to a lesser extent produced a model of the deeper cultural misunderstanding at issue in Keane’s post. In his first reply to Smith, executing the function of anthropologist, Keane made something like this point by alluding the “the clash of our disciplinary cultures.” In any case, whether between Keane and Smith, or between Danish cartoonists and Muslims in Pakistan, there are gaps between social functions that cause invisible misunderstandings. Yet the nature of functions is at the heart of my dispute with Smith, so this dispute may be flying the plane right back to the airport from which our hijack began.
I have argued above that the distinction between description and prescription—in other words, between ‘is’ and ‘ought’—cannot be sustained whenever the thing or person in question has a function. Smith seems sympathetic to this argument, but worries that I must hypostatize functions in order to secure its conclusion. A broader and related debate has also arisen between us about the contributions of social scientists in normative philosophy, or ethics. In order to make progress in that broader debate, it seems, I must first assuage Smith’s worries about hypostasis. I have lately forsworn that venerable practice, in my own post on this blog about immanent spirituality: a spirituality that religiously eschews hypostasis and transcendence. I do not think I have committed this sin in my example of the watch, nor would I be tempted to commit it in Smith’s imagined counter-examples. But let us see.
In my example, the watch has a social function—to tell time accurately, so that we may regulate our schedules and keep our appointments. This is to say that people in our society expect certain things from watches, behave towards them in certain ways, speak about them in certain ways, and so on. The (social) function of watches is nothing but the sum total of these expectations and dispositions to behave or speak. Where is the hypostasis? It’s all there for the naturalist to survey, whatever her disciplinary affiliation, although she would survey the relevant facts best if she were an anthropologist.
In a society where people do not expect or behave or speak in these ways, after all, the same object will not have the same function. Anthropologists can adduce a legion of such societies, whereas we laymen can rest content with the amusing 1980 film, “The Gods Must Be Crazy,” which makes comedy out of the misunderstandings created when Kalahari bushmen try to appropriate a Western artifact. The film begins with a pilot dropping a Coke bottle from his plane onto the Kalahari desert, where it is picked up by a bushman who gives it different purposes: e.g., a rolling-pin for snakeskin. To return to our watch example, a society less concerned than ours is with schedule-regulation—such as the bushmen—might show no interest in watches, except as jewelry or amulets or perhaps even paper-weights. To understand what we here in the West do with watches, the people of this society would need to send among us their approximation of an anthropologist, if there were such a role, just as we must send ours among them in order to understand their unique artifacts and social functions. Biologists, physicists, and other practitioners of the hard sciences are, despite their valuable contributions elsewhere, not much use in this endeavor.
From descriptions of these functions, I wish to claim, normative conclusions follow. If you are a physician in our society, for instance, that is both a description and a prescription: it describes the training you have received and the skills you have; but it also commits you to behave in certain ways, to speak in certain ways; indeed, others in our society are entitled to expect certain things of you. The description ‘physician’ thus prescribes any number of rights and duties, both to you and others around you in our society. In another society, however, there may be no role for a physician, or no role quite like ours, although there may be other social functions that resemble it. An anthropologist or a historian is sometimes required to describe these functions, and thereby to convey to us the obligations they entail. Truth be told, there are even social functions within our own society that are similarly opaque to us, at least without the guidance of social scientists like Keane. For thanks to his post, some of the smoke surrounding the social function of the cartoonist has been dispelled. In fact, I must admit, I did not even notice there was smoke there until reading this post.
So here, again, is where social scientists make contributions to normative philosophy, or ethics. They help us understand what the functional roles in a society are, and thus what obligations certain people—physicians, cartoonists—have in certain circumstances. My claim may appear redolent of relativism, although I do not think that this appearance is accurate. I have tried elsewhere on this blog to elaborate an immanent ethics that avoids relativism by making human flourishing the criterion of moral judgment. In brief, here, the physician who finds the rights and duties of her social role intolerable—because she deems them incompatible with her flourishing, the flourishing of others, or even the flourishing of her whole society—may abandon that role, may even become a harsh critic of that role in general. A physician conscripted to serve in a concentration camp, for example, could make such a protest. That is just an extreme example; there are many others all around us, of whistleblowers, conscientious objectors, and the like. Not all roles are created equal.
On that score, now is the best point in my argument to notice the existence of some roles devoted to understanding what flourishing is and how it might be maximized. For my part, I believe that historians and social scientists as well as philosophers occupy such roles. If so, they thereby incur the rights and duties of this devotion. These rights and duties are always negotiable, I add, because these are roles for the study of roles, making them self-reflexive in a way the hard sciences are not. Whenever self-reflexivity becomes involved, as Smith knows, investigation becomes complicated very quickly. This is the sense I would give to Smith’s accurate assessment that “questions of meta-philosophy are very difficult.” Until Smith, or anyone else for that matter, invites me to pursue that investigation further, I propose returning to the more straightforward role of the watch. More specifically, we should return to Smith’s worries that I have hypostatized this role and function.
I hope to have largely assuaged these worries already, but a few words are in order to address the counter-example he envisions. If he uses a watch as a paper-weight, he assigns it an idiosyncratic function, but he does not thereby deny it the function assigned to it by wider society. For when we see him using his watch as a paper-weight, we can ask him whether he does so because it is broken. That is an intelligible question, even a sensible one, admitting of a range of straightforward answers. Any negative answer would presume that the watch, which is working, fulfills its social function (telling time accurately), while any positive answer would presume that it does not fulfill this function. Unintelligible, by contrast, would be a similar question put to someone who uses, say, a stone for a paper-weight. Imagine coming upon someone using a stone for this purpose and asking: “Is it broken?” Were Wittgenstein to hear us asking such a question, he would smile and say knowingly: “They must be discussing philosophy.”
After our dispute about functions and normativity, there remain a few disagreements between me and Smith that I would like to address quickly. “It is far from obvious,” he writes, “that certain words play central and stable roles in constituting social practices.” On the contrary, this seems rather obvious to me. (But aren’t the answers to all philosophical questions obvious: to half of us, obviously yes; to the other half, obviously no?) Think only of the examples I mentioned in my previous comment: the words ‘freedom,’ ‘family,’ and ‘faith.’ Think only of how the word ‘freedom’ has operated in the practice of American foreign policy of this decade. Think only of how the words ‘family’ and ‘faith’ have operated together in the practice of the conservative culture war of recent decades. To contest the meanings of these words is at once to contest the practices and social functions they partially constitute, and thus also the obligations these social functions underwrite. That is why we debate the meanings of such words, and why our debates become so heated. So much is at stake.
“It is dubious,” as Smith states, “that some particular word is in some sense essential to the well-functioning of a social practice.” To win a public debate over the meaning of freedom will not remedy the missionary habits of our foreign policy overnight, nor will a public debate over the nexus between faith and family unhorse the culture warriors, but these debates could be steps in those directions. To the chagrin of analytic philosophers, we do not find essences in the complicated world studied by social scientists and historians. To invoke Wittgenstein again, this time with his own words: “Back to the rough ground!”
Smith claims that my “take on reflective equilibrium is not quite right,” and when it comes to the precise details of Rawls and the tradition of Anglophone moral and political theory I must defer to his evident expertise. I do not myself understand the distinction between ‘regulative principle’ and ‘methodology,’ let alone the specific methodology of ‘constructivism,’ and so I cannot yet judge whether these qualifications touch my complaint about this tradition’s epistemology. I do, however, understand enough about “the primary constructivist tool…the original position” to be suspicious that these qualifications only confirm my complaint. By using this tool, after all, Rawls asks us to fantasize about human beings stripped of all knowledge of their social identities—whether they are woman or man, slave or free, Jew or Greek—and then to envision which principles of justice they would endorse in that position. The idea is that these principles will indeed be just, since they will be chosen by people without the prejudice engendered by social identity. Do I have it roughly right?
If so, then I have never met with a reply to the objection quickly elicited from many quarters where a particular social identity was considered constitutive of rationality itself. (The most thorough exposition of this objection was MacIntyre’s Catholic magnum opus, Whose Justice? Which Rationality?) Despite the original position’s pretense of neutrality, such critics further objected, one social identity would survive the strip-search that was required in order to come to the Rawlsian table. Not surprisingly, this proved to be the social identity of Rawls himself, and others of his academic and philosophical sub-culture: namely, the social identity of the Western, secular, cosmopolitan, whose identity is to strip himself of all historical and religious and cultural identity, at least when he comes to the table of moral and political discussion. This hypocrisy has always seemed to me a decisive blow against Rawlsian political theory. Although Rawls does try to surpass the ‘intuitionism’ of his philosophical tradition, then, it has always seemed to me that he re-inscribes it on a shinier tablet. If Smith can parry this blow, however, I would be grateful, since I myself believe that political liberalism is the worst form of political philosophy, except all those other forms that have been tried from time to time.
Now, both Smith and I hesitate to enter too deeply into the exegesis of Rawls in this forum, which is frequented more by social scientists than by analytic philosophers. But notice: even as we appear to digress into technicalities of this tradition we nonetheless return to the main theme of my original complaint: that by neglecting history and social science, and thereby neglecting the constitution of normativity and meaning by social functions and identities, the “idealizations of philosophy” denature both morality and semantics, condemning philosophical investigation of each to the realm of fantasy. Ironically, then, no social function more than that of the social scientist, especially that of the anthropologist (or sociologist) whom Smith would exclude from normative discussions, could have drawn Rawls’s attention to the presuppositions he brought to his normative treatise, the presuppositions he shared with his peculiar sub-culture, that of Anglophone philosophy.
I haven’t space to engage Miller’s thoughtful post here – but I must say that at least in my eyes his rhetoric is far more lucid and free of pointless adornment than Keane’s was above.
Miller’s functionalism is something at which many anthropologists and sociologists might blanch. Miller’s functionalism requires a kind of static cultural and semantic homogeneity that is as much a rational ideal as Keane and Miller angrily assert run rampant in in the “peculiar sub-culture” of Anglophone philosophy. For a poor person, a watch may be a status symbol, for a rich person a time-piece, or vice-versa. For some a watch may be a link to the past, regardless of whether it can tell time (e.g., if the watch is an heirloom), and for others it may be a pointless accoutrement to a cellphone. After praising to high heaven the hard-nosed empirical work of anthropologists, then, Miller’s bald presumption that there is enough shared within a society to assign with finality a function to an artifact is a surprise.
Here, it appears that Miller may benefit from giving another look to the work in his home discipline. Most notably, the work produced by philosophers of biology, often in concert with biologists of many stripes (gasp! philosophers working with empirical scientists… in the 1980s!), has raised profound doubts about functional analysis. The worry is that it is incredibly difficult to identify even *natural* functions, much less functions that depend upon convergence of the intentions of actors within a society. My worry, again, is that positing such convergence is as much an idealization as any an analytic philosopher might make. (For more, I once again refer Miller and others to Ruth Millikan’s Language, Thought, and Other Biological Categories, as well as the work of Jerry Fodor, Fre Dretske, Francisco Ayala, Berent Enc, and Philip Kitcher, to name a few prominent examples).
We oughtn’t get into Rawls exegesis – Miller is correct. But, at the very least, it’s worth noting that Miller has changed his complaint about Rawls: before the problem was that Rawls’ method involved non-experts blithely balancing their intuitions; now the problem is that the original position presumes a kind of rationality that is culturally specific. This latter objection is well known and doesn’t need to be rehearsed here. All that matters is that we see that with respect to his first line of argument Miller has capitulated and fallen back on appeals to MacIntyre’s work.
A few final notes: A regulative norm, as I used the term, is not a method – it regulates the application of a method. I hope that clears up some confusion.
Also, I never recommended excluding social scientists from normative theory, I just suggested that their contributions qua social scientists would be minimal – at best critical and never substantive. And, I cannot imagine most social scientists bothering with substantive moral theory as they have other fish to fry. Keane, in his post above, may also be avoiding switching from the frying pan to the fire, although it is hard to tell. Keane makes either an obvious but dubious claim – almost entirely because of its secularity, the secular west is conceptually unequipped to understand the religious east and so sees them as irrational – or a bold but undefended claim – the secular west reacted wrongly to Muslim anger expressed in response to the cartoons in question. The former claim is the province of social scientists. The latter of normative political theorists. It’s no wonder that there is nothing Keane actually says that supports the latter claim. And, my objection to Keane above was that the former claim simply was not substantiated in his post. Perhaps if I read all the supporting material he supplied later on, I would be convinced. But, perhaps if Keane bothered to read (in a open-minded fashion) some analytic philosophy, his views would change as well. So, off I go to read Keane’s work, and the work of other social scientists.
My suspicion here, ultimately, is that Miller is not fond of substantive political philosophy that cannot be found in MacIntyre’s oeuvre.
One of the most important contributions social scientists make to normative theory emerged in our discussion from—of all places—our hesitant foray into the normative theory of Rawls. His is among the most influential normative theories of the last half-century, and so if social scientists can make a contribution to the debate over the soundness of this theory, let alone a decisive contribution, then a fortiori they can make an estimable contribution to normative theory more generally. Since they have already made such a contribution, however, my argument on their behalf is not merely hypothetical. Before entering into that argument again, we need to remember that Smith does not wish to dismiss social scientists from such debates, only to claim that “their contributions qua social scientists would be minimal—at best critical and never substantive.” In light of this qualification, my argument must show not only that they make valuable contributions to such debates, but further that their conclusions are substantive. Part of the problem with this onus, I shall argue, is that it is by no means clear or uncontroversial what “substantive” means here.
As a witness to the contribution of social science to the debate over Rawls, I called MacIntyre to the stand. Now, of course, MacIntyre is also a philosopher, even primarily one, but when he criticizes Rawls’s normative theory, he does so as much in his capacity as sociologist as in his capacity as philosopher. (Not incidentally, sociology was a field he tilled for a while.) Thus in Whose Justice? Which Rationality? he argues that its liberal assumptions are shared by most of those who discuss it, which occludes these assumptions, in the way that “the contemporary debates within modern political systems are almost exclusively between conservative liberals, liberal liberals, and radical liberals.” MacIntyre’s historical capacity is still more prominent in that book. Here, for instance, is its final sentence: “The rival claims to truth of contending traditions of enquiry depend for their vindication upon the adequacy and the explanatory power of the histories which the resources of each of those traditions in conflict enable their adherents to write.” Written like a true Hegelian.
By MacIntyre’s own lights, then, it is impossible to tease apart the history from the philosophy in his book, or in any book he thinks will make a substantive contribution to the debate between his famous three rival versions of moral inquiry. Rival normative theorists may dispute the second claim, ironically confirming MacIntyre’s diagnosis of their differences, but they cannot deny the first without disputing his own accounting of his project. Nor is he alone in this project. Charles Taylor’s historical and philosophical vocations are likewise inseparable in A Secular Age. Contemplating their shared method, then, I feel more confident than ever in my belief that the social sciences and history make substantive contributions to normative theory. For if it be agreed, first, that these books make contributions to normative theory, second, that they could not do so without their sociological and historical erudition, and third, that these contributions are substantive, the onus placed upon my argument by Smith has been carried.
The crux here, I expect, is the third condition. What exactly counts as a substantive contribution to normative theory? Must such a contribution provide a decision procedure for moral and political action—a new Categorical Imperative, Principle of Utility, or Two Principles of Justice; a new Bill of Rights, Geneva Convention, or Universal Declaration? If so, the contributions of these Catholic Hegelians are not substantive. And so much the better for them. Although lists of rights, guidelines of justice, and occasionally even decision procedures for political action have their place, I have criticized here the confusion of these lists, guidelines, and procedures with the whole of normativity. To focus my case now on political theory more specifically, recall that many of the great political philosophers made no such contribution. Take the first great one, Plato. What was his substantive contribution in Republic? Did he recommend that we institute the utopia it described, with all of its absurd norms? It is not at all clear. What about Augustine? What was the substantive contribution to normative theory in his City of God? Again, if we seek decision procedures, we come up short. A reader of the work of MacIntyre or Taylor faces the same practical challenge: what does it mean to apply their ideas to political life? In my view, this difficulty is a virtue; it frustrates the persistent temptation to ossify moral life, to relieve us of the existential burden of freedom.
Lest Smith, or anyone else, think I am hostile to political liberalism, which indulges this temptation to an extent, let me repeat my adaptation of Churchill that “political liberalism is the worst form of political philosophy, except all those other forms that have been tried from time to time.” In other words, Smith’s ultimate suspicion that I am “not fond of substantive political philosophy that cannot be found in MacIntyre’s oeuvre” is a bugbear. MacIntyre is a merciless critic of political liberalism. As for my own tastes, I am not sure how they are relevant here, but if they are in some way I cannot foresee, here they are: among ancient political philosophers, my favorite is Plato; among moderns, de Tocqueville; finally, among the living, Taylor—yes, despite the fact that everything I have so far posted to this blog has been critical of him. (Careful and polite criticism has always been a token of my respect.) If Smith can find in this eclectic mix some inflexible agenda, however, I would be forever grateful, since I would then learn definitively what I myself believe when it comes to these bewitching questions!
For all my admiration of Whose Justice? Which Rationality?, in particular, I never intended to have “fallen back on appeals to MacIntyre’s work,” as Smith writes, so I would like to make it clear what I did intend. The following excerpts from my last comment should help: “Despite the original position’s pretense of neutrality,” I wrote, allying myself with critics from the social sciences, “one social identity would survive the strip-search that was required in order to come to the Rawlsian table.” If this is true, as I believe, then Rawls’s original position fails on its own terms as a tool of normative theory: it is not neutral with regard to social identities. So, which social identity survives the strip-search? “Not surprisingly,” I added, again in concert with social scientists, “this proved to be the social identity of Rawls himself, and others of his academic and philosophical sub-culture: namely, the social identity of the Western, secular, cosmopolitan, whose identity is to strip himself of all historical and religious and cultural identity, at least when he comes to the table of moral and political discussion.”
Although I referred to MacIntyre’s work in this connection, since it provides the most erudite exposition of this criticism, I am not appealing to it as though making an argument from authority. On the contrary, I believe the argument stated as simply as I have tried to do works quite well—so long as it is true that Western, secular, cosmopolitans feel that nothing is lost when they fantasize about entering the original position, whereas inhabitants of other cultures or sub-cultures feel that they could not enter this position and still make a ‘rational’ decision. By entering it, according to them, they would be stripped of the very criteria they use to decide rationally: the Decalogue, the Magisterium of the Catholic Church, the latest Fatwa, or what have you. Smith says that this “objection is well-known and doesn’t need to be rehearsed here.” Whether or not it needs to be rehearsed, I have done so—because I believe it poses a serious problem for political liberalism—and I once again invite Smith, whom I acknowledge as an expert in these matters, to enlighten me with a reply to it.
Neither did my elaboration of my objection to Rawls—moving from a critique of ‘reflective equilibrium’ to a critique of ‘the original position’—justify Smith’s claim that “with respect to his first line of argument Miller has capitulated.” For my first line of argument was not what Smith represents it to have been. According to this misrepresentation, “before the problem was that Rawls’ method involved non-experts blithely balancing their intuitions.” I did discuss, all-too-briefly, and thereby fostering this confusion, how Aristotle’s moral epistemology was superior to that of Rawls. When faced with competing endoxa, whether in ethics, psychology, or any subject for which there are endoxa, Aristotle sought a consistent account, as does Rawls. Unlike Rawls, however, Aristotle had a method for determining which endoxa should be abandoned, which should be revised, and which should be wholly retained. This is his method of epagōgē, or induction, and the distinction between experts and non-experts is marginally involved in his method, but that is not what matters to our debate.
What does matter here is that Aristotle had a method to make the determination of which endoxa to keep and which to abandon without having to rely—as Socrates did—upon implicit regulative norms, which is most often a fancy term for custom. Rawls, like Socrates, has to rely on these implicit norms to secure agreement, even when he offers the original position as a tool. When we all enter the original position, he believes, we shall all agree upon the two principles of justice, but he neglects how his own customs have excluded some people from entering that position without sacrificing the criteria they use to make rational decisions. That has been my objection all along, both in my first line of argument and in my second. Rawls’s normative theory relies on custom, and that is why social scientists are its best critics—they are experts in eliciting implicit customs. Smith correctly observed that Rawls was a critic of Intuitionism, where these customs and norms were just below the surface; he offered instead the tool of the original position. But since social scientists and their philosophical allies have exposed the regulative norms disguised by this more sophisticated tool—namely, the regulative norms shared by Western, secular, cosmopolitans—the basic flaw of Intuitionism remains.
At this point I would like to turn to Smith’s replies to what he is calling my ‘functionalism.’ Anthropologists and sociologists might indeed blanch at the title he gives to my position, but they have nothing to fear from its content. For it is not the case, as Smith writes, that “Miller’s functionalism requires a kind of static cultural and semantic homogeneity that is as much a rational ideal as Keane and Miller angrily assert run rampant in the ‘peculiar sub-culture’ of Anglophone philosophy.”
I shall begin to redress this mischaracterization of my position in a moment, but first allow me to protest that I have made no angry assertions in my comments on this post, nor have I knowingly felt anger thus far in the debate. If analytic philosophers are to become aware that they inhabit a “peculiar sub-culture,” and they do, they must allow others—especially experts in such cultural descriptions—to describe it without defensively assuming that these experts are reacting angrily, or under the sway of any other irrational passion. (This is another remarkable parallel with the sad episode of the Danish cartoon.) These experts are social scientists, and a good entrée into the work they have done on this sub-culture is Jonathan VanAntwerpen and David Kirp’s chapter on the NYU Philosophy department in Shakespeare, Einstein, and the Bottom Line.
If I must confess my emotions, though, I am as grateful for the graduate training I received in analytic philosophy as I am for the friends I made during that training—Smith among them. Analytic philosophy has a crucial role in both our universities and our society, emphasizing logical rigor and clear expression. Indeed, it is with no small debt to analytic philosophy itself—that is to say, with no small effort to be rigorous and clear—that I marshal my argument against its anachronistic allegiance to a simplistic version of the ‘is’ / ‘ought’ distinction and an idealized moral epistemology that would diminish the contributions of social scientists and historians. If I am an apostate, in sum, I am trying to be a faithful one.
Now to the content of my ‘functionalism,’ and an exoneration of it from Smith’s allegation that it requires a static cultural and semantic homogeneity. Traditional versions of functionalism, versions with which he is familiar, do indeed require this. I welcome his recommendation of more analytic literature on this subject, but I am already a convert to the position he urges upon me: even the literature I have already read has been enough to convince me that biological functions are very difficult—perhaps even impossible—to determine in particular cases. (I am thinking especially of John Dupré’s The Disorder of Things.) And yet I do not see corresponding problems for determining the functions of watches or physicians or even, dare I say, human selves.
As for watches, I agree with Smith that “for a poor person a watch may be a status symbol, for a rich person a time-piece, or vice-versa.” The possibilities are infinite, and Smith understandably experiences surprise in what he took to be my view, that “there is enough shared within a society to assign with finality a function to an artifact.” But I have never believed there to be any finality in these matters; they are always open to revision, contestation, sometimes even outright reversal. This is particularly true with regard to the functions of social roles. One generation’s physicians may become the next generation’s torturers, as my example of the concentration camp was meant to suggest. The fluidity of social functions is one reason why social scientists must constantly renew and revise their findings. An analogy with meaning is once again apposite: the meanings of words are situated in complex webs of fluid practices, and so anyone who seeks to assign meanings with any finality will fail (as every writer of a dictionary has learned); indeed, anyone who seeks to describe the meaning of even one word comprehensively will find this difficult (as every contributor to the OED has learned); but it follows neither that there are no meanings, nor that the believer in meaning has hypostatized. So likewise for functions.
What about the function of the human self itself? I do believe there is such a thing, and here is where anthropologists will likely blanch, although a few words would help them recover their color. Following Plato, Aristotle notoriously thought our function was rationality, where rationality was given a specific content that excluded the participation of women, natural slaves, and barbarians. But that is not the sort of function I ascribe to the self. The trick here is to articulate our function in such a way as to avoid two shoals: on the one hand, a function so specific that it freezes human potentials and homogenizes human achievements; on the other hand, a function so general that it lacks content and is nothing but a cipher. In my effort to navigate between these shoals, I wish to argue that our function is creativity.
The most pressing danger for this effort is the second; it risks producing a cipher. In my first two posts on this blog, however, I have begun to fill in the content of this function by discussing Freud and Nietzsche. For in my view, no recent philosophical ideals illustrate this function better than Freud’s psychoanalyst and Nietzsche’s Übermensch. Taking just the latter for now, we see how this hero is supposed to overcome himself, surpass himself, exceed himself; in other words, he is consummately creative, turning himself into a work of art. Creativity, Nietzsche seems to think, is the human function. This is why he praises Goethe above all. But if this interpretation is right, then all the many interpretations that would reduce the Übermensch to a formula—military conqueror, let alone blond beast—have utterly missed the point. They have surrendered to the temptation I described above, while discussing political liberalism, to regiment our lives according to a static and homogenous formula (curiously, the very complaint Smith raised against me). The problem with such regimentation, not only as a key to Nietzsche, but more importantly as a guide to life, is that it forfeits our function: creativity.
My first two posts—Psychoanalysis as Spirituality, and Immanent Spirituality—touched upon our creative self, but they did not say enough to satisfy anyone who brings justifiable skepticism to my conclusion. That shortcoming was partially due to the limited space afforded these posts, but largely due to a deliberate omission on my part. The best argument for this creative understanding of the self is not found in any modern author, I believe, but instead in an ancient author, a writer of obscure aphorisms that have consequently been neglected by philosophers intent on clarity. And so to remedy this omission, I have submitted a third post, “Heraclitean Spirituality,” which should—providing it sees the light of the computer screen—make my version of ‘functionalism’ clearer, if not also persuasive to such philosophers.