Philosophy: Truth or Dare?

Many years ago I was having a long chat with someone who later became a well-known philosopher. His work was already way cool, but looking at the theses he defended, I told him he must be aiming for the Annual David Lewis Award for Best-Defended Very Weird View. He told me that he did not always believe the views he defended. He was most interested in seeing how far he can go defending an original, counter-intuitive proposition as well as he can. What did I think? I said that it seems to me that some philosophers seek the Truth but others choose Dare.

I am more of a Truth Philosopher than a Dare Philosopher, but I doubt it’s a matter of principle, given that my personality is skewed towards candor. I’m just not a natural for writing things in which I don’t have high credence at the time of writing. However, if you are human, should you ever have high credence in a view like, say, compatibilism, which has, for a long time, been on one side of not only a peer disagreement but a veritable peer staring contest? Looking at it from one angle, the mind boggles at the hubris.

Zach Barnett, a Brown graduate student, has recently been working on this and has a recent related paper in Mind. I asked him to write about it for Owl’s Roost and he obliged. Here goes:

I want to discuss a certain dilemma that we truth-philosophers seem to face. The dilemma arises when we consider disagreement-based worries about the epistemic status of our controversial philosophical beliefs. For example:

Conciliationism: Believing in the face of disagreement is not justified – given that certain conditions are met.

Applicability: Many/most disagreements in philosophy do meet the relevant conditions.

————————————————————————————————

No Rational Belief: Many/most of our philosophically controversial beliefs are not rational.

Both premises of this argument are, of course, controversial. But suppose they’re correct. How troubling should we find this conclusion? One’s answer may depend on the type of philosopher one is. 

The dare-philosopher needn’t be troubled at all. She might think of philosophy as akin to formal debate: We choose a side, somehow or other, and defend it as well as we can manage. Belief in one’s views is nowhere required.

The truth-philosopher, however, might find the debate analogy uncomfortable. If we all viewed philosophy this way, it might seem to her that something important would be missing – namely, the sincerity with which many of us advocate for our preferred positions. She might protest: “When I do philosophy, I’m not just ‘playing the game.’ I really mean it!”

At this point, it is tempting to think – provided No Rational Belief is really true – that the truth-philosopher is just stuck: If she believes her views, she is irrational; if she withholds belief, then her views will lack a form of sincerity she deems valuable.

As someone who identifies with this concern for sincerity, I find the dilemma gripping. But I’d like to explore a way out. Perhaps the requisite sort of sincerity doesn’t require belief. An analogy helps to illustrate what I have in mind.

Logic Team: You’re on a five-player logic team. The team is to be given a logic problem with possible answers p and not-p. There is one minute allotted for each player to work out the problem alone followed by a ten-second voting phase, during which team members vote one by one. The answer favored by a majority of your team is submitted.

      Initially, you arrive at p. During the voting phase, your teammate Vi – who, in the past, has been more reliable than you on problems like this one – votes first, for not-p. You’re next. Which way should you vote?

Based on your knowledge of Vi’s stellar past performance, you might suspect that you made a mistake on this occasion. Perhaps you will cease to believe that your original answer is correct. Indeed, you might well become more confident of Vi’s answer than you are of your own.

It doesn’t follow, though, that you should vote for Vi’s answer of not-p. If all you care about is the the accuracy of your team’s verdict, it may still be better to vote for your original answer of p.

Why? In short, the explanation of this fact is that there is some value in having team members reach independent verdicts. To the extent that team members defer to the best player, independence is diminished. This relates to a phenomenon known as “wisdom of the crowd,” and it relates more directly to Condorcet’s Jury Theorem. But all of this, while interesting, is beside the point.

In light of the above observations, suppose that you do decide to vote for your original answer, despite not having much confidence in it. Still, there is still an important kind of sincerity associated with your vote: in a certain sense, p seems right to you; your thinking led you there; and, if you were asked to justify your answer, you’d have something direct to say in its defense. (In defense of not-p, you could only appeal to the fact that Vi voted for it.) So you retain a kind of sincere attachment to your original answer, even though you do not believe, all things considered, that it is correct.

To put the point more generally: In at least some collaborative, truth-seeking settings, it can make sense for a person to put forward a view she does not believe, and moreover, her commitment can still be sincere, in an important sense. Do these points hold for philosophy, too? I’m inclined to think so. Consider an example.

Turning Tide: You find physicalism more compelling than its rivals (e.g. dualism). The arguments in favor seem persuasive; you are unmoved by the objections. Physicalism also happens to be the dominant view.

      Later, the philosophical tide turns in favor of dualism. Perhaps new arguments are devised; perhaps the familiar objections to physicalism simply gain traction. You remain unimpressed. The new arguments for dualism seem weak; the old objections to physicalism continue to seem as defective to you as ever. 

Given the setup, it seems clear that you’re a sincere physicalist at all points of this story. But let’s add content to the case: You’re extremely epistemically humble and have great respect for the philosophers of mind/metaphysics of your day. All things considered, you come to consider dualism more likely than physicalism, as it becomes the dominant view. Still, this doesn’t seem to me to undermine the sincerity of your commitment to physicalism. What matters isn’t your all-things-considered level of confidence, but rather, how things sit with you, when you think about the matter directly (i.e. setting aside facts about relative popularity of the different views). When you confront the issues this way, physicalism seems clearly right to you. In philosophy, too, sincerity does not seem to require belief (or high confidence).

In sum, perhaps it is true that we cannot rationally believe our controversial views in philosophy. Still, when we think through the controversial issues directly, certain views may strike us as most compelling. Our connection to these views will bear certain hallmarks of sincerity: the views will seem right to us; our thinking will have led us to them; and, we will typically have something to say in their defense. These are the views we should advocate and identify with – at least, if we value sincerity. 

I find the proposed picture of philosophy attractive. It offers us a way of doing philosophy that is immune to worries from disagreement, while allowing for a kind of sincerity that seems worth preserving. As an added bonus, it might even make us collectively more accurate, in the long run.

That was Zach Barnett. Do I agree with him? As is usual when I talk to conciliationists, I don’t know what to think!

8 thoughts on “Philosophy: Truth or Dare?

  1. I wonder if there’s another component to rationality here, which operates in philosophy as well as inquiry generally. Typically our reasoning isn’t “finished”. So, for example, staying with physicalism despite recognizing that all the best thinkers are for dualism might be a gamble on what’s out there that hasn’t been found yet. It’s a stance: “I’m betting that long-term, this is the strategy that is going to play out more successfully”.

    Like

    1. This sounds quite a bit like a suggestion Sandy Goldberg makes, in a footnote of a highly relevant paper called “Defending Philosophy in the face of Systematic Disagreement” (fn. 5). He says, roughly, that there’s something off about one’s defending a view, if one doesn’t think that the view will be vindicated in the long run.

      I certainly agree with your suggestion that considerations of this kind might factor into someone’s thought process. But I’d resist the stronger suggestion that there’s something problematic about not having this opinion about one’s views.

      I have a friend who is inclined toward panpsychism. But he knows it’s a fringe view, and he’s very humble. I suspect that his all-things-considered confidence in panpsychism is pretty low. He probably also thinks that, in the long run, panpsychism is not likely to be vindicated. (After all, he thinks it’s probably wrong. So why should he think that it is likely to be vindicated?) Still, panpsychism seems right to him, and I think it makes sense for him to defend that view.

      Like

  2. Hey Nomy and Zach – long time listener, first time caller.

    Here is a (relatively) modest claim: I don’t think I’m either a truth or a dare philosopher. I’m definitely not truth – I have very few philosophical beliefs, and I don’t have a high credence in almost any of the claims I have written down. This is partially due to the disagreement Zach talks about. But this is also because, for a lot of questions, I can “see” the issue from every side. That is, I can inhabit a mindset where certain arguments seem very compelling, and the objections don’t seem intractable. But I can also switch to the opposing mindset.

    When I write papers, what I tend to do is get into one of these mindsets, and it helps me come up with arguments. I’m not just picking a claim to see how far I can push it – in some sense I really do think that there is much to be said for that particular claim. But the whole time, in the back of my mind, I know that I can also switch mindsets.

    Here is a claim that isn’t modest at all (and one I’m not sure I endorse, but maybe I do?): My sister once told me that her high school teacher said that if you understand an issue well, you can argue both sides. I think this is false as a general claim. But I kind’ve think something close to it is true for many philosophical issues. When I’m teaching, for example, I will often have undergraduates say something like “I don’t see how anybody can believe this view.” When I hear this, I often think (though never say), “that’s because you haven’t thought about the issue enough.” And when there are issues that I can’t inhabit multiple “mindsets” about, I think this is evidence that there is some aspect of the issue that I don’t quite understand.

    Maybe this adds up to another reason to suspect that we shouldn’t believe our philosophical views?

    Like

    1. Cool point, Han. I can’t speak for Nomy, but I’m open to the idea that there are ways of doing philosophy that don’t fall neatly into either camp.

      I can identify with your mindset-entering to an extent, but I find that, often, at the end of the day, one position seems more plausible to me than the others (setting aside information about how popular the different positions are). I wonder if I’m too much like your imagined undergrad…

      Like

  3. This sounded a bit to me like the kind of sincerity preserved is: what it would be epistemically rational to believe, if that belief were based on first-order evidence ONLY. Higher-order evidence is bracketed.

    How close — or how far — is that from the view described here?

    Like

    1. Yep, you have the right idea. In the paper, sincerity is one desideratum among others. More generally, I’m concerned with what attitude we should take toward our views, in light of disagreement.

      As you anticipate, I do appeal to bracketing. And I do think that *some* higher-order evidence will get bracketed. For reasons that are subtle, I don’t think all higher-order evidence should be bracketed. A quick analogy…

      Suppose I’m on an admissions committee. And suppose I learn that I tend to rate male candidates as more deserving than their equally qualified female counterparts. This is higher-order evidence, since it is evidence about my ability to competently evaluate first-order evidence. But it seems like it would be appropriate for me to take this evidence into account, in arriving at a view about who to hire.

      If you’re curious, section 7 of the paper deals with these issues. It tries to describe which evidence should be bracketed, and why.

      Thanks for the perceptive comment.

      Like

  4. “When I do philosophy, I’m not just ‘playing the game.’ I really mean it!”,.. post modern truth means: I am-we are the means (ones-own instincts sensations emotions mentations) of meaning for meaning…origins of truth of evidence…

    Liked by 1 person

Comments are closed.