October 02, 2004

Lies and Credibility

[NOTE: This is very untopical; I started working on it a few days ago. You will even notice me passing up an attempt at a political cheap shot. It does, however, sum up some of my key philosophical ideas.]

Hilzoy at Obsidian Wings discusses the fact that it is not uncommon for politicians to lie--and it's relatively recently that we've got used to this. Not surprisingly, she looks at this from a moral standpoint:

While some people seem to think that it's inevitable that politicians lie, it is not. There is nothing about deciding to run for elective office that automatically strips politicians of their principles and renders them incapable of telling the truth. Moreover, there is nothing about being a citizen that forces us to accept this state of affairs. We could, if we wanted to, take the fact that a politician tells a flat-out falsehood as a serious strike against him or her, a consideration that might be outweighed by something even more important, but that was as important as, say, that politician's stand on taxes or the environment. And I think we should.

And I, unsurprisingly, want to take an epistemological standpoint, because this has a direct connection to my views on testimony (which I've written about a bit). Specifically, it might endanger one of my key theses; that the teller has a responsibility to tell the truth that is grounded in purely epistemological considerations.

That argument works, quickly as follows:

(1) You have an interest in having people believe what you tell them.
(2) If you tell falsehoods, people shouldn't believe what you tell them.
(3) So if you tell falsehoods, you shouldn't be able to accomplish something that you have an interest in.
(4) Put another way: If you tell falsehoods, you should have a certain bad consequence happen to you.
(5) "If you don't do X, then you should have bad thing S happen to you" implies that you are responsible for doing X, on pain of sanction S (this is adapted from A.R. Anderson's analysis of "ought").
(6) In sum: You are responsible for telling the truth in that you stake your credibility on the truth of your testimony.

But, as Hilzoy points out, politicians (among other people) can lie and get away with it. That looks like it presents an obvious problem for my account. It doesn't--the problem is serious, but not obvious.

The obvious problem may look like it arises with premise (2). What happens when a politician tells a falsehood and people believe him anyway? But that's not incompatible with (2), because (2) says "If you tell falsehoods, people shouldn't believe what you tell them." Your hearer should believe what she's justified in believing; that doesn't mean she will believe what she's justified in believing. She may believe what you tell her even though your testimony doesn't provide evidence for what she says.

In big words, her epistemic responsibility is to form beliefs that accord with the evidence. When you've lied, your testimony won't provide evidence for what you've said; if she believes it, she'll be believing in an epistemically responsible manner. Proverbially, "Fool me once, shame on you, fool me twice, shame on me."

The "should" is important for the move from (4) to (5). If not doing X will have bad consequences, that doesn't show that you're responsible for doing X--falling sick isn't a punishment for not dressing warmly. The sanction itself has to be applied in a normative manner. (You're prudentially responsible for dressing warmly enough; but the sanction for not doing so is that you can appropriately be criticized for behaving imprudently, not the bad consequences that follow naturally.)

So it's no problem for my thesis that liars are sometimes believed. What is a problem for my thesis is if sometimes liars should be believed--if someone may lie and still be fairly confident that his audience will be justified in believing his future testimony.

How could that happen? Hilzoy, again:

It is, of course, possible to tell who is lying. I, for instance, often know. But that's because I am the sort of person who actually likes to watch CSPAN panels on natural gas pricing, and read GAO reports on the security of shipping containers. For some reason, most people don't share this taste; even my friends find it somewhat eccentric. Moreover, lots of people don't know the sorts of basic things about policy that they'd need to know in order to sort out truth from falsehood.

And Bernard Yomtov, in comments:

I think you ignore the media's responsibility here. It has often been pointed out elsewhere that much newspaper reporting has descended to reporting statements made, with little or no effort to evaluate their truth. The point bears repeating. Politicians would lie less if they were called on it more often. Isn't that that job of the press?

Take ordinary citizen Y who gathers information from the newspaper--and not by reading every single article in detail. It seems reasonable in some sense for Y not to spend all her time scouring over the paper. Then suppose politican X, who has lied frequently in the past, says that there is a growing threat from Molvania. (It doesn't matter whether X is telling the truth this time--only that he wants you to believe it.) What should Y think?

Well, on the evidence that's available to Y, X is no more or less trustworthy than any other politician. All Y knows is that X has said this; she doesn't have a lot of information about X's record. So it seems that Y is justified in believing what X says about Molvania, even though X has lied in the past. X has lied but should still be believed; so the sanction (as described in (4) and (5)) doesn't apply even normatively.

What to do? One point is that your loss of credibility needn't apply to all your future utterances to count as a sanction. You're going to say a lot of things in the future, and it's a sanction if even some of them shouldn't be believed because of your past false testimony.

The problem here is that it seems as though the information on politicians' truth-telling record might be so poorly disseminated that most people won't be in a position to use that information for judging their future testimony. So, for most Y, it will not be the case that Y should not believe what X says because of X's future testimony. My account tends to presuppose that your record is going to get out--if you tell enough falsehoods, eventually people will have heard about them, so they won't be justified in believing what you tell them. But if the institutions that give most people their information are broken, that might not be true.

Perhaps I should answer this by moving the locus of epistemic responsibility from the individual hearer to the society. So not only will individuals have epistemic responsibility for their beliefs; a society will have an epistemic responsibility for ensuring that individuals have enough information to arrive at true beliefs. A society that doesn't do that isn't well set up epistemically. Then my original premises (2)-(5) will still go through; if you tell falsehoods, you shouldn't be believed, because society should make sure that the falsehoods are well enough publicized that people will have evidence when their informants are liars. (For politicians, it's important that dissemination be widespread; for most of us, our reputation need only extend through our social circle.) What the hearer does with the information is up to them.

Anyway, that's a thought. The idea of a social epistemic responsibility is a pretty big one that could use some more working out, but I remain somewhat optimistic that we can establish a sense in which someone who tells falsehoods shouldn't be believed.

Posted by Matt Weiner at October 2, 2004 01:49 PM
Comments