Way back on my first two-envelope post, commenter Mark pointed out that the St. Petersburg two-envelope problem has the following property: though it doesn't make sense to switch before the envelopes are opened, but if you see what's in the first envelope, it would make sense to switch *no matter what is in the first envelope*. The EU of the second envelope is infinite, so once you know the finite sum that's in the first envelope, you will expect to gain by switching.

Brian has since put up another two-envelope problem in which the two envelopes are indistinguishable before you open them--but if you see what's in the first envelope, switching has a *finite* positive EU no matter what's in the first envelope. (Call it the Broomean problem--it's not exactly Broome's case, but the differences don't matter.)

There's an important difference between these two cases, though. [Warning! This post rambles worse than usual.]

In the St.P2NV, the switch can effectively be repeated indefinitely. The numbers in the two envelopes are the results of completely independent processes. If God feels like it, S/He can keep producing new St. P envelopes, and offering to sell them to you for 1 util more than the value of your last St. P. Sounds like you'll keep getting your substance whittled away, and you'll never wind up keeping your St. P.

In the Broomean case, one switch is all they can get you to do. The numbers in the envelopes aren't produced by independent processes. After you see the number in the first envelope, you may pay to switch--but after you see the number in the second envelope, you know whether or not it makes sense to switch back. There's not even any risk involved. There's no way to keep getting you to disadvantage yourself indefinitely. To give you a choice that's analogous to the original choice to switch, they'd have to make a new envelope with (in Brian's case) a 0.6 chance of half what you got in the last envelope, and a 0.4 chance of twice what you got in the last envelope (or 2 for sure if the last envelope had 1). And it seems perfectly sensible to take that envelope. You might wind up gaining on both switches.

(Of course, if you're offered an indefinite number of Broomean cases, you can be made to take an indefinite number of Dutch books. But you also get to keep the results of an indefinite number of bets of infinite EU, so things aren't so bad!)

All this, I think, rests entirely on uncontroversial principles of conditional probability, not on any of my attempts to go further with the two-envelope problem. In the St. Petersburg case, the process of determining the value of each envelope is independent--so learning what's in one envelope doesn't change the conditional probability of what's in the next. In the Broomean case, the values aren't independent.

So, suppose you're offered an infinite number of St. Petersburg switches, each costing a dollar more than the value you got the last time. At each stage your expected EU is infinite; you simply have no reason to stop.

This isn't even as bad as the cases like Arntzenius, Elga, and Hawthorne's "Trumped"--where every day you're offered two days in Heaven if you take one day in Hell first, and so you take an infinite number of days in Hell and never get to Heaven. In the St. Petersburg switch, at every stage you have an infinite expected EU. It doesn't even seem to make sense to ask how many utils you'd have if you could make an infinite number of these transactions--that would be the outcome of the last St. Petersburg minus a countable infinity, but "the last St. Petersburg" isn't defined.

Note, though, that if your goal is simply to attain a certain high number of utils, you can accomplish that with probability 1 in this scenario (I think). Set your target at N--so long as each St. P costs only 1 more than the payoff of the last one, the probability should be 1 that you eventually get a payout of at least N. So this is a sense in which the infinite St. P switch really is arbitrarily good--at some stage you're almost guaranteed to hit a target that's at least as high as you like. (By definition of utils, your goal can't just be to attain a certain high number of utils--more utils are always preferred. I don't really believe in unbounded utility, so that's not going to keep me up at night.)

It should be possible to tell a disturbing tale about the infinite St. P switch, though... in the next post.

Posted by Matt Weiner at February 4, 2004 12:55 PMComments