Sunday, October 28, 2007

What's rational?

All arguments depend upon how that question is answered, but how is it to be done? (Maybe subjective probabilities are relevant, but the St Petersburg paradox makes me unsure; the following sketch of that paradox is based on a post of mine in May:) Imagine the Supreme Being offering you the following deal, on Her fair tossing of a fair coin (you can tell somehow that it is the Supreme Being talking to you, and that She is no deceiver, and so you rightly believe all that She tells you, e.g. that the tossing will be fair)...
......The deal is that in exchange for you playing the following game (as detailed below) She will give you the entire wealth of the Universe. In effect, you would become Her appointed (and hence the absolute) Ruler of the Universe (in exchange for you owing Her an amount determined by the game below). To simplify matters, assume that She has shown you that, whether or not you take Her up on this deal, you will live forever in some form or another (e.g. as an immortal soul), and that the wealth of the Universe includes alien medical technology that can prolong your natural life within it indefinitely; and also teleportation devices, so that you could actually spend all that wealth. Conversely She could, if necessary, make you pay Her arbitrary amounts over and above your new wealth, were you to end up owing Her money (were you that unlucky at the following game), by getting you to work for Her, at a very reasonable rate of pay, in some relatively pleasant part of Purgatory.
......The game is as follows: She will repeatedly toss a fair coin, until it lands heads up, and you will pay Her back a number of cents equal to 2 to the power of (1 + the number of tails before the first head), but only if that number of tails is less than twenty times the wealth of the Universe in cents. So, if She gets a head first time, you will only owe Her 2 cents; and if She gets a tail and then a head, you will owe Her 4 cents; and if She gets two tails and then a head, you will owe Her 8 cents; and so forth, unless She throws as many tails as twenty times the wealth of the Universe in cents, in which case you will owe Her nothing. Since the chance of Her getting a head on the first toss is 1/2, and the chance of Her getting Her first head on the second toss is 1/4 (there being four equally likely possibilities for two tosses, i.e. HH, HT, TT and this one, TH), and the chance of Her getting Her first head on the third toss is 1/8, and so forth, hence Her expectation is (2/2 + 4/4 + 8/8 + … + N/N, for some N, as given by the above) cents minus the wealth of the Universe = twenty times the wealth of the Universe minus the wealth of the Universe = nineteen times the wealth of the Universe. That is, She would expect to get, were She to play this game a lot (thereby reducing the effects of chance), an enormous profit.
......Nonetheless this deal is only being offered because She suspects that you might wish to take Her up on it; so, would you? Well, what is your chance of losing much? It is clearly very small because for you to have to return as much as 20 dollars, from the vast wealth of the Universe that you would have already been given, She would have to throw at least 10 tails in a row (and if She threw less than 46 tails before the first head, which seems almost certain to occur, you would not even have to return a paltry trillion dollars of your vast wealth), and so your chance of having to work for many years in the afterlife is clearly tiny; so, would it be rational to reject such an offer, just because of something not too bad that almost certainly won’t happen anyway? Hardly; I mean, what would actually happen if the above deal were offered, and you took Her up on it? Just the actual outcome, which would surely (is there any reasonable doubt about this?) be you owning most of the Universe.
......How could turning that down be rational? (Of course, were you to take Her up on this, it being irrational not to, your future incarnations in similar Universes would surely do the same, for similar reasons, thereby ensuring that you would almost certainly spend an awful lot of your time working for nothing, because of your own well-informed and free choices :)

4 comments:

Matt Norwood said...

Rational self-interest is a concept entirely dependent on assumptions about the psychological notion of self and on psychological preferences. Both of these notions are entirely indeterminate in the case of an immortal specimen of homo sapiens enslaved to some galactic overlord: we do not have enough information to imagine what such a being would come to regard as its preferences or its self, and as such, we cannot imagine what its rational self-interest would entail.

Relying on the idiotic assumptions of neoclassical economics returns exactly the tautological answer your post would seem to set up as the obvious one: do the math, and your model returns exactly what the model would predict. Its relation to actual human behavior or preferences is as remote as most predictions generated by that model.

Enigman said...

Thanks for the comment, I was hoping for something along the lines of it being the god's existence that was shown to be irrational.

My thinking was that our rationality (whether it evolved physically or not) is something that we can take with us into any situation that could (logically) crop up. If we behave rationally only when faced with convenient situations, then we're not really rational, we're just lucky.

Incidentally we were not (in the scenario) enslaved, we were totally free to do what we thought best (according to ourpsychological preferences); and there were two obvious answers, that was the problem.

Martin said...

Matt, what's wrong with thinking that our rationality is the product of evolution, and that it would therefore not necessarily give us the right answers in such scenarios? You seem to be saying something like that... But then this scenario is an argument for such a view (a converse of Plantinga's Evolutionary argument against Naturalism) since were our rationality the product of divine design, it should be expected not to yield paradoxes in such (finite) scenarios... or was that your point?

Enigman said...

It occurs to me now that it is not an argument for Naturalism, since we could be faintly irrational as a result of some Fall. The St. Petersburg seems to get its force from the same pragmatic source as Borel's Law (that very unlikely events never occur).