Monday, January 10, 2011

Less Wrong?

So here is something interesting that has surprised me in San Francisco: It is common for people (i.e., roommates) to give each other rides to and from the airport.

Now why is this surprising? Because we have safe, efficient, clean public transportation (BART) that goes straight to the airport terminal. I know of friends living close to the BART line who've given or received rides.

And my first reaction was: To ask for or accept a ride... isn't this a bit irrational? To impose on others, or to allow others to impose on you, for something that's easily done alone?


So I was reading the interesting blog Less Wrong. The masthead declares that the blog is "devoted to refining the art of human rationality". The following problem ("Newcomb's Problem") appeared in one of their posts:

A superintelligence from another galaxy, whom we shall call Omega, comes to Earth and sets about playing a strange little game. In this game, Omega selects a human being, sets down two boxes in front of them, and flies away.

Box A is transparent and contains a thousand dollars.
Box B is opaque, and contains either a million dollars, or nothing.

You can take both boxes, or take only box B.

And the twist is that Omega has put a million dollars in box B iff Omega has predicted that you will take only box B.

Omega has been correct on each of 100 observed occasions so far - everyone who took both boxes has found box B empty and received only a thousand dollars; everyone who took only box B has found B containing a million dollars. (We assume that box A vanishes in a puff of smoke if you take only box B; no one else can take box A afterward.)

Before you make your choice, Omega has flown off and moved on to its next game. Box B is already empty or already full.

Omega drops two boxes on the ground in front of you and flies off.

Do you take both boxes, or only box B?

"Obviously", you should take both boxes. Omega has already flown off.

The Less Wrong blog post launches into a long discussion of rational behavior, even discussing Bayesian probability and whatnot. The blog seems to take it for granted that better decisions are made by people who understand Bayes' theorem than people who don't. But I think that most mathematicians would take both boxes, and most non-mathematicians would take only Box B. And get the million dollars.


In my opinion, the interesting thing about this example is how contrived it isn't. Suppose that your goal in life is to lead as rewarding of a life as possible, and you approach this rationally. Then you start thinking about how to optimize your time, and how to get the most of every day, and so on. Surely you wouldn't give anyone a ride to the airport.

And yet, I've developed a great admiration for people who give each other rides, whose sense of efficiency is trumped by a sense of companionship and consideration for others, and who don't ever try to optimize anything. These are the people who leave Box A on the ground. After all, as they know from experience, if you do, you get a million dollars.

And indeed you do. I understand Bayes' Theorem, which may put me at a disadvantage. But I've learned that it's often right to be more wrong.

2 comments:

cl said...

I guess I don't find your solution to Newcomb's solution so obvious. I read the alien as absolute omniscient God, not as superintelligent Joe. And in my world, God is the perfect predictor, so somehow I read the problem as a moral choice, not a problem of logic. Maybe my belief in an omniscient God is what makes me irrational.

Frank said...

Well, I put obvious in quotes. Apparently this is the biggest question in an entire academic field.

By "obvious" I mean a tautological consequence of my implicit assumptions on what rational behavior is. Of course, those assumptions seem to be self-defeating...

That said, your point of view is interesting. I had not considered it from a moral standpoint.