Newcomb's problem

from Wikipedia, the free encyclopedia

Newcomb's problem , also called Newcomb's paradox , is a problem of decision theory raised by William Newcomb († 1999; great-great-nephew of Simon Newcomb ) at the beginning of the 1960s and first published by Robert Nozick in 1969 in a philosophical commemorative publication .

The situation of the thought experiment

There are two boxes. In the first, clear box, there is always $ 1,000; In the second, opaque box, there is either a million dollars or nothing. One of the following decisions may now be made:

  1. Only the second box is chosen.
  2. Both boxes are chosen.

An omniscient being foretold what decision would be made. His predictions are mostly correct. If this being foresees that only the second box will be chosen, it has put the million dollars in the box. On the other hand, if the being foresees that both boxes will be taken, the second box remains empty.

Since the decision as to whether the million is in the second box has already been made at the time of the election, you could take both boxes, as the amount of money won can no longer change. But that is exactly what the creature could have foreseen and left the second box empty. Accordingly, it would be better to only take the second box, because the omniscient being would have foreseen this and the second box would be full.

Modifications and delimitations

Parfits problem

Newcomb's problem is similar to Parfits problem .

Prisoner's Dilemma

If one renounces a higher being (analogous to the Laplace demon ), the problem corresponds to the prisoner's dilemma in the event that one of the two prisoners has already chosen his strategy, but the result is still unknown to the other, who now has to make his choice .

The optimal strategy in this case can be calculated using the probability that the predictor will make a correct prediction and using the sum that plays a role.

Glass container

The problem can also be applied to a glass container: if the voter sees what is in the boxes, he can make his decision accordingly. If he wants the maximum amount of money with minimal effort, he will either choose both boxes if there is money in both boxes, or he will only choose one box if there is money in only one. As a result, however, the prediction will never be true, provided that a normal course of time is assumed.

Higher being

If one assumes in the thought experiment the possibility of a higher being who is able, for example, to travel through time , this can make its decision according to the choice and fill the second box accordingly. The final state is only established through the choice, before both options are available. One could say, as also described in Matt Beller: Newcomb's Paradox is Not a Paradox , that a time travel contradicts the physics of our world and is therefore not possible. However, we know nothing about the abilities of the “higher being”, except that, in contrast to our world, it can make absolute predictions, so that this argument is not valid for the problem at hand, because a time travel is possible in a thought experiment.

An accomplice

After this variation, it is unknown what is in the two boxes because they are opaque. Furthermore, the Higher Being can look into the future but cannot travel through time. The voter's friend was there when the higher being put the money in the two boxes and therefore knows whether there is a million dollars under the second box or not. In addition, his friend is absolutely loyal to him and will always give him the tip with which he will make the most profit. The rules forbid the friend from telling whether there is $ 1 million under the second box or not. He can only say whether the voter should take both boxes or just one.

There are two options for the friend:

  • He sees that the Higher Being only put $ 1,000 in the first box and left the second box empty. In this case, he will advise the voter to take both boxes. (Because if the voter takes both boxes, he still gets $ 1,000. If, on the other hand, he only takes the second box, he gets nothing.)
  • He sees that the Upper Being is packing $ 1,000 in the first box and $ 1 million in the second box. In this case too, he will recommend the voter to take both boxes. (Because $ 1.001 million is worth more than $ 1 million.)

Regardless of what the Higher Being has seen in the future and how it has distributed the money accordingly, the friend will recommend the same thing to the voter in both cases: "Take both boxes!"

For the voters themselves, however, the same applies as in the original example: It is better for them if they only take the second box, as this will give them $ 1 million.

The paradox is: although the voter can trust his friend 100% and even though his friend did not lie to him in this case either, it was better for him not to listen to his friend. (His friend said the right thing: he finally saw that the money was under the second box. Therefore, his recommendation to take both boxes was perfectly legitimate for him and absolutely the right advice - just as it is for the person who chooses correct decision was not to listen to this (actually correct) advice.)

Opponent variant

This time we have two suitcases. In the second suitcase there is either no money or a million euros. The first suitcase is exactly 1000 euros more than the second suitcase. In principle, the following applies: The 1st suitcase contains the money that you get if you choose both boxes and the 2nd suitcase contains the money that you get if you only choose the second box. Analogous to the original problem, the omniscient being now looks into the future and predicts that they will get either 1000 euros or one million euros. As in the original problem, the voter wants as much money as possible.

To make matters worse, the rules of the game are changed in one tiny detail: no matter how you choose, the suitcase you don't choose will be your money-greedy archenemy. So you now have the choice of contenting yourself with a thousand euros so that your archenemy goes empty-handed or giving him € 1.001 million so that you receive one million euros yourself. So you think long and hard about which option you like better and in the end your decision is made.

Now the paradox is: Regardless of how you have decided, after this decision you will know how much money is in both suitcases. And although you know how much money is in both suitcases, you give your archenemy the suitcase that has more money in it, and content yourself with the suitcase that has less money in it.

Braess paradox

There is a variant of the Newcombs Paradox in which the being is not omniscient, but can only predict the future with a certain probability. Andrew D. Irvine shows that this variant of the Newcombs problem is equivalent to a variant of the Prisoner's Dilemma and can be solved with the help of a variant of the Braess paradox .

The variant of the prisoner's dilemma is that the second prisoner can predict the behavior of the player with a certain probability and adapts accordingly. (If he thinks he is being betrayed, then he is also betraying the first prisoner. If he thinks the first prisoner is cooperating, then he is also cooperating.)

The equivalence between the variant of Newcomb's problem and the variant of the prisoners' dilemma is clear: "take both boxes " from Newcomb's problem is identified with " betray the other prisoners ". And " just taking one box " is identified with " cooperating with the other prisoner ". In addition, there is a probability in both versions that the other person can correctly assess the act. And ultimately, the actual decision has no impact on the other person's actions. (Because neither the predictive being nor the prisoner experience the actual decision, but have to rely on their knowledge of human nature and on the behavior of the "player" before the decision.)

Irvine then shows that this variant of the prisoner's dilemma (and thus also the variant of Newcomb's problem) can be solved with the help of a variant of the Braess paradox. (In the Braess paradox, the travel time for all participants is extended after a new road is built .) Irvine identifies " cooperation " from the prisoner's dilemma with " driving the old route " from the Braess paradox. And he identifies " the other prisoner is cooperating " with " the other road users are driving the old route ". Accordingly, if someone does not cooperate in the prisoner's dilemma, in the Braess paradox it corresponds to driving the new route.

Analyzes and proposed solutions

This situation is paradoxical because on the one hand it presupposes free will, but on the other hand it negates it. If it is already clear how to vote, there is no free will. If one chooses after the omniscient being has filled the box or left it empty, either the choice must be influenced by it - then there is no free will - or the past may have to be changed depending on the choice. The filling of the box would only be made through the choice. The problem is often more difficult to assess than some other dilemma situations because, as many commentators understand, "omniscience" also implies that certain predictions are made; this would mean that attempted solutions using probabilistic quantifications would not continue.

Mostly, one assumes that the decision is about winning as much money as possible. Matt Beller pointed out in his article Newcomb's Paradox is Not a Paradox that there could be other criteria. For example, you can take both boxes immediately. You either have only the smaller amount or you have the large amount and at the same time you have proven that the being (the experimenter) is not omniscient.

Assuming the stated conditions are true, the optimal strategy is to only take the second box and forego the first. You then know that there is a million under the second box. A higher value cannot be obtained if the given conditions are true.

literature

  • Maya Bar-Hillel; Avishai Margalit: Newcomb's paradox revisited. In: British Journal of Philosophy of Science 23 (1972), pp. 295-304.
  • Richmond Campbell; Lanning Sowden (ed.): Paradoxes of Rationality and Cooperation: Prisoners' Dilemma and Newcomb's Problem. Vancouver: University of British Columbia Press, 1985.
  • Martin Gardner: Free Will Revisited, With a Mind-Bending Prediction Paradox by William Newcomb. In: Scientific American 229 (1973).
  • Martin Gardner: Reflections on Newcomb's problem: a prediction and free-will dilemma. In: Scientific American 230 (1974), No. 3, pp. 102-106.
  • Robert Nozick: Newcomb's Problem and Two Principles of Choice. In: Nicholas Rescher (ed.): Essays in Honnor of Carl G. Hempel . A Tribute on the Occasion of his Sixty-Fifth Birthday. Dordrecht: Reidel 1969. (Synthesis Library, Vol. 24.), pp. 114-146.
  • William Poundstone: In the Labyrinth of Thought. Rowohlt, 1995. ISBN 3-499-19745-6 .
  • Richard Mark Sainsbury: Newcomb's Paradox . In paradoxes . Reclam, 2001. ISBN 3-150-18135-6

Individual evidence

  1. Wolfgang Lenzen : The Newcomb Paradox - and its solution. University of Osnabrück, accessed on September 9, 2018 .
  2. ^ AD Irvine: How Braess' Paradox Solves Newcomb's Problem. International Studies in Philosophy of Science, Vol. 7 (1993), no. 2, 145-164.

Web links