summaryrefslogtreecommitdiffstats
diff options
context:
space:
mode:
authorbnewbold <bnewbold@ziggy.huji.ac.il>2009-06-28 16:43:03 +0300
committerbnewbold <bnewbold@ziggy.huji.ac.il>2009-06-28 16:43:03 +0300
commitda7254bb2aee2ef5e15b4b272d48688cd39105cb (patch)
tree7736df564b0790b78af6fe1c66a953553b498397
parent61a434c7d72a75ab02f0fe1e979c8750609b7cff (diff)
downloadknowledge-da7254bb2aee2ef5e15b4b272d48688cd39105cb.tar.gz
knowledge-da7254bb2aee2ef5e15b4b272d48688cd39105cb.zip
a pair of docks
-rw-r--r--tmp/Newcomb paradox65
1 files changed, 65 insertions, 0 deletions
diff --git a/tmp/Newcomb paradox b/tmp/Newcomb paradox
new file mode 100644
index 0000000..c7da22b
--- /dev/null
+++ b/tmp/Newcomb paradox
@@ -0,0 +1,65 @@
+==================
+Newcomb's Dialemma
+==================
+
+Newcomb's paradox was thought up by a researcher named Newcomb; it was first
+explored and written up by Robert Nozick in the 1969 paper
+"Newcomb's Problem and Two principles of Choice".
+
+The Situation
+-------------
+As narrated by an all knowing "predictor"::
+
+ I am going to give you a choice. It is important to know that I really
+ pretty much know what you are going to do. I have been watching their whole
+ life and am additionally an immortal being; i've been doing this a long
+ time and always guess correctly. It's also important to know that I am
+ unbiased and don't care which decision you make, I have nothing to gain
+ either way.
+
+ Here are two boxes: a large and a small. The small has a 10 shekel coin
+ in it (show everybody). The large one may or may not have a thousand
+ shekels in it; you don't know. Your choice is to either take only the
+ large box or to take both the large and small boxes. The twist is that
+ I already knew which decision you will make and decided whether or not
+ to put the $1000 in the large box or not based on that knowledge.
+ If I knew you would "two box", then I left the large box empty. If I knew
+ you would "one box" then I filled it.
+
+Dominance Mindset
+-----------------
+Regardless of what decision was made previously, and whether or not there
+is anything in the large box, the person is better off taking both boxes;
+either they will get just $10 (better than none) or $1010 (better
+than $1000). So two-box.
+
+Trusting Mindset
+----------------
+The predictor is pretty much always right so we can just ignore the
+possibility that they are wrong. In this case, choosing to one-box
+implies that the Predictor knew you would and you get $1000;
+choosing to two-box implies that the predictor knew you would and you
+only get $10.
+
+The predictor doesn't even have to be perfectly accurate; say they are
+90%:
+If you one-box, your expected value is $900.
+If you two-box, your expected value is $110.
+
+Discussion
+----------
+It's disputed whether this is a paradox, and there are many deeper arguments
+that I don't have time to go into here. Ultimately, I am a one-boxer
+though this is something of a minority position.
+
+Afterword
+---------
+The person who taught me this paradox, Professor Augustin Rayo, a
+two-boxer, then had this to add. He was talking with his one-boxing friend
+and accused her of letting irrationality undermine her logic: she is so
+optimistic that if a statement S is unprovable, but it would be nicer if S
+was true than false, then she pretens that S is proven. So basically, even
+though there is no rationalization, she will accept a statement "just
+because it would be nice", and this isn't how logic works. To which she
+replied "but wouldn't it be nice if it was?".
+