Search Episodes
Listen, Share, & Support
Listen to the latest episode
Subscribe via iTunes
Subscribe via RSS
Become a fan
Follow on Twitter

Support Us:

Please consider making a donation to help make this podcast possible. Any contribution, great or small, helps tremendously!

Subscribe to E-Mail Updates

Related Readings
  • Answers for Aristotle: How Science and Philosophy Can Lead Us to A More Meaningful Life
    Answers for Aristotle: How Science and Philosophy Can Lead Us to A More Meaningful Life
    by Massimo Pigliucci
  • Nonsense on Stilts: How to Tell Science from Bunk
    Nonsense on Stilts: How to Tell Science from Bunk
    by Massimo Pigliucci
  • Denying Evolution: Creationism, Scientism, and the Nature of Science
    Denying Evolution: Creationism, Scientism, and the Nature of Science
    by Massimo Pigliucci

RS140 - Kenny Easwaran on, "Newcomb's Paradox and the tragedy of rationality"

Release date: August 9, 2015

Kenny EaswaranThis episode of Rationally Speaking features philosopher Kenny Easwaran, who delves into the notorious "Newcomb's Paradox" -- the puzzle about which it was once said, "To almost everyone it is perfectly clear and obvious what should be done. The difficulty is that these people seem to divide almost evenly on the problem, with large numbers thinking that the opposing half is just being silly." Kenny and Julia explore how Newcomb's Paradox is related to other puzzles in decision theory, like the Prisoners' Dilemma; what its implications are for free will; and what Kenny calls the "deep tragedy" at the heart of rationality.

Kenny Easwaran is an Associate Professor in the Philosophy Department at Texas A&M University. He works on several topics relating to epistemology and decision theory, and the role of probability in helping to understand these and related concepts. 

Kenny's pick:  "Thinking, Fast and Slow" 


 Full Transcripts 

Reader Comments (17)

Being what they call a Post-Modernist, a follower of Wittgenstein, I have to respond to this in my own sense of rational thinking. My problem with such puzzles is that they always fail to fully define the situation in realistic terms. There is always an assumption that there is no more to the puzzle than what has been stated, which simply doesn't realize the full implications of one's decision making process. There are and will always be consequences to ones action or inaction. What are the full implications, for example, of not following through on a promise, either to pay the money for the ride, or to drink the poison one intends to drink.

The problem with the box of money may seem a little subtler, but that's only because the problem itself is less likely and even less defined in terms of situational ethics. If I were to walk into such a situation and the question posed to me in those terms, the first thing I would have to ask is: "What are you not telling me?" It's most likely that I would opt to not take either on the grounds that I don't know where this money is coming from or who could be better served by it than me.

What are the consequences of being one type of person or another in rationally dealing with life? Knowing that there are other people in the universe who could possibly suffer the consequences of my failure to do what I promise, whether I will ever know about it or not causes me to not always do what is in my own best interest. That is the nature of morality and ethics. My very rational answer to any of these problems is to always do the ethical thing because you simply don't know.
August 10, 2015 | Unregistered CommenterScott Bloom
August 10, 2015 | Unregistered CommenterSteve Walker
If the predictor is perfect, then to me it's obvious that you should take the one box, but not everyone can even agree on that. Some conclude that a perfect predictor is impossible. But suppose that instead of you making the decision, you have to write a computer program that does it, and the predictor can read the source code and see what it'll do. Then it's obvious that it should take one box. Or suppose the predictor simply asks you, "What will you pick? Be honest or I'll kill you." Again, you'd pick the one box.
Both scenarios force you to make a decision BEFORE the predictor does anything. But that's what determinism says, that everything is already decided from the outset. Yet the illusion of free will makes you think that you're deciding AFTER the money has been placed.

On the other hand, if the predictor is just your friend who knows you, then he's guessing what you'll do based on your past behavior. If he thinks you'll pick two boxes, then you're out of luck. If you decide to surprise him by picking one box, you'll just be left with no money, so you may as well take both boxes. Your best strategy would be to bluff that you'll pick one box, but then take both boxes.
August 11, 2015 | Unregistered CommenterMax
For the sake of argument, suppose the hypothetical "smoking gene" raises the risk of cancer and guarantees that you'll start smoking at some point. In that case, whether or not you smoke won't actually change your genes, but if you never start smoking then you'll have peace of mind knowing that you don't have that nasty gene. And your life insurance company will also know that you have a lower risk of cancer, and will charge you less than smokers.
August 11, 2015 | Unregistered CommenterMax
What would Bayes do?

Assuming the oracle is limited to intentions only and its goal is to maximize its own accuracy how does one maximize the take.

Strategy 1. Enter the room and intend to flip a coin. If heads take both boxes and tails take just the one. The best the oracle can do is flip a coin to decide whether to put the million in the first box. The expected payout is $500,500. The $500 is from the half the time the player picks both boxes and $500,000 is from the half the time the oracle decides to put in the million.

Strategy 2. Enter the room and intend to roll a die. If 6 comes up take both boxes otherwise take just the one. The expected payout is $1,000,167. The $167 is from the 1/6th of the time six shows up. The oracle will always put in the million because this maximizes its accuracy.

Strategy 3. Enter the room with a slightly bias coin that comes up heads 499 times out of 1000. If heads take both boxes and tails take just the one. The expected payout is $1,000,499. Any small bias toward taking only one box will cause the oracle to always put in the million.
August 14, 2015 | Unregistered CommenterAlan
I think another issue that paradoxes like these leave out is what the utility of the consequences will be going forward in your life based on what your life is already like. If you already have a million dollars, you're going to one-box because the extra $1000 is essentially meaningless but the mil doubles your net worth. But if you're penniless, you might two-box to be assured of having something. If you're the hitchhiker, and you have only $1000 to pass on to your daughter, and you've been shot and are bleeding out, you might refuse the driver's request to begin with because if he gets you back to town, you pay him, and die a day later, your daughter gets nothing. If you're a young billionaire, then you just pay the guy and think nothing of it, since $1000 doesn't really mean anything to you.

This actually comes up in gaming a lot. In tournament poker, a lot of decisions will change based on how many chips you already have. If you call and are wrong, you'll be crippled or out, but if you fold you'll maintain a workable stack. So you might fold in a situation that might have a slightly positive EV in chips just to preserve the utility of the stack. The reverse of that would be having a huge stack where you don't mind calling anything, because even if you lose you'll still have a lead.

Regardless, the person offering the bet in Newcomb's Paradox is an idiot, as she's always out at least a thousand bucks. I'd play poker with her any day.
August 15, 2015 | Unregistered CommenterSean
The Causal Solution: Take both boxes.

The Evidential Solution: Take the opaque box.

The Zen Solution: Take neither box.

The Geek Solution: You're already wearing your own brain scanner. Check the logs and do the opposite of T($enter_tent).

The Kobayashi Maru Solution: Sneak in the day before and reprogram the brain scanner.

The Infomercial Solution: Sell 100k copies of the book "Beat the Brain Scanner and get $1,001,000" for $10.01 each.

The Philosopher Solution: Prove that such a device would solve the Halting Problem, or violate Godel's Incompleteness Theorem. Or argue about Quantum Physics, Heisenberg's Uncertainty Principle, or Special Relativity.

The Wall Street Solution: You have insider information from the company that manufactured the brain scanner, a HFT algorithm that gets the result 1ms faster, put/call options in case you make the wrong choice, and $1m bailout from Congress after you invest the money in Greek bonds.

The Brute Force Solution: There's $1m in the tent somewhere. Beat the experimenter up until <non_gender_specific_pronoun> gives it to you. (Note - this is Computer Science related pun. A joke. I do not condone violence or the threat of violence. Seriously, the internet really needs to lighten up.)
August 16, 2015 | Unregistered CommenterJeff Scott
The situation in Newcomb's paradox is a good test for magical thinking.

In as much as the situation is defined such that the person's decision to take one box or two has absolutely no affect on what money is in the boxes anyone who act/believes differently, as if their decision controls if there is more or less money in the box, is engaging in magical thinking. They are deluding themselves into feeling like they have some power which is clearly not there. Why would a person fantasize that they have powers that they clearly don't? This happens all the time in religion. People imagine that they can ask a deity for favors and have them granted. (They are raised to believe this.) It is not surprising that so many people fall into this fantasy power wish. The resulting "controversy" isn't because of any uncertainty of the rational choice in the situation. The controversy is just that too many people don't distinguish between magical thinking, wishful thinking, and realistic thinking.

I would be interested in seeing psychological studies to see if those who are willing to think that their choice in the situation can possibly affect the presence of money in the box are also more likely to engage in other forms of magical or wishful thinking. What kinds of people would engage in magical thinking? Education level, age, sex, nationality, social and economic levels, etc . I myself am strongly physics oriented and would expect that those in the sciences would be less likely to fall intro magical thinking. The situation in the paradox is a good type of situation to use a test question to look for magical thinking patterns. I imagine that something like this has been done and would like to find out more about it.

If anyone knows of such studies/tests I'd love to hear about them.
August 18, 2015 | Unregistered CommenterVector
I'm strongly physics oriented, which is why I see determinism as plausible and contra-causal free will as magical. I would be interested in seeing psychological studies to see if those who are willing to think that their choice in the situation is independent of the presence of money in the box are also more likely to engage in other forms of magical or wishful thinking.
August 18, 2015 | Unregistered CommenterMax
This decision or game theory requires that one enter values for variables into an equation, yet these values don't exist, not that anyone could determine, so how can it be considered a valuable exercise? And the dilemmas discussed, they're abstracted scenarios the mastery of which doesn't seem to offer any lessons on practical morality. They're interesting as mental games but are there any realistic applications to such exercises?
August 21, 2015 | Unregistered CommenterCurt Nelson
I've listened to most of the episodes and it's one of my favorite podcasts. A thank you post is overdue. The transcript is a great addition. If there's a particular point I want to listen to again, it's hard to find that particular bit with audio, much easier with text. I also found that when browsing through the transcript, I noticed a few other things that were interesting, which went above my head when listening.

On Newcomb's paradox, I agree with the view that backwards causation isn't happening. It's based on what kind of person you are at the moment the paradox is put to you. That may be a problem for free will, but I'm skeptical about free will, so it's not so much of a problem for me. I also like the notion that it's a contrived test conceived by a devious intelligence and as such doesn't necessarily speak against rationality in the natural wild. Lots to muse on, so an excellent episode.

Thanks to all the people involved for all their efforts over the years. Great comments too.
August 23, 2015 | Unregistered CommenterAppreciative
Some of the problem set up is just a smoke screen.

In particular the notion of "the predictor" artificially introduces concerns about free will, retro-causality, determinism and such. People start calculating expectation values, conditional probabilities and so one. The situation of the subject is conflated will these considerations which is confusing. Assuming a perfect predictor for specificity the situation of the subject is exactly the same as given below.

The situation that the person finds themselves in is this. They are told that there is money already in both boxes or not, That their decision to take one or both boxes will not affect the money in either box. They are also told that if they take both boxes there will only be money in the clear box and if they take only the opaque box there will be a million dollars in it.

Clearly saying that their decision won't affect the monies already in the boxes is flat out inconsistent with telling them that the amount of money in the boxes depends on their choice. The introduction of the predictor is just to distract the listener from the inconsistency involved in the situation being set up. It is said that the perfect predictor scenario is the easiest to decide, that the only reasonable choice is to take only the opaque box, since then surely one gets the million dollars. But that is exactly the situation given above. If one accepts the description as given one has accepted a contradiction so either side can argue forever, anything follows from a contradiction.
August 23, 2015 | Unregistered CommenterVector
"They are told that there is money already in both boxes or not, That their decision to take one or both boxes will not affect the money in either box."

That's not how I read it. The way I learned it, the predictor will put the money in the box based on what will happen in the future, and afterwards you can take one or both boxes. To me, it was always about determinism and free will.
The way it was described in the episode, a brain scanner reads your intention, so it was more about whether you can intend to do one thing, but then do another, sort of like trying to fool a very good lie detector.
But the analogy to the studies that find correlation without causation would be if the predictor noticed that, say, 60% of women take both boxes while 60% of men take one box, and predicts solely based on your gender. In that case, you'd be better off taking two boxes.
August 23, 2015 | Unregistered CommenterMax
I hated this podcast. It summed up all the pretentious bullshit that swirls around so-called "logical paradoxes". In the case of Newcomb's Problem, there is no problem at all. You've already accepted that the world you're in is based on magic, and therefore every claim made, however ridiculous, is true. Therefore, if "everyone" who only leaves with the opaque box finds that it contains a million dollars, only an idiot would take both boxes.

There's no logic or reason involved, because you've already been presented with two absolute facts: if you take just the opaque box, there's a million dollars in it. If you take both, it's empty. The rest of the story is irrelevant. It's horseshit. Their is only one fact that has any meaning at all: if you take just the opaque box, there's a million dollars in it. The only reason you wouldn't do that is if you thought the rules were lies, but that's not presented as a possibility.
July 20, 2017 | Unregistered CommenterKyle
Newcombs paradox depicts a world with causality loops. I choose one therefore predictor puts 1M therefore i choose one etc. But equally true i choose 2 therefore predictor puts 0 therefore i choose 2. Now it is not the loop that is the problem. The fact is both loops potentially exist. To force an outcome in this world only rational act is to choose 1 box.
Going here with real world causal theory thinking is irrelevant; that is going by what is current state of affairs plus my action, because it is a world of causality loops, in which this logic does not apply. In this world one cannot ignore what preceeded current state of affairs...

There is no tension between the theories.
Just one rational act given a type of a world with different causality.
July 23, 2017 | Unregistered CommenterBoaz
Great episode. I feel like my mind just got a really intense workout.

For the Newcomb Problem, consider two events, t1 brain scan, and t2 box choice. To make the already extremely difficult problem simpler, assume a 100% accuracy of the brain scan. At t2, you must make a choice knowing that t1 previously occurred with 100% accuracy. The event at t1 could make you involuntarily commit to an action at t2. If at t2 you feel you simply cannot forgo the $1K, even though it's 1% of the $1M, then you should take the opaque box and the $1K. If, however, you feel you can risk the $1K in order to have a chance at the $1M, then you should take only the opaque box. The brain scanner need not have any uncertainty since your own mind already provides uncertainty. You have to decide whether the mad scientist can make you involuntarily commit to abstaining from taking the $1K. Thus you have a rational character and engage in a rational action whatever choice you make, so long as you use rationality to predict your previous mental state. A rational person cannot, however, decide that they would have involuntarily committed to the opaque box earlier and then somehow uncommitted themselves and then take both the opaque box and the $1K. So you might have the opportunity to gain $1K or $1M, but you cannot expect to gain both the $1M and the $1K ($1,001,000). Another interesting question arises if we consider a 50% accuracy of the brain scan.
December 16, 2017 | Unregistered CommenterJameson
Actually $1K is .1% of $1M.
December 16, 2017 | Unregistered CommenterJameson

PostPost a New Comment

Enter your information below to add a new comment.

My response is on my own website »
Author Email (optional):
Author URL (optional):
All HTML will be escaped. Hyperlinks will be created for URLs automatically.