Thursday 9 January 2014

Torture vs Dust Specks

In a post called Torture vs Dust Specks, Eliezer Yudkowsky says the worst thing that can realistically happen during one's lifetime is ~50 years of torture and the least bad bad thing that can happen is a dust speck momentarily getting in your eye.

Assuming that both the negative utility of 50 years of torture and of getting a dust speck in one's eye are theoretically quantifiable, there must be a finite number of dust specks that equal the suffering caused by 50 years of torture. Let's say that number's 500 trillion. 

Would you prefer for one person to suffer 50 years of torture or for 500 trillion + 1 people to get dust specks in their eye? (If you take issue with the numbers selected, imagine the number of people getting dust specks in their eye is a googolplex.)

I've said in Possible Positions on Insect Suffering that I think many little pains can stack up to equal singular instances of large pain. Torture seems like the only rationally justifiable choice to me.

I've thought of a way to tweak the intuition pump to serve a separate purpose. Suppose the 500 trillion + 1 people were informed of this situation and each of them selflessly agreed to accept a dust speck in the eye so that that one unfortunate person could avoid getting tortured. Would you still prefer the torture over the dust specks? In this case, I change my answer and vote for dust specks. This is because I'm a(n idealized) preference utilitarian and I want people to get what they want, even if that leads to less happiness.

For those who choose dust specks right off the bat, what if it was the other way around? What if one brave martyr volunteered to get tortured for 50 years in order to prevent 500 trillion + 1 people from getting a dust speck in their eye? Would you still prefer to have 500 trillion + 1 people get dust specks in their eye? If not, you might be a preference utilitarian.

1 comment:

  1. In some ways this sidesteps the original question, because now the payoffs are different than they were originally (at least if you're a preference utilitarian of a certain flavor, as you say). However, you make a good point that if this situation happened in reality, we could indeed see preferences like this help to resolve it.