In The Problem of Pain, C. S. Lewis points out that it's easy for the brain to get overwhelmed when thinking about lots of people suffering a lot, as we imagine the pain multiplying out in a linear way. But pain doesn't stack up like this. Really, when contemplating the question of "how much can human beings suffer", the only thing you need to take into consideration is the maximum amount that *one* person is capable of suffering. Of course, this is still a huge amount - but the point is that consciousness is what it is. It's not lots of little things in a pile.
Only marginally relevant to what you're discussing here, but always stuck with me.
It’s strange reading about EA for me, since I also read the logic as erroneous, but I appreciate the good being done---which is weirdly similar to how I feel about proselytizing religious charities.
Is there some probability at which you would be willing to accept a risk of torture to avoid a googolplex dust specks? Would you accept a 0.0000001 probability of 50 years of torture to avoid a googolplex dust specks? If so, it sure seems like you have to admit that torture and dust specks can be compared and traded off against each other.
I take the dust specks it will be fine. It really makes me feel queasy the 50 years of torture, even if the probability is infinitesimal. But there's a whole other factor there: when the odds are so low, you are really choosing between nothing and some mild discomfort. So it's not really a trade between torture and dust specks, it's that odds so low are effectively the same as zero.
Hmm, so anything could go on that trade then? 0.0000001 probability of the extinction of the universe versus dust specks? No, it's better to stick with the safe choice (the dust specks) in both instances, than risking a truly terrible outcome.
See it this way, in the original scenario: the dust specks create no trauma, the torture sure creates a lot of trauma!
In The Problem of Pain, C. S. Lewis points out that it's easy for the brain to get overwhelmed when thinking about lots of people suffering a lot, as we imagine the pain multiplying out in a linear way. But pain doesn't stack up like this. Really, when contemplating the question of "how much can human beings suffer", the only thing you need to take into consideration is the maximum amount that *one* person is capable of suffering. Of course, this is still a huge amount - but the point is that consciousness is what it is. It's not lots of little things in a pile.
Only marginally relevant to what you're discussing here, but always stuck with me.
Thanks for this!!!
It’s strange reading about EA for me, since I also read the logic as erroneous, but I appreciate the good being done---which is weirdly similar to how I feel about proselytizing religious charities.
Is there some probability at which you would be willing to accept a risk of torture to avoid a googolplex dust specks? Would you accept a 0.0000001 probability of 50 years of torture to avoid a googolplex dust specks? If so, it sure seems like you have to admit that torture and dust specks can be compared and traded off against each other.
Michael Huemer has interesting things to say on this subject here: https://fakenous.substack.com/p/lexical-priority-and-the-problem
I take the dust specks it will be fine. It really makes me feel queasy the 50 years of torture, even if the probability is infinitesimal. But there's a whole other factor there: when the odds are so low, you are really choosing between nothing and some mild discomfort. So it's not really a trade between torture and dust specks, it's that odds so low are effectively the same as zero.
Hmm, so anything could go on that trade then? 0.0000001 probability of the extinction of the universe versus dust specks? No, it's better to stick with the safe choice (the dust specks) in both instances, than risking a truly terrible outcome.
See it this way, in the original scenario: the dust specks create no trauma, the torture sure creates a lot of trauma!
But thanks for the link!