4 Comments

In The Problem of Pain, C. S. Lewis points out that it's easy for the brain to get overwhelmed when thinking about lots of people suffering a lot, as we imagine the pain multiplying out in a linear way. But pain doesn't stack up like this. Really, when contemplating the question of "how much can human beings suffer", the only thing you need to take into consideration is the maximum amount that *one* person is capable of suffering. Of course, this is still a huge amount - but the point is that consciousness is what it is. It's not lots of little things in a pile.

Only marginally relevant to what you're discussing here, but always stuck with me.

Expand full comment
Jun 18, 2023Liked by Carlos Ramírez

Thanks for this!!!

It’s strange reading about EA for me, since I also read the logic as erroneous, but I appreciate the good being done---which is weirdly similar to how I feel about proselytizing religious charities.

Expand full comment

Is there some probability at which you would be willing to accept a risk of torture to avoid a googolplex dust specks? Would you accept a 0.0000001 probability of 50 years of torture to avoid a googolplex dust specks? If so, it sure seems like you have to admit that torture and dust specks can be compared and traded off against each other.

Michael Huemer has interesting things to say on this subject here: https://fakenous.substack.com/p/lexical-priority-and-the-problem

Expand full comment