Utopia:
If there's only a five percent chance we go extinct, that means there's a 95% chance that human experience lives on. Should we not spend more time ensuring that time is spent well? Should we not dedicate more resources to ensuring we get to post-instrumental utopia?
The future:
Let's say someone else is playing a game. The outcome of the game is that there is a 10% chance your parents die, and a 90% chance your parents get to utopia. If you killed the person playing this game, would you be wrong?
Nightmares:
If you are an EA, you believe that conscious suffering matters. In-the-moment suffering, meaning if you are tortured and your brain is wiped after and you have no memory of the event, that is bad. If you are a shrimp and you suffocate once brought on land, that is bad (possibly). Well, what about nightmares? There are some nightmares I've had that I certainly remember, and I'm fairly positive in the moment I am facing actual psychic distress. Should a new cause area be to limit the amount of nightmares people have, or the intensity?
Breakups:
Breakups are some of the most negatively impactful events for most people. I'd much rather break a bone than get a divorce, and it's not even close. Pain in the moment is hard to compare to the toil of human relationships. It seems in a country where most middle class families can put food on the table, but almost half dissolve because of divorce, we might be missing some low-hanging fruit.
Magnitude:
EAs aren't usually directionally wrong about things. Sure, they mess up the magnitude. But the direction is usually correct. Animal welfare is a good example of this.
The Repotato Conclusion:
Are plants morally valuable? Is a potato? How many potatoes equals one human life?
Life:
It is very hard to live life outside the Overton Window. It's easy to claim to be an independent free thinker who stands up for their ideas. But when actually faced with public mockery and shame, one realizes how hard life can become.
Digital:
Consciousness also falls victim to the anthropic principle. This may be the only sort of universe where consciousness can exist. This may mean things like digital consciousness are more likely.
AI:
We basically want the future ASIs to think humans are utility monsters. That is the control problem.
Simulation:
If you take simulation theory seriously, you think that we are probably digital minds. In which case, you should probably care a lot about how digital minds are treated.
No comments:
Post a Comment