Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> Because this could actually be the optimal policy if we take your view of "second persons" to it's optimum.

Maybe if you're evil enough to not care about any individual human's quality of life. Is there a word for the logical fallacy where you argue against the most absurd possible interpretation of a person's beliefs in order to feel no guilt for disregarding them?



The idea is that due to exponential growth, amortized over long enough times, the utility of a person's happiness right now is 0 compared to the utility of filling the planet with lets say hundreds of billions of people with barely alive standards of living. Even if an individual persons life is 100x worse than present day, it doesn't matter since there are billions more of them.

This is the standard issue with any aggregate utilitarianism theories of morality.

https://utilitarianism.net/population-ethics/

It's a stupid idea that's been soundly rejected because it posits arbitrarily bad living conditions since "anything" is better than nothing.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: