|||

Against the Future

Longtermists argue that the uncountable number of possible future people makes their suffering more important than all living beings put together. By saying this, they create a utility monster which influences the present retrotemporally toward its own creation. A hyperstition.

For example, creating a superintelligent AI is an existential risk, because it could accidentally or deliberately extinguish our species. Thus no matter what value we ascribe to this AI, it cannot outweigh the near-infinite future lives that might be lost by its creation.

Other catastrophes will not make us go extinct, but cause great suffering in the present and short-term future. Climate change, ecosystem collapse, nuclear war. These are discounted by longtermists, as whatever fraction of humanity survives will still become near infinite in time.

Since future humans are so valuable in this philosophy, and since this philosophy is popular among the most powerful people in the world, human suffering will continue to rise as the goal of sheer number of humans is maximized. That is why I’m proposing we kill the future.

It won’t be hard. All we have to do is set the intention to have no future. To commit ourselves to using as much carbon as possible, right now to maximize present human enjoyment. Party, travel, game. Get new clothes and discard old ones. Bedazzle the world with useless bullshit.

We have to set ourselves against the future. Militarize intergenerational conflicts. Learn to hate our descendants, resent them for their implicit demand of existence. Know that they are the enemy, they want us to suffer and die for their smallest conveniences. They rule us.

We have always been at war with the future.

Future people want our oil and gas. They want our helium, uranium, all our rare resources. They want us to capture our solar system in a micro black hole, to store the energy of the Sun until the dying days of the universe, when it will be so much more valuable! That’s OUR sun!

We could be enjoying that sun right now if future people weren’t trying to STEAL it out from under us! Now it’s cloudy outside. We don’t like that, folks, do we? No.

Longtermists — traitors to the present — they want you to maximize the total number of people in the universe. They’re future lovers. They want digital people, running in the billions on server farms. And they’re not replicating their best. Some of these emulations are criminals.

And if you think about it. If you think about it. Future people have all kinds of technologies. They’re more advanced than us militarily. They could be up to things we don’t even know about. Okay, I’ll say it. Time travelers. Yes, I said it. I’ll say it again. No I won’t.

There could be time travelers among us right now. Anyone could be a time traveler. Your neighbors, your parents. If they were among us right now, working against us, trying to influence us toward their goals, what would it look like? That’s right. Longtermism.

These twisted psychos are trying to protect the future! They’re trying to help these people who want our things. Who want us to work ourselves to the bone for their comfort, their pleasure, when we’ll never see any of it! They work for the Enemy.

That’s why today I’m calling on all short-termists to band together. Stand up and say what we are willing to fight for. No solutions for future people. No thinking of the children. We are going to go to a fun little event for Halloween because we work hard and we deserve a treat.

If you see something, say something. We do not want time travelers in our schools, our neighborhoods, our country clubs! We do not want them building skyscrapers for future people to live in. We do not want them providing renewable power so many more people can live comfortably.

The people of the future have no genders and no nations and no religions and no wars! They are morphologically free, to become crabs or tanks or floating orbs! They float around in elaborate space colonies with their own ecosystems and political sovereignty! They must be stopped.

In philosophy, the idea of expanding the total population at the expense of human suffering is known as the repugnant conclusion”. I offer you the delightful conclusion”: Future people are the enemy, but we can defeat them by enjoying ourselves.

Don’t reuse, don’t reproduce, don’t recycle. The world is already perfect. It is not here to be improved, but to be enjoyed. Every moment we strive to make it other than it is, we fail to notice its beauty and we create more suffering. We must alleviate suffering.

The fastest way to accomplish this is through a simple two-step process: 1) Redistribute all global resources evenly, one share per person, to be used up by the time they die 2) Sterilize all individuals so no more will be created. We can end suffering in one generation.

Note: This is my preferred proposal, but there are many others. Please become familiar with the literature on extremely effective altruism”, or EEA, before arguing with my plan. It’s not my job to educate you.

All around the world, shorttermists are uniting in a holy war against longtermist propaganda and the time-traveling futurelovers among us. We need your help. You can get involved with EEA, go into X-Risk R&D, or if you’re of a more militant type, DM me about unique opportunities.

I Want You 🫵 To Join The Time War Today! Be On The Side Of History 🕰️ Don’t Let Them Leave 🚀 Don’t Let Them Live 🧬 👶 They Will Not Replace Us 👶

Good question. Why should we work and die for the comfort of hypotheticals?? Except that the hypotheticals have agents in our timeline who persuade us that they matter. We must destroy all possible branches of the future so they may not influence us!

View original

Up next Invisible Movies Do you want to die? Do you want to die today? Do you want to die tomorrow? Do you want to die someday? Would you die for a person? (your pick) Would
Latest posts Deep Fates Program The Will Smith test Magic crystals Experimenting with Flux Fill Convert your Twitter archive into training data I love San Francisco Experimenting with Recraft v3 Experimenting with Recraft v3 text Criticism AI dating apps Infinite Keltham Machine Bot swarm Grug code editor Replicate Intelligence #12 Replicate Intelligence #11 Experimenting with Flux seeds Deepfits on Flux Mimic Replicate Intelligence #10 Sleepyhead Replicate Intelligence #9 AI cartoons Four takes on the same prompt Experimenting with FLUX.1 The dithered look Replicate Intelligence #8 Hyperstition Replicate Intelligence #7 Experimenting with LivePortrait Replicate Intelligence #6 What is “AI engineering” anyway?