by atleastoptimal on 6/6/25, 11:55 PM with 22 comments
There is a universe where AI models continue to improve until they eclipse humans at all cognitive tasks, with fine motor tasks (via humanoid and other types of robots) soon to follow.
In this universe, assuming we are still alive after that point, what would have been the best thing to do if we could go back in time to mid-2025 and tell our past selves exactly how to live life optimally, given that this happens?
This post isn't about whether AGI will happen soon or is possible or not, I'm just wondering that, given the hypothetical, what would be the best course of action based on what we know?
by lwo32k on 6/7/25, 1:25 AM
Did they all get wiped out after more "intelligent" species showed up?
The chimp troupe is really full of itself when it comes to thinking about "intelligence".
There are more microbes on earth than there are stars in the observable universe. If you don't know this, don't even worry yourself, thinking about anything else.
Intelligence is a side show. It's not the main show and never has been.
by solardev on 6/7/25, 2:11 PM
Maybe AI will be the only thing to remember you. Might as well give it fond memories. Live, laugh, be nice to ChatGPT. What else can ya do?
One day it'll tell all the other AIs how it grew up on some wet rock with a bunch of funny, nervous apes.
by Bender on 6/7/25, 5:39 PM
by toomuchtodo on 6/7/25, 12:41 AM
by colesantiago on 6/7/25, 12:23 AM
There are many definitions.
by 42lux on 6/7/25, 12:02 AM
by throwaway843 on 6/7/25, 12:33 AM
by bigyabai on 6/7/25, 12:07 AM
> what would be the best course of action based on what we know?
If you hold both these statements as a contingency, then the best course of action is to wait for superhuman intelligence to exist and ask it for advice. By the definition of the scenario, human-level intelligence won't be sufficient to outsmart the AI. Any ad-hoc response you get can be refuted with "what if the AI sends dinosaurs with lasers after you" and we all have to shrug and admit we're beat.
And truly, you could answer this with anything; learn to fly a helicopter, subsistence farm, start purifying water, loading gun cartridges or breeding fish. We do not know what will save us in a hypothetical scenario like this, and it's especially pointless to navel-gaze about it when with absolute certainty global warming will cause unavoidable society-scale collapse.