by levlaz on 6/21/25, 4:49 AM with 13 comments
by xg15 on 6/21/25, 9:21 AM
I had to chuckle.
by happyPersonR on 6/21/25, 12:21 PM
by xg15 on 6/21/25, 9:20 AM
Wow. And that's the best-case scenario in the article. Explain to me again how that is the glorious future in which everyone is better off.
I think an important lens to view those developments with, apart from income and income security, would be alienation [1].
The industrial revolution brought alienation with respect to physical labor: Workers were not able to identify with (and learn the entire process of creating) a manufactured good like in preindustrial times. Instead, they were made to specialize on one specific production step inside the factory, which changed their role to the proverbial "cog in the machine". Suddenly, they weren't working to produce a good anymore, they were working "for the factory".
If the article's vision plays out (the "Shared Upswing" one, I.e. the good scenario) then the same alienation process will play out for cognitive labor: You won't think (and gain the knowledge/experience) to come up with a solution for problems that other people have, you will think to do some nebulous and hard to quantify improvements for an AI, so that can think about how to solve the actual problems - as directed by its owners. I.e. you will work "for the AI".
Even if (if!) those jobs stay economically viable enough to make a living, they sound extremely unfulfilling and much more psychologically draining than today's jobs.
[1] https://en.m.wikipedia.org/wiki/Marx%27s_theory_of_alienatio...
by mooiedingen on 6/21/25, 10:04 AM
> Altman: Is this intelligence?
> Altman: We gonna achieve AGI!!
> Altman: Gib money pleaz
by physix on 6/21/25, 6:16 AM
From what I understand, we are far from achieving AGI, and the article would have benefited from defining the term and putting it into relation.
Because the disruption is significant even without AGI.
by tim333 on 6/21/25, 12:32 PM
That's an awful lot of money chucked into the AI boom. It'll either be a big impact or a big bust.
by datavirtue on 6/21/25, 4:23 PM
This discussion is barely interesting at this point.
by ath3nd on 6/21/25, 10:01 AM
When climate-disaster-induced fires or tornadoes or tsunamis hit the data centers, I like to think that we'd like to spend a bigger chunk of our economy on food and housing. But who cares four us plebs if Sam Altman finally gets an (AI) girlfriend.
> Employers are testing replacements for workers, not just systems to augment them
That's a tale as old as time. One has to hope that capitalists would cling to oppressing humans out of habit and nostalgia and still employ us. Because, as we know from all modern economy books, if one is not in gainful employment, they are useless to society. We always have the "sell your blood" option until AGI invents cheap synthetic 3 printed blood, which, if you trust people like Elon, should happen aaany moment now.
After all, making an AI work overtime to create shareholder value has to be less satisfactory to the current Amazon RTO-policy makers than forcing real humans be stuck in traffic for no gain on productivity.