by raptorraver on 4/30/23, 7:23 PM with 30 comments
by calderwoodra on 4/30/23, 10:17 PM
by yowzadave on 5/1/23, 1:45 AM
Well this skewers the worst tendencies of the HN comment section…
by gregfjohnson on 5/1/23, 12:01 AM
by labster on 5/1/23, 12:20 AM
I’m talking about our first AIs, the corporations. They maximize their goals at any cost, live forever, assimilate other corporations, and frequently trick humans to engage in unsafe behavior for their own benefit. They expand to control more and more territory, concentrate power to themselves, and seek to make humans dependent on them — while at the same time polluting everywhere to make the human race weaker. They don’t intend this, they just have an alien code of ethics that worships shareholder value as the height of virtue.
How is computer AI going to be worse than what we’re already doing?
by cholmon on 5/1/23, 12:11 AM
by twic on 5/1/23, 12:32 AM
https://wn.rudolfsteinerelib.org/RelArtic/BlackDavid/DB1981/...
Seems the author does technical due diligence at a VC firm now:
https://www.blackliszt.com/2021/10/what-is-technical-due-dil...
by sbaiddn on 4/30/23, 11:31 PM
His part Two is available (to subscribers at least)
by zamnos on 5/1/23, 12:27 AM
The digital revolution, which began some 30 years hence, has little to do with the more recent advancements in AI/ML. The reason digital technology feels so revolutionary is because it is - it's our first glimpse into a post-scarcity future. Distribution costs for a digital work to all of humanity is low, and approaches zero if we get rid of copyright.
Copyright is an antiquated system, suitable only for a pre-digital age. We need a better system. One that incentivizes the creation of works while also admitting that DRM to make bits uncopyable is like trying to make water not wet. If instead every copy that was distributed over the Internet, including via Bittorrent paid the author back, suddenly it's not a problem anymore. There are other problems/details to be worked out, but the first step is in admitting you have a problem.
by NumberWangMan on 4/30/23, 11:02 PM
Both yes and no, in a sense.
https://slatestarcodex.com/2014/07/30/meditations-on-moloch/
We are doing this because our individual incentives are not aligned with those of the group. Moloch is the god of misaligned incentives, of coordination problems, of arms races and races to the bottom. It's a property of any system with selfish actors, which means any system that humans are part of, and probably any intelligent being that arose out of natural selection would have been hypothetically been part of.
From `Meditations on Moloch`:
> There’s a passage in the Principia Discordia where Malaclypse complains to the Goddess about the evils of human society. “Everyone is hurting each other, the planet is rampant with injustices, whole societies plunder groups of their own people, mothers imprison sons, children perish while brothers war.”
> The Goddess answers: “What is the matter with that, if it’s what you want to do?”
> Malaclypse: “But nobody wants it! Everybody hates it!”
> Goddess: “Oh. Well, then stop.”
by throwaway202303 on 5/1/23, 12:39 AM
by smitty1e on 5/1/23, 1:24 AM
Animus against that catalyst is so much hormonal groaning.
Human nature is constant.
by mftb on 4/30/23, 11:17 PM
> I cheated a bit there, I admit. I changed one of the words. The name that the saint used in that passage was not ‘Ahriman’. It was ‘Antichrist.’
That's manipulation. That's gross. That's a shame. There are some interesting ideas in the post.