by shayonj on 6/22/25, 5:11 PM with 5 comments
by proc0 on 6/22/25, 6:28 PM
This is highlighted in statements like this one:
> For AI to truly transcend human intelligence, it would need to learn from something more intelligent than humans.
Just imagine a human with a brain the size of a large watermelon. If the brain is like a computer (let's assume functional computationalism), then larger brain size means more computation. This giant brain human would have an IQ of 300+ and could singlehandedly usher in a new age in human history... THIS is the analog of what AGI is supposed to be (except a lot more because we can have multiple copies of the same genius).
Circling back to the article, this means that an AGI by definition would have the capacity to surpass human intelligent just like a genius human would, given that the AGI is processing information the way human minds process information. It wouldn't just synthesize data like current LLMs, it would actually be a creative genius and discover new things. This isn't to say LLMs won't be creative or discover new things, but the way in which they get there is completely different and more akin to a narrow AI for pattern matching rather than a biological brain which we know for sure has the right kind of creativity to discover and create.
by PaulHoule on 6/22/25, 5:48 PM
If you have a framework stacked up to do it and you are just connecting to it maybe, but I’d expect it to take more than 50 lines in most cases, and if somebody tried to vibe code it I’d expect the result to be somewhere between “it just doesn’t work” to “here’s a ticket where you can log in without a username and password”