by szemy2 on 1/9/23, 11:46 AM with 169 comments
by klelatti on 1/9/23, 1:42 PM
The difference for me is that Web3 was never shown to be at all useful.
by iambateman on 1/9/23, 3:04 PM
ChatGPT is trivially useful in a lot of cases...The other day I asked it to write marketing copy for a project and it wrote _better_ marketing copy than I could have written in an hour. On another project, I spent 10 minutes integrating the OpenAI library and was programmatically receiving incredible results with almost no effort.
The nature of predicting the future means that there will be periods of overconfidence. In the case of Web3 (taken to mean blockchain/crypto/digital coins), the overconfidence was fueled by a core ponzi scheme combined with truly extraordinary returns for early speculators. But AI has no ponzi scheme attached to it so the comparison breaks down. My uncle cannot gamble his retirement on a mysterious promise of überwealth from a man with wild hair and no financial experience.
AI companies will have lots of false starts but 2022 was a transformational year and we are only getting started.
by a4isms on 1/9/23, 2:18 PM
This led to idle speculation: If it was possible to short early-stage startups that VCs were backing, there would be as much incentive for the media to discuss a startup’s shortcomings and vapourware promises as there is to repeat their breathless braggadocio.
by Kaotique on 1/9/23, 12:37 PM
by jackmott42 on 1/9/23, 2:12 PM
The negative takes are mostly correct in all the limitations they talk about, but what they miss is how amazing these things are despite these limitations. These things are remarkably simple and limited yet they can generate realistic photos of myself in places I've never been, wearing clothes I've never worn, doing things I've never done, with a quick text sentence. Or nearly pass the bar exam.
On top of that a lot of the limitations have straightforward ways to address, many of which are already in progress. It is going to get really interesting. StableDiffusion knows nothing about the images it is produces, its just repeated denoising with image targets. It doesn't really understand anything about your text either, its just matching up tags. But both of those things can easily change. Put a big language model in front of it to better understand text. Already variants of these image models have depth information. Next up 3d object information, maybe next models of physics so it can understand how things would actually work in the scene, and so on.
going to get wild.
by snowwrestler on 1/9/23, 1:26 PM
The biggest challenge for generative AI is its willingness to make things up. It’s fine when you’re playing around. Not so fine when you’re expecting it to actually help you in a real way.
I suspect this is why Google has not debuted such an interface despite literally decades of work on AI. You have to be able to bolt a “truth filter” onto the AI, which seems difficult.
by qabqabaca on 1/9/23, 1:08 PM
Content generation AI is so obviously useful to the majority of people and it does not require an understanding of how it works in order to be impressed by it.
by api on 1/9/23, 1:20 PM
The comparison with web3 is very excessive though. This AI stuff is at least somewhat actually useful. Web3 was a gigantic billion dollar bubble that produced very little in the way of things that are useful for any purpose, even playing around. It's one of the most vapid bubbles in history outside pure financial instrument bubbles.
by WinstonSmith84 on 1/9/23, 1:52 PM
- 15 years ago we got smartphones and people expected laptops and desktop computer to go away (within the next few years)
- we got web3 and people expected suddenly banks, tradfi, etc. to go away
- we got ChatGPT 2 months ago and now what? Google dead? School / university irrelevant? Threat to humanity?
There will be incremental improvements for AI, it will slowly become more part of our life, like it happens for these other techs
by mihaic on 1/9/23, 1:55 PM
In the next decade, I see AI to tackle a special category of problems: those that shouldn't be solved, or else the system as a whole gets worse -- chat bots for customer support, absurd amount of content creation. There's much more content today than 20 years ago, and yet my enjoyment has gone down. If I were a member of congress, I'd be surely thinking about ways to slow this down.
by marban on 1/9/23, 1:13 PM
by netman21 on 1/9/23, 3:48 PM
I completely ignored the hype around blockchain, NFTs, etc. I even block anyone who says they are a blockchain expert on Linkedin.
But I am all in on leveraging LLM for new business use cases. My mind is blown at what we can do with DaVinci 3.5 and looking forward to evaluating 4.0. ChatGPT is a demonstrator. DaVinci is ushering in the next wave of innovation.
by version_five on 1/9/23, 1:53 PM
Otoh, my impression is people who have been involved in ML for a while didn't have any sea change in their opinion of the technology based on the recent advances. These were predictable, but cool, extensions of things that were already known, and represent a fundamental advance in polish and marketing, rather than technology.
My point is that the public discourse is now mostly dominated by people looking to profit from hype, not people who actually have experience in the technology, which is of course going to lead to a web3 type feel
Incidentally, below is my prediction for 2023 from new year's eve. I didn't think it would start becoming apparent so quick: https://news.ycombinator.com/item?id=34197033
by superkuh on 1/9/23, 1:34 PM
The utility of AI can be real, just as the utility of bitcoin is, but it'll be drowned under things like "Quantum AI trading platform" or "NFT"s in the public perception.
by michaelcampbell on 1/9/23, 1:44 PM
by sharemywin on 1/9/23, 4:01 PM
as for GPT I think it is revolutionary. And it is the first chatbot that I prefer to work with over google. Way easier to use than google.
If it becomes a paid service(assuming the pricing is right) I feel it's incentives align better than googles with mine.
Where things get messed up is recommending products and services. If it can stick to a no pay to rank type of service than it would revolutionize the economy.
No assistant is perfect but seems to do better then a lot of human assistants.
Will it be perfect in the future no, but I think if it can offer some kind of confidence factor to it's answers that would go a long way.
by more_corn on 1/9/23, 4:46 PM
AI as it exists TODAY has the potential (with a bit of prompt engineering and a free account) to assist everyone in their jobs. It will disrupt the software industry, art, writing, education, law, science.
Specialized AI assistants already exist and work startlingly well. It can write at a college level. It can code at a college level. It can learn specialized knowledge worker skills in a trivial amount of time. (Law for example)
It’s ok to not be optimistic. But if you discount it entirely you’re in for a bad time in the coming year, three years, five years.
Actually no, do whatever you want. Ai will come as a surprise to something like 6 billion people. There’s no harm to me in you being part of that group. And frankly the existence of different opinions about the future is a great hedge every society makes.
by nsm on 1/9/23, 8:01 PM
I find the likes of ChatGPT wonderful, but in the end image and copy generation seems like a very first world/"content creator" use case that won't help us solve critical problems.
by birdymcbird on 1/9/23, 12:54 PM
Think this cynical. author good to see ads or marketing using hype for clicks. people do stuff to make money on latest trend, just nature.
but me dont know anything productive in web3. it hype through all way. English not me first language. writing use almost feel magic to have chatgpt clean and rewrite my posts. also used for coding examples. dont think this equal to web3. maybe need time to mature to billion dollar scale but me wouldnt bet money against
by peter-m80 on 1/9/23, 3:57 PM
On the contrary, web3 is nothing more than empty promises.
by mikerg87 on 1/9/23, 12:47 PM
by Havoc on 1/9/23, 2:49 PM
by rayiner on 1/9/23, 2:57 PM
In particular the collapse of faith in the future of Tesla’s full self driving, and self-driving cars generally, has been palpable.
by juujian on 1/9/23, 2:27 PM
by NDizzle on 1/9/23, 1:45 PM
by Isinlor on 1/9/23, 2:32 PM
Current AI models have three main limitations:
- rapid skill acquisition. Example: someone invents a new programming language. Something like Elm or Rust. Humans can start using it right after reading a "quick start" page and few tutorials from the authors of the new language. How much training data will GPT-style models need to start using that language? A lot, like output of hundreds or thousands of people. This needs to go down by a factor of 10x to 100x to match humans.
- agency, or taking actions guided towards a goal. Example: Can you ask GPT-like model to book a flight ticket for you? Help it help you learning Photoshop? New IDE? Test your latest app or indie game and find bugs? No. The ability of current models is still not good enough to be useful to an average person with regard to interacting with outside world.
- acting in the physical world. Example: An average human can learn to drive enough to pass a driving test in the order of 100 hours during a driving course. How far away are we from a system that can control humanoid-style robot to learn to drive in the order of 100h? Currently, we can't do it with billions of dollars with specially designed hardware. Using humanoid robot to control a car is not even useful as a benchmark for the state of the art machine learning systems.
IMO the currently existing systems like ChatGPT or Stable Diffusion are worth all put together in the range of $10B to $100B in the next 3-5 years.
Future systems that will address all 3 limitations mentioned above may be literally the whole reachable universe changing if we decide to build self-replicating space probes (Von Neumann probes). We know that there are physical systems that are not very intelligent but capable of exponential growth like viruses or bacteria (humans too).
The main limitation of biological systems is their adaptability, especially to lack of water. If robots can build other robots avoiding bottlenecks of human intellectual and manual labor then the robots are limited only by resources and energy available. We have plenty of both on Earth and Solar system is full of it.
Also, I'm just talking about human-like abilities. All of that is possible without involving concepts like superintelligence. Bacteria are not very smart, but they can multiply exponentially.
The spread of possibilities is enormous.
All of that hinges on your timelines on the 3 above mentioned limitations. It's like predicting when atomic bomb will be possible. It may not happen for decades, or there may be right this very moment someone with an idea that will make it all possible.
by rafaelero on 1/9/23, 4:08 PM
by karmasimida on 1/9/23, 2:42 PM
What is web3 actually? I don't think there is a definition people can agree on. So even bashing web3 as if it is anything other than at best a vaporware, worst case a scam, is fruitless and not grounded in reality.
Let's talk about the current climate in AI part. From his tweet, he seems to point to Sam Altman's statement that GPT-3.5+ is going to 'civilization transformation'. Well, if you follow Altman's past statements, he likes to make grandiose statements like this, and at the end of day, it is more of his personal style of speech than anything else.
So let's back to the comparison of web3 vs GPT style AI agents. I would argue, the latter is indeed, partially already a reality. Imagine if you have a multiple modular in-context learning agents, then here is the opportunity: A machine that is programmable through natural language and examples/demonstrations. The level of automation potential is going to insane, even scary. What if we hook it up with some robotic arms? If someone makes this work, we will have a factory that can make many many things, at the same time, in the same place, with little human involvement. This, of course, is going to fundamentally change wide range of industries, or capitalism itself and will have geopolitical implications.
by ackbar03 on 1/9/23, 2:19 PM
by anotheryou on 1/9/23, 1:02 PM
by labrador on 1/9/23, 11:52 AM
by marginalia_nu on 1/9/23, 12:35 PM
by spaceman_2020 on 1/9/23, 12:46 PM