by picozeta on 10/9/22, 4:15 PM with 1 comments
Though important stuff like
- getting images (via iterations of stable diffusion) - derving facts (via GPT iterations and wikipedia) - mapping (via just compressing openstreetmap in 100G) - math (via open source jupyter iterations)
What do you think?
by smcn on 10/9/22, 6:27 PM
20 years is a long enough time that anything could happen, but I don't personally see it. The main thing is not size, it's keeping things updated.
You give Wikipedia as an example, the issue is not how big it is currently, but that it's constantly updating and evolving, so you either eat the cost of update every time you open the application, or you eat it when you load specific pages (and if you do that, you may as well just have a web browser).
And that same scenario will happen with most of the examples: Amazon with products/pricing, StackOverflow with questions/answers, OSM with routes/businesses. I just don't see why it would be better to store these things locally when, in Amazons case, I may only ever want to see a certain number of products.
My personal expectation is that hardware will get so tremendously good that software developers are able to be even lazier, meaning software still feels as slow as it always has.