by dinobones on 8/23/24, 7:27 PM with 90 comments
I'm curious what things would look like if the internet was suddenly 1000x faster.
Would there be new apps we could build? Would it enable novel use cases?
by oneplane on 8/23/24, 7:40 PM
For non-users (datacenters, companies, cluster systems etc) that might mostly help with distributed storage, but again, you'd also need all the other things, because distributing anything like that also means every piece of the puzzle needs to be able to handle it. Within a datacenter or a public cloud, even top speeds of 400Gb/s (which is less than half of the thought experiment) is at such a high tier that it isn't useful for individual applications or nodes.
Something that would actually make an impact after you get to around 2Gbps would be lower latency, lower jitter, net neutrality (including getting a full connection with no administrative limits), and more peered routes. More bandwidth doesn't really do much, especially if you can't use that bandwidth anyway if the amount of connections, the latency and the computing equipment doesn't scale with it.
When you have low enough latency combined with high enough bandwidth you can start to actually get new apps and also develop novel use cases (imagine a CPU-to-CPU interconnect that can span continents). But the speed of light (or some more exact latency-per-distance quantity) prevents that.
Beyond the likes of BitTorrent and Gnutella we're not likely to see network-based ideas that are currently impossible due to limits on the average speed. Perhaps the real problem right now is the lack of universal availability of reasonable connectivity.
by neveroddoreven on 8/23/24, 7:59 PM
That approach would allow for thinner, cheaper devices with longer battery lives since their hardware is only responsible for processing the video stream from the cloud and rendering it on a screen.
Perhaps it's not a direct result of 1TB/s but such internet speeds would likely have the second order effects of providing extremely robust streaming infrastructure that enables such a use case.
by jaysonelliot on 8/23/24, 7:53 PM
Now, we share complete video files and music files, whereas before we shared vector-like files such as Flash and MIDI.
What are we doing locally today that could not be sent over our current bandwidth? Is it something that will affect telepresence, like all the 3D data needed to recreate a realistic environment in real time? Is it about more accurate control of remote objects, like drones and robotic vehicles? Maybe it will enable remotely connected computers to be more efficient clusters, taking advantage of unused cycles during off-peak hours.
I think the biggest impact isn't going to be what happens at the faster speeds that happen in best-case scenarios. It's what will happen when mobile devices in areas with poor reception can achieve 1Gbps reliably and consistently.
by palata on 8/23/24, 7:45 PM
Probably software wouldn't get better, probably we wouldn't solve the real big problems of our time either.
by grvbck on 8/23/24, 8:10 PM
Fewer services would run locally. The typical user would probably not even care (or know) if their photos were stored on the phone or in the cloud.
I don't think that much would change at first, since everything else would become a bottle neck. 16K HDR streaming? Sure, but how many people would have a 16K HDR screen? Lossless music streaming? Already here, more or less, and does not require 1Tb/s.
Over time, everything would change of course, probably for the worse (for users). Mega corporations would be in posession of all user data, use it for AI training, ad targeting and all kinds of data extraction we can only dream of.
by gamepsys on 8/23/24, 7:42 PM
I would also disagree with the thesis that internet speeds in the US have stagnated. In 2014 I had about 80Mbps. Today I have about 1500Mbps. On west coast cities I see high end condos with access to speeds up to 7000Mbps. Even my friends in pretty rural locations in 'fly over' states have access to hundreds of mbps with the latest federal grants to build fiber in rural areas. In one case I know someone that skipped from 52k to 200mbps fiber, with cable internet never offered to his house.
by knallfrosch on 8/23/24, 7:41 PM
by miyuru on 8/23/24, 7:53 PM
It mirror current situation with RAM and electron apps. websites would be bloated and unoptimized, there will be GBs of JS/CSS for a single url.
Realistically thinking, I mostly want to better connectivity across the globe(current ISPs speeds are mostly for a given metro or country) and non throttling connections that I fully utilize.
by curiousthought on 8/23/24, 7:43 PM
I wish I was joking but take the current usage of the internet, and scale up each part. 1TB/s might enable new things, but it's more likely to enable more of old things.
by n_ary on 8/23/24, 7:56 PM
by vaylian on 8/23/24, 8:08 PM
by refulgentis on 8/23/24, 7:47 PM
I also believe we hit a practical wall on this that's observable by the success, or lack thereof, of game streaming.
In a world where there was a lot of savings to be unlocked, things like Stadia and Nvidia's GeForce Now, and Xbox's service, are notable big wins.
They didn't, which has me firmly believing incremental speed past "everyone in my household can stream video at 4K when desired" is an expected end state. That's tantamount to saying "once people can see whatever they want, at a resolution indistinguishable from reality, without delay, there won't be mass desire for increased Internet speeds", which seems intuitive.
Anything requiring greater streaming bandwidth (ex. VR) is highly sensitive to latency, which may have also affected the game streaming use case.
If latency approaches ~0 ms (which requires colocation with peering providers), I could see this sort of bandwidth opening up AR a bit more by effectively reducing compute requirements in such a small form factor, but that's kinda it.
by jiggawatts on 8/23/24, 10:05 PM
For one, ubiquitous terabit Internet would completely eliminate the need for local compute and storage in most form factors.
About 60 Gbps is enough for 8K uncompressed video! You wouldn’t need a GPU or a PC to put it in. Just run everything in the cloud and access it like a virtual desktop. This is already commonplace in large enterprise.
One issue with such virtual desktops is latency. Their disks are expensive to move around to follow users so the virtual PC doesn’t move and mobile users have to access them from far away. With terabit Internet a 1TB operating system disk could be moved to a nearby point of presence in about ten seconds. Alternatively the OS could start booting instantly from the remote disk and stream the rest in the background later.
In other words, computing would be more like in Star Trek: there would just be this “ambient” compute you can interface with anywhere without thinking about data or device locality.
by mikewarot on 8/24/24, 4:50 AM
Because we, and the public, don't have secure computers, we're always under threat from any code we run, and any sites we use. This leads us inevitably to walled gardens, where we effectively outsource our security concerns. This has horrible consequences for Democracy, but we put up with them.
Until we fix the root issue, faster connectivity will just mean more content "consumed" by the public, and power concentrated into fewer and fewer hands.
---
On the plus side, if we all had our own secure servers, we could all have a Memex, as envisioned almost 80 years ago. We'd have our own copies of everything, and be able to spool them off for anyone as an act of sharing. We'd all have our own Library of Alexandria.
Of course, that doesn't sit well with the rentier class, so it's unlikely as well. 8(
by KaiserPro on 8/23/24, 8:18 PM
1) latency
2) congestion
Now, lets ignore power for the moment, thats a tricky thing. For example if I could connect a laptop at 1tb/s without draining the battery in 5 minutes, I could just not bother with a CPU, RAM or anything else locally. just have a dumb terminal.
But
Latency is the killer here, as is congestion.
If your internet actually ran at 500megs a second, and with <10ms latency, you could offload a whole bunch of things. You could have network storage (as in NFS) you would be able to load things instantly, up to 5mb in size. (5mb is about as much as you can download in a blink of an eye at 50 megabtyes a second.)
If you look at some of the concepts for windows 95, microsoft wanted "networked" computers, based on files and applications, rather than web pages. If you apply that to modern life, thats what 1tb/s could get you.
by chadash on 8/23/24, 7:53 PM
Now, maybe there'd be some novel use case that would come up if everyone (or let's say 80%+) had 2gbps internet, but it's hard to imagine that that's the big constraint for much. Maybe something like virtual/augmented reality could do more heavy processing in the cloud in that case (assuming low enough latency)?
by lunchmeat317 on 8/23/24, 7:57 PM
It might be easier to do distributed computing in some fields, and there could be interesting opportunities for mesh networks and internet of things in addition to data collection, but it'f all be the same corporate data sales stuff we see now. There won't be a paradigm shift because the current culture is built around business.
by gmuslera on 8/23/24, 9:11 PM
Another game changer it would be internet (with good enough bandwith/latency) everywhere. Scratch that, Starlink is doing pretty much that, or at least have the potential to do it soon enough.
The key disruptor is near to free to get and to use it somewhat, and what kind of ubiquitous devices with that universal internet access would imply to change everything. GPS is a good example of that kind of impact, without the internet part.
by pbowyer on 8/23/24, 7:37 PM
At this point the main thing that would change is I'd be able to do online backups for TB of data. With 1Tb/s internet I would save a lot less on my hard drive and download ML models and more whenever I needed them. But I just know the remote server I'm downloading from would still throttle me to Mbps to guard their bandwidth.
by kkfx on 8/23/24, 9:02 PM
- in 10 years audio/video resolutions will probably be very higher, probably a bit more than human eye resolution, 3D movies might became a thing not only a curios experiment demanding very expensive and crappy iron to be used, so it would be common having let's say 50Gb for a common movie, demanding let's 1Gb/sec for a conference call etc;
- dataset will probably be much bigger, fine grained and very long in timeline terms, let's say a future home assistant with InfluxDB will not record daily maxima and minima for 10 years but 1' resolution temp and gazillion more sensors, let's say it could be common to have 3D thermo-cam at home to regulate ventilation better and so on.
So, on one side, anything computer related should be expected much bigger than today, as today almost anything is much bigger than 10 years ago, and not in a linear progression.
As a middle ground we have to consider some so far known physical limitations and some climate and geopolitical changes, meaning it's even possible than in 10 years internet will be in a much worse shape than today because mass migrations and world wars have cracked the current infra and poverty caused by wars and not so well done and quick reorganization of the world supply chains might left us with limited wireless comms with too few low altitudes satellites.
Finally... If we achieve a steady bandwidth growth the current dare, sorry state of IT, archaically keeping up a crappy modern mainframe model for the service economy, where almost no one own nothing except big tech... Well, it might be worse "hey, do not buy an expensive NVME drive! Just mount an NVME one in a proper datacenter via internet" and I do not like much such nightmare...
BEWARE, so far thin clients are not much less limited than old dumb terminals, but they are still common and it's pretty common to have gazillion of people working on remote desktop because 99% of the companies infra are not designed for distributed desktop computing, so they keep absurd centralization, totally ignoring the enormous waste of resources, limits of use and comfort, and attack surface of modern "endpoints". Such disgraced model can be made worse under the flag of reducing hw costs to the consumer.
by surgical_fire on 8/23/24, 7:45 PM
by poikroequ on 8/23/24, 7:46 PM
by markn951 on 8/23/24, 7:40 PM
by JasonSage on 8/23/24, 8:02 PM
by omegaworks on 8/23/24, 8:24 PM
A 1 terabit /sec network connection would allow you to basically allow your machine to train and tune arbitrarily sized models completely remotely.
by xnx on 8/23/24, 8:22 PM
by mysteria on 8/23/24, 7:51 PM
In addition we could send more raw data to the cloud for processing, rather than having it done locally. For example a cheap VR headset with no GPU could simply send raw position and control data to a cloud server, which would stream back stereo video back to the headset with little compression or latency. Or say a large surveilance system could send the footage from thousands of cameras and sensors directly to the cloud without requiring any initial processing on the edge. You could make devices smaller, lighter, and consume less power at the cost of having all the compute done off site.
by alfonsodev on 8/23/24, 7:45 PM
by Stevvo on 8/24/24, 12:26 AM
by janice1999 on 8/23/24, 7:42 PM
However there is also the downside of making high fidelity omnipresent surveillance easier.
by p1esk on 8/23/24, 9:05 PM
by oldtopman on 8/23/24, 7:39 PM
by t0astbread on 8/23/24, 7:55 PM
> 6. Ask HN: Imagine a world with 1Tb/s internet. What would change?
> 7. OpenSSH Backdoors (isosceles.com)
by magnat on 8/23/24, 8:04 PM
by notaharvardmba on 8/26/24, 1:03 AM
by insane_dreamer on 8/23/24, 8:08 PM
by slicktux on 8/23/24, 9:35 PM
by scop on 8/23/24, 8:02 PM
by steelframe on 8/24/24, 1:28 AM
I might start paying for streaming movies if services would fully buffer an artifact-free copy of the movie on my media player before starting to play the content. I don't think they'd do that though even if I had infinite bandwidth to my house. To save bandwidth and storage costs on their end they're going to continue to enshittify the streams to whatever generally-tolerable trickle people won't cancel a subscription over.
My Zoom calls would be the same. 75% of the people in Zoom calls keep their camera off anyway and hardly ever say anything, so the extra bandwidth wouldn't matter for work.
I generally will continue to avoid any Internet of Shit devices or privacy-ravishing Cloud services, so having extra bandwidth wouldn't impact my propensity to use any of that stuff.
I don't know. I guess I'm not creative enough or prone to use Cloud anything to imagine how my life would get better with 1000x more bandwidth. Maybe some startup would come up with something that would be the next "iPhone moment."
by d--b on 8/25/24, 7:08 PM
by Retr0id on 8/23/24, 7:51 PM
by gedy on 8/23/24, 7:53 PM
by Arch485 on 8/23/24, 7:50 PM
by carabiner on 8/23/24, 7:54 PM
by Dwedit on 8/23/24, 7:53 PM
by 1minusp on 8/23/24, 7:52 PM