from Hacker News

Ask HN: Imagine a world with 1Tb/s internet. What would change?

by dinobones on 8/23/24, 7:27 PM with 90 comments

For the past ~10 years, it seems like internet speeds here in the US have stagnated. Probably 80% of places I've been to or lived have been 100Mbps-500Mbps.

I'm curious what things would look like if the internet was suddenly 1000x faster.

Would there be new apps we could build? Would it enable novel use cases?

  • by oneplane on 8/23/24, 7:40 PM

    Scoping this to users, it wouldn't change much at all. For data transfers you would also need storage, memory and CPUs that can handle it. For streaming it also doesn't change much since even a 4K HDR stream would work on a legacy VDSL2 system. Same goes for remote compute.

    For non-users (datacenters, companies, cluster systems etc) that might mostly help with distributed storage, but again, you'd also need all the other things, because distributing anything like that also means every piece of the puzzle needs to be able to handle it. Within a datacenter or a public cloud, even top speeds of 400Gb/s (which is less than half of the thought experiment) is at such a high tier that it isn't useful for individual applications or nodes.

    Something that would actually make an impact after you get to around 2Gbps would be lower latency, lower jitter, net neutrality (including getting a full connection with no administrative limits), and more peered routes. More bandwidth doesn't really do much, especially if you can't use that bandwidth anyway if the amount of connections, the latency and the computing equipment doesn't scale with it.

    When you have low enough latency combined with high enough bandwidth you can start to actually get new apps and also develop novel use cases (imagine a CPU-to-CPU interconnect that can span continents). But the speed of light (or some more exact latency-per-distance quantity) prevents that.

    Beyond the likes of BitTorrent and Gnutella we're not likely to see network-based ideas that are currently impossible due to limits on the average speed. Perhaps the real problem right now is the lack of universal availability of reasonable connectivity.

  • by neveroddoreven on 8/23/24, 7:59 PM

    We could have smartphones that are essentially streaming their UIs from a cloud instance that's running the actual OS/rendering for the device.

    That approach would allow for thinner, cheaper devices with longer battery lives since their hardware is only responsible for processing the video stream from the cloud and rendering it on a screen.

    Perhaps it's not a direct result of 1TB/s but such internet speeds would likely have the second order effects of providing extremely robust streaming infrastructure that enables such a use case.

  • by jaysonelliot on 8/23/24, 7:53 PM

    The first thing I'd consider is what changed when we went from 100kbps to 100Mbps.

    Now, we share complete video files and music files, whereas before we shared vector-like files such as Flash and MIDI.

    What are we doing locally today that could not be sent over our current bandwidth? Is it something that will affect telepresence, like all the 3D data needed to recreate a realistic environment in real time? Is it about more accurate control of remote objects, like drones and robotic vehicles? Maybe it will enable remotely connected computers to be more efficient clusters, taking advantage of unused cycles during off-peak hours.

    I think the biggest impact isn't going to be what happens at the faster speeds that happen in best-case scenarios. It's what will happen when mobile devices in areas with poor reception can achieve 1Gbps reliably and consistently.

  • by palata on 8/23/24, 7:45 PM

    Websites would be heavier, everything would expect a faster internet connection and using software without a fast internet connection would be worse.

    Probably software wouldn't get better, probably we wouldn't solve the real big problems of our time either.

  • by grvbck on 8/23/24, 8:10 PM

    Apple would make 16 GB the standard ssd size in all laptops.

    Fewer services would run locally. The typical user would probably not even care (or know) if their photos were stored on the phone or in the cloud.

    I don't think that much would change at first, since everything else would become a bottle neck. 16K HDR streaming? Sure, but how many people would have a 16K HDR screen? Lossless music streaming? Already here, more or less, and does not require 1Tb/s.

    Over time, everything would change of course, probably for the worse (for users). Mega corporations would be in posession of all user data, use it for AI training, ad targeting and all kinds of data extraction we can only dream of.

  • by gamepsys on 8/23/24, 7:42 PM

    Right now we have a common architecture where users upload files to a central service, and that central service then forwards the content to other users. This is true of services like Youtube, Zoom, etc. With 1Tb/s content creators could serve the content from their own network. This would allow for platforms that have much lower operating costs, and could offer much more generous revenue share. Perhaps a peer-to-peer agreement could occur, where different nodes in the network will cache and reserve each other's files to respond to highly viral content.

    I would also disagree with the thesis that internet speeds in the US have stagnated. In 2014 I had about 80Mbps. Today I have about 1500Mbps. On west coast cities I see high end condos with access to speeds up to 7000Mbps. Even my friends in pretty rural locations in 'fly over' states have access to hundreds of mbps with the latest federal grants to build fiber in rural areas. In one case I know someone that skipped from 52k to 200mbps fiber, with cable internet never offered to his house.

  • by knallfrosch on 8/23/24, 7:41 PM

    Web frameworks would be 1000x the current size.
  • by miyuru on 8/23/24, 7:53 PM

    Innovations like better videos codecs/compression would stop.

    It mirror current situation with RAM and electron apps. websites would be bloated and unoptimized, there will be GBs of JS/CSS for a single url.

    Realistically thinking, I mostly want to better connectivity across the globe(current ISPs speeds are mostly for a given metro or country) and non throttling connections that I fully utilize.

  • by curiousthought on 8/23/24, 7:43 PM

    16K porn becomes the new standard definition

    I wish I was joking but take the current usage of the internet, and scale up each part. 1TB/s might enable new things, but it's more likely to enable more of old things.

  • by n_ary on 8/23/24, 7:56 PM

    Call me cynic, but with every improvement the adtech and survillance gets stronger, so 4k video ads everywhere, more analytics to analyse your environment(somehow with lots of AI on everything), your door lock might need internet, metaverse might become a common thing, faster computers to allow AI more strong(like how we needed a hunky rig to play Crysis but my dinky notebook can do Crysis 3 fine) etc. That being said, I have access to 1Gbps but I still use 150Mbps because after 100Mbps, I don’t see any improvements for my daily life .
  • by vaylian on 8/23/24, 8:08 PM

    Companies will lean even harder into software as service so that people can't pirate stuff. Microsoft will require you to boot Windows over the internet.
  • by refulgentis on 8/23/24, 7:47 PM

    IMHO this parameter just tends to shuttle things towards agglomeration, i.e. entities with servers, that can benefit from economics of scale.

    I also believe we hit a practical wall on this that's observable by the success, or lack thereof, of game streaming.

    In a world where there was a lot of savings to be unlocked, things like Stadia and Nvidia's GeForce Now, and Xbox's service, are notable big wins.

    They didn't, which has me firmly believing incremental speed past "everyone in my household can stream video at 4K when desired" is an expected end state. That's tantamount to saying "once people can see whatever they want, at a resolution indistinguishable from reality, without delay, there won't be mass desire for increased Internet speeds", which seems intuitive.

    Anything requiring greater streaming bandwidth (ex. VR) is highly sensitive to latency, which may have also affected the game streaming use case.

    If latency approaches ~0 ms (which requires colocation with peering providers), I could see this sort of bandwidth opening up AR a bit more by effectively reducing compute requirements in such a small form factor, but that's kinda it.

  • by jiggawatts on 8/23/24, 10:05 PM

    A lot of people here are extrapolating from current tech, not realising that “quantity has a quality all of its own.”

    For one, ubiquitous terabit Internet would completely eliminate the need for local compute and storage in most form factors.

    About 60 Gbps is enough for 8K uncompressed video! You wouldn’t need a GPU or a PC to put it in. Just run everything in the cloud and access it like a virtual desktop. This is already commonplace in large enterprise.

    One issue with such virtual desktops is latency. Their disks are expensive to move around to follow users so the virtual PC doesn’t move and mobile users have to access them from far away. With terabit Internet a 1TB operating system disk could be moved to a nearby point of presence in about ten seconds. Alternatively the OS could start booting instantly from the remote disk and stream the rest in the background later.

    In other words, computing would be more like in Star Trek: there would just be this “ambient” compute you can interface with anywhere without thinking about data or device locality.

  • by mikewarot on 8/24/24, 4:50 AM

    We could have secure computers, if we adopted capability based security en masse, but that's highly unlikely. I don't see us all running Genode or Hurd any time soon.

    Because we, and the public, don't have secure computers, we're always under threat from any code we run, and any sites we use. This leads us inevitably to walled gardens, where we effectively outsource our security concerns. This has horrible consequences for Democracy, but we put up with them.

    Until we fix the root issue, faster connectivity will just mean more content "consumed" by the public, and power concentrated into fewer and fewer hands.

    ---

    On the plus side, if we all had our own secure servers, we could all have a Memex, as envisioned almost 80 years ago. We'd have our own copies of everything, and be able to spool them off for anyone as an act of sharing. We'd all have our own Library of Alexandria.

    Of course, that doesn't sit well with the rentier class, so it's unlikely as well. 8(

  • by KaiserPro on 8/23/24, 8:18 PM

    So the interesting thing to put in here is that 1tb/s might on paper look like it'd make things more innovative, but it might not because of two things:

    1) latency

    2) congestion

    Now, lets ignore power for the moment, thats a tricky thing. For example if I could connect a laptop at 1tb/s without draining the battery in 5 minutes, I could just not bother with a CPU, RAM or anything else locally. just have a dumb terminal.

    But

    Latency is the killer here, as is congestion.

    If your internet actually ran at 500megs a second, and with <10ms latency, you could offload a whole bunch of things. You could have network storage (as in NFS) you would be able to load things instantly, up to 5mb in size. (5mb is about as much as you can download in a blink of an eye at 50 megabtyes a second.)

    If you look at some of the concepts for windows 95, microsoft wanted "networked" computers, based on files and applications, rather than web pages. If you apply that to modern life, thats what 1tb/s could get you.

  • by chadash on 8/23/24, 7:53 PM

    I went from 500mpbs download/upload in my old condo to 50/20mbps download/upload in my house. There are two noticeable effects: it takes slightly longer to download movies the night before I go on flights or long car trips and it takes significantly longer to push updates to large docker containers to the cloud. Everything else is more or less identical for me.

    Now, maybe there'd be some novel use case that would come up if everyone (or let's say 80%+) had 2gbps internet, but it's hard to imagine that that's the big constraint for much. Maybe something like virtual/augmented reality could do more heavy processing in the cloud in that case (assuming low enough latency)?

  • by lunchmeat317 on 8/23/24, 7:57 PM

    I don't think anything meaningful will change. We'll have higher-definition video and that'd be it. Business practices and the client-server model wouldn't change, and the internet as we know it wouldn't change. Files will just get bigger (and arguably morr bloated). That's the trenf we've always seen.

    It might be easier to do distributed computing in some fields, and there could be interesting opportunities for mesh networks and internet of things in addition to data collection, but it'f all be the same corporate data sales stuff we see now. There won't be a paradigm shift because the current culture is built around business.

  • by gmuslera on 8/23/24, 9:11 PM

    What it could cause a big change is a drop in latency, more than of bandwidth. But for some world regions the light speed would be a limiter anyway.

    Another game changer it would be internet (with good enough bandwith/latency) everywhere. Scratch that, Starlink is doing pretty much that, or at least have the potential to do it soon enough.

    The key disruptor is near to free to get and to use it somewhat, and what kind of ubiquitous devices with that universal internet access would imply to change everything. GPS is a good example of that kind of impact, without the internet part.

  • by pbowyer on 8/23/24, 7:37 PM

    I'm on 45Mbps/11Mbps in a suburban area in the UK. We might get 1Gbps next year. A cable provider (Trooli) previously installed down our road in 2023 and left out two houses. The one I own is one of them.

    At this point the main thing that would change is I'd be able to do online backups for TB of data. With 1Tb/s internet I would save a lot less on my hard drive and download ML models and more whenever I needed them. But I just know the remote server I'm downloading from would still throttle me to Mbps to guard their bandwidth.

  • by kkfx on 8/23/24, 9:02 PM

    Well, there are some thing to be put on the table:

    - in 10 years audio/video resolutions will probably be very higher, probably a bit more than human eye resolution, 3D movies might became a thing not only a curios experiment demanding very expensive and crappy iron to be used, so it would be common having let's say 50Gb for a common movie, demanding let's 1Gb/sec for a conference call etc;

    - dataset will probably be much bigger, fine grained and very long in timeline terms, let's say a future home assistant with InfluxDB will not record daily maxima and minima for 10 years but 1' resolution temp and gazillion more sensors, let's say it could be common to have 3D thermo-cam at home to regulate ventilation better and so on.

    So, on one side, anything computer related should be expected much bigger than today, as today almost anything is much bigger than 10 years ago, and not in a linear progression.

    As a middle ground we have to consider some so far known physical limitations and some climate and geopolitical changes, meaning it's even possible than in 10 years internet will be in a much worse shape than today because mass migrations and world wars have cracked the current infra and poverty caused by wars and not so well done and quick reorganization of the world supply chains might left us with limited wireless comms with too few low altitudes satellites.

    Finally... If we achieve a steady bandwidth growth the current dare, sorry state of IT, archaically keeping up a crappy modern mainframe model for the service economy, where almost no one own nothing except big tech... Well, it might be worse "hey, do not buy an expensive NVME drive! Just mount an NVME one in a proper datacenter via internet" and I do not like much such nightmare...

    BEWARE, so far thin clients are not much less limited than old dumb terminals, but they are still common and it's pretty common to have gazillion of people working on remote desktop because 99% of the companies infra are not designed for distributed desktop computing, so they keep absurd centralization, totally ignoring the enormous waste of resources, limits of use and comfort, and attack surface of modern "endpoints". Such disgraced model can be made worse under the flag of reducing hw costs to the consumer.

  • by surgical_fire on 8/23/24, 7:45 PM

    Ads would get bigger and more annoying.
  • by poikroequ on 8/23/24, 7:46 PM

    If corporations got their way, nothing would run locally anymore. Everything would run in the cloud and our devices would be nothing more than thin clients.
  • by markn951 on 8/23/24, 7:40 PM

    I feel like most use cases are more latency-bound than bandwidth-bound today. I could be wrong, maybe I'm not thinking big enough :)
  • by JasonSage on 8/23/24, 8:02 PM

    With that crazy-high bandwidth, wouldn’t most users end up using Ethernet instead of HDMI/DP for screens? I’m imaging houses have a media center computer that’s powering every desktop and television from that point. No need for local processing. The latency wouldn’t matter for most use-cases either.
  • by omegaworks on 8/23/24, 8:24 PM

    That would rival the speed of on-die processor caches. Right now the Apple M3 offers 300 GB/s or 2.4 Terabits / sec.

    A 1 terabit /sec network connection would allow you to basically allow your machine to train and tune arbitrarily sized models completely remotely.

  • by xnx on 8/23/24, 8:22 PM

    Streaming 8K per eye at 90 fps gets you pretty close to the maximum detail our eyes can perceive.
  • by mysteria on 8/23/24, 7:51 PM

    Assuming those are low latency links the line between the edge and the datacenter would be blurred. For instance it might be possible to train an LLM or have a supercomputer for weather models by combining thousands of nodes across the Internet, rather than having them all racked together. Multiple university campuses could easily combine their clusters and so forth.

    In addition we could send more raw data to the cloud for processing, rather than having it done locally. For example a cheap VR headset with no GPU could simply send raw position and control data to a cloud server, which would stream back stereo video back to the headset with little compression or latency. Or say a large surveilance system could send the footage from thousands of cameras and sensors directly to the cloud without requiring any initial processing on the edge. You could make devices smaller, lighter, and consume less power at the cost of having all the compute done off site.

  • by alfonsodev on 8/23/24, 7:45 PM

    I wish one day, normal video conference just works, is not just the speed that we need but stability of the connection, easier way that audio auto-configures, maybe we can stop saying "do you hear me? " "yes and do you hear me? " ...
  • by Stevvo on 8/24/24, 12:26 AM

    You don't have to get to even a tiny fraction of 1Tb/s before your hardware can't take advantage of the speeds. Many devices area already bottlenecked by WiFi, leaving much bandwidth unused, and people are OK with that.
  • by janice1999 on 8/23/24, 7:42 PM

    AR/VR with low latency could allow for some interesting multiplayer experiences.

    However there is also the downside of making high fidelity omnipresent surveillance easier.

  • by p1esk on 8/23/24, 9:05 PM

    It would enable geographically distributed training of large neural networks. 1Tbps internode bandwidth is roughly what’s required in modern GPU clusters.
  • by oldtopman on 8/23/24, 7:39 PM

    With bandwidth caps (soft and hard) also stagnating, you'd see increased profits for ISPs as those caps would be reached in minutes instead of hours.
  • by t0astbread on 8/23/24, 7:55 PM

    Welp, I guess HN has answered. On my front page it says:

    > 6. Ask HN: Imagine a world with 1Tb/s internet. What would change?

    > 7. OpenSSH Backdoors (isosceles.com)

  • by magnat on 8/23/24, 8:04 PM

    Network-attached storage and Network-attached RAM would be ubiquitous. You would literally be able to download more RAM.
  • by notaharvardmba on 8/26/24, 1:03 AM

    Wearable MRI with some type of big model on it that can enable brain to brain communications.
  • by insane_dreamer on 8/23/24, 8:08 PM

    Not sure about new apps but you can be sure the first thing there would be is more and higher resolution ads :/
  • by slicktux on 8/23/24, 9:35 PM

    Centralization systems become the norm and AI is common place. No need to localized software or operating systems…
  • by scop on 8/23/24, 8:02 PM

    We would still be marveling at how the latest web framework let’s you increment a counter.
  • by steelframe on 8/24/24, 1:28 AM

    For me nothing. I can already torrent a full 4k movie in under 10 minutes with my current connection, which is plenty fast. If everyone else had 1Tb and I could get a movie in 20 seconds or something, okay, fine. That would be kind of neat, I guess, but it wouldn't really be that big of an improvement to my overall life. My vision isn't good enough to appreciate the difference between 4k and 8k, so I wouldn't download 8k versions even if they were available. I'd consider that a waste of disk space.

    I might start paying for streaming movies if services would fully buffer an artifact-free copy of the movie on my media player before starting to play the content. I don't think they'd do that though even if I had infinite bandwidth to my house. To save bandwidth and storage costs on their end they're going to continue to enshittify the streams to whatever generally-tolerable trickle people won't cancel a subscription over.

    My Zoom calls would be the same. 75% of the people in Zoom calls keep their camera off anyway and hardly ever say anything, so the extra bandwidth wouldn't matter for work.

    I generally will continue to avoid any Internet of Shit devices or privacy-ravishing Cloud services, so having extra bandwidth wouldn't impact my propensity to use any of that stuff.

    I don't know. I guess I'm not creative enough or prone to use Cloud anything to imagine how my life would get better with 1000x more bandwidth. Maybe some startup would come up with something that would be the next "iPhone moment."

  • by d--b on 8/25/24, 7:08 PM

    Lower latency would change things a lot more than higher speeds.
  • by Retr0id on 8/23/24, 7:51 PM

    Modulo storage costs, we'd probably do a whole lot less compression.
  • by gedy on 8/23/24, 7:53 PM

    Websites would probably just be huge f-ing videos shouting in your face.
  • by Arch485 on 8/23/24, 7:50 PM

    Instead of a webpage loading 16MB of JavaScript for some animations and styling, itwebpages would come in a 16TB JavaScript super-framework.
  • by carabiner on 8/23/24, 7:54 PM

    Video games would be more realistic.
  • by Dwedit on 8/23/24, 7:53 PM

    Severe DDOS attacks everywhere.
  • by 1minusp on 8/23/24, 7:52 PM

    high fidelity holograms of people for remote presence