from Hacker News

Will serving real HTML content make a website faster?

by tkadlec on 9/21/22, 5:21 PM with 106 comments

  • by 1vuio0pswjnm7 on 9/21/22, 10:28 PM

    There could be a companion article: "Will Consuming Only Real HTML Content Make A Website Faster? Let's Experiment!"

    Having myself run this "experiment" for many years now by (a) controlling DNS so that only the domain in the "address bar" URL is resolved^1 and (b) making HTTP requests using a TCP client and/or an unpopular nongraphical web browser that only processes HTML and does not perform auto-loading of resources. No images, JS, CSS, etc.

    The answer to the question is yes. This "makes a website faster", or, more specifically, as someone else in the thread has stated, it does not make the website slow. It does not accomodate the practices of "web developers" that slow a website down.

    But most importantly, IMO, it makes "website speed", not to mention appearance, more consistent across websites. Good luck achieving any semblance of that with a popular graphical web browser.

    Most web pages submitted to HN can be read this way. I find it easier to consume information when it follows a consistent style of presentation and without the distractions enabled by "modern" web browsers.

    1. This is the only URL the www user is informed about. In the short history of the www so far, auto-loading from other domains, whether through HTML, Javascript or otherwise, has unfortnuately been abused to the point where allowing it produces more risk-taking for the www "user" than convenience for the www "developer". Sadly, instead of deprecating the "web development" practices that have been abused and make websites slow, the new HTTP standards proposed by an advertising company and supported by CDNs cater to this practice of "composite" websites comprised of resources from various third parties. It stands to reason that advertisers and therefore "tech" companies and their providers, e.g., CDNs, stand to benefit more from "composite" websites than www users do. IMHO the easiest way to "make websites faster" is to stop enabling "web developers" to do the things that make them slow.

  • by kmeisthax on 9/21/22, 9:05 PM

    I remember when single-page applications were all the rage. I was highly skeptical that they could beat just loading HTML, given that the performance benefits were all predicated upon amortizing the initial load cost over many page requests. It's a very risky bet given that a lot of sites don't have a lot of repeat traffic to begin with, unless you just so happen to be an application in the guise of a website.

    Apparently my skepticism has been validated.

  • by thwarted on 9/21/22, 9:10 PM

    A point of comparison should be to git.kernel.org, which loads and renders instantly (at least compared to all these other sites), contains a massive amount of actual content per page, is highly cacheable on the server, and uses exactly zero javascript while remaining usable (for its use case at least, which is all links and little form interaction (only the search box)).
  • by no_time on 9/22/22, 8:54 AM

    >a team may deem the initial performance impact of JS-dependence a worthy compromise for benefits they get in other areas, such as personalized content, server costs and simplicity, and even performance in long-lived sessions

    notice how all these "benefits" only benefit the developer at the expense of the user or have nothing to do with the problem at hand. "personalized content"? really?

    Pre client side Youtube,Twitter,FB,Reddit were all superior feats of engineering to their modern JS heavy counterparts.

  • by superkuh on 9/22/22, 1:21 AM

    It will certainly make the website more accessible to more people and reduce the load on their computers. This is required for government/public services. Look at how nice the UK NHS sites are. But for-profit corporations are free to, and seem to always, go the javascript application route because it is cheaper and easier to find teams to build them.
  • by kyle-rb on 9/21/22, 11:19 PM

    These types of tests tend to be unfair to actual web apps, since they only really account for first-time-use.

    Twitter is slow in this experiment because it has to load a bunch of JavaScript up front. But that's not the case in practical use! Twitter uses service workers and HTTP cache headers (e.g. `expires`) to make sure that most non-first-time-users aren't actually loading most things every time. Client-side rendering isn't the thing that's slow here, it's mostly the re-downloading of the rendering code every time when that's not realistic.

  • by smm11 on 9/21/22, 9:50 PM

    I'd been thinking that 5G is a thing only because the IOT is a thing. It had nothing to do with phones, but the build-out is funded by phones. So when it all settles down, phones will be as slow as they were in the 3G era, at best, what with so much stuff clamoring for data.

    Plain Jane HTML is going to save us.

  • by can16358p on 9/21/22, 7:44 PM

    Shouldn't we have more devices and more connection types to have a more controlled experiment?

    It's always 4G, mobile Chrome and I assume the same device.

    Very likely same carrier at the same place, so roughly same connection conditions in terms of latency DL/UL bandwidth and jitter. Also always the same device with same CPU/GPU. Perhaps a flagship new shiny phone with a superfast SoC which gives a headstart to faster JS execution? Or perhaps a very spotty barely 1-bar 4G connection. (Just assumptions, maybe both are false, but you get the idea)

    I'm a bit fan of client-side generation using JS too but I don't think this experiment is exhaustive of many practical scenarios.

    If we see more connection types and more variety of devices with different CPUs then it'd be more convincing.

  • by kbenson on 9/22/22, 1:25 AM

    I wonder if one of the reasons we've seen a big push towards systems like this is because it does make the website faster, but in a way that's one step removed than we often consider it, or by a slightly different metric.

    What if instead of client side load time, what's also being looked at is load time per finite server side compute resource? By dumbing down the server side to graphql JSON delivery + static JS, maybe that allows them to serve that specific need faster per 10k servers or something, and having to do the full page composition under heavy load server side just doesn't scale as well?

  • by chmod775 on 9/22/22, 12:45 AM

    There needs to be a website hall of shame that serves particularly slow websites rendered into PNGs/WEBP/WEBMs (+JS code to make them interactive/clickable) if those would load faster and use less data volume than the real thing.
  • by vxNsr on 9/21/22, 8:05 PM

    This basically just tells us what we already know, SSR is faster for the client usually
  • by IYasha on 9/22/22, 9:26 PM

    JS queries are tools of Satan! Today web 2.0 uses 1000% faster machines to make web experience 100% slower! What an age!

    PS: thank you for calling HTML HTML!

  • by megaman821 on 9/21/22, 8:47 PM

    The large failing with this test is that it assumes the time to get the relevant page data from the database and render it to HTML is 0. If Twitter had your feed ready to go from its cache this might be accurate, but realistically I would give the server a few seconds to do its work since the site is so personalized.
  • by the__alchemist on 9/21/22, 9:39 PM

    This page is serving me a recursive stream of Captchas, as pudgetsystems reviews the security of my connection; not a great look for the topic alluded to in the headline.
  • by pier25 on 9/21/22, 11:25 PM

    It Depends ™
  • by NohatCoder on 9/21/22, 8:48 PM

    Time for a hot take:

    You don't need to make your website fast, all you have to do is not make it slow in the first place.

    Partially or fully generating a web site client side can be plenty fast, the slowness tends to come from using some bloated framework to do so.

  • by jokoon on 9/21/22, 10:04 PM

    It's really funny because I asked on stack exchange why websites are slower than apps, my question for removed for being opinion based, and I got an answer about hydration.

    In my view, the dom should be made obsolete, and there should be tighter restrictions, by making things immutable, or just completely redesigning how the dom works.

    I'm not an expert, but the dom smells very weird.

  • by jedberg on 9/22/22, 12:40 AM

    What this doesn't account for is html rendering time on the server.

    The reason websites use local javascript to render html is so they don't have to do it on their server while you have to wait for the result. This way you have the perception of a page load while the html renders. It's actually a better experience for the user.

    This entire analysis assumes that the server renders the html instantly. Unless it is static content that is highly cacheable, chances are the render time on your machine isn't much slower than the server, but the website can use a lot less compute resource to make the webpage for you since your computer is doing part of the work.

    Also, chances are they have to transmit less data to you, which cuts down on network latency as well.