by tkadlec on 9/21/22, 5:21 PM with 106 comments
by 1vuio0pswjnm7 on 9/21/22, 10:28 PM
Having myself run this "experiment" for many years now by (a) controlling DNS so that only the domain in the "address bar" URL is resolved^1 and (b) making HTTP requests using a TCP client and/or an unpopular nongraphical web browser that only processes HTML and does not perform auto-loading of resources. No images, JS, CSS, etc.
The answer to the question is yes. This "makes a website faster", or, more specifically, as someone else in the thread has stated, it does not make the website slow. It does not accomodate the practices of "web developers" that slow a website down.
But most importantly, IMO, it makes "website speed", not to mention appearance, more consistent across websites. Good luck achieving any semblance of that with a popular graphical web browser.
Most web pages submitted to HN can be read this way. I find it easier to consume information when it follows a consistent style of presentation and without the distractions enabled by "modern" web browsers.
1. This is the only URL the www user is informed about. In the short history of the www so far, auto-loading from other domains, whether through HTML, Javascript or otherwise, has unfortnuately been abused to the point where allowing it produces more risk-taking for the www "user" than convenience for the www "developer". Sadly, instead of deprecating the "web development" practices that have been abused and make websites slow, the new HTTP standards proposed by an advertising company and supported by CDNs cater to this practice of "composite" websites comprised of resources from various third parties. It stands to reason that advertisers and therefore "tech" companies and their providers, e.g., CDNs, stand to benefit more from "composite" websites than www users do. IMHO the easiest way to "make websites faster" is to stop enabling "web developers" to do the things that make them slow.
by kmeisthax on 9/21/22, 9:05 PM
Apparently my skepticism has been validated.
by thwarted on 9/21/22, 9:10 PM
by no_time on 9/22/22, 8:54 AM
notice how all these "benefits" only benefit the developer at the expense of the user or have nothing to do with the problem at hand. "personalized content"? really?
Pre client side Youtube,Twitter,FB,Reddit were all superior feats of engineering to their modern JS heavy counterparts.
by superkuh on 9/22/22, 1:21 AM
by kyle-rb on 9/21/22, 11:19 PM
Twitter is slow in this experiment because it has to load a bunch of JavaScript up front. But that's not the case in practical use! Twitter uses service workers and HTTP cache headers (e.g. `expires`) to make sure that most non-first-time-users aren't actually loading most things every time. Client-side rendering isn't the thing that's slow here, it's mostly the re-downloading of the rendering code every time when that's not realistic.
by smm11 on 9/21/22, 9:50 PM
Plain Jane HTML is going to save us.
by can16358p on 9/21/22, 7:44 PM
It's always 4G, mobile Chrome and I assume the same device.
Very likely same carrier at the same place, so roughly same connection conditions in terms of latency DL/UL bandwidth and jitter. Also always the same device with same CPU/GPU. Perhaps a flagship new shiny phone with a superfast SoC which gives a headstart to faster JS execution? Or perhaps a very spotty barely 1-bar 4G connection. (Just assumptions, maybe both are false, but you get the idea)
I'm a bit fan of client-side generation using JS too but I don't think this experiment is exhaustive of many practical scenarios.
If we see more connection types and more variety of devices with different CPUs then it'd be more convincing.
by kbenson on 9/22/22, 1:25 AM
What if instead of client side load time, what's also being looked at is load time per finite server side compute resource? By dumbing down the server side to graphql JSON delivery + static JS, maybe that allows them to serve that specific need faster per 10k servers or something, and having to do the full page composition under heavy load server side just doesn't scale as well?
by chmod775 on 9/22/22, 12:45 AM
by vxNsr on 9/21/22, 8:05 PM
by IYasha on 9/22/22, 9:26 PM
PS: thank you for calling HTML HTML!
by megaman821 on 9/21/22, 8:47 PM
by the__alchemist on 9/21/22, 9:39 PM
by pier25 on 9/21/22, 11:25 PM
by NohatCoder on 9/21/22, 8:48 PM
You don't need to make your website fast, all you have to do is not make it slow in the first place.
Partially or fully generating a web site client side can be plenty fast, the slowness tends to come from using some bloated framework to do so.
by jokoon on 9/21/22, 10:04 PM
In my view, the dom should be made obsolete, and there should be tighter restrictions, by making things immutable, or just completely redesigning how the dom works.
I'm not an expert, but the dom smells very weird.
by jedberg on 9/22/22, 12:40 AM
The reason websites use local javascript to render html is so they don't have to do it on their server while you have to wait for the result. This way you have the perception of a page load while the html renders. It's actually a better experience for the user.
This entire analysis assumes that the server renders the html instantly. Unless it is static content that is highly cacheable, chances are the render time on your machine isn't much slower than the server, but the website can use a lot less compute resource to make the webpage for you since your computer is doing part of the work.
Also, chances are they have to transmit less data to you, which cuts down on network latency as well.