from Hacker News

Conditional JavaScript: Only download when it is appropriate

by umaar on 12/16/20, 11:22 AM with 48 comments

  • by jy14898 on 12/18/20, 12:07 AM

    I feel like this is something you'll almost never worry about in 99% of websites. In fact, it doesn't even happen for native applications (deactivating features based on battery life/free space/processing power).

    It'd be a bit jarring to have certain features suddenly disappear on the next page load, just because your resources dropped below certain thresholds decided by the developer?

  • by FriedrichN on 12/18/20, 10:57 AM

    As a general rule of thumb you should use as little JavaScript as possible and serve it in one big file. Just yesterday a client was wondering why his website was slow and their webdesigners said it needed 'more varnish cache' and a bigger server.

    After inspecting I saw over 100 .js files were being requested by require.js, but because each module required yet another set of .js files and those files required another set, and so on. The server was serving them up plenty fast (<50ms) but because they were being loaded in waves the load time ended up being between 5 and 7 seconds.

    Many of these issues arise from being fancy and people tend to want to fix it with fancy stuff. But sometimes you need to stop being fancy and just fix the damn problem in an old fashioned way (concatenate, minify, allow local caching).

  • by bamboleo on 12/18/20, 1:14 AM

    This is rather low-level and needs to be hardcoded by developers. It doesn’t scale.

    What we need is ways to mark content as optional or offering it in multiple levels, that the navigator can pick from. This would be similar to how videos are served.

    I also want to point out the Save-Data header which is a step in the right direction.

  • by Etheryte on 12/18/20, 1:13 AM

    Rather than looking at this as an enhancement, this raises both of my eyebrows very hard: why would my browser possibly need to tell a webpage all of this information about my system? Of course, it's great for fingerprinting and other shenanigans, but I can't help but feel this is getting absurd. I want my browser to be a webpage reader, not a tool to leak as much data about me as technically possible.
  • by FrontAid on 12/18/20, 8:16 AM

    There are some nice examples in that post. One other example would be to respect the DoNotTrack setting of your browser. You don't need to load any tracking scripts if DNT is enabled. That saves bandwidth and has a small effect on your page's loading times.

    We recently blogged about that on the example of Matomo (Google Analytics alternative). The main part is the following code:

        if (navigator.doNotTrack !== '1') { /* load script */ }
    
    Here is the link if you are interested in the details: https://frontaid.io/blog/matomo-dnt-do-not-track/
  • by tored on 12/18/20, 1:37 PM

      navigator.deviceMemory
      navigator.hardwareConcurrency
    
    Oh great, more ways to track users.
  • by cboatie on 12/18/20, 1:00 AM

    The browser support for these APIs isn’t very good
  • by guerrilla on 12/18/20, 5:58 AM

    More complexity where the solution is rather probably simplicity.
  • by sloshnmosh on 12/18/20, 2:54 AM

    I block ALL JavaScript and only allow JavaScript to run if I absolutely have to but limit to only what is needed with uMatrix.

    I had to stop viewing my favorite website recently because it’s no longer viewable with JavaScript disabled.

  • by 1f60c on 12/18/20, 1:05 PM

    This is an interesting technique but it seems like premature premature optimization.
  • by kazinator on 12/18/20, 1:51 AM

    What if the browser lies about those system properties in order to defeat fingerprinting?

    How about: if you don't need the JS, don't load it.

    I know you don't need most of it because I use NoScript, and most sites work fine with a good chunk of JS disabled. As a generally observed pattern, the more JS they load from more sites, the more you can disable and still use them.