by chamoda on 3/6/24, 12:29 PM with 74 comments
by jgrahamc on 3/6/24, 4:42 PM
https://blog.cloudflare.com/cacheing-the-uncacheable-cloudfl...
I am glad to see that things have moved on from SDCH. Be interesting to see how this measures up in the real world.
by saagarjha on 3/6/24, 5:17 PM
by jauntywundrkind on 3/6/24, 6:18 PM
The original proposal for Zstd was to use a predefined stastically generated dictionary. Mozilla rejected the proposal for that.
But there's a lot of great discussion on what Zstd can do, whic.h is astoundingly flexible & powerful. There's discussion on dynamic adjustment if cinpression ratios. And discussion around shared dictionaries and their privacy implications. That Mozilla turned around & started supporting Zstd & has stamped a positive indicator, worth prototyping on shared dictionaries is a good initial stamp of approval to see! https://github.com/mozilla/standards-positions/issues/771
One of my main questions after reading this promising update is: how do pick what to include when generating custom dictionaries? Another comment mentions that brotli has a standard dictionary it uses, and that's some kind of possible starting place. But it feels like tools to build one's custom dictionary would be ideal.
by eyelidlessness on 3/6/24, 4:57 PM
True, for documents (as is another comment’s focus) this is perhaps overkill. Although even there, a benefit could be imagined for a large body of documents—it’s unclear whether this case is addressed, but it certainly could be with appropriate support across say preload links[0]. But if “the web is for documents, not apps” isn’t the proverbial hill you’re prepared to die on, this is a very compelling story for web apps.
I don’t know if it’s so compelling that it outweighs privacy implications, but I expect the other browser engines will have some good insights on that.
0: https://developer.mozilla.org/en-US/docs/Web/HTML/Attributes...
by lukevp on 3/6/24, 11:21 PM
by matsemann on 3/6/24, 5:14 PM
What's the savings on that approach vs a gziped file without any dictionary?
by ComputerGuru on 3/6/24, 4:40 PM
It’s also insanely complicated. All this effort, so many possible tuples of (shared dictionary, requested resource), none of which make sense to compress on-the-fly per-request, mean it’s specifically for the benefit of a select few sites.
When I saw the headline I thought that Chrome would ship with specific dictionaries (say one for js, one for css, etc) and advertise them and you could use the same server-side. But this is really convoluted.
by falsandtru on 3/6/24, 5:25 PM
https://developer.mozilla.org/en-US/docs/Web/Security/Subres...
by TacticalCoder on 3/6/24, 4:55 PM
The savings are nice in the best case (like in TFA: switching from version 1.3.4 to 1.3.6 of a lib or whatever) but that Base64 encoded hash is not compressible and so this line basically adds 60+ bytes to the request.
Kinda ouch for when it's going to be a miss?
by ramses0 on 3/6/24, 5:03 PM
Fingerprinting concerns aside (compression == timing attacks in the general case), the fact that it's nearly network-transparent and framework/webserver compatible is incredible!
by raggi on 3/6/24, 8:43 PM
by IshKebab on 3/6/24, 7:57 PM
by netol on 3/6/24, 7:48 PM
by Sigliotio on 3/6/24, 5:10 PM
Image compression for example or voice and video compression like what nvidia does.
But i do like this implementation focusing on libs, why not?
by jwally on 3/6/24, 10:38 PM
by skybrian on 3/6/24, 5:53 PM
by tsss on 3/6/24, 11:07 PM
by kazinator on 3/6/24, 7:58 PM
Just put the to-be-compressed item into the shared dictionary, somehow distribute that to everyone, and then the compressed artifact consists of a reference to that item.
If the shared dictionary contains nothing else, it can just be a one-bit message whose meaning is "extract the one and only item out of the dictionary".
by cuckatoo on 3/6/24, 6:03 PM
I do not want my browser sending anything that looks like it could be used to uniquely identify me. Ever.
I want every request my browser makes to look like any other request made by another user's browser. I understand that this is what Google doesn't want but why can't they just be honest about it? Why come up with these elaborate lies?
Now to limit tracking exposure, in addition to running the AutoCookieDelete extension I'll have to go find some AutoDictionaryDelete extension to go with it. Boy am I glad the internet is getting better every day.