Hacker News new | past | comments | ask | show | jobs | submit login

I'd love to see Zstandard accepted in other places where the current option is only the venerable zlib. E.g., git packing, ssh -C. It's got more breadth and is better (ratio / cpu) than zlib at every point in the curve where zlib even participates.



Also zlib is horrible code. Went away with disgust after finding critical errors (that time only patched in master).


It would be great to see better compression supported by browsers.


Chrome apparently tested zstd out, and it's an improvement over Chrome's forked and optimized zlib on x64, but slower on ARM/Android. https://bugs.chromium.org/p/chromium/issues/detail?id=912902...

Few (or none?) of Chrome's fairly dramatic improvements to zlib have been upstreamed. https://github.com/madler/zlib/issues/346

Edit: Also, if browsers do adopt zstd, it's likely you'll end up with the same situation where they fork their own implementation of zstd. Upstreaming requires signing Facebook's CLA, which has patent clauses that don't work for most.


Brotli has wide browser support (https://caniuse.com/#feat=brotli) and comes closer to zstd in compression ratio and compression speed, but its decompression speed is significantly lower and closer to zlib.

https://github.com/facebook/zstd#benchmarks

AFAIK (I haven't looked much into it since 2018) it's not widely supported by CDNs, but at least Cloudflare seems to serve it by default (EDIT: must be enabled per-site https://support.cloudflare.com/hc/en-us/articles/200168396-W...)


Brotli compresses about 5-10 % more than zstd. Benchmarks showing equal compression performance use different window sizes (smaller window sizes for brotli) or do not run at maximum compression density.

https://github.com/google/brotli/issues/642 is the best 3rd party documentation of this behavior.

zstd does decompress fast, but this is not free. The cost is the compression density -- and lesser streaming properties than brotli.

For typical linux package use, one could save 5 % more in density by moving from zstd to large window brotli. The decompression speed for a typical package would be slowed down by 1 ms, but the decompression could happen during the transfer or file I/O if that is an issue.


The linked series of comments (which, to be clear, I've only skimmed — there's a ton there) show zstd 22 sometimes coming behind Brotli 11d29, sometimes ahead on compression ratio; usually coming ahead of Brotli 11 on compression ratio; ~5x faster on compression throughput and ~2-2.5x faster on decompress throughput. To cherry-pick some numbers (the table after "259,707,904 bytes long open_watcom_1.9.0-src.tar", dated "TurboBench: - Mon Apr 30 07:51:32 2018"):

  Name         | Comp. size | Comp. ratio | C. MB/s | D. MB/s
  brotli 11d29 | 21117099   | 8.1         | 0.52    | 515
  zstd 22      | 22249772   | 8.6         | 2.32    | 1270
  brotli 11    | 22898691   | 8.8         | 0.57    | 662
So in that particular instance, zstd 22 comes out about 5% worse (+1.1 MB over Brotli 11d29's 20.1 MiB) on compressed size, but 3% better (-640kiB) vs Brotli 11 at 21.8 MiB. So... maximum compression is within a small margin; compression and decompression speeds are much quicker.

I think it's fair to say that zstd struggles the most at the extremes. On the fast extreme it loses (marginally) to lz4; on the slow extreme it (maybe) loses (marginally) to brotli. But it's relatively quick across the spectrum and provides a lot of flexibility.

It may make sense to continue to use Brotli or xz for static assets that are compressed infrequently and read often. But for something like HTTP Content-Encoding, where dynamic pages are compressed on the fly? Zstd would shine here, over both Brotli and (vanilla) zlib. (I know Chrome has some hacked up zlib on the client side, but I do not know too much about it.)


That's interesting. Brotli has wide browser support although its less than 5 years old but webp is reaching a decade and Safari still doesn't support it...


WebP has an excellent lossless image compressor (like PNG just 25-40 % more dense), but the lossy format has weaknesses that people focused on, and slowed down the adoption. The initial lossy encoder had weaknesses in quality -- it had bugs or was a port of a video coder. Nowadays, the quality is much better, but the format forces YUV420 coding (does not allow YUV444 coding) which limits the quality of colors and fine textures.


> but webp is reaching a decade and Safari still doesn't support it...

That’s a philosophical objection. For a long while Mozilla also was of the opinion that WebP is not “better enough” than JPEG/PNG to warrant the addition of another image format which the entire web must support forever using only one available implementation.

Plus I think there are still some unresolved patent claims on the VP8/9 video codec (which are the basis for WebP).


Wireshark! Wireshark!

Also lz4, of course.




Consider applying for YC's Summer 2025 batch! Applications are open till May 13

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: