Yeah I think that’s the reason it didn’t happen ;) not saying it’s impossible to make in less than 2kb, but it would have to be around 1kb to make it really worthwhile, so they could squeeze in more content.
They already have a compressor/packer - so this 2kb represents space 'left on the table' by that packer (either due to the algorithm being bad, or the packer's decompressor being big).
Considering ZIP normally uses deflate, which is LZ77 + huffman. LZ77 is super simple and can be implemented in ~30 bytes of code. Huffman tables are fairly simple, but I can't easily visualize some assembly instructions to implement them, however arithmetic coding in all cases exceeds huffman's performance, and can be implemented in about 60 bytes of code. Total = 90 bytes to save 2 kilobytes. Seems worth it.
Note that both these decompressors will be awfully slow, but for just 64 kilobytes of input data I don't think that'll be an issue.
1. Didn't read the article (they dropped 13% of the entire page weight). The websites are already well below of "today’s average"
2. Don't understand the impact of having a page that loads instantly at the government level. These websites are not for fun, they must work well for everyone.
> the change for users on a low bandwidth connection or lower specification device will be much more noticeable, resulting in significantly improved page download speed and performance.