Introduction and index of this series is here. The previous post investigated some lossless filtering of data, before passing it to a regular compression library. Our result so far is: 94.5MB of data can get filtered+compressed down to 23.0MB in one second (split/shuffle bytes, delta encode, zstd or kraken compression). It decompresses back in about 0.15 seconds (which quite a bit slower than without data filtering, but that’s something for later day).| Aras' website
I was playing around with compression of some floating point data, and decided to write up| Aras' website
<a name="intro-video"></a>| facebook.github.io
Previous blog post was about| Aras' website
Update 2021 October: default zip compression level was switched from 6 to 4,| Aras' website
One thing led to another, and I happened to be looking at various lossless compression options available in| Aras' website