I have moved on to xz. I know it is slower and uses more memory, but I like it because it is multithreaded and compresses a lot better than gzip or bzip2. The main XZ Utils web page also distributes Windows binaries. It is also built into tar now.
Facebook's ZStandard compression library appears to be both faster and better at compressing than zlib. Is there any reason to continue using zlib for new projects that don't need backward compatibility?
Hm, not really, unless you care about being able to read the data from anywhere else. Zlib, like any of the other "traditional" libraries out there, is supported pretty much everywhere and has bindings for every language you can think of. But that shouldn't drive your project.
I work with a few projects that were badly impacted by correctness bugs in the recent zlib 1.2.10 update and could no longer interoperate with older versions of the software.
Mark's a very busy guy at JPL (I'm a former JPLer myself). Making science happen on Mars and whatnot.
zlib is super stable and that's largely its biggest benefit as opposed to when it was originally released in the 90s. Back then, biggest benefit was that it was not impeded by patents or licensing, and, it was faster (in general purpose use) than any other compressors while providing the highest level of compression (in general purpose use). Nowadays bzip2 and lzma (and others) can provide better compression.
However, there's lots of choice in compressors (especially open sourced compressors).
My favorite for symmetric compression/decompression speed is LZ4 [0] and overall decompression speed over compression ratio. Decompression speed approaches memory copy speed! You can use it for compressing your Linux kernel [1], your ZFS file system [2], and your Linux zram driver [3]. lzo, miniz [4], QuickLZ, LZF, Snappy are also quite speedy in different ways.
For compression ratio, it's lzham [5] that is most impressive. But pigz may be more your style. Others may choose brotli, zstd, or others still.
zlib is one of those packages that's often the root of all packages.
(Were you expecting "evil?" That's only in packages distributed without checking GPG signatures of sources verified by strong web-of-trust, built on well-secured build farms and implementing end-to-end chain-of-custody for distribution artifacts.)
no source repository? Looks like a great opportunity for someone to grab all the release tarballs and build one to chuck on github. Bonus points for putting changelog contents in the commit logs!