Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The crème de la crème of text compression for games back then was Huffman coding. I'm a bit surprised Zork didn't use it, as surely it would have saved space.

Compression algorithms like LZ weren't viable in games until much later, as most computers lacked the memory and clock speed to run them quickly.



80s C64 demos already used compression (both Lempel-Ziv style and Huffman). Suppose the demo had 3 parts. When part 1 was running, part 2 and 3 were present in memory in compressed form (both code and data). When it was time to run part 2, it would be decompressed over the code and data of part 1. etc


Compressing cracked games was a big thing too. There would be a lot of competition to get the smallest version of a game for faster downloading from a BBS, getting more games on a floppy disk etc. Getting the best compression software was a sign that you were an ‘elite’ cracker. (Disclaimer - was part of that scene as a young teen)


Wow, that is innovative. Something like overlays in certain versions of Turbo Pascal, but better.


Never used Turbo Pascal overlays but we had a product on OS/2 using a Realia compiler that when ported to DOS didn't fit. The overlays destroyed performance. I ended up making a segmented that spilt the call graph into mostly subtree by selecting the subroots. A small set of the shared pets got to live in non overlay memory. There was still a little thrashing because the splits weren't exclusive or of similar sizes but it worked remarkably well.


>The overlays destroyed performance.

True. A space-time tradeoff.

>I ended up making a segmented that spilt the call graph into mostly subtree by selecting the subroots. A small set of the shared pets got to live in non overlay memory.

I didn't get what you mean by "making a segmented" and "shared pets". Typos?


Some of the Infocom games certainly used compression I believe by doing things like using 4 or 5 bits for characters instead of 8. I couldn’t tell you where I read that though.



It'd be tricky to use a compression scheme with a history window, since these interpreters also used a virtual memory scheme for static data with page sizes of around 1 KB.


LZ decompression is far faster and simpler than Huffman --- and has at times been seen to be even faster than memcpy().


Unisys's US patent on the LZW algorithm expired on June 20, 2003. It was vigorously enforced. Many programs used huffman to avoid paying royalties. PNG was created to avoid problems with LZW patents. A lot of compression technology choices were based on avoiding proprietary algorithms.


I think you've confused LZ with LZW.


Haven't the speed of CPU and memory flipped since then?


Didn't they have the ARC compression back then? Not as good as LZ but good enough to save space.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: