Huffman coding is probably one of the more interesting ones, not only because it is so ridiculously useful, but because of the wide taxonomy of implementations. It is quite malleable, able to be morphed and optimized to the particular application: JPEG, PNG, HPACK, Gzip to name a few popular usages and implementations.
What is really enlightening though is implementing a basic one, because it is so simple. The core of it involves popping two graph nodes from a heap and pushing a new one. I did this in school, and was impressed by it, but became far more appreciative when I tried to do the JPEG way. It doesn't even provide a table, just a histogram!
It also acted as the basis for the successor of arithmetic coding, which is pretty much in every modern video codec. Can you imagine a world that is still analog because we couldn't figure out how to transmit digit video or images or audio? Huffman is a key link in the chain between the past and present.
My gut reaction to the question was "The one I can understand and apply", but I think that the Huffman algorithm added a third feature for me, it expanded the way I thought about computing in general. It is relatively simple, but it is clever and it makes you feel clever to understand it when your experience with algorithms is limited and your relationship with them is tenuous.
I like the fast fourier transform method for fast integer multiplication. Only applicable on really big inputs, but the idea is that it computes the fft of both integers, does point-wise multiplication of the resulting vectors, then does the inverse fft to recover the product. More about this (https://en.wikipedia.org/wiki/Multiplication_algorithm#Fouri...).
Another interesting but asymptotically slower integer/polynomial multiplication algorithm is Karatsuba's.
I think the RSA algorithm is fascinating, just because of the impact it has had on the industry. Imagine a world without public-key encryption. Crazy right? (Obviously there would be other ways to achieve public-key encryption if RSA had never come along... but you get my point).
Last year I learnt Ford Fukkerson [0] and Dijkstra [1] Algorithm. Algorithns for graphs are really interesting and have a lot of applications. Fun fact, I learnt these in a transport engineering course, not in a computer science one, and I still use them sometimes in computer science courses.
The union-find data structure (the idea is so simple and a sophisticated analysis yields a really low upper bound).
RMQ with linear preprocessing and constant time queries (which is also true for multi-dimensional cases, if the dimension is bounded by a constant).
Knuth-Morris-Pratt pattern matching is the first algorithm where I saw the usage of an additional variable in the pseudocode just to simplify the complexity analysis.
What is really enlightening though is implementing a basic one, because it is so simple. The core of it involves popping two graph nodes from a heap and pushing a new one. I did this in school, and was impressed by it, but became far more appreciative when I tried to do the JPEG way. It doesn't even provide a table, just a histogram!
It also acted as the basis for the successor of arithmetic coding, which is pretty much in every modern video codec. Can you imagine a world that is still analog because we couldn't figure out how to transmit digit video or images or audio? Huffman is a key link in the chain between the past and present.