This is a cute example, but the problem could probably be solved much more efficiently with standard optimization methods, like Nelder-Mead. See Numerical Recipes...
It is a cute example. It's meant to teach Pyevolve and basic genetic algorithms with an example that doesn't seem too contrived. But thank you for noting that -- I'll add it to the article as a disclaimer.
Thanks for writing this up. Seeing "cute" examples like this is actually helpful, because I don't need to learn a particular application area while reading about the solution method. It's a great way to get me started thinking about new ways of approaching old problems.
While you play around with the how the algorithm parameters affect its ability to find the polynomial coefficients, it would be nice to see how adding some random noise to the sample points affects it, as well. This would make your example more realistic (because it would mean you'll never be able to generate a perfect fit).
I do plan to try to add some Gaussian noise to the sample points -- thank you for reminding me. I also hope to find ways to optimize the fitness function and selection function. I think there is much that can be done to fix it up.