Every installation of this program needs to come with a statistician to stand over your shoulder and hit you with a ruler and shout "That's data mining, not a controlled experiment! No causality for you!"
I don't mean to imply that these aren't powerful tools in the right hands; the program definitely looks like a well designed tool implementing a good range of modern techniques. Just saying, every time scientists get a new toy like this to play with, we get a whole lot of bad publications with pretty graphs. And don't get me started on the let's-predict-the-stock-market crowd.
Not sure why you're being voted down, as this was the first thing I thought. Admittedly, I'm not a statistician, so perhaps someone else can explain why this kind of thing is valuable?
Two big advantages as I can see (there are undoubtedly more):
1- As experimentalists do experiments they often arrive at these same conjectures, but after years of trial and error. Throughout history there were probably mountains of data that were recorded but discarded after they were hypothesized to have no scientific value. Having a program like this can essentially narrow the search for you. It can let you get back to doing science and do less data processing, which for the majority of scientists is a godsend.
2- You can now take historical sets of data and see if you can recover new aspects of structure. Perhaps there were important features in the data that were overlooked that can now be statistically analyzed. This can lead to new research avenues.
I have only a limited understanding of the details of BACON, but from what I remember it was what is called a "production system" and somewhat akin to an evolving expert system. Something like BACON starts out with a set of rules and then tweaks them looking for patterns that can be generalized into new rules. Novel for the early 80s, but in practice these systems seemed to be limited to very specific domains.
Eureqa is a genetic programming system (running over a graph encoding, bonus points for that one :) that evolves a set of rules from basic components to fit a set of data and hopefully be able to make future predictions. The general concept is not really new -- some of Koza's early examples of the power of GP were evolving Kepler's third law using astronomical data points as the input and evolving Ohm's law in a similar fashion[1] -- but developing a general toolkit and packaging it as a useful tool is somewhat novel. One thing that probably distinguishes Eureqa is that I would expect the output to be more parsimonious than that developed by a production system like BACON. A system like BACON is also more complex, involving various layers of detectors and rule emitters, while a GP system can be very simple once you have figured out the best set of operators for a problem domain.
[1] It should be noted that BACON did these tasks as well, so they are more of a general rule-building yardstick and put GP at a similar level to the well-developed (and well-funded) production systems research of the era.
There is a report generator that suggests experiments based on the data. Don't take that as a recommendation for the product however. I'm still test driving.