Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Programmer feel-good quote (axisofeval.blogspot.com)
93 points by samstokes on Sept 6, 2010 | hide | past | favorite | 30 comments


A quote by somebody who is not an astronomer, clearly (she said, while in the other workspace watching data originating gazillion of miles away taken by an instrument that works only because of obscure quantum effects).

:-P

Seriously, I think starting to think your discipline is somehow superior to others is an open door to complacency. I think what I do is pretty cool, but what a lot of people do is pretty cool. Not sure why I would try to outdo a neurosurgeon ("gee, you are only working to a 10^6 dynamic scale").


I agree with you, but Dijkstra would too. He never claimed that computer science or software engineering (remember that there was no clear distinction back then) were somehow superior to other disciplines. On the contrary, in many of his writings he is lamenting the state of our profession.

The quote on the blog is somewhat out of context, and it reminds me of this:

The major cause [of the software crisis] is... that the machines have become several orders of magnitude more powerful! To put it quite bluntly: as long as there were no machines, programming was no problem at all; when we had a few weak computers, programming became a mild problem, and now we have gigantic computers, programming had become an equally gigantic problem. In this sense the electronic industry has not solved a single problem, it has only created them, it has created the problem of using its products. To put it in another way: as the power of available machines grew by a factor of more than a thousand, society's ambition to apply these machines grew in proportion, and it was the poor programmer who found his job in this exploded field of tension between ends and means. The increased power of the hardware, together with the perhaps even more dramatic increase in its reliability, made solutions feasible that the programmer had not dared to dream about a few years before. And now, a few years later, he had to dream about them and, even worse, he had to transform such dreams into reality! Is it a wonder that we found ourselves in a software crisis? No, certainly not, and as you may guess, it was even predicted well in advance; but the trouble with minor prophets, of course, is that it is only five years later that you really know that they had been right. -- E. W. Dijkstra

http://userweb.cs.utexas.edu/~EWD/transcriptions/EWD03xx/EWD...


Great comment. I think what people are missing here is that Dijkstra regards this difference as a bad thing, one that we are not capable of handling! It's not something to be proud of at all.


Dijkstra (eventually at least) distinguished between computer science and software engineering, about which he wrote disparaging things.



"There are 10^11 stars in the galaxy. That used to be a huge number. But it's only a hundred billion. It's less than the national deficit! We used to call them astronomical numbers. Now we should call them economical numbers."

-- Feynman (in the 80s when the deficit was that tiny)


Maybe true for Dijkstra, but not for most working programmers today. Generally you work on a layer of the software stack and manage maybe two or three orders of magnitude. What's below is effectively instantaneous, and what's above is just an arbitrary length of time.

Of course, the hardest bugs are the ones that cross those boundaries.


I'm the first to admit that I have a bit of computer nerd hubris: I constantly and secretly think to myself that I do the most complicated work in the company and this quote supports this self-aggrandising, vain and stupid thought.

I need to stop taking myself so seriously.


That kind of thinking helps ward off mediocrity. Whenever I deal with somebody in my company, I really hope they think their work is important and I appreciate it in a positive way, even if their work isn't important and I'll only notice if they fail to deliver (in which case it will just set me back a week.)

Having a realistic perception of your own importance leads to depression and to performing at the bare minimum necessary to stay employed. I don't bother much with reality anymore; I prefer sanity and success to reality ;-)


I feel like the stuff I studied in school leads 90% of the grads in my field to making electronic doodads that no one needs or websites that contribute nothing to society. Working on something that might, actually, help another person in serious way seems pretty rare these days.


You are not curing cancer.


Unless he is; there are computational biologists who try to use math/programming to cure cancer.


That's my point.


Although unlikely to be supported by the context, I think the quote would have been stronger if the ! in 10^9! were an operator and not punctuation.

While the Large Hadron Collider is experimenting with particles less than 10^-10 m in size over distances on the scale of 10^4 m, computer scientists can easily run into combinatorial explosions that would make 10^9! seem puny. (Not just computer scientists, too -- you can probably find code that tries to operate like that on the Daily WTF.)


Any time I meet some of these stuck-up physicists, that's what I'm going to lob into their face.

And they'll get even more stuck up, because physics as such deals with (but doesn't control) processes which range more orders of magnitude.

And the techniques of computer science and physics are similar: deliberate simplification. Planets are turned into mass dots, or at best spheres with uniform density; and customers are required to have an all-ASCII name. The difference is that physics relegates anything that can go wrong - building bridges, machines, cars - to people called engineers, whereas the likes of Dijkstra would rather have hyperintelligent theoreticians sort out these problems. (You can tell that he's never had an angry lynch mob of customers banging at his door because he implemented an incorrect specification instead of using common sense to find the specification error and getting back to the customer).


Spheres with uniform density behave exactly the same as mass dots.

Mathematicians can cook up giant numbers without blinking, like Graham's number. If that makes anyone feel good, then good for them. Somehow, all the quotes I read from Dijkstra leave an arrogant after-taste in my mouth.


Nice try, Alan Kay! <http://www.youtube.com/watch?v=s7ROTJKkhuI>;

(For those who don't feel like watching: "Arrogance in computer science is measured in nano-Dijkstras.")

As a Dijkstra fan myself, though, I feel duty-bound to emphasize that condemning a claim as having been stated by an arrogant man is just a somewhat more subtle way of stating an ad hominem fallacy.


I don't think he's using the nano-Dijkstras comment as an argument, but as a side joke. He refutes Dijkstras on empirical grounds: the US way of programming got the job done, as illustrated by the fact that most of the software in the world was being produced there.


I think Kay's US-vs-EUR comparison is largely inappropriate regardless because Dijkstra had long been writing his missives from Texas.


Simplifying the shape of planets to dots is a simplification, but amazingly you don't lose accuracy by it wrt to their movements due to gravity. If the planets were turned into dots, their trajectories would be the same.


In the case of a 3-body solar system, they wouldn't. But as soon as one of these mass dots tries to land (or stand) on one of the other mass dots, you'd see a difference.

If you say that this violates the assumptions of the basic model, you're right. But if you look at all the craters on the moon, the model assumptions are violated pretty often - the trick is recognizing when you need a different model, which is always easy in retrospect, but can be difficult to anticipate.


Yep, that's true. I just thought that is was interesting that the shape of a planet doesn't affect its trajectory under the laws of gravity alone (i.e. no collisions etc). It's also true that physicists do use explicit approximations often, for example in statistical mechanics, or by using a linear approximation instead of an exact quantity. That's similar to calculating the asymptotic running time of an algorithm instead of the exact number of steps it takes (except the limit in the physics case takes some quantity to zero to get a linear approximation whereas for the algorithm we take the limit of the input data size to infinity).


...hm, how many orders of magnitude down are the tidal forces this ignores?


Do tidal forces have any effect on the center-of-mass trajectories of planets?


They should, they transfer energy between rotational and orbital velocities. For example the earth is gradually spinning more slowly (as can be seen from daily/annual growth rings in corals), while the moon is orbiting faster and getting farther away.


This reminds me of this very very deep Mandelbrot fractal zoom:

"The final magnification ... is 6.066e+228 (2^760)"

http://www.youtube.com/watch?v=foxD6ZQlnlU


Also applicable to structural engineers. When I was still practicing I remember calculating what a single thread of a bolt could withstand (something irregular was being done, cant remember) for a structural piece of a replacement section for the truss of the sky dome.


What an awesome blog name...


"From a bit to a few hundred megabytes, from a microsecond to a half an hour of computing confronts us with completely baffling ratio of 109! The programmer is in the unique position that his is the only discipline and profession in which such a gigantic ratio, which totally baffles our imagination, has to be bridged by a single technology. He has to be able to think in terms of conceptual hierarchies that are much deeper than a single mind ever needed to face before." — E.W. Dijkstra


That is a nice quote, but as a biochemist I'm confronted with such ratios all the time for things like concentrations and binding constants. Like 10^-12 (picomolar) versus 10^-3 (millimolar).

I'm pretty sure that's why the metric system and SI units were invented...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: