Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> Sure, for my project, I'm typing in code in Visual Basic .NET. For writing code today, all things considered for my context, that is about the best option.

Maybe you should try something other than VB.NET, then. There are ways of constructing correct code, if you're patient enough, that is. If you're using the CLR, why not use F# instead of VB? That'd be a big step towards being able to prove some things about your code. Or if you really want to take out the big guns, go for Coq, Isabelle, Agda, Epigram, etc.

> Can some software go over it, report properties, do some transformations with known, useful properties?

Yes, in many cases. There's a whole lot of work in static code analysis, and refactoring tools. It's not perfect (the halting problem being non-decidable and all that) but there have been _significant_ improvements since the Algol60 days, and that's not counting functional languages and type-theoretic approaches.

> Or, return to most of the rest of engineering

That goes a bit off topic, but at the stage we are in, for most practical purposes, "software engineering" is not really engineering. Except maybe when done by NASA, but then again, that's not a practical approach either.

> Also for another of your points, I'm not talking about anything like branch prediction or deep pipelining. That's basically what to do with with the hardware to execute an existing instruction set.

Oh, but you were talking about compilers. Modern optimizing compilers have to take those things into account, among many other things.

> Clearly compiling and executing a program are necessarily mathematically something, understood or not, powerful or not. For progress, we need to understand the subject mathematically.

(from your previous post) My point was that very significant progress has been made in many fields, even if not necessarily formalized.

> Also the point is not math or not.

Sorry, but when you state that CS should be a footnote in a math book, you are kinda making the point that everything in CS (even in those sub-domains that are eminently practical) should be math-based to do anything meaningful. This is demonstrably not true.

> The point is progress.

And I've agreed with you on this. More and better maths can help advance CS. But we knew that already.

> Your point that there's a lot of good CS to do without 'mathematizing' the field is not promising for research or significant progress.

I contend that there has been significant progress in many CS areas without 'mathematizing' them. That is a fact. I also stated, in my previous post, that I agree that maths could help improve this progress. I think my problem with your position is that you're talking in absolutes in topics where those absolutes clearly don't hold.



"Sorry, but when you state that CS should be a footnote in a math book, you are kinda making the point that everything in CS (even in those sub-domains that are eminently practical) should be math-based to do anything meaningful. This is demonstrably not true."

I'm exaggerating, but, still there is a reasonable point here. CS is about some 'science', and call that the mathematical part where we have some solid material worth being called 'science', For the rest, call that 'computer practice' or some such.

The upside of my view is that for some serious progress we're going to have to use some serious methodology. So, I'm proposing if not mathematical physics envy then applied math envy. Applied math didn't take on how to design the display lights in a scientific pocket calculator although without the lights the thing wouldn't work.

My view is not the most extreme: Last year I communicated with a CS prof whose position was that CS is looking for the 'fundamentals of computation'. Hmm .... It sounds like he believes that the P versus NP question should be right at the top of the list, and I don't. I'll settle for anything that is solid and a contribution, even if small, to the 'science' and not just to current practice.

LIkely many people here know the current state of programming language research much better than I do. If that field has gotten nicely mathematical with some solid material, great. The progress from Fortran, Cobol, Algol, Basic, PL/I, Pascal, C, C++, etc. was pragmatic and important and of enormous value to the economy but not much progress in a 'science' of programming languages, and that progress, with poor methodology, has slowed as we might have expected.

Maybe I'm saying that, in 1920, if we wanted a really good airplane, then maybe we should set aside the wood, linen, and glue and go do some aerodynamiocs calculations, discover Reynolds number, and discover that those really thin wings were a mistake. Or, observational astronomy was just a lot of curiosity until Newton came along and made progress in understanding the universe. The practical chemists had discovered a LOT, but by applying quantum mechanics they made HUGE progress.

If we are going to make the huge progress we want in computing, then history suggests that we can't be just pragmatic and that "theory is under-rated". We can't expect that chemistry will do much to help CS, but the obvious tool is math, to turn CS, the science, part into some applied math.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: