Don't waste your expensive and valuable college time on software engineering tools, source control, and other mundane crap that any monkey can learn quickly. Every programming tool that I learned about in college is now either unused (CVS, RCS, Motif), or dismissed by the l33t rock-stars as dinosaur technology (C++, Java, Perl). Today's l33t tools will be just as dead in ten years. Learn enough to do your assignments well, but view the time invested as a sunk cost.
If you want to learn how to be a coder, go to DeVry, or read some O'Reilly books and hack away. Your career will be mercifully short and uneventful.
If you want to be a computer scientist, spend your time learning math and theory, and learn it inside out. Then take business classes, chemistry classes, language classes, art classes -- anything to make you marketable in a non-technical way. The only way you're going to survive to old age in software (if that's even possible) is by acquiring talents that grow more valuable with age and experience -- skills that can't be cheaply exported to the next younger guy who is willing to work 80 hours a week for low wages and free soda.
If you want to learn how to be a coder, go to DeVry
It's worth noting that neither a DeVry "degree" or a Computer Science degree will make you a programmer. If you just get the DeVry education, you will know what functions you can call in the Java core library. If you just get a CS degree, you will know what B+ trees are. But neither of those alone get you a working database management system; that requires excellent coding skill, and an excellent understanding of the underlying mathematics. "The real world" requires that you have both sets of skills; only having half the education will make you less than half the programmer.
I think this is a wider problem in our field; the "academics" don't want to admit that the "industrial" aspect of programming is important, and the practitioners don't want to admit that the academics are quite-often onto some very good ideas. Turns out that both sides are important. The practitioner doesn't want to waste time reinventing simple concepts (which is why we have code organization techniques that were popular in academia long before they were popular in industry), and the academic doesn't want to reinvent industry (turns out that knowing about automatic testing and source control make writing your new researchy programming language a whole lot easier).
So it's clear that to be an effective programmer, you need to be well-versed in both the practical and academic aspects of the field. Dismissing the practical aspects by saying they are a waste of time and that some dude from a trade school can handle them for you is ... short-sighted.
(Oh, and if Perl is a "dinosaur technology", Java, C++, and C are "pre-life self-replicating molecule" technologies. Just sayin'.)
The people I've worked with that are heavy on the academic side (MS,PhD, etc) with little real world experience have, without fail, been very mediocre and have had a hard time getting stuff done. They are either architecture astronauts or they try to turn it into a thesis paper. Bonus: I once saw someone with a MS in some computer-related field remove RAM from a running computer.
My experiences with people that have had no academic experience but lots of real world experience have been just as bad. I recently worked at a place where out of 14 programmers, only myself and three others had a degree. They all, to some extent, shared some of the same qualities: they can get stuff done (in small projects), but usually in really dumb and expensive ways, and they display a poor aptitude for learning new skills and techniques, and can't think abstractly.
I don't mean to say that a degree is everything or will automatically make you a better programmer, but in getting one, you gain a foundation that you wouldn't otherwise get from work experience alone.
"So it's clear that to be an effective programmer, you need to be well-versed in both the practical and academic aspects of the field. Dismissing the practical aspects by saying they are a waste of time and that some dude from a trade school can handle them for you is ... short-sighted."
That's not what I said, though. My opinion is that if you want to have longevity in your career, you need to focus on the theory. That doesn't mean that you don't learn the other stuff -- you just learn what's necessary, and move on. More to the point: when you're spending big bucks on college classes, you'd better be devoting your time to learning stuff that you can't learn from a few hours of quality time in a coffee shop with an O'Reilly book and a laptop.
For what it's worth, though, I've known many CS professors, and many industry programmers, and I wouldn't put the average of one group ahead of the other in terms of programming skill. The myth that academics don't know "industry" is a myth (but there are plenty of coders who don't know anything about algorithms).
Also for the record: I like Perl. I'm not the one calling it a dinosaur technology. I also like C++, though, so maybe I'm just old.
Why can't you learn data structures from a book in a coffee shop? I did.
I would argue that you can't learn the theory until you have enough practical experience to actually implement and play with the theoretical concepts you are learning about. By not learning "the trivial stuff you can learn from an Oreilly book in a coffee shop", you are wasting your on time. The sooner you stop wasting your own time, the less of it you will waste.
If you divide the sticker price of my college education by the number of class hours, the implication is that one hour of instruction costs about $80. In the harsh light of that fact, I would still have paid a few hundred bucks to learn CVS or SVN in college rather than learning bad habits. My first two jobs programming (academic and quasi-academic) didn't use source control, and I kept my bad habits until I got into industry and was dragged kicking and screaming into professionalism. I think source control should be taught starting the first day of CS101. If the exact tool changes in 5 years, oh well, you can learn the new tool. But it should be an automatic, instantaneous, ingrained part of your process from day one. (Ditto IDEs and basic Unix system administration.)
Four courses that have been worth substantially more than $80 an hour to me: Japanese, Technical Writing, AI (mostly because it really should have been called Introduction To Scripting Languages), and (weirdly enough) my single course on assembly. That was entirely due to a twenty minute discussion with my professor that had an effect on me, the general gist of which was "Any performance problem can be solved by caching, if you do it right." I haven't programmed a single line of assembly in my professional career but every time a performance problem comes up I cache and the problem goes away. (And is replaced by a cache expiration problem.)
I think source control should be taught starting the first day of CS101.
I agree with the general sentiment but I think that instead this should be taught during the first time students need to work in large groups (more than 2 people). Without suffering from not having version control, no one will care. Now, I'm not saying that we should only focus on what people think they need as students, but if there is a way to force them to understand the value of something, all the better.
In my opinion, the computer scientist is focused more on the academic and completes his work in the form of a paper. On the other hand, the programmer actually builds a product that is intended to be used. They ignore this distinction at my school (UMass Boston). The general consensus seems to be that programming is a lowly task.
Perhaps your experiences are different, but at the end of the day there is a sharp conflict of interest between the computer scientist and the programmer. Both want to solve problems, but the computer scientist seems more interested in some math problem and the programmer is more interested in building cool shit.
Computer science is as much about analyzing the properties of computation as it is about writing programs. Graph theory, computational complexity, data structures and algorithm design and analysis, etc. are all computer science topics.
Knowing the "theory" of computer systems (operating systems, networks, compilers, etc.) isn't worth a damn thing if you can't MAKE and MODIFY one. Software engineering tools matter because you'll never be able to make or modify a system of any complexity without first engaging with them (to some extent).
The broken assumption here is that you can become a competent computer scientist by doing assignments
Seriously. Learn to think creatively and abstractly about problems, which doesn't necessarily come from being in over your eyeballs in code. Your ability to think outside the box will help you more than any computer science class ever could.
If you want to learn how to be a coder, go to DeVry, or read some O'Reilly books and hack away. Your career will be mercifully short and uneventful.
If you want to be a computer scientist, spend your time learning math and theory, and learn it inside out. Then take business classes, chemistry classes, language classes, art classes -- anything to make you marketable in a non-technical way. The only way you're going to survive to old age in software (if that's even possible) is by acquiring talents that grow more valuable with age and experience -- skills that can't be cheaply exported to the next younger guy who is willing to work 80 hours a week for low wages and free soda.