Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

A really nice idea ... but why, oh why did they have to choose a language that has a lot of extra scaffolding like Java instead of something simpler like Python, or Ruby?


I'm guessing it's because "CSE 8A. Introduction to Computer Science" at UCSD is taught in Java.

A lot of universities start their CS students off with Java these days.


I've seen this in a couple places, but I'm not sure I understand why. Does anybody know of a good reason for teaching java first?


There are a few big reasons schools still teach Java primarily - adoption, scalability, and maturity. According to almost every Tiobe language survey, Java has been and is still either #1 or #2. It is pervasive in just about every enterprise IT environment throughout the entire world (whether you realize it or not), running on a ~billion devices and, now, Android phones as well.

Furthermore, it has very mature support for everything from GUIs (Swing) to web development (Servlet) to high-powered concurrency to advanced data structures. There are literally thousands of libraries and frameworks for every need and a wealth of talent and knowledge out there. Its performance is also orders of magnitude better than interpreted languages like PHP, Python or Ruby. It simply scales better in just about every way and is universally supported almost everywhere.

So, to summarize, for all the reasons listed above Java is the bread and butter of enterprise software. Personally, I prefer Python and C#, but Java is unavoidable if you want a good paying job or are serious about building a high-performance, platform-independent, scalable application.


Yes, but teaching java before graduation is different than teaching it first. You can get all the benefits you mentioned of learning Java even if it's the second or third language while using something nicer for the first one (python? Scheme? SML?).


Not so. Learning on an interpreted language is akin to learning how to fly a plane by simply riding in one. There is a lot to learn in a CS program, and it really requires a single language from start to finish. Python doesn't have some of the advanced capabilities and adoption of an enterprise-class language like Java. Believe me, I love Python - I've created a ton of apps with it, but I also have a CS degree and have seen the practicality of using a single language from start to finish.

How are you going to reinforce the notion of how a compiler works with an interpreted language? How would you teach and force the use of static types on a dynamically typed language? How would you teach true concurrency and thread safety with a language that is limited by a global interpreter lock (GIL)? These are introductory things a CS student would take in their first year.


I wrote my compiler in Python, and I know perfectly well what a compiler does.

It slows down development. :)

Pointer math and garbage collection are lessons better taught in C++, lessons about the stack make more sense in assembly, and it probably helps to learn about high level design patterns using a simple syntax (like Ruby or Python). Lisp, so I hear, teaches you pure enlightenment. We can play this game with any language we like.

Java, for its part, taught me a lot about installing a massive IDE and scouring pre-existing libraries full of absurd design decisions with nominal documentation. (To be fair, this is probably an enormously important lesson for working on large projects.)

Despite the fact none of these languages hits all of these notes, if we had to choose one, Java would be last on my list. Because in Java, there's a risk that you will never learn the most important thing about coding: that it can actually be fun.

http://xkcd.com/353/


I did mention SML, which is a compiled language commonly regarded as an excellent language for writing compilers. I don't believe your analogy of interpreted languages being like riding a plane is accurate. If you want to be "closer to the metal" or whatever, start with C, or assembly. But actually, computational theory is done with Turing machines, which no one uses directly in real life, and lambda calculus, which is far closer to scheme than Java. Which is more appropriate for learning the deepest basics of programming?

As for using a single language, I really don't know. I'm self-taught so far. I can't imagine a good CS degree to not use multiple languages to some degree. Can you do everything from process scheduling to web dev with one language, and do it well?

Further, I'm not sure an "enterprise-class" language is appropriate for learning. Mybe it's just all the OO ceremony that bugs me. I was watching a friend try to explain some C++ code to some ostensibly interested but inexperienced people. He kept having to say "you'll learn what this does later, it just has to be there..." to stuff like includes, int main, type declarations. That's no way to learn. Java is better, but not much. "What's a class?" they ask on their first day. A distraction, mostly.


> Learning on an interpreted language [...]

Interpretation or not is a property of the implementation, not of the language. And it's a sliding scale as well, as you can see with Java.


How are you going to teach pointers or recursion in Java, though?

More on that from Joel Spolsky here: http://www.joelonsoftware.com/articles/ThePerilsofJavaSchool...

There is a lot to learn in a CS program, and it really requires a single language from start to finish.

There is really no single language that you can use to teach everything that a CS student needs to learn. A computer science education should teach concepts that transcend languages, and use whichever language is best suited to demonstrate the concept at hand.


> How are you going to teach pointers or recursion in Java, though?

Pointers, okay sure, but what's the problem with teaching recursion in Java? See, e.g., http://danzig.jct.ac.il/java_class/recursion.html


Sure, recursion is possible in Java, but it's rarely used because of the inevitability of hitting a StackOverflowError given a sufficiently large value of n.

Teaching recursion doesn't make a whole lot of sense unless you're using a functional language where loop iteration is not the paradigm.


> Sure, recursion is possible in Java, but it's rarely used because of the inevitability of hitting a StackOverflowError given a sufficiently large value of n.

That's a good reason for not using recursive algorithms in production Java code, but it doesn't make Java unsuitable for teaching CS including recursion (Python, which IMO is a much better general purpose teaching language than Java, faces the same problem; plenty of intro CS classes that cover recursion well use Python.)

> Teaching recursion doesn't make a whole lot of sense unless you're using a functional language where loop iteration is not the paradigm.

It makes perfect sense independently of what is idiomatic in the language: teaching CS using a language isn't teaching the idioms of the language. Teaching iteration and recursion in the same language is useful, particularly because doing so is useful in underlining that recursion and iteration are different approaches for the same problems, even though in most languages one of the two options will tend to be more idiomatic.


> There is a lot to learn in a CS program, and it really requires a single language from start to finish.

Why would a CS program require one language from start to finish?! Wouldn't exposing students and researchers to as many languages and technologies as possible be the actual point? This is how you get people to come with new ideas, you expose them to as many existing ideas as possible, in as many forms and languages as possible... If I were to design a CS curriculum, language wise, I'd first expose people to Scheme via SICP, then push them from theory to practice with a language like Common Lisp and on a parallel track, more related to EE, move up from an idealized teaching-optimized assembly to real-assembly to C and then to very basic C++, then expose them to "engineery" concepts like OOP in two ways at the same time (the classic C++ way and the multiple-dispatch way in CL), and on a third more "pure CS" track have students choose any of the languages or techs from the other 2 tracks to solve the programming parts of the theoretical problems in this part - yeah, grading and teaching a class of such "polyglots" with their minds all over the map would be hell for any teachers, same as managing a research project with the same maddening diversity, but imho the people that are mentally equipped to handle such environments should be the ones teaching the new generations, not the ivory-tower one-problem-focused types.

Don't get me wrong, I'm against this new "hyperpoliglot wave" of using half a dozen languages in one project, but I'm against it in industry, and from a business perspective, because it wastes brainpower that can be put to other use and bring actual business value, but not in academia, because here the whole point is to explore everything, and you don't really have ugly time constraints and the need to provide "quality of service" assuring to real customers - something like a phd program lasts a few years and at it the end of you don't even have to provide a product that works in the real world (like support a certain load, not be littered with security exploits, be extendable to easily surpass or match the competition on new features etc.).

Maybe I have a weird way of looking at things, but the whole software universe seems upside down to me: academia / CS programs limit themselves to a very small number of technologies to use and problems to solve and this way they do very little "exploration of the problems' space", while the industry jumped into a madness of polyglot-everything way of doing things, with 100 choices of doing everything, none proven to work better for a particular case, and 90% of the time spend learning the tools instead of solving the problems. Shouldn't industry standardize and academia explore? Why are things the other way around in the IT world?


Very few first year CS students do any concurrency your first year. Your first year you're just learning Java.


Universities in the US don't teach languages in general. At my school, you get 1 semester of Java and that's it. The next class people take that uses something else is compilers, and that uses C. You're just expected to know it.

The usual justification is "we're not a vocational college" which I think is kind of bullshit.


Wait, is your take on this that their attitude is not good?

We got one quarter of C++, then everything else was "you are all grownups, use whatever the hell you want so long as it works on [cs cluster]". Most people continued to use C++ of course. It was not until an intro to languages course (preceding a few compilers courses, if you elected to take them) that course material again included learning a new language (scheme).

This, aside from the initial obnoxious first quarter, seems like the absolute proper way to run a CS department. A bunch of courses teaching different languages, as I would expect from some sort of "vocational college", is absolutely not what I was paying for.


It's not so much courses teaching languages, it's the lack of software engineering in general. There's one course here which is a joke and not taught until your second to last semester. And then people turn in absolute barf code in their homework.

Then people graduate and start a job, and they can't code their way out of paper bag, but they sure can write the tuple relational calculus for some given query.

Another example is my OS class. We just spent half a semester going over concurrency problems. Now we're rushing through everything else that goes on in an OS. We spent one day on filesystems and I/O. Not once have we said anything like "well this is how Linux or FreeBSD does it".


So I am a PhD student at the same university as xxpor and did my undergrad there. No idea who he is but I am going to say there is fault on both sides.

1) Every University has some courses which need a refresh.

2) The OS and Compilers courses certainly need refreshing at the moment at CWRU.

3) We have created a new class called "Software Craftsmanship" which address much of the concerns placed at the foot of the "Software Engineering" class.

4) It isn't the job of the university to teach any particular technology. It is the job to teach the theory and fundamentals.

5) Java probably isn't the best language but we have a new intro professor how is doing a good job with the intro course.

6) You never know who you will meet on the internet.

7) TRC and relational algebra are actually a pretty useful formalisms that helped me become a better user of databases.

8) The early circulum refresh has introduced many more software engineering principles earlier in the curriculum.

9) The best place to learn software engineering principles is on the job at a good company. There is no replacement for doing things for real.

10) There are good and bad things about our department but that is the same everywhere.


Hmm, my experiance may have been a bit different since mine was a 5 year program with a solid year and a half of courses before graduation without break. Most of what you are describing got at least two or three quarters of treatment at my school, depending on what tracks you took.

We also had a Software Engineering major which a lot of people in CS took courses from. The only required (non-track) CS courses in our suggested 4th/5th year were software engineering classes though. I think the coverage there was more than adaquate. Honestly it got a bit too "vocational" for my tastes.

On the other hand, most of my peers (70-80%?) took the AI/Games track; they probably got closer to the sort of coverage on the other topics that you're mentioning. IIRC only two arch classes (MIPS), no OS to speak of, and a single 3-month quarter of concurrent programming. I don't think that is satisfactory, but the last I have heard most of them have been doing fine in the real world. That stuff is neat and good fun, but I don't think it will really effect your ability to perform in the industry.


Class is short, you get one shot. Scheme or Java, not both.


In college I was expected to know Scheme and Java, and C and some Assembly besides. I also picked up two other languages on my own time. I can't imagine even the least capable CS student graduating knowing only Java or only Scheme.


Do any college students come out of BS CS programs mono-lingual? That is inconceivable to me.


Some come out without having properly learned any language at all.


Plenty of schools still use C++ instead of Java.


Is seems quite practical to each one of the most used languages in the world. You can then go and pick almost any other language.


> You can then go and pick almost any other language.

Any CS student should be able to do that at least by their second year, regardless of what language they started off with. If colleges were not starting with Java there would be zero risk of students not knowing it if they needed it by the time they were ready to enter the workforce.


The AP (Advanced Placement Exam)^ is taught in Java. Before it was in Java it was in C++. A lot of schools had C++ intro courses because of this. When the AP switched most switched to Java. It isn't the right language in my opinion for an intro course but there you go.

^ an exam which offers college credit at most universities to high school students


I feel it gives a good tradeoff between:

- typed languages that are too low level to start (C/C++)

- scripting languages that are high level, not typed, and therefore do a lot of things under the hood (Python, Ruby, Javascript)


This is a reasonable view, but I want to point out that Python and Ruby are strictly typed, just not statically typed. You must still understand primitive types and casting, in contrast to JavaScript or PHP, which are both loosely and dynamically typed (a frequently dangerous combination).


It's in the interest of the big companies. From what I know, some smaller schools like Harvey Mudd start off with other languages, like Scheme and Prolog.


That used to be The Way To Do It(tm).

MIT switched away, though to python iirc which isn't terrible. Northeastern still uses Scheme first though.


IIRC, Harvey Mudd starts off with Python, in the intro class everyone takes. I just looked through their course listing, but I'm not 100% sure I'm thinking of the right class.


Why not?


There it is. I wondered how far I'd have to read before we had a Java hate comment. Never fails.


It`s not a Java hate comment: I'm writing this as an educator with 20 years teaching experience who has learned that some languages are better teaching ones than others. Java has its place ... but another language would be better suited there.


Perhaps their next project will be a game that teaches kids how to smoke.


Java is a great choice for CS. It's similarity to C/C++ is handy for those who need to transition high performance computing or move to work on low level systems (kernels, drivers, embedded systems, etc). The JVM is also a great target for learning about virtual machines, since the JVM itself is quite an amazing virtual machine by it's own regard. The JVM can also be used to study a 'real life' implementation of stack machines, and Java bytecode can be used as a [gentler] introduction to assembly.

So yes, while you're startup may choose Python or Ruby or the latest language of the day, Java is a great system to teach a lot of fundamental computer science topics with.

Besides, Google uses it.


So yes, while you're startup may choose Python or Ruby or the latest language of the day, Java is a great system to teach a lot of fundamental computer science topics with.

Are you implying that Python (1991) and Ruby (1995) are some sort of flavor of the day compared to Java (1995)? :-)

    import java.util.LinkedList;
    import java.util.List;

    public class Pets {
        private List<String> pets;
        
        public Pets() {
            this.pets = new LinkedList<String>();
            this.pets.add("cat");
            this.pets.add("dog");
            this.pets.add("bird");
        }
    }
or whatever may be great for transitioning people to C++ later on (not exactly where all the jobs are...), but

    pets = ["cat", "dog", "bird"]
is probably a gentler introduction for elementary school students.


C++ comes in second behind java in terms of jobs according to stats from indeed: http://www.indeed.com/jobtrends?q=java%2C+C%2B%2B%2C+C%23%2C... . It might not be the new hotness, but there's still plenty of work for C and C++ programmers.


The point is not jobs, the point is going from zero to coding without scaring too many people off.


Gentler for a weekend hacker, but not for a student. Java is verbose for a reason - safety and reliability. Things enterprises value.

Students need to learn what is most pervasive in the field they plan to enter. That being Java. Starting them out on a "lite" version of a programming language simply because it is easier is just setting them up for failure.


I don't agree with the Java hate (why not teach kids to use Java? it's rather condescending if you think they can't learn it), but I will say: having learned to program on my own, I found it both easy and rewarding (lots of immediate, rapid growth and feedback) learning with Python for a few weeks before getting into Java. For some people, and maybe kids are NOT this group (but the group exists), the initial syntax barrier for the complete novice is much more intimidating in Java than it is in Python (or Ruby). I mean, arguably even C has an easier-to-grok-for-the-total-noob 'Hello, World!' than Java...

[insert examples of runnable 'Hello, World!' and 'Hello, <name>!' programs in Python v Ruby v Java here]


Thus, anyone who learns a first language other than Java is setting himself up for failure?

My first language was Basic on an Atari 800. My second was C++. I've made a respectable career from programming since then, in both serious and "lite" languages alike, including Java. I seem to have avoided these dire consequences. :-)


List<String> pets = Arrays.asList("cat", "dog", "bird");

You were saying?


> List<String> pets = Arrays.asList("cat", "dog", "bird");

Yeah...

This is a language that has its (very large) niche... education is not in that niche. Not out of merit anyway.


It's telling that you feel that is actually a counter example.


List<String> pets = Arrays.asList("cat", "dot, "bird");

Bad usage does not equal bad language.


This is aimed at elementary and high school students, not at CS students. Java may be an acceptable choice for introduction to programming in CS programs, but that is not the purpose of that software which aims to introduce programming, not computer science.


CS _is_ programming. Just more in-depth.


> CS _is_ programming. Just more in-depth.

CS to Software Engineering to Programming is as Physics to Aerospace Engineering to Aircraft Maintenance

Related, but not the same thing.


CS teaches the fundamental theories that form the mathematical basis of programming. It is true, to be a mediocre programmer you only need a few intro lessons or just a couple weekend hackathons, but to master it you need that theoretical CS background which covers it much more in depth.


Substitute technical drawing for aircraft maintenance and I would probably agree with you.


UCSD + Java = a love affair we may never be able to disrupt


This whole thread about their use of Java completely misses the point of this project. Maybe people shouldn't be so hung-up about what language is being learnt? I'm sure the students will be learning a whole range of languages.


Right before opening this page I knew this type of comment was going to be one of the top comments and was not disappointed. Why oh why can't the whining stop?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: