HN2new | past | comments | ask | show | jobs | submitlogin

While some of this makes sense... this "Undergraduate courses lose their technical currency in something like five years on average" sticks out like a sore thumb.

How well would we be served if C were taught in undergraduate programs? In depth. Add to that a Lisp and Java, and who knows what we would be capable of in a shorter timeframe? C and Java have not gone obsolete, and along with Lisp, many different paradigms of programming are taught. Not only that, but technical skill with important and oft-used languages would be transferred.

I add Lisp simply to include another paradigm and because it's an old concept that's never really gone away, and seems to be gaining traction (see this site, see Clojure, see the continuing development and expansion of Scheme and CL and Gambit etc etc.)



It's easy to point to Java and C as stalwart success stories in hindsight, but I can recall with Java there was a tremendous amount of histrionics from the C++ and VB crowd claiming that Java was merely another passing fad. C faced down Pascal and COBOL, two languages which were considered far more favorable to programmers at the time.

----

I feel sorry for CS/CEng departments these days, as they're facing pressure from both sides: The industry wants more vocational-style training, and students want "real world" experience but also want a world-class western liberal arts education. It's as if their idea of a perfect school is one with the prestige of Harvard married with the curriculum of DeVry. I'm afraid you cannot have both.


I can't imagine a better course than writing a Lisp interpreter in C, and then a Java implementation in Lisp.

They're clearly not fads, they're clearly useful, and the kind of knowledge and education that doing such an exercise would require is exactly something that proponents "world-class western liberal arts education" would laud.

It can work, it seems to me.


It does stick out... but apparently too far as it blocked out the following qualification "(obviously different sectors age at different rates – web programming has a very fast cycle, embedded systems a very slow one)". C is a systems language, it's been around forever and is not about to be replaced anytime soon, possibly anytime in our lifetimes. It's too good at what it does. And that's why it's taught in a lot of universities.

As is Lisp, for the reasons you mentioned. As is SQL. As are several others (though I'd disagree with Java myself because it's really more set up to do things than to teach things and there are too many shortcuts in there - but many others disagree with me on that point :) ).

I'm not for getting rid of established languages from courses. C is 37 years old; C++, 30 years; Lisp, 51 years; Objective C, 23 years - these are well-established languages that won't vanish in the four-year span of an undergrad course. Java at only 14 years (and so many releases that stability is a valid question) tends (to my mind at least) to mark the start of the gray area there. Some though, like Perl and C++, are really hard to teach in compared to others and that can outweigh their stability as a factor. Still though, I look at languages like Python and would love to work them into a course. But it is still a risk for the students. Maybe for the final year work though.

But teaching a four-year course using Ruby as a primary language, intellectually interesting as that sounds, is an unethical act at the moment. It's a beautiful language, a real joy even to read - but what the students start learning on day one may not be around, or be a useful thing to know for the jobs they'll pay their mortgages with four years down the line. That's the responsibility the university is taking on when it creates a course. The student invests four years of their life - the university must produce a return on that, and conservative thinking is needed for that.


On one hand, we have talk of learning fundamentals, but on the other, if we teach with more exotic languages, "they'll never be able to find a job because they didn't learn anything useful."

So, which is it? Do schools teach the fundamentals, transferrable to any language, or do they teach what'll be useful in industry?


The two are orthogonal, at least in some aspects. The fundamentals can - technically - be taught in any language (practically, noone's going to try teaching a course in BrainFuck). But we try to choose a language that gives them, if not industry skills, a platform from which to reach those skills rapidly. For example, teach them C and an OO language of your choice properly - and they can learn C++, ObjectiveC, Java, Python or any of a dozen others very rapidly indeed compared to someone who dove in, learnt Inferno and then graduated into an industry that never heard of it.

The thing to remember is this - we teach them the fundamentals for a reason, namely to get jobs in industry. At the back of all this academic teaching is a commercial reality that can't be forgotten. But you have to balance that with a long-term view of the student's entire career. We're trying to give them a degree course, not a Sam's book!


It's true.

Just color me bitter about how my college neither taught fundamentals nor prepared anyone for industry.


That's not bitter, that's justifiable anger steve. There's a duty of care involved on the university's part - if they screwed the pooch, that's a pretty major thing.


This actually lets me responded in a threaded way, rather than your blog, apparently? So my bad, I'll just respond here.

Do you feel that your experience or mine is more mainstream? When I read Joel's "The Perils of Java Schools," I felt that it described my school to a tee. I had just assumed that that's how most programs are, given my school's size. Obviously, Ivy-leage schools should be better, but I just kind of assumed that my own experience was average.


Wordpress.com's threading leaves much to be desired I'm afraid.

I can't judge what would be mainstream in US accurately enough to rate your school I'm afraid. I'd see some of the details of the main schools as they write up their courses in journal articles; but my experience is mostly with Irish universities. However, from what several people here and back on the blog and over on reddit are saying, it sounds like the average Irish CS degree is a step or two ahead of at least some US schools. Which I have to say is a major surprise to me, since US schools were (I thought, from speaking to grad students and academics from them) better funded than Irish ones.


I'm not sure if it's a funding issue. It's more of a philosophical one. We've been telling kids for the last few decades that colelge is the path to success, and we've lowered standards so much that everyone goes to college, so that it's actually true.

College is quickly becoming a big commodity business, rather than a place to learn.


Up voted for introducing me to Brainfuck. Hi. Larious.


Look up INTERCAL. Brainfuck isn't too hard, just tedious. INTERCAL on the other hand was designed to be as big of a pain in the ass ass possible. People like to use the phrase "fighting with the compiler," but this is a language where the compiler is actually actively out to get you. : )


Always a fan of INTERCAL. Malbolge is a good one, too.


For more fun, check out the Esolang wiki: http://esoteric.voxelperfect.net/wiki/Main_Page

FukYorBrane is a great Branfuck variant.


You're contrasting Ruby as an upstart and Java as an established language, but they're basically the same age. They both had their first public releases in 1995.

(Of course, this doesn't invalidate your point about relative popularity.)


And that's exactly the problem. In liberal arts educations (such as the one I'm receiving) or even at universities such as MIT, Python is the language of choice. It's great fun to teach, very easy to learn, but it's not an industry standard, and many would argue that it's too easy to use.


I don't think you're giving Python enough credit. It's used pretty extensively everywhere from NASA to Google to hard science research.


I don't think it's reasonable to claim Python isn't an industry standard. It's used heavily in industry by large and fairly conservative companies and is even more popular with startups.


I think it's reasonable because there are very rarely any such things as standards in software development. Municipal building codes are standards. Drug testing regimens are standards. Programming languages are not standards. At best they are conventions, and only insofar as certain niche subset industries are concerned.

As for de facto standards, I don't think we should be paying much attention to them. I think it's very important to draw the distinction that software development currently has no standards (you might be able to argue for TCP/IP, where alternatives are only ever used because they fulfill some unique use case not covered by TCP/IP), because for as much as we want to call this industry "software engineering", we sure as hell don't treat it like any other field of engineering, for many of the reasons already mentioned.

I think the first step is to sit down and agree on some terminology. You can't even get two programmers to agree on what "Object Oriented Programming" means. No wonder we aren't treated like professional engineers. We don't act like them.


Most "industry standards" are de facto standards. We're not talking about laws here, or even open standards like C or Common Lisp. An industry standard is simply a widely-accepted practice, which is perhaps defined by the fact that it would not usually be questioned by a casual observer from that industry. Painting interior walls some shade of off-white is an industry standard. Using Python for scripting and application logic is an industry standard in the same way.


I wouldn't say that using Python for scripting is even a de facto standard in that sense. Don't make the mistake of assuming your experiences are normative.


Are you really saying that Python is "not useful"?


That was a mistake: Python is QUITE useful. But I wouldn't call it an industry standard. Corrected.


Really? Python seems to be showing up everywhere I look. I don't think I've seen any largish company in quite a while that wasn't using python for something somewhere.


It's spectacularly useful, and I personally really love using it - but to say it's an industry standard the way C or C++ are is to stretch the point a wee bit too far unfortunately.

Still though, we use some languages for teaching (like Pascal or Modula-2) which don't have the kind of industrial usage levels of C or C++, so we might see Python being taken up sooner rather than later. I think there are one or two courses already using it over here on a trial basis.


Being the introductory language of choice doesn't make it the language of choice. MIT kids learn the real stuff too.


Python is "the real stuff", and I'm always annoyed by the snobbishness that leads some people to dismiss it. Python is an excellent language, with a large user base, and some compilers in the works.

Just because C is hardcore and Lisp is amazing and Haskell is mind-exploding doesn't mean that Python isn't a great language.


C/assembly to learn about the machine and because they're everywhere, Lisp to learn about all the paradigms, and a popular scripting language if you need big libraries. Java is so ugly, it'll just confuse them. It confused me, anyway.


Perhaps it's not too popular, but VHDL or Verilog are a ton more useful for learning about how a machine works. Seriously, writing some x86 assembly teaches you a thing or two, but implementing a full CPU from the design of the ISA to putting it on an FPGA is the way to go. Once you've done that, all the magic is gone, and computers are completely transparent and understandable.


Yeah, you're probably right. This is my 'limited life experience' showing...


Yeah, it's kinda like the point of learning the different sorts is not really to learn how to sort things - it's a vehicle for teaching big-O notation.


If I had learned C, Lisp, and Java "in depth" when I was an undergraduate, much of the effort to learn them would have been wasted.

My first programming-ish job involved documenting C++ libraries (in an era where C++ templates had just started to be supported by mainstream compilers). Then I had a job involving Perl and Java, and even though I had no experience with Java I had no trouble getting up to speed with it. Then I had a job involving J2EE; I hear that the "enterprise" Java world is doing a lot with Hibernate and other lightweight frameworks these days, so what little I learned about EJBs on that job is now stale. And now almost all of my coding involves some combination of shell scripts, Python, and SQL.

There's a place in every software shop for the language lawyer who really knows a programming language's subtle tricks and traps, but not everyone needs that coming out of a CS program.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: