It's interesting that Joel purports to treat his programmer staff as professionals, yet has the same unreasonable tradesman-focused expectations that the cube farm managers have about education in Computer Science.
A graduating structural engineer has no idea how to marshal a design through the building inspector's approval process. A graduating lawyer has no idea how to actually engage in litigation. Yet those fields aren't dominated by managers whining that students should be learning "real world skills" instead of "theoretical stuff" like high physics
or constitutional law.
They understand that a professional isn't an automation that gets precision machined by a training program to slide frictionlessly into their workflow. They know that when they hire a graduate, he isn't trained in the mechanics of his job (nor is he even licensed yet), and that the hire represents a commitment on the firm's part to take the
theoretical knowledge he got in school show him how to leverage it for the real-world practice of his profession. They are fine with that because they have a culture -- as professions and as firms -- of respecting and investing in their practitioners.
The computing industry doesn't operate that way. All the lip service it gives about 'professionalism' is, as far as I can tell, entirely driven by a desire to ensure that programmers remain exempt. Why is it that the manager class at development firms is dominated by non-technical MBAs? Why are development firms not set up so that programmers are partner-tracked associates? I can't think of any real profession where that's not the default configuration for a firm. And most apropos here: why do they expect their supposedly professional workforce to receive trade skills from their university education
programs?
I think people like Joel need to quit attacking universities until they can get some consistency in their own views of their employees. Either programmers are the skilled tradesmen we're currently treated as. In which case, the exempt status should be removed and the industry should come up with a tradesman's curriculum. Or they should accept us as professionals, and start treating us that way.
It would be nice to shunt all the blame off on MBAs but programmers have much of the responsibility to shoulder themselves.
How many times a month does news.yc have a thread where people debate whether you need any education whatsoever to do our job? If we can't even decide ourselves no wonder others think anyone can do what we do.
Another systemic problem in our industry is training. "Training" at most companies, if you're lucky, is a small budget for buying books on amazon.com that you can bring home and read on your own time. Lots of companies will tell you (especially if you're a new grad) they train their new employees and then, once hired, you're shunted off into a cube and the "training" turns out to be an overworked co-worker who comes over to tell you how to set up your workstation, where the source code is oh and if you have any questions just send me an email.
Combine the above with a culture of acting like add-afflicted children (oooh shiny erlang, let's rewrite everything in erlang! Rails is sexy. I'm a rockstar!), and the MBA types distrust us and view us as people incapable of managing ourselves, people that need proper structure to deliver anything at all (Joel's last statements).
At my last job, soon before I left and after I'd handed in my resignation, a manager (not my manager but another) came around and congratulated me on how much work I'd gotten done in my last couple of weeks, with the obvious subtext that he was surprised I did anything at all since I'd resigned. It was insulting. If he had seen me as a real professional, there's no way he would've said that but he obviously didn't.
He even links to this in the current article! I suppose he is trying to make a more nuanced argument, but it does not come across well. He seems to be saying that universities need to be a little more "Java School"ish than they are now, but not too much...?
I think he should better clarify just how much Java School there should be in a good computer science curriculum.
Absolutely. I loathe the emphasis on being an 'X programmer' in the software industry. I've never met anyone who described themselves as an 'Autocad engineer'.
There is a management culture of referring to technical people as "resources", e.g. this person is a "Java resource" and we'll need 3 "Linux resources" on that project.
They sure don't like it when I call 'em "Powerpoint resources".
Tell me about it. The firm where I'm at used to be quite small, and MBA talk was non-existent. Now project managers use it a lot. We engineers used it ironically at first, but now it's starting to taint us ...
Powerpoint resources is a great way to put it. I'll be sure to use it.
It's very de-humanizing. Even in places that claim otherwise, they tend to treat technical staff like interchangeable parts, rather than people. And they wonder why their best folks tend also to be the least loyal.
I've never met anyone who described themselves as an 'Autocad engineer'.
Every engineer will describe themselves as an 'X engineer' where X is equal to automotive, or electrical or structural or something similar. No engineer is going to think that just because they're really good at design underwater oil pipelines, they'll be equally awesome at designing cars.
Many programmers on the other hand seem to be of the opinion that once they've learned how to 'program' they'll be equally adept at writing ray tracers, TCP/IP stacks and video editors (as long as they get to use their favourite language).
My post was ambiguous, but I didn't mean to say that engineers don't specialize. The difference is that an engineer's value lies in their understanding of a technical domain and skill in analysis and design, not the particular tools they use.
It's perfectly valid to describe oneself as a 'web developer' or 'systems programmer'. What bugs me is calling yourself a 'C++ programmer' or a 'Ruby hacker'. Emphasizing skill with a tool is the mark of a tradesman - it abets the industry dysfunction of viewing programmers as interchangeable resources who should be slotted into a project with minimal investment.
While I agree with everything you say, I think one has to accept that not only are many programmers tradesmen, but many of them are happy being tradesmen. I know people who won't consider any programming job where they won't be working on php based web sites, because that's all they know and they have no real interest in learning anything else. Those people are "php programmers" and can be slotted into great number of projects with minimal investment.
What is needed is not to get annoyed at these peoples existence or their approach to programming, but to find a way to differentiate between them and other types of programmers.
The sad truth is, average and below-average developers actually need considerable training to learn a new programming language. They can't just look at the specs and read some tutorials to get an understanding of the conceptual and syntactical differences and then start hacking. They need to be trained.
Also, to counter your 'Autocad engineer' example, commercial airline pilots can be described as a 'Boeing 747 pilot' or an 'Airbus A318' pilot. You need certification to fly a specific model and can't just hop in and start flying a different model.
> commercial airline pilots can be described as a 'Boeing 747 pilot' or an 'Airbus A318' pilot.
I don't think that's a good analogy.
Pilots are operators, not creators. What they do is pretty much defined by what devices they use to do it.
Engineers, in contrast, use tools to create new things, and are defined by what kinds of things they create (e.g. civil infrastructure, electrical systems, computer architecture), not what tools they use to create them.
Actually, none of the private or commercial pilots I've known have ever referred to themselves that way.
Granted, my sample size is limited, but I think that every one of them would be as annoyed at being referred to as an <aircraft x> Pilot as I am at being referred to as a <language x> Programmer.
Well, in their defense, while someone may not be called an Autocad engineer, they can certainly be called civil engineer focusing on concrete stresses. Calling them an Autocad engineer would be like me calling you an eclipse or vi programmer. Ultimately I think we're confusing tool and end product here.
I think that's sort of the point being made -- you might not find eclipse programmer, but you'll find plenty of Java or .NET programmers, and those are tools also.
I think you're missing the original point I was trying to make. Java and .Net aren't really the tools, they're the medium in which you work. Just like that civil engineer understands the trade offs invovled with different kins of concrete ad using it in different ways, the Java programmer understands the trade offs involved with the API and VM, in addition to generic CS tradeoffs. Look at the following two fake resume excerpts:
Bob T. Builder :
Civil Engineer with Emphasis in Concrete Stress
---Skills---
Autocad :
Industry Used Physics Modeling Software
Suzie M. Hacker :
Java Programmer
---Skills---
Eclipse :
Ant
Learning to really use Eclipse or Visual Studio well takes a surprising amount of time. So, IMO it's reasonable padding for a resume, but people don't call themselves Visual Studio developers.
What people most often identify with seems to be the frameworks they are using. So you will see someone call themselves a .Net Developer not C# .Net coder even if that's what they do. They don't know all of .Net but they feel confident they know how to approach most .Net issues and they happen to use .Net and Visual Studio.
I could have looked at the Spring stack or Coca, but I think the point still stands. Even if someone is calling themselves an OX X developer they are talking about knowing the same type of thing as a .Net developer.
A graduating structural engineer has no idea how to marshal a design through the building inspector's approval process. A graduating lawyer has no idea how to actually engage in litigation.
The point of his post is that the market opportunity doesn't exist because the firms professionals are expected to enter take care of that domain-specific training themselves.
For a very long time there has been a "tradition" of sending newly graduated engineers to work on the manufacturing floor, or in product testing for a while so they get background in the business before actually having to design anything. Such a tradition seems to be missing in software development.
While I don't always agree with Joel, it is a bit of headline grabbing to call him a snake-oil salesman. He is building a product that is shipping and the company is profitable. Not easy to do.
But I think if universities taught CS the way that they should, and not many do, in my opinion, the result might not be what he expects.
I think that many companies misuse programmer talent--one example is to employ thousands (not exaggerating) in enterprise departments that are reduced to moving one field to the other. If there were lots of well-trained CS grads with a better orientation, they might be questioning how things are done. Both in large enterprise situations and perhaps in Joel's company.
His strongest point is that students are not exposed to longer projects. I was once in a position to hire compiler people. Few graduates of programs that taught compilers were well-equipped. One large state university in Michigan taught a compiler course in a semester. The graduates that I talked to came away with the notion that compilers were very scary. Graduates from another midwestern school across the lake did very well. Their compiler course was a full year as part of their masters program.
I think that industry does not really know what to ask for. CICS as part of a CS program? (True). Really?
"The fact that operating system design posed some at the time tough conceptual and logical problems was hardly allowed to surface. Consequently, those problems did not get the attention they deserved, and this had far-reaching consequences: I recently read a series of articles on the Shuttle on-board software and, though President Reagan has generously offered a teacher a free ride, I tell you I am not available."
This talk was given at a conference in mid-November 1984. The flight with a teacher on board was, of course, the Challenger in January of 1986.
Of course, the disaster wasn't due to software... but still rather uncanny.
You may be "right" in theory, but it doesn't matter.
The vast majority of computer science students pursue a CS degree because it has become a prerequisite for being a professional software engineer. NOT because they have a desire to become a computer scientist.
The simple fact is that if you do not have a 4-year college degree you will find it immensely harder to work as a software engineer in the industry. It's silly to pretend otherwise.
The vast majority of today's CS programs do a disservice to all of their students. They are neither good engineering programs training students for industry (for which they currently serve as a proxy) nor good computer science programs. A recently minted graduate of a Chemistry or Physics program (from an accredited US 4-year college program) is, on average, much more versed in the relevant scientific findings and research techniques of their field than a CS graduate. They are so poor precisely because the vast majority of CS graduates do not go on to do CS research or utilize their CS knowledge but instead become software engineers.
> The vast majority of computer science students pursue a CS degree because it has become a prerequisite for being a professional software engineer.
I don't know if things are different outside of Canada, but isn't this precisely what a computer engineering or software engineering diploma is for? That's what I'm doing and it's definitely different from CS - very applied, a lot of low-level programming (assembly, VHDL, embedded C), making a game with Java calling C++, source control is always used, etc.
In my experience, Computer Engineering is a combination of EE and CS, producing someone equally capable (assuming equal interest) of writing low level systems and designing logic circuits.
Software Engineering is more of what Joel is describing, and more schools are starting to offer it as a major, with as much focus going to various project management methodologies as algorithms and data structures.
Things are different outside Canada, specifically in the U.S.. I've found in the U.S. that there are vastly inconsistent definitions for the different programs. In Canada you tend to have Computer Engineers who study in a real engineering program. Some schools may split this into Computer Engineering (emphasis on hardware) and Software Engineering (emphasis on software). Then you have Computer Scientists that range from "Software Engineers" -- in quotes because it's not a real engineering degree but a science one. They study lots of programming, more math than other CS students, as well as processes. On the other end of the spectrum areManagement and Information Technologists that do less programming, less rigorous math, but more business and management education. Having had to interview candidates that have CS degrees from schools all over the world (Canada, Cuba, Russia, the U.S.) I've found that for some reason the U.S. has a weird sort of wild west situation with respect to computer science. Students in community colleges who take one course in Java consider themselves to be studying computer science the same way a student at a top notch program in a place like Stanford would. Here in Canada if a student is studying programming at a community college they're more likely to declare themselves as studying just that, "programming" or "IT".
Answers my question - in Quebec, the title "engineer" has quite a few requirements, mainly having an engineering degree and passing an exam. Hence my confusion with the text: here, a CS graduate could not call himself a "professional software engineer".
While I mostly agree with your point, the phrase "Infinitely harder" is a bit of a stretch. I've worked with many professional software engineers without CS degrees, and even many professional software engineers without 4-year degrees at all.
Personally, I was already a programmer when I took my first University CS courses and realized quickly that while I love programming, CS isn't where my interests lie. I took the CS courses that were most relevant and wound up getting a business degree.
I do wish there were more programs available serving the middle ground between the v-school ghetto and the ivory tower.
Good point, "infinitely" is a bit hyperbolic, "immensely" fits better, I think.
I agree, it is possible to be a professional software engineer without a CS degree (some of the best fit that category), but that's a non-traditional road. More to the point, it's exceedingly rare for someone to intentionally avoid earning a CS degree if they plan, prior to attending college, to become a software developer.
I don't have a degree and I've never had a problem getting multiple job offers when I've been on the market. I've held some very high posts as well. I think people use the degree as a road sign or indicator of possible quality, among other indicators, to come to a hiring decision. When the candidate is demonstrably talented, the degree becomes worthless (not negative, but just not contributing to the decision anymore).
I've personally found that people who got into programming before the dotCom boom don't need to worry about a lack of degree, people who got in after absolutely need a degree.
I suspect it's pretty easy to get job offers when you are demonstratively better than 99% of your peers, degree or no. That doesn't mean your average programmer can find a job without a CS degree...
If you have a solid track record in the industry then you don't need a degree. But if you're a wet behind the ears programmer-to-be with no prior job experience then you'll be in a bad position. Exceptional, naturally gifted software engineers in such a position can manage to get by through demonstrating their talent in some way (e.g. contribution to a major open source project, perhaps), but the average developer is at a severe disadvantage compared to his degree holding cohorts.
This may well be true, and that is pretty unfortunate.
But, frankly, if all you want a job developing software, I'd suggest interning instead of getting a degree. You'll certainly be more qualified than your peers for most jobs four years on.
And, for what it's worth, I'm currently hiring for a Jr Developer. I want someone smart who we can train. A CS degree wouldn't hurt, but it's certainly not a prerequisite.
I don't have a CS degree (or any degree for that matter) and have been programming professionally for 3.5-4 years, 13 total. I never interned, but I spent a lot of time as a child/HS student writing code independently and working for a few open source projects. That got my foot in the door but until recently, it was rough finding work.
Any (career?) suggestions in general? I have a strong background in CS (I made prodigious use of ocw.mit.edu), and I feel competent at a practical level.
At this point, I'm just trying to pick a particular language to master while learning something esoteric on the side.
Doesn't programming deserve to be the subject of a rigorous, high-level education? It is a rich craft with a growing tradition. We would do well to consider, as Joel does, how to provide students with appropriate experience for the future practice of their career. That doesn't mean teaching the particular, accidental details of contemporary professional programming, but it does mean taking programming seriously as a subject of education. You can study architecture, design, and engineering in universities - not just sciences and humanities. Similarly, the existence of computer science shouldn't exclude programming from our institutions of higher learning.
It's a trade in Germany. I left school after 10th grade and started my apprenticeship in programming. The net effect is that at 29 i've now been a professional programmer for 12 years.
More importantly, I also developed all my programming habits during my formative years and therefore developed pretty good habits that are now hard wired and second nature. It gives me a huge edge over other programmers of my age who have been joining the work force over the last few years.
The huge difference being that in Germany you have functional apprenticeship programs. In the US, which this article is centric to, you're really just going to go to college, have little option besides CS, and then end up somewhere in the chasm of this debate.
Trade schools exist, but since they're generally looked down upon you're not going to see a lot of popularity in Programming Trade degrees. Though perhaps the only difference between programming and metal craftsmen is that CS is more obviously popular and valuable these days. P only might equal NP so computers and robots haven't yet taken over the job of crafting programs.
This is meant to be a lot less a critique of programming and a lot more a nod toward the intense craft that goes into many of the things you might learn as a "trade". The general skill to build something physical and high quality is so frequently looked down upon these days.
That said, in a world where there are degrees in video game design and journalism, theoretical purity is probably not the defining characteristic of subjects that are taught at university.
And there's no reason that those who are drawn to the purity of math and science behind computers can't attend traditional CS programs. It would be good if real universities could also (as a different department, whatever) provide relevant instruction and research in the realm of programming.
Those who really care about both can always double major.
Nope, I wouldn't. I'd expect them to be able to write the book and then get a lawyer to review the contract that the agent negotiates for them.
Mind you, I've not seen degrees in programming as distinct from Computer Science or Computer Engineering over here. We have one, two and three-year diplomas, yes, but degrees are all on the fundamentals over here.
I just don't get where all the Joel-bashing comes from. You'd never know it from reading this rant or any of the comments but Joel is giving his software to these students for free:
"FogBugz would work great for tracking this: if you’re doing a capstone project and need access to FogBugz, please let us know and we’ll be happy to set you up for free."
More importantly Joel is paying for students to participate in the program, and providing one of his programmers as a mentor. He's also had well-known, paid internship programs at Fogcreek. To me that says he's taken the initiative to fill this void in academia himself, with his own money no less.
"Joel is giving his software to these students for free"
The last thing Joel needs is students getting used to and/or turning software like Trac or Bugzilla into direct competitors. Having noticed this, its pretty obvious why Joel would give away FogBugz.
I agree, it is very obvious that Joel would promote his own software. That's not really what I intended my post to be about though.
Joel has long said he only hires great programmers that know CS theory in and out. I don't think that's changed, he's just also identified that recent graduates are lacking in time-management and collaboration skills needed to be good industrial programmers. However, he isn't just having a debate about academic vs vocational education but offering paid internships, and now sponsoring one of these Capstone groups, in an effort to give students the skills he thinks they are missing.
As someone else put it elsewhere, 'the first taste is always free'...
Companies have been giving away software to college courses for as long as there's been an industry. It's a long-term marketing strategy based on the idea that if you give free software to 200 students (with limits on how it can be used so they can't make money using it), then inside ten years many of them will end up in positions where they're asked to recommend software and yours is the first they ever used. Altruistic it is not.
Just because it's business doesn't mean it's not altruistic. pg does YC because he wants young hackers to do interesting work and solve the money problem, but he wants a return. Joel wants people to manage their time and get their work done in an organized way, and he believes in that so much that he wrote a product that does it. I'm sure he wouldn't care if you used any great product to track your time, but there's only one he can offer for free.
Joel is training people who are about to enter into the workplace, and through hearing about this are self-selected as people likely to be both vocal and interested, to use FogBugz.
It's not charity - it's very smart marketing and most likely good business.
From Joel:
Where do they learn to write a program longer than 20 lines?
Oh man, if I never had to write a program longer than 20 lines for my CS degree, life would have been a lot less hectic (and I also wouldn't have learned shit, which is not the case at all).
Joel's statement was hyperbolic. The average assignment might amount to 1000 - 3000 lines of code for the more advanced courses, but his point still stands.
Real products (products that actually make money) consist of a few thousand lines of "fun" algorithm code and hundreds of thousands of lines of "boring" code.
But code length wasn't even the main point of his essay - time management was.
Most classes I took beyond the sophomore year required rather involved projects to be delivered. Be it a creating a compiler, a mini OS or a peer-to-peer chat client. All of those had requirements discovery, deadlines and checkpoints. We also worked with a partner which meant dealing with integration issues and debugging other person's code, which made me appreciate value of a source code control system.
The classes I regret taking had to do with "hot" industry technologies such as Data Warehousing, Applets, JSP. Spending those credits on something like Graph Theory or deeper dives into Operating Systems/Computer Language design would have been more rewarding.
How many of those hundreds of thousands of lines of boilerplate code have concepts in them that the average CS student isn't taught in their "fun" algorithm code?
None.
The simple fact is that with the amount of things to teach in a CS course, boilerplate code is just not a good use of time to teach. Students come across it in any major project and know it exists in the rest of the course - but spending equal time on boilerplate and on algorithm code would be like teaching an artist a degree course in chemistry before letting him buy his first tube of paint. He'd be a great chemist, but a lousy artist.
As to time management, Joel's point was nonexistant there. He spent the entire article berating student's time management skills, suggested Scrum supported by his product could fix the problem, then quickly blurted out that industrial programmers have equally poor time management skills and in fact are solely differentiated from college students by the point that industrial programmers have managers to enforce time management on them, and that neither Joel's product nor Scrum can help that.
Leaving aside the point that he's not actually correct about time management skills (long-term time management isn't just down to the programmer, but to how stable project requirements are and other such factors), and leaving aside the query of just how bad the programmers he works with in industry are (because in six years of industry work, I never came across a programmer who both kept their job and had poor time management skills), there's the query of what the hell his post was about at all in the first place if all it did was to create FUD, hawk his product as a solution, then indemnify himself from any failure by saying it probably wouldn't work (in a sufficiently roundabout way that it didn't discourage the sale)?
How many of those hundreds of thousands of lines of boilerplate code have concepts in them that the average CS student isn't taught in their "fun" algorithm code?
Remember, his article was about time management. The whole basis of his argument is that college kids assume "boring" code can be done quickly and thus can be delayed to last possible second.
The truth is "boring" code usually means API code, user interface code, and input processing code. They may involve concepts that CS students have already learned, but it certainly isn't a breeze. It's not whether or not a student can do it - it's how fast they can ship a working product.
As for the rest of what you wrote, you seem to have some grudge against Joel. Maybe it's just manufactured to produce traffic. But that's just my cynicism leaking through.
I'm happy that my article is causing some conversation, but I was NOT advocating that there is anything wrong with the nature of CS education, other than the fact that it happens to be poor preparation for a career as a programmer. Anyone who got to paragraph 4 of my article would realize that I think CS degrees should stay CS degrees. That said, a lot of CS students do go straight into programming careers, so a lot of good CS programs include capstone/senior projects on a team, and they're never time-managed very well, which is what I was writing about.
But are there really all that many undergrad programs that prepare people to go straight into the associated careers? Why hold CS to a different standard? Maybe we just should have a programming school a la medical school :)
Schools should have separate programs for students who want to pursue academic research and who want to work in the industry. Recently graduated with a CS degree, I found that most school courses didn't help much me to land on a job. It's really the internships (which was part of my program) I did during my school years helped me most to land on a job.
Through internships, open source projects, research, and real courses (Operating System Design and Implementation comes to mind), students learn everything they need to about version control, bug tracking, etc.
Students who don't are lazy.
Professors who don't gently push students in that direction are lazy too. Good courses have projects so hard that using version control is implicitly mandatory. Further, Fundamentals of Software Engineering at my school requires students to participate in an active open source project (for example: chrome, firefox, scala, eclipse, etc.) and actually have their patches accepted. [Edit] Also, group projects are only collected through subversion (yeah yeah, I know), so students don't have any choice but to learn to use version control.
I can say the same for the University of Washington. A lot of my peers don't know or care much about real-world development, but those of us who realize its importance seek out jobs, research, and internships to fill in the gaps.
While some of this makes sense... this "Undergraduate courses lose their technical currency in something like five years on average" sticks out like a sore thumb.
How well would we be served if C were taught in undergraduate programs? In depth. Add to that a Lisp and Java, and who knows what we would be capable of in a shorter timeframe? C and Java have not gone obsolete, and along with Lisp, many different paradigms of programming are taught. Not only that, but technical skill with important and oft-used languages would be transferred.
I add Lisp simply to include another paradigm and because it's an old concept that's never really gone away, and seems to be gaining traction (see this site, see Clojure, see the continuing development and expansion of Scheme and CL and Gambit etc etc.)
It's easy to point to Java and C as stalwart success stories in hindsight, but I can recall with Java there was a tremendous amount of histrionics from the C++ and VB crowd claiming that Java was merely another passing fad. C faced down Pascal and COBOL, two languages which were considered far more favorable to programmers at the time.
----
I feel sorry for CS/CEng departments these days, as they're facing pressure from both sides: The industry wants more vocational-style training, and students want "real world" experience but also want a world-class western liberal arts education. It's as if their idea of a perfect school is one with the prestige of Harvard married with the curriculum of DeVry. I'm afraid you cannot have both.
I can't imagine a better course than writing a Lisp interpreter in C, and then a Java implementation in Lisp.
They're clearly not fads, they're clearly useful, and the kind of knowledge and education that doing such an exercise would require is exactly something that proponents "world-class western liberal arts education" would laud.
It does stick out... but apparently too far as it blocked out the following qualification "(obviously different sectors age at different rates – web programming has a very fast cycle, embedded systems a very slow one)". C is a systems language, it's been around forever and is not about to be replaced anytime soon, possibly anytime in our lifetimes. It's too good at what it does. And that's why it's taught in a lot of universities.
As is Lisp, for the reasons you mentioned. As is SQL. As are several others (though I'd disagree with Java myself because it's really more set up to do things than to teach things and there are too many shortcuts in there - but many others disagree with me on that point :) ).
I'm not for getting rid of established languages from courses. C is 37 years old; C++, 30 years; Lisp, 51 years; Objective C, 23 years - these are well-established languages that won't vanish in the four-year span of an undergrad course. Java at only 14 years (and so many releases that stability is a valid question) tends (to my mind at least) to mark the start of the gray area there. Some though, like Perl and C++, are really hard to teach in compared to others and that can outweigh their stability as a factor. Still though, I look at languages like Python and would love to work them into a course. But it is still a risk for the students. Maybe for the final year work though.
But teaching a four-year course using Ruby as a primary language, intellectually interesting as that sounds, is an unethical act at the moment. It's a beautiful language, a real joy even to read - but what the students start learning on day one may not be around, or be a useful thing to know for the jobs they'll pay their mortgages with four years down the line. That's the responsibility the university is taking on when it creates a course. The student invests four years of their life - the university must produce a return on that, and conservative thinking is needed for that.
On one hand, we have talk of learning fundamentals, but on the other, if we teach with more exotic languages, "they'll never be able to find a job because they didn't learn anything useful."
So, which is it? Do schools teach the fundamentals, transferrable to any language, or do they teach what'll be useful in industry?
The two are orthogonal, at least in some aspects. The fundamentals can - technically - be taught in any language (practically, noone's going to try teaching a course in BrainFuck). But we try to choose a language that gives them, if not industry skills, a platform from which to reach those skills rapidly. For example, teach them C and an OO language of your choice properly - and they can learn C++, ObjectiveC, Java, Python or any of a dozen others very rapidly indeed compared to someone who dove in, learnt Inferno and then graduated into an industry that never heard of it.
The thing to remember is this - we teach them the fundamentals for a reason, namely to get jobs in industry. At the back of all this academic teaching is a commercial reality that can't be forgotten. But you have to balance that with a long-term view of the student's entire career. We're trying to give them a degree course, not a Sam's book!
That's not bitter, that's justifiable anger steve. There's a duty of care involved on the university's part - if they screwed the pooch, that's a pretty major thing.
This actually lets me responded in a threaded way, rather than your blog, apparently? So my bad, I'll just respond here.
Do you feel that your experience or mine is more mainstream? When I read Joel's "The Perils of Java Schools," I felt that it described my school to a tee. I had just assumed that that's how most programs are, given my school's size. Obviously, Ivy-leage schools should be better, but I just kind of assumed that my own experience was average.
Wordpress.com's threading leaves much to be desired I'm afraid.
I can't judge what would be mainstream in US accurately enough to rate your school I'm afraid. I'd see some of the details of the main schools as they write up their courses in journal articles; but my experience is mostly with Irish universities. However, from what several people here and back on the blog and over on reddit are saying, it sounds like the average Irish CS degree is a step or two ahead of at least some US schools. Which I have to say is a major surprise to me, since US schools were (I thought, from speaking to grad students and academics from them) better funded than Irish ones.
I'm not sure if it's a funding issue. It's more of a philosophical one. We've been telling kids for the last few decades that colelge is the path to success, and we've lowered standards so much that everyone goes to college, so that it's actually true.
College is quickly becoming a big commodity business, rather than a place to learn.
Look up INTERCAL. Brainfuck isn't too hard, just tedious. INTERCAL on the other hand was designed to be as big of a pain in the ass ass possible. People like to use the phrase "fighting with the compiler," but this is a language where the compiler is actually actively out to get you. : )
You're contrasting Ruby as an upstart and Java as an established language, but they're basically the same age. They both had their first public releases in 1995.
(Of course, this doesn't invalidate your point about relative popularity.)
And that's exactly the problem. In liberal arts educations (such as the one I'm receiving) or even at universities such as MIT, Python is the language of choice. It's great fun to teach, very easy to learn, but it's not an industry standard, and many would argue that it's too easy to use.
I don't think it's reasonable to claim Python isn't an industry standard. It's used heavily in industry by large and fairly conservative companies and is even more popular with startups.
I think it's reasonable because there are very rarely any such things as standards in software development. Municipal building codes are standards. Drug testing regimens are standards. Programming languages are not standards. At best they are conventions, and only insofar as certain niche subset industries are concerned.
As for de facto standards, I don't think we should be paying much attention to them. I think it's very important to draw the distinction that software development currently has no standards (you might be able to argue for TCP/IP, where alternatives are only ever used because they fulfill some unique use case not covered by TCP/IP), because for as much as we want to call this industry "software engineering", we sure as hell don't treat it like any other field of engineering, for many of the reasons already mentioned.
I think the first step is to sit down and agree on some terminology. You can't even get two programmers to agree on what "Object Oriented Programming" means. No wonder we aren't treated like professional engineers. We don't act like them.
Most "industry standards" are de facto standards. We're not talking about laws here, or even open standards like C or Common Lisp. An industry standard is simply a widely-accepted practice, which is perhaps defined by the fact that it would not usually be questioned by a casual observer from that industry. Painting interior walls some shade of off-white is an industry standard. Using Python for scripting and application logic is an industry standard in the same way.
I wouldn't say that using Python for scripting is even a de facto standard in that sense. Don't make the mistake of assuming your experiences are normative.
Really? Python seems to be showing up everywhere I look. I don't think I've seen any largish company in quite a while that wasn't using python for something somewhere.
It's spectacularly useful, and I personally really love using it - but to say it's an industry standard the way C or C++ are is to stretch the point a wee bit too far unfortunately.
Still though, we use some languages for teaching (like Pascal or Modula-2) which don't have the kind of industrial usage levels of C or C++, so we might see Python being taken up sooner rather than later. I think there are one or two courses already using it over here on a trial basis.
Python is "the real stuff", and I'm always annoyed by the snobbishness that leads some people to dismiss it. Python is an excellent language, with a large user base, and some compilers in the works.
Just because C is hardcore and Lisp is amazing and Haskell is mind-exploding doesn't mean that Python isn't a great language.
C/assembly to learn about the machine and because they're everywhere, Lisp to learn about all the paradigms, and a popular scripting language if you need big libraries. Java is so ugly, it'll just confuse them. It confused me, anyway.
Perhaps it's not too popular, but VHDL or Verilog are a ton more useful for learning about how a machine works. Seriously, writing some x86 assembly teaches you a thing or two, but implementing a full CPU from the design of the ISA to putting it on an FPGA is the way to go. Once you've done that, all the magic is gone, and computers are completely transparent and understandable.
Yeah, it's kinda like the point of learning the different sorts is not really to learn how to sort things - it's a vehicle for teaching big-O notation.
If I had learned C, Lisp, and Java "in depth" when I was an undergraduate, much of the effort to learn them would have been wasted.
My first programming-ish job involved documenting C++ libraries (in an era where C++ templates had just started to be supported by mainstream compilers). Then I had a job involving Perl and Java, and even though I had no experience with Java I had no trouble getting up to speed with it. Then I had a job involving J2EE; I hear that the "enterprise" Java world is doing a lot with Hibernate and other lightweight frameworks these days, so what little I learned about EJBs on that job is now stale. And now almost all of my coding involves some combination of shell scripts, Python, and SQL.
There's a place in every software shop for the language lawyer who really knows a programming language's subtle tricks and traps, but not everyone needs that coming out of a CS program.
The most important thing to learn is data structures, algorithms, complexity.
The languages and techniques are not that important and change all the time.
Just go into a course where you understand what a hash table is, what is a NP problem, what is a turing machine, etc.
Understand what an operating system is, what is virtual memory, how a program is scheduled.
See how a compiler works.
Learn with a functional programming language, learn a language where you need to manage memory and pointers yourself, learn an "object oriented language".
I think most people agree that universities should try to teach people how to learn, and not just the latest technologies and good engineering practices. But I've sat through too many university lectures on programming language syntax or introducing threads to believe the university is teaching me "how to learn".
Why not tell students, "here's an introduction to C++, read it before next week's lecture" or "for this project you must use a VCS, pick one and learn it" instead of making me sit through lectures where the professor reads documentation to me? Have other people had the same experience?
'"here's an introduction to C++, read it before next week's lecture" or "for this project you must use a VCS, pick one and learn it"'
My one modification would be "here is the problem, implement a solution however you choose" (with maybe a caveat that it must be in a language that the TAs can read well enough to grade your work). Maybe some suggestions as to which languages are appropriate, especially if libraries written in a specific language are required or recommended.
I think that learning the theory is more important than the tools we use in industry. Picking up a programming language is pretty easy, if you already know all the hard topics like different paradigms, computability, logic, mathematics and so forth. Sure, you can learn these on your own, but I think its much more valuable to have people who know how to learn, know how to program and have a good, solid theoretical background.
I don't think you can be a great programmer unless you can develop a non-trivial program in a programming language in all major paradigms. The best way to do this, is to know the more abstract details beforehand.
Having said that, I do believe its important to do a lot of hands-on programming experience and feel that universities need to be somewhat more hands-on too. I also think that they should put some focus on team work and collaboration, on tools like debuggers, source control and profilers and also things like unit testing, though it is much much easier to learn these later than it is to learn the theory later.
This piece is a bit much, but I do think it's odd for Joel to be prejudiced in only hiring people from top schools and then complain about how top schools are too theoretical. You can't have it both ways.
A graduating structural engineer has no idea how to marshal a design through the building inspector's approval process. A graduating lawyer has no idea how to actually engage in litigation. Yet those fields aren't dominated by managers whining that students should be learning "real world skills" instead of "theoretical stuff" like high physics or constitutional law.
They understand that a professional isn't an automation that gets precision machined by a training program to slide frictionlessly into their workflow. They know that when they hire a graduate, he isn't trained in the mechanics of his job (nor is he even licensed yet), and that the hire represents a commitment on the firm's part to take the theoretical knowledge he got in school show him how to leverage it for the real-world practice of his profession. They are fine with that because they have a culture -- as professions and as firms -- of respecting and investing in their practitioners.
The computing industry doesn't operate that way. All the lip service it gives about 'professionalism' is, as far as I can tell, entirely driven by a desire to ensure that programmers remain exempt. Why is it that the manager class at development firms is dominated by non-technical MBAs? Why are development firms not set up so that programmers are partner-tracked associates? I can't think of any real profession where that's not the default configuration for a firm. And most apropos here: why do they expect their supposedly professional workforce to receive trade skills from their university education programs?
I think people like Joel need to quit attacking universities until they can get some consistency in their own views of their employees. Either programmers are the skilled tradesmen we're currently treated as. In which case, the exempt status should be removed and the industry should come up with a tradesman's curriculum. Or they should accept us as professionals, and start treating us that way.