Hacker News .hnnew | past | comments | ask | show | jobs | submitlogin
Do Developers Need College Degrees? (stackoverflow.blog)
160 points by technologyvault on Feb 21, 2017 | hide | past | favorite | 194 comments


Succeed personally: no. Some dev jobs obviously require a lot of foundational knowledge. I wouldn't want to have a colleague who didn't know linear algebra, simply because that's important in my field. That's not important for making Wordpress sites.

I think there are things "every good developer should know" such as knowledge about how cpus work, how OS'es work, how some important algorithms work, some basic set theory and complexity etc. All of this is doable without a degree, but I'd probably be more willing to bet that someone with a degree knows it, than someone without. Also of course - there are jobs that are doable without knowing foundations, but I'd call those "code monkey jobs" and whether such a job constitutes success I guess depends on who you ask. And what it pays.

Most importantly, when I recruit I don't see the diploma as evidence of what they know but as evidence that they could learn, quickly. If the diploma is from a prestigeous school but says they failed math 101 four times over that would make me doubt their capacity. Being able to pick something up is the capability I'm looking for. A curious mind, with a taste for the abstract. A diploma is just one clue.


One of the much-downplayed things that degrees are more likely to indicate than personal projects is foundational knowledge. When you work exclusively on Wordpress sites, this is irrelevant. When you work on distributed systems or cryptography, then foundational knowledge gets pretty important.

It is completely possible for any person to learn foundational knowledge outside of a formal instructional setting. I personally know excellent developers who have done so. It's unquestionably possible. I firmly believe that given sufficient time and determination and help, it's possible for any person to learn any subject to any level.

It is, however, possibly true that not all people learn discrete mathematics, relational algebra, and computer architecture in an ideal manner when entirely self-directed. Or perhaps even all developers.


You don't know what you don't know.

This is what struck me about my formal education. While I also believe you can learn anything with enough material and dedication on your own, having the structure to ensure that you cover a set of subjects that constitute "foundational knowledge" can be invaluable.


What is it about formalized schooling specifically, that you believe makes it uniquely capable in teaching people what they don't know?

When I was 18 I spent dozens of hours pouring through a university course catalogue, looking at the required courses for each degree, the prerequisite trees and the requirements for graduate degrees. It felt like a bird's-eye of the entire fractal structure of human inquiry.

Later, I realized how much sharper the view is to people who do that today. Regardless of whether or not they're a student, they can see the course catalogues online and, in the case of MIT (and many top institutions), even see lectures of those courses recorded in previous years. If you're reading this now and you don't have a very high idea what kinds of knowledge are out there, it's probably your own choice.


I think one factor is that Universities are usually involved in research, so they're in a good position to continually identify what's important out of the existing body of knowledge, and keep their courses up to date with new ideas.

Another factor is that you can't skip over topics that don't immediately interest you. Some self-taught people can have that discipline as well, but it's not universal.


> What is it about formalized schooling specifically, that you believe makes it uniquely capable in teaching people what they don't know?

The teacher. Teaching/learning was always about the teacher, who knows what and in what order to teach.

You can have the same effects with the same effort as with formalized schooling if you have a private teacher, but how many self-taught developers had them? And then there's this small thing that in formalized schooling you're taught by several people, each for their own discipline. It's hard to come by a tutor that knows seven different fields and it's hard to come by seven tutors for each of the fields.


>who knows what and in what order to teach.

Plenty of courses have their general outline publicly available. Some colleges even share older lectures [0]. There are more-than-enough resources available for self-directed learning. Have questions? StackOverflow exists.

The #1 criticism against self-directed learning is how self-motivated the autodidact is. Most people suck at staying motivated to teach themselves. If they aren't 100% passionate about learning the subject and can keep their interest going, they will fail.

[0] https://www.youtube.com/user/MIT/playlists


I totally agree with you -- staying motivated, especially if you are learning alone from home, it's incredibly difficult. I have a thesis that I have recently started to validate which is that you can considerably improve motivation in self-directed learning environments by introducing online collaborative learning. I just wrote about it and launched a little course/workshop to test the idea: https://medium.com/@arielcamus/learn-to-build-a-backend-with...


> Plenty of courses have their general outline publicly available.

Yeah, something like educational videos we have since the VHS era, and even before that, handbooks. It's still nothing like having a real teacher.


This is an excellent point. A lot of what I learned in university was things I didn't know before, and didn't learn either, but I learned they existed.

When faced with a problem now, I can often identify at least what area of my non-knowledge it belongs to. In that sense I knew a lot less after university than before (in a relative sense).

The ninja skill you get from a very broad education isnt solving problems by applying knowledge, its solving problems by identifying them (then Googling)


Don't forget the part where you had to do things that you didn't like (possibly hated) and only much later, truly appreciated.


> You don't know what you don't know.

Self-taught developer with the mentioned "foundational knowledge." I'm very strong so far as any university-level CS subject goes. One area where I have recently learned that "I don't know" is anything beyond high school calculus - having started to play around with Tensorflow. You are most certainly correct and now, instead of immediately having that expertise at my fingertips, I have to self-learn it over a month.


> in an ideal manner when entirely self-directed

With all the resources available these days, including MOOCs, textbooks, syllabi, and recorded lectures from the best schools, there's a middle ground between "degree" and "entirely self-directed".


You're completely right! Those are all amazing resources.

I would posit that using textbooks, syllabi, recorded lectures from the best schools and greatest professors, and documentation is still a form of self-direction. The learner is left entirely to their own direction, discipline, and environment. Perhaps the only thing that's changed in that list of wonderful resources since the 1960s is MOOCs. I thought of and considered these resources when writing my previous comment.

Perhaps I would have been best-advised to use a turn of phrase such as "self-directed with use of readily available non-instructor instructional resources". I thought such a phrase excessively clunky and unnecessary to make the point, but perhaps I was wrong.


> The learner is left entirely to their own direction, discipline, and environment.

I think this is one of the main problems of self-directled learning and why, in average, is not nearly as effective for most people unless they are incredibly motivated.

I just started playing with the idea that online collaborative learning can considerably reduce that pain and I just wrote and launched a little experiment to test it:

https://medium.com/@arielcamus/learn-to-build-a-backend-with...


It's a great idea! I would consider being careful with peer instruction, however. In software it runs the considerable risk of encouraging and inculcating bad practices, which is something an instructor is often used to mitigate in classroom-style environments.


Well now we're opening up a can of worms. Are not those with instructors also partially self-taught?


By that logic, everyone is self-taught. So perhaps that line of reasoning proves too much, rendering it potentially suspect.


What do you think of this curriculum? https://github.com/open-source-society/computer-science

I'd say that these days it's easier to gain all that "foundational knowledge" without attending a university.


A person with a strong grasp of the subjects described therein would have excellent foundational knowledge.

I would say that you're right! These days it's easier to access all the required instructional material for foundational knowledge outside a university. I would also say that this has probably been true for multiple decades. I would say that a list of MOOC courses is possibly not the same as a strong grasp of core materials, however.

Such a curriculum, while strong at first blush, likely suffers from many of the flaws of self-directed learning. It is perhaps not access to information or instructional material that is the problematic part of the educational process.


You sure can gain all that knowledge, but my experience is that without a formal setting with defined expectations and specific goals, few people will actually have enough motivation to spend few hours every day for a few years doing this.


Yes, but if you are a dropout because of e.g. financial reasons, you don't have the degree but can have the knowledge.


The best I've found is to find any university and look at their recommended course calendar for a CS degree and go to those course homepages to get the lecture notes/slides or vids if they are public. This will be a course structure made by professional educators to give you the intuitive ideas for a subject. Then look up the same foundational material in The Art of Computer Programming or other academic text with enough challenging problem sets so you remember the material. Unfortunately many universities require campus accounts to get at homework or tests so the only other way is get it from TAOCP.

While you do this apply for entry level support engineer somewhere and work your way into development from within the company or find open source projects and write requested features, debug them when problems arise, repeat. That plus being helpful on the mailing list and stackexchange for whatever project it is almost always leads to email job offers coming in. When you have enough experience you can directly apply to developer ads and hopefully survive the initial screen for not having a degree as you have real experience, plus all those problem sets you did will help pass a whiteboard interview.


Not having a degree, this was of the biggest issues I didn't know I didn't know. I encountered it after about 6 years on the Job.

I landed a role in a blue chip and had get a great boss who taught me the foundational theory I was missing.

I didn't ask him to, per-se, but I think he just sensed I needed it.

It made a huge difference to me as a developer. So I do agree, foundational knowledge is kind of a big deal.


With everything that's available in documentation, on YouTube, in forums, and in boot camps and other trainings, doesn't it seem like there are much more efficient ways to get the foundational knowledge that don't require four years of expensive college courses?


Step one: Find that material organized into a complete lesson plan. Without curation, a library is just a pile of books.

Step two: Package that with a bunch of people (both experienced and inexperienced) who can help with coursework and answer questions when they come up, teams of people to work with, somewhere to live with dozens of others doing the same work, hundreds more learning unrelated things (but who might be useful to you in the future), and you've got something close to a replacement for college.


What about Udemy and Lynda?

Every time I think of all the extra hundreds of hours I spent learning things that didn't help my developer career and that seemed to be distractions, I tend to compare those to courses published online that have precise learning modules that are so helpful.

College just seems like it's so expensive compared to the value you get out of other sources of knowledge.


College is a lot more than learning about software development and computer science. I guess that's what I was getting at. It's expensive, but I don't know any other way to meet the same mix of people and get the same set of experiences. And that's part of the point: If you just want to learn development, college shouldn't be a prerequisite. I'd imagine that some kind of trade school, self-study, or job training would make more sense.


All the material exists in one form or another, but I've yet to see anyone put together a credible "DIY CS degree" online resource.

Some day most (certainly not all) aspects of university education will be supplanted by high-quality universally accessible materials. But it's not here yet.


I think even if that existed, you would still miss one of the most important aspects of college and any other traditional learning environment: the motivational side of having a community of learners and teachers that hold you accountable and help you stay motivated.

I've just started playing with this thesis and published an article about it. I'd love to hear what you think about it: https://medium.com/@arielcamus/learn-to-build-a-backend-with...


Is this a market opportunity? Is there a way to make money producing a high-quality curated list of online resources, plus a series of online tests to prove that you learned it?


Udacity is trying with their nanodegrees.


It does seem utterly inconceivable that getting copiously documented things into the heads of people would be something only accomplishable through an outmoded and expensive educational process. Surely we've come up with something better by now with all the effort being poured into it!

Yet, perhaps there's something about that process. After all, there are aspects to a formal collegiate experience that are not neatly captured by boot camps, fora, YouTube, and documentation. Perhaps some of those aspects, like an instructional environment that values theory, are of nontrivial significance.

Obviously, it's still possible that something better can be done. This may not be the same as a better option having been developed and being on offer, though.


This may or may not scale but I've been lucky to be part of a startup program that roughly matches your description of a "better option". It's a 2-year program; I was, and still am, part of their experimental first class of students. So far I've gone through 9 months of entirely CS study, and I was even placed in an internship working in Mountain View, and without a college degree.

I have never held a dev job before, but I’ve gotten to doing some pretty serious and nuanced coding (all the way down to memory management and writing in C). They’re a really great team of people, their unofficial motto is “Google it”, and RTFM, and their program is called Holberton School.

I really hope it succeeds.


That does sound very promising. I, too, hope it succeeds.

How much time have you spent on set theory or discrete mathematics?


None. At least not on the first 9 months. Mostly it has been about performance, data structures, algorithms, and time complexities.

So far, the first part (9 months) the focus is mostly on practical project-based learning and skills for developers. But we are encouraged to learn on our own. I certainly know about uncountable sets and the Axiom of Choice, but how much do I really need for the problems I am solving on my day to day?


Thank you for answering my question.

Unfortunately, I must inform you that that is the answer I feared. There's a strong tendency for would-be "better options" and bootcamps to discard CS fundamentals and theory in favor of practical education. I am of the opinion that this sacrifices long-term practical utility for short-term utility. While seemingly of obvious benefit to those seeking jobs in the not-so-distant future, this is a penalty that mounts later in careers.

Odds are very good that your entire career will not use whatever tools this program has taught you. Odds are similarly good that your time as a junior engineer won't hinge much on abstract mathematics. But odds are very good computers will run on the same mathematics in twenty years.

Even today more interesting work (cryptography, geospatial, distributed systems, graphics) hinges on the sort of mathematical underpinnings that are generally found in a full collegiate computer science education. Of course, all of this can be learned independently, but most individuals struggle to learn crypotgraphic mathematics in such a way.

So really, it depends a great deal on what you want to do with your career. How much you know about how computers work will do a great deal to determine how much flexibility you have down the line. I have had jobs where reasonably complex synchronization problems involving work-stealing and partial orderings over a network were pretty common, and other jobs where `rails g ...` was the most complex thing I needed to know.


Well, like you said, it really depends on what you'll be doing. Granted, there are different paths one's trajectory will take, and you won't know them in advance.

On the other hand though, I wouldn't presume to write my own cryptographic protocol without having the fundamentals myself. Wouldn't you say that it's actually reasonable to learn this on your own? There are so many options, paid, or free, that can help you with this. You could take a Coursera class, or follow an open source curriculum like someone said in the comments down below.

You did say "most individuals" struggle to learn in such a way, but the argument can be made that individuals studying cryptographic principles are not "most people", which, if not intelligence, shows a special kind of perseverence and dedication that will also differentiate them in self-study. I wouldn't say what you're saying is immediately obvious.


You're right! The world contains a wondrous bounty of instructional materials the curious might use to attempt to edify themselves.

You're also right that the argument can be made that any individual who might seek self-study of advanced mathematics is not "most people" as I described previously. To that subject, let me offer a different formulation: most individuals attempting to study advanced mathematics independently struggle to learn effectively in such a way.

A non-zero number of people have set about doing what you describe with a special kind of perseverance and dedication... and wound up making rather severe mistakes. CryptoCat comes to mind. Perseverance and dedication failed to differentiate them. You may be different! It's very possible! But perseverance and dedication should not be confused for a rigorous and rigorously evaluated course of study. This becomes a significant difference when questions of scaling arise.

The world is full of options to help you learn, and I would not dissuade you from doing so. I just want you to be aware of the limitations likely to be imposed by a given educational approach.


As a percentage, how many US colleges would you say are excluded from the opportunities provided by "rigorous course of study"? Would you say you could get this kind of rigorous study at any university? If not, how many (as a percentage of all undergrad colleges)? Is this scalable/sustainable for the entire population?


I don't know, and I don't think the answer is as relevant in this context as might be hoped. There are many known, admitted, and well-documented failings in and problems with the current tertiary educational system. Yet, it's perhaps possible that these are not repaired by stripping away some of the aspects that are of value.

The general need for a better approach might not be the same as a given different approach being better.


> With everything that's available in documentation, on YouTube, in forums, and in boot camps and other trainings, doesn't it seem like there are much more efficient ways to get the foundational knowledge that don't require four years of expensive college courses?

Sure, but the risk of getting led onto an inefficient, time wasting path that doesn't give you a good grounding in the foundational knowledge is also high.

The skill to evaluate approaches to learning the information and the set of information you need to learn is not something you are likely to have without having studied the information, and there are lots of people with pet theories or financial interest in your actions trying to promote different approaches.



> If the diploma is from a prestigeous school but says they failed math 101 four times over that would make me doubt their capacity.

I don't think diplomas nor resumes normally list failed classes ;)


It's not uncommon for employers to request GPA and/or transcripts. Source: I graduated 2 years ago.


I've never been asked for GPA or transcripts (been graduated since 2012). YMMV, depends on where you apply I guess.


I graduated in 2008 and was asked for my GPA when I applied somewhere in 2015...but that was an unusual case. One place out of dozens.


If I see a resume with any GPA but a 4.0, I generally laugh.

If you got a 4.0, sure put it on resume. But a 3.2 or a 3.6? Silly, leave it off.

Source: Graduated longer than 2 years ago. Never been asked for a transcript.


I always get them without even asking (Europe). I never received a "diploma" however. Only transcripts of classes.


It's usually obvious from the ones I see (I don't see diplomas I get transcripts with classes taken, dates they were passed, and grades)

The usual sign is someone e.g getting a masters degree 2000-2005, and passing the introductory math classes in 2005.


Your metric will lock me out:I did Real and Complex Analysis sophomore year but was forced to take intro to statistics senior year in order to graduate.


That kind of thing is in every transcript. Doesn't look strange at all with the odd one like that. It's rather systematic failure I'm trying to look out for.


Maybe early in career you are correct, however after 5+ years of professional experience nobody even cares to which school you went or do you even have a degree.


This is my general experience as well.

The exceptions are companies like Facebook and Google, whose hiring practices seem to really be focused on hiring people straight out of college as opposed to established programmers.

Perhaps it's too broad a generalization - I haven't worked at Google - but all the companies who ape the GAFA hiring policies are indeed only interested in developers straight out of college. And I have a hard time blaming them; college students are fairly inexpensive to hire, and there's a lot of them out there looking.


So they can mold them into the types of programmers they want.

ie. Train them to be corperate zombies


I knew linear algebra in high school. I taught myself matrix math because I wanted to make games, and because the most challenging math my lackluster rural public high school offered was basic calculus.

My top university's CS program had a linear algebra class that loaded me up with theory though like Hilbert spaces, wavelet transforms, spectral theory, abstract algebra, and of course, principal component analysis (but the latter, along with SVD, is becoming more valuable these days with data science-y type jobs).

It was more concerned with proofs and advanced concepts than applications, which was fine for me, because I already learned the applications in high school because I was bored and self-motivated. The most valuable benefit of learning all of this extra theory to me was learning how to solve problems because doing proofs isn't some algorithm like long division or evaluating a derivative that you can just crunch as a cog-in-the-wheel.

The question is whether all of that extra rigmarole is worth it in my day-to-day development work. The answer to that is "no." Occasionally, I'll borrow some code that does thing X that is tangentially related (like convolution for computer vision and deep learning), but the theory behind it rarely comes in handy because I'm not doing academia or R&D work.

Of course, your usual full stack developer job won't even come close to touching any of this, but if you're doing moonshot type work like self-driving cars, you're going to need all of this theory to really attack the types of unsolved problems you're facing. Those types of jobs are rare, sometimes highly-compensated, and inaccessible to your Hack Reactor grad.


Linear algebra is used in even the simplest of games, and almost anything related to cad, computer vision, engineering, ...

If the problem is related to web and databases and crap then there is rarely any linear algebra (heck there are rarely even real numbers) but we didn't go to school to work with databases and web forms did we ;)

Obviously hilbert spaces might not be useful everywhere but for example, anecdotally I did stumble across one interesting problem already in my first year out of university: A drawing program needed to scale a drawn ellipse in one direction, which wasn't necessarily along one of its axes. Q: is the result an ellipse? What are its axes?

A: (I had to ask a friend with better linear algebra skills, which still bugs me): the axes will be given by the eigenvectors of the matrix involved. The math was fairly hairy compared to most other things in a simple drawing program, but it was fun and really rewarding when the math worked.


There is a tiny fraction of linear algebra that's very useful for most games, but you can pick most of it up over a long weekend. Further taking the class five years ago means you still need that long weekend anyway...

Compared to say learning direct X it's a tiny time investment and hardly worth a degree.


Yes, and most of your linear algebra on the field as an engineer consists of plugging numbers into MATLAB and letting it crunch.

It's much better for the bottom line if you just use everything out-of-the-box and bolt it together with duct tape and bubble gum.

You'll keep your boss happy with your fast progress, and the bean counters will be happy as well.


> ... I don't see the diploma as evidence of what they know but as evidence that they could learn, quickly.

It's a better evidence that they can follow steps to complete something, but is not a good indicator of the one thing I value in SEs: being self-directed in tackling a problem.

Also this varies a ton by sub-industry, over in games we don't care about degrees at all, just demonstrated work. Once you move on the sliding scale over to the larger corps it starts becoming a checkbox for HR that won't get you through the filter(most of the time).


> I think there are things "every good developer should know" such as knowledge about how cpus work, how OS'es work

Considering the complexity of modern CPU's and OS's, does college go into enough depth to tech them well or does it just let students think they know how they work?


Well, what you need to know as a developer of either is pretty limited. Understanding the basic existence of the following will get you a long way: cache hierarchy, branch prediction, instruction reordering, syscalls, interrupts, virtual memory and page mapping, the layers of the networking stack, etc. You might need to go in depth about a particular CPU/OS later during your working life, but most likely not, just know enough to have a semi-realistic picture of the layers below your own code. Those are all things that can be self-taught to some level, specially if you have the discipline to write some experimental code to test them in your system. For most developers, however, those are things that are easier to learn in a classroom than on your own time.

(Some of the theoretical parts are even more suitable for college and equally useful: complexity/computability theory, lambda calculus, graph theory, autómata theory, ML, etc)


My classes taught me the existence of a lot of low-level constructs and showed us example implementations. To me, it's more like some handholds to use when I need to go spelunking into a particular implementation. Do I always know how something works? No. But I've got starting places to investigate when reality doesn't match my mental model.


"A curious mind, with a taste for the abstract. A diploma is just one clue." This is me and my BA in math. I would like employers and coworkers and people to understand this point. They often don't get it. Also, I think I've come to realize that I am actually a slow learner. I need practice and patience to deeply understand something and confidently talk about it. Where do I belong then? Personalities who say they are quick and are able to confidently sell things erk me and makes me question if they truly have the best they can give. You could read a Wikipedia page about some things and sound awesome if you have prior knowledge and a quick wit be able to convince most people of expertise.


I'm someone who graduated with a non-cs engineering degree (and worked professionally as an engineer) whose now in a MSCS program after trying my hand in the job market. Not for a lack of trying, but my experiences definitely relate to an uphill struggle just to get even noticed. And before anyone says "you should do x|y|z", I've done my part pretty well. A few decent sized personal projects, constant studying, meetups, networking, github, etc.

I hear a lot of anecdotal evidence (mostly from professional devs who have degrees) that they know a friend or coworker who doesn't have a degree.

The truth is that the system for recruiting and hiring is really based around colleges, students, and degree holders. Starting from the application requirements, you see that most applications require a CS degree (or equivalent). Then the technical interview is aimed at fundamental CS questions, most of which students are accustomed to.

So for someone self taught, face to face interactions are a lot better. But even going to events/meetups, you notice a distinct change in demeanor of the recruiter when you mention you don't have a CS degree. It really is a tiring battle just to prove that you're competent.

Do I believe that you 100% need a degree to get a CS job? Definitely not. But without one, your options are very limited and on top of that you'll need a lot of luck and effort just to get past the first hurdle.


IIRC, the article says that a whopping 56% of developers don't have a degree. Given that the majority of developers don't have a degree, I'm not sure I'd agree with saying " without one, your options are very limited". Maybe "somewhat limited"?

My anecdotal evidence, looking at both my own career and a lot of developers I've worked with over the past 17 years, is that having a degree, or not, isn't a big deal.


What could be true with the last generation may not necessarily hold true for the next generation.

Many are working towards abstracting schooling out of a being a developer, great.

That also means we're moving towards making being a developer a blue collar job rather than a white collar job. We're also making private businesses now the trainers of labor.

The time to be a dropout rockstar developer was in the 2000s but we're moving towards 2020 now.

The real question is meant for the generation of 13 yo now in the next 5 years. They can study software development on their own but in 10 years time, will they be better spending half of that with expensive college education or nah.

Everyone in this thread are giving out answers based on their own experience surviving the past 20 years as if what happened the last 20 years in the tech industry is where we're at right now or will be going forward the next 20.

I wrote this on hardware and software made by and improved by organizations of highly educated and highly specialized people. It's incredulous to think the future is only going to be built by Lucky Palmers and 16 yo hacker geniuses.


I agree with this. Furthermore, the bright youths of the nineties spent a lot of time putting hardware (and software) together. Now a lot of this is on a commodity level, open source packages abound, and the theory is more accessible than ever.

Software as the future is yesterday's news.


> That also means we're moving towards making being a developer a blue collar job rather than a white collar job

This will never happen, the majority of the population doesn't have the dedication and/or intelligence to be employable as software developers. I've given numerous people resources to learn how to code and none have stuck with it.


What could be true with the last generation may not necessarily hold true for the next generation.

Fair point, but I still haven't seen much - if any - evidence that it's any more necessary for developers to have degrees now, as opposed to in the past. But, as always, it's going to vary by (company|geography|industry vertical|etc ), so my sample data is almost certainly not broadly representative.

I wrote this on hardware and software made by and improved by organizations of highly educated

True, but FWIW, I would argue that "highly educated" and "highly formally credentialed" are not the same thing.


> That also means we're moving towards making being a developer a blue collar job rather than a white collar job.

We're going to make software development into a job involving manual labor?


Treadmill desk?


> I wrote this on hardware and software made by and improved by organizations of highly educated and highly specialized people.

Maybe a lot of that useful hardware and software was not written or designed by people who own a sheepskin?

> It's incredulous to think the future is only going to be built by Lucky Palmers and 16 yo hacker geniuses.

Does a sheepskin enable that type of high quality condescension? I guess it's good for something, eh?


I think for an out of the gate job this is true. But after you've had your first job this stops being as much of an issue. I have an Aerospace Engineering degree. I was able to get a WebForms job without much experience and a minor in CS. I will never knock that job because they took a chance on me, paid me a semi-competitive salary, and let me learn a lot.

I haven't had a hard time finding something since then. I do hate the basic algorithms questions every interviewer asks, but a month or so of Coursera and I'm ready to answer all of the "Implement a QuickSort algorithm in 20 minutes" questions again.

You either need luck or you need to be willing to take a job that isn't as reputable to learn programming as a profession, once you have that many doors will open up to you.


Just the first hurdle though. It took me a year of serious effort to land my first dev job. The second one came to me: I was contacted by a recruiter and asked to interview and ended up accepting that offer about a year after starting in my first position.

I dropped out of high school ten years ago and did nothing of note between then and teaching myself to code.

Just stick with it until you land the first one and you'll be fine.


I graduated with a degree in operations management. I've probably done over 100 personal projects to gain experience, with maybe 15 being worth talking about. I've read quite a few books, actually a lot if you count the PAKT ones. When I lived in Seattle I went to the meetups and found a couple groups there I enjoyed hanging out with. I've had a really hard time getting interviews except from a couple of small teams, and I mostly feel ignored. At this point I'm taking a computer systems class and discrete structures at a CC. The systems class is helping my debugging skills in ways that I wouldn't have put myself through but the price tag for this knowledge is high, and it's beyond just money. I still learn a magnitude more on my own than directly from the coursework.

Do I think a degree is necessary? No. I do think you'll have a hard time getting a job without one so you might as well start your own company.

These last few years have made me salty.


Just to share my experience:

Like you, I also didn't have a CS degree (I did take foundational CS coursework in undergrad, and also had an MS in something akin to HCI). I was hired for my first software engineering job in SF by a couple not-very-technical founders--both of whom were rather taken with a game I had designed/developed--after completing a week-long contract job for them. No CS fundamentals interview process.

After that first job, it was very easy to get interviews, although I had to prep a bit for those CS questions. I'm now interviewing for my third job, and feel like I've taken on too many interviews, many at big name startups and big corps. Among 18ish cold applications, I've only been turned down without so much as a recruiter call twice, likely because I was missing some keyword on my resume or something.


Where are you located? What kind of positions are you looking for?

I'm sure things are (potentially) quite different now compared to the 17 years ago when I first landed a job building web applications. (My very first job was an internship - sort of - while pursuing a CS degree. My next job was after I had dropped out, left that job and was busy delivering hoagies as my job.) So that was "luck" and being "in the CS education world" when I got started.

Still, I think if I have a point, I started out in silly, minor jobs for little local companies that needed someone that could code more than they needed a highly polished computer scientist or engineer. From there I, more so, "learned on the job" the rest of what I needed to advance (and I continually self-taught by working on my own projects).


We are such a weird profession. What other industry is like "Hey, you can do my job. It's easy, here are lots of extremely detailed courses, documentation and guides on how"


Actually, I'm always amazed at how many Real Estate Agents suggest that others should get into it because it is so lucrative. They seem to ignore the fact that they are welcoming more competition in a finite market.


In the US, there's a bit of pyramid to real-estate commissions and a lot of the recruitment is done by brokers. And even at the sales person level, the commission structure of most company's pays better when an agent sells the listing of another agent in the company so the more agents the better the odds of a better commission.

But coming from the architecture profession, I don't discount encouraging others as a form of self-validation of one's own career choices.


I was strangely missing that piece. I was going to write in my comment that they are similar to pyramid scheme members, but thought that was a stretch.

My step-father and brother are both brokers, so I know how they operate, but I somehow missed how agents get a piece of the action.


The agent to agent incentives are more akin to profit sharing derived from the increased broker revenue and reduced cost when one broker handles both sides of the transaction.

There are also potential informational advantages to having other agents more tightly coupled to an agents social graph, e.g. it is easier to obtain information about seller or buyer motivation and finances through a back channel.


The idea of "open source" is also unique, I can't think of any other professions that have something analogous.


Closed source is what is unique.

I can tear apart any engine to understand how it works.

I can rip out the bathroom in my house and install a totally new one.

With closed source, I can't just rip out component X and replace it with component Y. I can't read the binary to understand how it works (Technically I could, but I mean that with modern languages, the assembly looks massively different from the original source).


You can tear it apart and understands how the mechanism work but you don't have the million details about the manufacturing process, precise drawings with tolerances, material composition, etc.


I'd say that analyzing a finished mechanical device is similar to examining disassemblies and hex dumps of program files. You're missing information on the manufacturing process, allowable part tolerances, and so on, that the original manufacturers have access to.


Once you have the asm of a program, you have 100% knowledge of it if you put the time in to learn& follow the assembly. Sure, it's not as easy as looking at the source, but you have the capability of fully understanding the program.

With a mechanical device, you don't know the dimensions on press fits, you don't know what the aluminum was heat treated to, you don't know the silica to aluminum ratio and how that affects thermal expansion, you don't know what lubricant was used, you don't know what blend the fiberglass insulation is, the list goes on and on. It's not possible to obtain 100% knowledge without the drawings. At best, you make assumptions about material processes, what a reasonable press fit would be, occasionally you can spark test to make a guess at material composition, etc.

There isn't a "holy grail" (1) mechanical equivalent to a hexdump where if you grind at it long enough you achieve 100% knowledge.

(1) Obtaining a complete set of original drawings is, but unlike a hexdump, you can't get the drawings from the finished product.


> Once you have the asm of a program, you have 100% knowledge of it if you put the time in to learn& follow the assembly.

You have all of the "what" (in an inconvenient format that will need tremendous effort to be useful for most things), but you're mostly guessing on the "why". Tons of information is discarded, and can't be retrieved. Just guessed at. This puts a good bit of distance between what you have access to in the theoretical sense and how useful it is in the practical sense.


Disassembling an engine to learn how it works is like the literal disassembling of a closed source binary to learn how it works.


No, closed source is I think the more unique thing.

I can go to planning offices and see the plans of any building. Almost no-one has "secret painting methods", or "secret plumbing methods".


Almost no-one has "secret painting methods", or "secret plumbing methods".

For plumbing, building codes dictate the range of solutions that are available for any given problem, and they're not secret (but buying a guide can be expensive.) However, painters, plumbers, and other trades definitely have tips and techniques that they've learned through experience and from mentors, and you won't find much of that being given away freely. You've got to learn it the way they did, through hard work and apprenticeship. And that's ok, because those skills require mastery to use properly, and that's true in software development too.


> However, painters, plumbers, and other trades definitely have tips and techniques that they've learned through experience and from mentors, and you won't find much of that being given away freely.

Exactly this. My father is a civil engineer who's now in his early 60s, and he recently got to work with colleagues much younger than him.

He told me that said colleagues didn't know some trade-related stuff which hadn't got covered in university courses (like how to reinforce earth ditches when digging in mountainous terrain with probable heavy rains) but which was paramount in keeping the work going (not much that you can do once a ditch collapses because of heavy rain). He told me that in this instance he had instructed his younger colleagues on what was the correct procedure to take, but he also told me that he's got other professional secrets that he's not ready to share just yet.


> but he also told me that he's got other professional secrets that he's not ready to share just yet.

I'm curious enough to ask: why?


For job security, I think. We're lucky enough as programmers not to have to worry about that too much, but when you're a civil engineer in a quite poor part of the country and the continent (I'm talking Eastern Europe) you do not have much better other options.


Speaking as an artist, this was untrue for us as well pretty much until the internet democratized things. Prior to this the assumption was that if you're good you should just be able to do it or that it should locked up in guru-like ateliers that parcel it out at high cost like Scientology courses. Of course access to basic art instruction has always been available, but a lot of advanced techniques or knowledge in specific subdomains (e.g. animation) was pretty closely guarded.


Those professions cannot be compared with software, you would have to compare an industry where the product is intangible.

Do you see musicians making samples and giving them away to other musicians for free? Sure, you can find some online but it is definitely not the norm.

Does Hollywood create stock footage and release it for other production companies to use for free?

Do photographers allow others to use their photos for free?

OSS is very unique to the software world.


I reckon the increasing popularity of FOSS is paving the way for the examples you've listed to start becoming mainstream. Social media is already starting to kick that third one into gear, and I wouldn't be surprised if the growing remix culture soon evolves into licensing terms that expand fair use to "you can sample my song as part of your song as long as your song is substantially different from my song".


> Almost no-one has "secret painting methods", or "secret plumbing methods".

Magicians!

(an exception that proves your rule of course)


I've been in AEC since 1989. I can count with the fingers on one head the number of times I've seen individual architects form a community based on unpaid contributions to some big firm's profit driven project. That's bread and butter at Google.

The difference is that construction has long been a commodity industry and software has not become one yet.


It's free labor, and companies love free labor. They claim to value candidates that produce open source, but somehow those who toil night after night on their passion projects never seem to see any job offers materializing as a result. At best they get recruiter spam (for which many here argue they should be grateful) inviting them to submit themselves to a whiteboarding carried out by an arrogant 20-something often fresh out of college and eager to assert their dominance.


Cooking and music are two professions that come to mind.


I used to do card tricks for fun. I made an interesting observation about card tricks. I could show someone how to perform the slight of hand. I could even show them the trick and explain how misdirection works. Despite that, they couldn't do the same thing (even if it was a fairly simple trick). Free info simply wasn't enough to copy the trick. They had to have the aptitude and put in the hours if they wanted to do it themselves.

As coders, we do something similar. We show the code and explain how it is put together. We even talk about the theory behind the application. But for them to learn, they have to have the aptitude and the practice.

Coders have the aptitude, and they seem to insist anyone can do it too. Companies go along because it's in their best interest if this is true. The reality is that people have varying degrees of ability for handling abstract logic. Only a small portion of the population is adept enough to handle abstract logic at scale and we need that skill in more fields than just programming.

Reality seems to bear this out as we have increased the amount of recruiting every year at every level of business, school, and government, but with diminishing returns.

I believe that we offer to teach because we like finding people who think like we do (or maybe wish to believe we think the same way as the rest of the world), because we like to believe we are making other people's lives better, and on some level, because it's safe in that the returns are small enough to not threaten our own wellbeing, but maybe I'm just too cynical...


It's due to our poor social skills. Flight attendants successfully lobbied the media and public to refer to them that way and not as stewardesses/stewards to improve the esteem of the profession, but we're so clueless that we don't object at all when journalists refer to us as coders, a term also used for medical clerical workers. Many here will even argue with you that it's not derogatory.


Reminds me of a funny Steve Yegge Blog Rant: http://steve-yegge.blogspot.com/2006/03/truth-about-intervie...


Well, that's because companies can't afford to be as picky right now. Once everyone "learns to code." and there is a glut, you'd better believe degrees become more valuable.


It still won't necessarily be degrees that separate the high performers from the low performers.


In reality, no, but most hiring managers won't know how to tell the difference, so they'll just default to whether or not the person has a degree. Hell, I can't tell the difference between a good an exceptional developer until I've worked with them for a month or two.


Yea, I think people who think they can tell how effective a developer is before working with them for a month or two are mostly deluding themselves.


Great point. I've always seen the computer industry completely operates in its own dimension, not like any other industry in history.


I think that makes our profession pretty amazing rather than weird. Although, I don't think it's easy.


No.

Some people are self taught and self motivated, and they are often more capable than someone whose primary exposure to development (or technology beyond consumer use) was through a CS program. That is not to suggest that CS degrees are unnecessary, but once you understand the entirety of someones experience and knowledge you will often find their degree or lack thereof is entirely irrelevant.

Of course there are some employers who won't even acknowledge someones resume exists without a degree, which I have always found silly.


>Of course there are some employers who won't even acknowledge someones resume exists without a degree, which I have always found silly.

It really is stupid, I would hire a software engineer who dropped out of high school over a PhD if he's better. Never understood why people care about degrees, the whole point of college is to learn, not get a piece of paper. Only excuse I can think of is that it's a sorting mechanism for lazy businesses to filter applications.


Define succeed. Nice job, nice wife, etc... or FU money level of success?

I can only speak from my experience hiring and getting hired but all things being equal that piece of paper will likely get you hired over the other applicant(s). For inexperienced folks it also makes getting those critical first positions easier, which will allow you to quickly move up, or out, in the field.

I question the validity of their claim that by omitting a degree over 60% of the job postings didn't require a degree of some sort (see my previous point). Hiring staff is a real pain when you're not one of the big boys so casting a wider net is somewhat necessary.


>I can only speak from my experience hiring and getting hired but all things being equal that piece of paper will likely get you hired over the other applicant(s).

It may take you 10% more applications to get a single job, but that's not much if you live in an area with a lot of jobs. I have read about struggles from both sides of the fence (degreed/not) to the point where I can't take common advice of degree advantages seriously any more.

We really need to find a way to assess skills and make a great interview process that many companies can adopt, and that will solve the problem because then the CompSci curricula will have to switch over if they want to keep incoming students and graduate hiring rates high. "Who cares about your degree? Take this test to see if you're actually worth employing at a given career level!"

10% better hiring speed just does not seem attractive considering you spend tens of thousands plus the opportunity cost of working somewhere and contributing to your retirement (which for millennials is apparently very far from previous generations). Student debt is only increasing. Number of graduates are only increasing. Degrees make you stand out significantly less-so unless you are able to attend a prestigious school -- and at that point, the decision may not even have to do with the technical rigor involved. The hiring managers are playing the same risk aversion game everyone else is, only they can sometimes afford to be picky or not.

The only time a degree will become super-important is if the middle and low level jobs suddenly start disappearing. I haven't heard any doom and gloom about that, but if it was actually happening, it'd be the biggest sleeping giant of a news story for our industry.

Even then, most of the hiring that gets done will shift to people with experience and/or a degree. It's just that there will be less hiring overall.

Plus, you can always move laterally to another portion that still uses Excel as its backing store and probably automate your job away and do 30 minutes of work per day -- I'm seriously thinking about doing this because it seems easier to reduce the amount you work than to increase your salary for the same 40h/week.


I responded to a recent 'Will quitting school hurt my career?' type thread. I banged my head against the wall of youthful perspectives about what experiencing a career will be like across several variations of 'you play the odds and take your lumps when you lose' until I ran out of steam. Later, having thought about it (by writing) I recognized a very distinct benefit that pretty much only a college degree will provide.

It comes up on 'Ask HN' from time to time that a person without a degree is looking for a way around a degree based work visa requirement in order to immigrate for a software job. Now for a lot of developers in the US, this is probably less of a big deal since they don't have to emigrate for a high paying job at sexy company. Probably not as big a deal within the EU either. But there are a lot of other places in the great big world where it can matter and not having a degree does curtail access to jobs that a developer might want and opportunities that a developer might want to pursue for professional or other reasons.

Except in pretty rare circumstances where a person qualifies for a 'snowflake visa', a portfolio does not fill that gap. Likewise graduation from a coding bootcamp probably won't be an alternative visa qualification any time soon.

Despite their drawbacks and for better or worse, credentials provide an objective criteria for decision making and in contexts with legal implications that is very useful -- not just visas but for complying with employment laws and any other CYA context.


Rhetorical question: Am I the only one who did get a degree to actually learn stuff?

I was self-taught, had programmed as a job for 4 years but wanted to make a leap in my skills. Dedicating 5 years to just focus on learning stuff, practicing, all in a structured way (courses) was really, in hindsight, pretty amazing in terms of accelerating my learning. I did get an MSc in CS, but the main motivation was to learn stuff, not to get a degree on paper.

Disclaimer: EU-citizen. Studied in Sweden. No tuition fee.


I'm in the same boat. My self-taught experience inspired me to pursue a CS degree. The degree did put me in a serendipitous position that led to a number of great industry opportunities but personal growth was my main reason for starting it.

FWIW, I'm studying in Canada and paying my way through school with internships.


No, but it helps. Autodidacts are rare (possibly not on this site, but in general), and few people who haven't been to college even know what they don't know. Unfortunately a large proportion of college grads are equally clueless. College is a place where you can get an education, but nobody's going to insist, and neither degree nor GPA is any indication one way or another. What's worse, people who learned next-to-nothing in college don't even know that they don't know anything, and they can interview quite well. As the numerous HN threads about hiring and interviews attest, it's murderously difficult to separate the wheat from the chaff.


[College drop-out here]

TLDR: A degree is not completely relevant, but it gives you an edge.

Im my experience, a degree is not necessary, but demonstrating skills is harder than it sounds; a lot of interviewers at non-tech companies are not technical and rely on typical CS material for interviews (algorithms and the like). As a self-thought, I rarely paid attention to this sort of stuff (I started as a web developer) until I had a few interviews. Salary has an impact as well; even though Ive performed better than colleagues, my salary was ALWAYS lower.


Your salary is lower because it started lower, and there's an unwritten rule in the industry to never hand out disproportionate raises. I was lucky enough to have a couple of jobs that broke the rule, and as a result I believe my salary is in line with my degreed contemporaries.

The degree gives you an edge in two important ways. First it qualifies you at those places who absolutely demand a degree, whether you believe in their reasoning or not; having a smaller pool of opportunities will always be a handicap. Second, if a hiring decision ever comes down between you and a person with a degree, the degree will become the tie-breaker.

On the flip side, the places that won't consider you without a degree might be the places you wouldn't want to work anyway.


That's one more issue with the industry. Everyone says they don't want to train because the devs will just leave, but they refuse to give them the raise to keep them there. If a company won't give a disproportionate raise to a person who deserves it, then some other company will hire that person.

My first gig paid only around 30K/yr, but I became aware that moving jobs was a good way to get a raise. When you already have a job (and genuinely have the ability), it's pretty easy to negotiate aggressively. My next job was 70K and within two years, I was in the top 10% of dev salaries.

If I weren't willing to jump companies, it would have taken years more because (as you point out) companies don't like giving raises.


No. I've found the college degree can be important in getting your first job, and its importance decreases over time. When you get to about 10+ years of experience, it has minimal value.

So don't let the lack of degree stop you if you are pushing ahead with making a career of it.


Jumped into programming as soon as I could around 13, I'm in my 30s now and have been reading every night since. I'm a self tought system admin/dev ops/code monkey and on the side a security enthusiast. I did a year Of college in late 20s and couldn't stand the pace/starting over.

Lately I've been going through a lot of resumes and interviews. Out of 50 resumes and interviews not one was female so I'll just throw that statistic out there. Most are out of college but are grabbed by government contractor work eventually if they are good and the rest need to be open to learning or think they know everything and get fired a lot. We finally snagged a junior from college we like and training him but you'd be surprised the mistakes he can make or get stuck on but he just wasn't the type to learn on his own outside of his college classes and I think that's the problem.

There's also a lot of big data stuff these college grads aren't getting exposed to and then they some how jump into a job for 3 years working with a database and still don't know after 3 years how to create a index let alone scale something despite having it on their resume.


Its been my experience that a non-degree developer can crank out CRUD apps quickly whereas a developer with a CS or engineering degree will struggle with the idea of a CRUD app since its never taught in school from a modern perspective. That being said, the theory learned in school can help a developer see the bigger picture and provide them with tools for creating solid code with few performance issues while those that are self-taught or went through the boot camp will probably have to Youtube and Google their way out of trivial or ambiguous problems, such as null-pointer exceptions and data type mismatches.


In the latter part, this post stresses on 'keep learning'. That's the only thing the people who have this doubt in their head need to know.

I have met great developers who never went to college and some developers who can't even ship a small segment of a product with Masters.


There are doors that close when you do not have a college degree. That's the plain and simple way of things.

That being said, a diploma from whatever institution with whatever major, honors, or other kind of augmentation is simply a very expensive piece of paper. Getting a good grade in a course, or doing well in a series of courses brings about precious little practical knowledge in and of itself. And that piece of paper says very little about things that matter a good deal - like character, work ethic, and the ability to work well with others.

Relevant experience trumps everything. Barring that, near experience and a desire to learn come next. I'm big on people having a solid foundation, but my experience tells me that even the best schools turn out people with shaky foundational knowledge. People without degrees suffer the same analysis. Some are good, some are bad.


> There are doors that close when you do not have a college degree

The question then is, are those doors worth opening? Most jobs ads I see with "requires degree in CS" are for large boring companies and government agencies.


Large, boring company job used to coincide with stability. That seems to be far less true now than 25 years ago. The gig economy of today is far different than what last generation grew up with.


I'm at the stage in life where that stability is appealing. But large boring companies are synonymous with awful code bases, decades of accruing technical debt and teams of onshored developers (partly because of requirements like CS degrees). When I'm job hunting a rank "not wanting to shoot myself in the head every day" higher than financial stability.


> teams of onshored developers

You prefer working with off-shore developers? That's honestly the first time I've ever heard that.


I suppose it also depends on what type of development you want to. A lot of fields won't require it, such as front-end, apps etc.. Other fields, such as AI/ML/advanced graphics etc a lot of the people working there usually has a college degree and more.


My degree is as unrelated to maths or sciences or computers as is possible but I've been working remotely for many years as a developer, despite not having looked into it until I was out of college for many years. So college really has nothing to do with it at all, in my experience.

College might get you contacts and job prospects, but I've interacted with a lot of developers who do have college degrees but because their training was relatively recently, they didn't get a lot of important knowledge about things like memory management because a huge number of schools only teach GC languages now, so I still have an edge against those with degrees since I made it a point to learn C++ really really well.


> But when it comes to getting a job as a developer, there are simply more important things than a degree. Who is more hirable: someone with 3 years of work experience and no degree, or someone with a degree, but with an internship experience?

Depends on the job and the type of work applicants have done. If I want a senior engineer working on a whole new networking stack at Facebook, I don't expect a fresh graduate will fit the role. There are exceptions, but I bet that number is so small I will neglect it.

If I were to look for a junior-level software engineer, I still expect some projects he/she can proudly show off. It can be a class project, or an internship project. Having an internship in the resume is a major triple +.

Some people have 5-10 years experience but they aren't the great fit perhaps because they don't know the technology we use. I want to find people who can pick up the stack quickly but I prefer to find someone who already have the experience so we can get the most out of the new hire right away.

I only have a few years of piratical experience in the area I focus on, but I am already considered senior. In the end your ability to grow as an individual is very important. Having 20 years under the belt is great advantage, but that doesn't mean a young engineer with 3-5 years experience isn't as strong.

When I consider hiring someone, I look at

* the resume (long resume is a negative, because I really don't care what you did 10 years ago, you can tell me that during the interview, not during the resume screening. Please also have a consistent formatting).

* internship / project

* experience and technology familiar

* ability to interact with me like we are building a tent together

I encourage people to get a college degree. As a student get involved in either a research project, or join / start ACM club, or do anything to help yourself marketable like giving talks or volunteering at tech meetup which will allow you to use your technical skills. College degree is not a necessity, but it does provide an unique experience.


I've talked to developers who have said that it can be a strike against you if you have a college degree, because it implies that you wasted time that could have been better spent learning real life coding.

Have you ever seen a situation where having a degree counted against you when applying for a job?


No degree here. Left school at 15. Started coding in my 20's. Been at it for 20+ years and loved every minute of it. Currently CTO of a well funded and successful ed-tech company.


I'll share my experience here for anyone who's considering the tradeoffs.

It's harder without a degree, especially getting the first job. And your jobs might pay less in the beginning (mine did not, but that might not apply to everyone.) After that it gets easier, but even now in my mid thirties I get turned down because of that - but it doesn't bother me much. There's a lot of developer jobs out there and I still end up getting to pick among several. I was making six figures by my second job, but I have no idea how common that is.

A degree costs a lot of money and 4 years of your time. You should give that a good hard look, because if you're dedicated and you love the work you can progress much faster on your own and it won't cost you anything - in fact you will likely earn a lot of money during that time. If you prefer to veg out after work every day with your choice of drug or tv, forget it. You're going to need all the help you can get, go for the degree. If you're going to put in the time and effort you'll quickly surpass your peers and get the best jobs - with or without degree.

But the biggest thing I did for my career was move to Panama where there's no tax on foreign income. I worked five years in a remote job and paid off my mortgage.


Need? no.

Do you need a good working knowledge of what a degree covers? Ideally yes, in practice only parts.

Do people with degrees come out "knowing" what they covered in their degree? often partially, bits are almost completely forgotten as the significance of concepts wasn't really understood till much later.... even if you got A's

But, ideally, if you go in to a degree, highly engage with the content, it gives you a structured approach to many ideas and is a great basis for many kinds of programming applications.

Can you get the same through being self taught? sure. Though I do see that people won't teach themselves things that don't seem relevant, perhaps some of the mathematical foundations, etc. Which they often don't directly need, but it does limit the brains innovations about possible ways to attack various problems (to be fair, this is the stuff some people with degrees tend to forget). Then, of course, some people teach themselves many things, beyond what a degree would give them.


Anecdotal, but given my case, a resounding NO.

32 years and counting as a developer, architect, and dev manager. Not one hour of college.

I'm mostly self-taught, but did go to a magnet computer science high school in Milwaukee.


"Need" for what?

To get any job? No

To get some specific jobs? Yes

To learn how to program? No

To learn computer science? Technically, no. But may be an easier path for most than self-study.


I sometimes find it hard to discuss some subjects with some of my co-workers that don't have some degree directly related to computer science.

For instance, it's very common for electrical engineers to work on software development and they tend to have a good grasp of how things work in the lower level, but when you start moving up in the abstraction ladder, the discussion starts to become harder. Compilers, types, and generally anything that is more related to math or "formalization" is not within their standard knowledge.

It could be argued the same could be said about some with CS degrees as well - category theory, what? - but that usually is a sign of a lacking education. A good electrical engineer does not need to know about context-free grammars, same cannot be said about a good computer scientist.

This is usually a problem when discussing solutions: "How are we going to solve problem X?" tends to be approached by a computer scientist from the POV of algorithms and the concepts involved (for instance, CAP for distributed systems, and how to deal with conflict resolution, etc) and then search for tools that match the criteria. OTOH, most of the time people that don't have the understanding of this area would want to choose a tool that best fits the problem by its description, without thinking about the underlying algorithm and the implications or trade-offs.

I also had some issues when discussing languages and paradigms - it's hard to evaluate the available options when you can't see the concepts and understand what they provide you with. Most engineers don't have the slightest idea of what is logic programming, though it is essential to a good number of fields in science and industry.

Either way, I don't think anyone should be forced to have a degree to be able to work with software, only reservation I have is when people say it doesn't make any difference, as "programming is easy, anyone can do it and understand anything without formal education". Sure, surgery is also easy... the hard part is to keep the patient alive.


Can you elaborate (more) what would be required to "fill the gap" to an 80% sufficient extent? Perhaps a list of links / topics / keywords that should be checked out and read about? Perhaps it's possible to get up to speed with ~6 months of self-study...


Usually anything that deals with theory of computation, formal languages, formal methods, compilers, graph theory, functional programming, logic programming.

I assume numerical analysis, probability and even assembly are subjects most people from engineering would already have some familiarity with.

More specific subjects include relational algebra (for relational databases) and AI concepts like neural nets.


If a hiring decision comes down to two equally qualified candidates, and one has a degree, that one gets the job. CYA, plain and simple.

However, if both candidates have several years of relevant experience, it is more of a toss-up.


How often it's the case that you have "two equally qualified candidates"?


Will you learn more in college than you otherwise would on your own? Probably, especially if you dedicate the same amount of effort in each.

Will having a college degree hurt your chances for getting a job? Not likely.

I'm a developer, and I have a post-grad EE degree in addition to undergrad CS. It's surprising to me how often the EE material and math is useful in my job.

I wouldn't tell anyone to go get a Masters in EE in order to be a good dev, but if you have a realistic option to get a degree without saddling yourself with crippling debt, get the degree.

It's likely to help and unlikely to hurt.


I didn't. Although 1) I do have 3-4 semesters worth of college heavily focused on CS. 2) Back when I started, 30years ago. it was much less of a thing (to have CS degree) 3) I've been programming since I can remember. I grew up reading Byte (before it sucked) and Dr. Dobbs, I taught myself C in 6th or 7th grade from my dad's copy of K&R. I've spent inordinate amounts of time, way more than time spent getting a degree(or several), learning, studying and practicing the art of software development.

College degree is the easy way.


Degree is a certification, not the knowledge.

For me it is useful to understand how a computer architecture, operating system, complexity theory, discrete math and calculus works. It doesn't give tangible knowledge for the everyday work, but transformed my view so I make different decisions that I would make without these skills.

But there are wast amount of quality materials that are accessible for free on these topics, so no, you don't need college degree (whatever it means; Bachelor?) but you need the skill to be a good software engineer / developer.


well, i would say for the average person (and nothing bad about being average), a college degree will make it a lot easier to get a job, and getting a job is a very good start in my opinion ..

even if you define success as being the owner of a business, starting in a job will help you learn a lot about doing business


Maybe I'll be down voted for speaking against the grain, but one of the benefits of someone who has the degree is that you know they have some minimal grasp of the basics (yes, there are exceptions) which make adapting to changes in the tech stack or industry practices less painful.

One of the obstacles with he thing hired out of the coding schools is that employers don't know even if you're "current" that you aren'the just equipped for their ephemeral needs.


I'm a student at a top 5 CS school. I feel that the majority of the value of my degree does not comes from lecture, homeworks, or exams. I feel the acess to opportunities on campus is where much of the value comes from. We have many bright, talented, and diverse people on this campus. Not only my fellow undergrads, but also the professors, grad students, researchers, employees of companies with a campus presence, etc. There's huge opportunities for collaboration on programming projects, both in class and out of class. There's a great startup culture, with a lot of campus resources. There's a slew of companies that employ students for part-time co-ops on campus. Almost every professor talks about research they would be interested in getting students involved in. We have very active IEEE and ACM student orgs. We put on a great hackathon everywhere. I could go on, but to put it short, you almost have to try to avoid opportunities to not develop yourself as a computer scientist/engineer. If you're teaching yourself, you have to be proactive in working with projects online, motivating yourself to work on personal projects, seeking out opportunities, where as on campus, the opportunities basically come to you.


I've seen degrees actually set people up for failure, or not prepare them enough.. My brother went to school for 'interactive design/dev' and 2 years later hasn't found work.. they taught php/javascript -vanilla no frameworks, he learned css/html but no bootstrap. For basic web dev, I think if you're a go-getter and self-taught you're more likely to succeed than a college grad w/ the basic college curriculum.


Continuing on to college back in 2000 was a big deal. A college education was the underpinning of the American dream, the promise is that it leads to a stable and steady job, which leads to stable payments of a house mortgage, which leads to a stable marriage, which leads to a family.

Then you die.

The promise was bolstered by the returning GIs of WW2 who took advantage of the GI Bill to go back to school. Every generation there after took to that prototyping.

Earn a degree, get a job, buy a house. Repeat.

But in 1996 a very curious thing happened.

The folks that commercialized the works of academics and scientists started to take off and get attention.

Thereafter the work in the industry started to outpace the output from academia.

You want to build a dot-com business? School wasn't going to teach you that. You have to go learn it yourself. You would be stupid to stay in school.

This held true so much that in the early 2000s, when the risk of NOT having a college degree made dropping out to pursue your tech dreams an actual proposition.

But we're flipped now.

In 2017, schooling has caught up. What's discussed in the latest industry is discussed in the class rooms. And not only caught up but been condensed to mere months. You want to build a website? It makes more sense to go to a technical school or college, participate in the college level entrepreneur safe space, or leverage networking.

You would be stupid to not have a college degree.

But now the high cost of college is more of a privilege than anything else.

Good luck.


I've been in the software industry for 20 years. To my knowledge, there has never been a time when there wasn't a significant number of professional programmers without degrees. It's a field that attracts people who love to do it - aka "amateurs". Many people love it enough to delve into CS of their own accord.

I didn't go to college. I've never been asked about my education during any hiring process. I've worked on several phenomenally successful products, and I'm quite sure I've been considered a high performer at every job I've had.

I've worked with a dozen people like this, often at big companies with highly technical developer cultures. I know a few no-college developers whose knowledge of our mutual specialization is so deep that I have a hard time following them.

As long as demand for talented devs is like it has been so far, the industry as a whole can't afford to exclude any.


I got into web development about 5 years ago as a self-taught developer with no degree. Since I was laid off at the beginning of 2016, I decided to take the entire year (all of 2016) off work for personal reasons. Now that I have started interviewing again in the last few weeks, I've found my lack of degree to be a serious sticking point for interviewers. I think the industry is being flooded with "bootcamp" grads and other self-taught developers now, and it's becoming much harder to get your foot in the door without a BS.

If you're wondering whether you need a computer science degree to build your own iPhone apps or launch a SAAS, the answer is no. But if you want to work for large corporations or SV startups as a Software Engineer, then it's almost a hard requirement.


I would not hire a software developer if he/she does not have a CS degree from an accredited university.

I could care less about what people learn at colleges or how someone without a degree could know more than them. It's not the knowledge that I'm after, it's the person.

Someone with a college degree has shown they can make a long term commitment, take on a large and complicated task and see it through. I know that for at least four years they have been exposed to individuals who are at the top of the industry and have learned about CS with peers who share the same interest and are now their close connections.

Although all of these might be true with an individual without a CS degree but the odds of that is so miniscule that I believe it's not worth taking a chance.

Just my opinion


I appreciate your optimism about the value of your average CS degree, though I think it's completely unfounded. There are hundreds of thousands of CS degrees handed out from over a hundred accredited universities every year.

> I know that for at least four years they have been exposed to individuals who are at the top of the industry

The chances of all of those students being exposed to "individuals who are at the top of the industry" is pretty much nil (I'd even argue that folks who fit that description are not to be found in most, if not all, universities).

> peers who share the same interest and are now their close connections.

Those close connections with their peer groups is going to provide, IMO, pretty much zero value in the short term. That's a recipe for an echo chamber where programming memes are passed around and mistaken for wisdom.

In the long term, as they gain experience and learn about the real world, that's when those peer groups are going to be valuable. But by that point, if you're still looking for that CS degree, you're looking for the least valuable part of their experience.


> I appreciate your optimism about the value of your average CS degree,

It's not perfect, but when you're hiring you're taking a chance and have to hedge your odd for success.

> The chances of all of those students being exposed to "individuals who are at the top of the industry" is pretty much nil

It's higher than someone who is plowing through online courses and YouTube videos.

> Those close connections with their peer groups is going to provide, IMO, pretty much zero value in the short term.

I would have to disagree with your opinion. Being exposed to other individuals who share the same value and aspiration has plenty of short term and long term values for the student and the individual that hires her.

In short term, when you hire a new grad who is a great fit, odds are his friends from college would also be a great fit.

In long term these individuals and their network "gain experience and learn about the real world" concurrently. So their network expands much faster than someone who is picking up the trade on their own.


Wouldn't 4 years of work be pretty equivalent, especially with good references? I think college is incredibly valuable, but it's pretty ridiculous to say that the odds of good engineer without a degree existing are "miniscule". Have you really never worked with someone without a degree?


> that the odds of good engineer without a degree existing are "miniscule"

I didn't say that. I said the odds of finding an engineer without a college degree who can prove to me that they can commit and achieve a long and complicated personal goal, have been exposed to individuals who are experts in the field and have built a network of colleagues who also share the same personal traits and passion is miniscule in comparison to someone who has a college degree.

This doesn't mean someone with a college degree gets an easy pass. It just means I can make certain assumptions that I cannot make for someone without a degree, and given the importance of hiring, it's a chance I don't I would take.


I understand, I just don't get why a good job doesn't qualify. Work projects are much longer and more complicated than school projects, you're working with accomplished professionals in the field, and at a good office you're surrounded by people with the same traits and passion.

I know anecdotes don't mean much, but I'm sure you went to school with people who didn't give a shit, and I'm sure you've worked with great engineers who didn't go to college or have a non-CS degree. Obviously it's up to you who you hire, but I'd encourage you to give a couple non-degree candidates a shot next time. Just ask them questions about the stuff you've mentioned here. Tell me about some projects you've worked on, what are you passionate about, how do you keep up with current technology, etc. It's not that hard to make sure someone meets the same baseline of "you're committed enough to not drop out of college".


What would you think about a degree in "Software development"* from an accredited university? I'm sure it won't hold the same weight as CS but would you at least put it in the same league?

*It's basically CS minus the math.


It's quite ironic, I have a degree in (Theoretical) Physics but it's is waaaay easier to find work on pure programming/Web projects and not something with numbers involved. And the latter is basically what I've been drilled for in my studies.


Yes. I also have major in a math-y subject, with extensive minor in CS, but all the jobs seem to be in building shiny webapps with the latest JS framework to sell people stuff or services. Data Science that is much talked about seems to mostly about glorified statistics for advertising. Chances of getting into the R&D department of BigCo's moonshot project (think about self-driving cars or something) are 1-in-million.

As far as I know academia has no pay and no stable career prospects for a student like me because I'm not the top-of-the-top of my class, just slightly above the average, but at least there is a promise of a chance of doing something interesting that might wider humanity's perspectives and maybe even help us. With real technological progress.


Exactly... Well at least working there are startups (at least in Berlin) which offers good working conditions, 40h or less and it's even possible to work then apart from the job...


There's also a huge difference between "how much does a college degree help an individual programmer" and "how much does a policy of requiring CS degrees improve programmers as a class". I'd argue that there's more of the first than the second - a college degree is a largely a positional good in that having a more impressive educational background gets you hired only as much as other folks have less.

I don't think the program is worth the massive investment getting put into it. There's obviously superior alternative structures - proctored algorithms tests to filter for IQ/ability, which qualifies you for a year-long apprenticeship with a working programmer.


'a Boolean search for “degree OR bachelor OR BS OR BA OR B.S. OR B.A.” yielded 1,760 matches.'

Thus missing jobs that include "Masters" or "Phd". In certain fields, (such as compilers) jobs that require an advanced degree are quite common.


Having a college degree can be a requirement for work visas. I moved to SF from Europe, and this would have been a real problem without a master degree. Not sure about requirements for european visas, but that's probably true too.


Instead of going to college, I did an apprenticeship and have now been working for three and a half years in the tech industry.

I have also been doing a college degree part-time online, it will take six years in total, but I am working full time so I don't need it, but just want it to maximise my potential.

I am three years into the degree already, so only have three left. I am 21, and my friends are all still at college, so feel like I already have the upper hand.

Just need to get some good content on my GitHub profile!


Succeed is not just getting a job. I have not finished a bachelor. And I had no problem getting in jobs requiring a degree. But the future is unknown. Am I just a good cheap worker? How about career progress? How about when I'm 45 with less energy/interest and stuck in some low uninteresting position? Because although many companies relax on hiring they might not on promotion. I am definitely considering long term finishing my bachelor at least.


Probably as people flood into the field my guess is those with formal education will be picked purely because they check somewhat arbitrary boxes.


No. Definitively. That's not the same as not needing an education. Some US college is optimal imho. US High school is specifically socialization now (thanks to the teacher's unions). Nowhere was this more evident than in pursuit of a teaching credential where the filters are stark and narrow for a specific mindset allowed to be a teacher.


The key here is "developer". A web developer can learn to write code for a pretty website. This only requires very basic knowledge and essentially no math is involved.

In contrast, an engineer or computer scientist capable of working as an applied mathematician definitely requires an advance degree. I laugh when webdevs call themselves engineers.


>In contrast, an engineer or computer scientist capable of working as an applied mathematician definitely requires an advance degree. I laugh when webdevs call themselves engineers.

Why would you think this? Do you think everyone who has a degree in CS understands algorithms, compilers, type theory, PL design, or operating systems at a deep level?

Why would it be impossible for a self taught dev to learn these things? I've been in the industry for over twelve years and have found no correlation between degrees and deep understanding of CS concepts. Some with degrees have it, some without... shrug


It's just simple gatekeeping. People are finally having to admit you can be a developer without a CS degree so they're moving the goalposts.

"Fine, you're a webdev, but you'll never be AN ENGINEER."


> I laugh when webdevs call themselves engineers.

The really funny thing is asking a computer scientist to engineer a web page.


Maybe. I know excellent (software) developers without formal degrees, and I know many incompetent developers with them.


I did it my way, but I wouldn't wish that path on anyone. Stay in school kids!


Do health professionals need college degrees?

It depends, as it is not the same to be a first aid officer to a dentist, a physiotherapist or surgeon. In some cases you need a higher degree than in others.

The same with software development, IT, etc.


Yes, they do. Having a degree opens more doors, there is no denying that. Degree doesn't offer enough proof of your skill, but this doesn't mean it is pointless.


I do not agree. It's nice to have options, but one doesn't not need, and practically speaking cannot take advantage of, infinite options.

If maximizing the number of doors opened were the best choice, then a logical extension to "Do Developers Need College Degrees?" is "Do Developers Need A CS Degree From <'best' CS school(s)>?" Clearly they don't. When we're given infinite time, energy, and other formerly finite resources, I might feel differently.


But the truth is a bachelor degree is not a very difficult option, for 40% of US population, it is very PRACTICAL.

Having a degree definitely helps me get a lot more interview opportunities than my friends who come out of a coding bootcamp.


If I am hearing you correctly, you are saying that a CS college grad (maybe with STEM grad with programming) is more likely to get the opportunity to interview than someone with only bootcamp coding experience. I agree with that. The former has been coding for at least a few years, and the latter a few months. There's no substitute a the right kind of foundation built from experience.

That being said, when hiring for a junior role, it's hard to spot the "right stuff" on a resume. If you figure out how to spot that, let me know :). Or keep it a secret and sell me that as a service! There's a reason that currently most recruiters can't charge a commission for placing junior candidates. It's hard enough for you to spot the junior dev you want to hire in a sea of applicants you've never spoken to. A recruiter who isn't you will have an even harder time figuring out what you want.


No, they don't. They do however need to invest a lot of time into learning and must be willing to for the remaining of their career. I don't have a degree and I have been at this for 15 years making above middle class wages for the past 10. I know many other developers in similar situations. The common factor is self motivation and the willingness to always learn.


for developers, i think in some ways it hurts you if you had college experience. and drop out students are considered accolades in a bizzarre way. sad but true.


Elon Musk started a space and a car company. I'd say nah you don't need the degree.


Elon has a degree, and was a PhD candidate.


The discussion here shows the three main aspects of this question; as a pre-requiste for getting hire, as a mechanism for learning skills, and as a limit on what I can do.

Is it a pre-requisite? Generally no. But the word 'developer' is very broad. Putting up wordpress sites? Definitely not. Developing real time systems in rockets? Definitely helps. Mostly though there are a lot of developers out there looking for the next big thing and having a degree differentiates you from people without a degree. It says you can take a multi-year project and see it through to the end. Shipping a fairly complex product has the same credentials, there is a lot of scut work you have to do when you ship a product just like there is a lot of crap you have to do to finish a degree, the big signal here is that you can do all the work to get the project done and won't stop when you get push back.

The second aspect is skills learning, here again a shipped project will help you get through a lot of the challenges for that particular project. A CS curriculum puts you through a bunch of material that is known to be fundamental. So people who have shipped a wordpress site or a Heroku project have an understanding a some of the things that have to be done when shipping such a project but no idea about the things that have to be done to build a new interpreter (for example). It is faster to develop expertise on a single topic by shipping product related to that topic, it is faster to develop tools to let you handle a broad range of topics by learning from a curated curriculum that is designed to be as wide as possible.

And then there is the limit question. In my experience people are mostly self limiting, which is to say the tell themselves "I couldn't figure that out, especially not on my own." So a college program typically has some history about how people went from not knowing something to it being part of the curriculum. And if you're lucky that rubs off on you and you develop a sense of how to get from not knowing to knowing. (that is the essence of science, getting to the right answer when there is no textbook to tell you what the right answer is.) As a result I've found people who are curious and stubborn and refuse to believe they can't learn something do well either way, but a college degree can sometimes indicate training in how to figure out the unknown. Whereas people who are self taught at one thing may become stuck when that thing is replaced by a new thing.

I often rail against "milestone" thinking as in once I have this milestone then I can make the next milestone. Especially when it comes to learning. Humans are a bunch of experiences and learning wrapped up in a personality. Their ability to learn new things is unparalleled but only if they have the right tools in the toolbox.

If you work to develop those tools, you will always be able to learn new things and be relevant in a changing world. If you learn only a few things and don't change you will become irrelevant as the world changes. Does anyone 'need' a degree to learn something? no.


No.


This is not the time to ask that question.

The time to ask that question is when you're 38 years old, married with two kids and the wife home raising the youngest one. Then you're laid off. And it's 2000 or 2001. Or 2008 or 2009.

Companies need people now so it's in their interest to tell you to leave school and work. You're on the hook for your future, not them.


Do any engineers need degrees? Does anybody? Before you go too deep down that philosophical rabbit hole just consider one thing.

Is it a job that requires a lot of skill?

Any job I've seen that is highly skilled and not in the "get your hands dirty" kind of way usually has a degree behind it.

I've seen many kind of engineers without a degree. It's not like you couldn't figure it out on your own for any kind of engineering. The two main problems are commiting enough time to learn it on your own and getting a company to take a chance on you.

You can do it without a degree but it will be a lot harder to learn enough and to get your foot in the door. If someone takes a risk on you, you'll probably get paid like crap for a few years too.

Is it worth it to do that or just spend 4 years on a degree? Most of the time the answer is a degree.

Computer science isn't any different from any other branch of engineering even though developers like to think it's so special


If you are an engineer, you need to take the PE exam. In most states, you cannot take it without a BS degree. Not really comparable to computer science where such things don't really exist.


No.

A person can go from zero knowledge to landing a full-time non-entry software engineering position in 3 months. Many of my friends have done it and I am in the process of doing it right now. I just started learning in December and I am in the final stages of an interview for a Software Engineer job at Big Name Tech Company that I fully expect to receive an offer for.

Coding is the easiest and highest paying profession in the world today. Everyone should do it.


Define non-entry software engineering position? I really don't thing good code is easy to produce, so it is not definitely the easiest in my mind. I think there is a distinction to be made between becoming a good software engineer, and get a job as a software engineer. The first option takes more than 3 months.




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: