Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
What is life like for PhDs in computer science who go into industry? (vivekhaldar.com)
122 points by gandalfgeek on Aug 12, 2012 | hide | past | favorite | 67 comments


I can sum up this article:

A PhD is roughly the same as a Masters for getting a job, except that you get paid a bit more if it's a big company with an HR department. You haven't really developed any skills that are useful, and now you are going to be paired up with some 25 year old that has 3 years of experience more than you do at 30. Well, you might be able to do well, or maybe not, some PhD's are great and some are suck, and the same is true for people without degrees. You have to prove you are valuable. This is all going to make you very depressed since you are facing 50 hour weeks doing work you don't really care about, instead of 9 hour weeks doing only what interests you in Academia, but you have to eat so you might as well get on with it.

Well, that's my summary :) (I'm a bit biased being a dropout)


> You haven't really developed any skills that are useful

That's practically all I did. Doing a PhD gave me both the time and the excuse to pick up both technical skills and soft skills that prove useful to myself and my colleagues on a regular basis. I'm a dramatically better coder for it.

> This is all going to make you very depressed since you are facing 50 hour weeks doing work you don't really care about

I work 37 hour weeks doing stuff that I find really interesting. I made a conscious decision near the start of my PhD that I had no desire to work in academia but that doing a PhD was fantastically interesting, so I'd keep doing it.

> instead of 9 hour weeks doing only what interests you in Academia

That might have been the cause of your dropout.


Very interesting perspective

(Disclaimer: I dropped out, with a little help from my advisor)

I guess it all depends on a lot of things. If you enjoy what you are doing, and have a nice environment, go for it.

But dropping out was something I'm very glad I did. And I learned a lot outside of Academia. (and the money is good as well)


I think it probably helps (for people like me) that the UK PhD system is rather lighter weight than that in NA - I spent about 3 years and 4 months on mine, which I considered a very acceptable tradeoff for the learning (and ability to work on a project I was really interested in). My lifetime earnings will probably be a little lower, but the flipside is I'm now equipped to take on much more interesting jobs than I was beforehand. If the timescale had been closer to 6 years I would be more inclined to think twice about it.

edit: I don't mean to imply that a PhD is for everyone - I have friends who dropped out who made absolutely the right decision: they didn't want to become academics and had lost interest in their subject. If anyone is in that position they absolutely should drop out. The uninformed supposition that skills gained during a PhD are useless is frustrating to me, though - I really honed skills like the ability (and motivation) to investigate a topic area, present my work clearly, and learned a whole load of new technologies along the way. These have all been invaluable to me - far more so than a little extra time in industry would have been. I was certainly at a disadvantage at 0 years in industry compared to someone else with 3 and a bit. A few years later, though, and I feel at an advantage.

I guess I'm also quite defensive about PhDs because mine holds a special place for me - it was during that time that I discovered how much I love to learn. As a result I'm far more motivated to improve my knowledge than I ever was before.


Your "summary" is bullshit and has completely distorted the article. You're not just biased, you're also completely misinformed.

> You haven't really developed any skills that are useful, and now you are going to be paired up with some 25 year old that has 3 years of experience more than you do at 30.

More likely, you've spent 3-6 years coding, learning to communicate, and work well in a team.

> This is all going to make you very depressed since you are facing 50 hour weeks doing work you don't really care about, instead of 9 hour weeks doing only what interests you in Academia.

Most recently hired academics on a tenure track work 80 hour weeks. You've got to teach course, mentor research students, write grant proposals, and produce research. If you don't, you're out the door.

Imagine I were to suggest that because you're a dropout, all you're good for is saying "do you want fries with that?" That's about the level of what you've done with your comment.


Tenure track profs work 80 hours per week? Really? They work from 8AM to 8PM, 7 days per week (taking only a 15 minute break each day to eat)???

This doesn't seem likely, since they would be completely burned out after only a few months, and you would have diminishing returns after the first week. If you are really working 80 hour weeks then you are being incredibly foolish and wasting a lot of time being completely unproductive due to exhaustion.

But, of course, you don't really mean 80 hours, you were just exaggerating a bit to make your point. Fair enough, lets say profs are putting in (I'm assuming a proper research university where tenure track professors teach 3-4 classes at most) 12 hours lecturing each week, for 30 weeks per year. That's (at most) 360 hours of teaching, per year.

Lets say you spend 2 hours per week preparing for lectures (you should know this stuff already, but lets say you just need time to get organized). That's 60 hours, yearly.

Now, lets say you spend 4 hours every day working on research. Everybody probably would like to spend more, but some days you have to see friends or your wife/husband or kids or get your car fixed or whatever. That's 4 * 365, which is probably a huge overestimate for most people, but that's 1460 hours per year.

Now, as for writing grant proposals, or mentoring students, or whatever else, nobody believes you are spending any serious time (more than 5 hours a week) on that. I don't know how long you really spend on it, but I'm going to assume that you aren't writing more than 1 grant proposal per month, so I'm going to just say that I gave very optimistic and too-high estimates for everything else, so this will get rolled in as overhead.

So you're doing (drumroll) 360 + 60 + 1460 = 1880 hours of work per year, all but 420 of which is just thinking about the problems that interest you most.

If you work in industry, you are spending 220 * 9 = 1980 hours per year working on stuff that are not the things that most interest you, and almost none of it is blue-sky thinking of any kind.

Want fries with that?

Edit: With my really crazy overestimates on how tenure track profs spend time, averaged over a regular work schedule it is 42.72 hours per week, of which 9.5 is anything at all besides pure 'research'. Odd that, it's within .5 hours per week of my original guess, and I swear I didn't cook the numbers to come to that.


Wow, your hours on preparing lectures, grant proposal writing, dealing with students, etc. are all way underestimated. A solid grant proposal that is likely to get funded (often in a super competitive field with maybe 8% acceptance rate) can take well over 2 months of solid preparation. Of course we don't actually get 2 months of solid preparation time, what with all the other stuff we have to do. My workload as an early career academic is roughly 10% on a funded project, 20% "research", i.e. grant/paper writing, 5% on a couple of committees I am on, and the rest teaching, with roughly half of that supervisions of PhD/Masters students.

For my masters students that equates to about 6 hours of meetings and reviewing their stuff over 3 months of their project. Most of my colleagues and I burn through that in the first month. So what do we do for the other 2 months? Well it's not like we can shut up shop and say "welp, sorry, but you've used all your allotted time" because that'd be terrible. So we continue to give whatever amount of time we feel is necessary (within reason).

Anyway it's not as simple as you've broken it down to be. Things ebb and flow over the year too, students need more mentoring at certain times of the year than others. Grant proposals, conferences, special issues journals all have deadlines and calls. My week is extremely varied and I'm still working on juggling all the different hats and priorities I have. I have a full load right now over summer even without teaching; when teaching starts up again I'll have to do all the stuff I'm doing now (bar masters student supervision) while doing roughly 25% of my allotted time teaching undergraduates.

Anyway, just another perspective.


Your numbers are in line with what I see from my advisor.

The original also failed to break down paper writing. For a top-tier conference, you're going to spend several weeks worth of full days working on a paper, and you're probably going to do it twice per paper because, as they say, even Simon Peyton-Jones has papers rejected occasionally. This time is on top of work you already did to get your system working! (well, unless the paper is the result, as in some areas of theory)

You really, really have to want to live it. I enjoy it, and as a PhD student I don't mind spending all my waking hours that aren't devoted to my wife or a little bit of exercise doing research. But just to be competitive, that's sort of the baseline. I'm not sure, though, whether it's truly a demand of the job or just the default because academic research attracts people who work like that (I also worked like that when I was a developer at MSFT).


The thing is you can kinda coast if you want to. But in the UK at least you have to churn out at least 4 high quality papers every 6 years to be included in the "Research Excellence Framework", the assessment of the research quality of the university. Although you don't have to be included in the REF, it was used as a weeding point for redundancies at one of my previous universities (i.e. those who had been in the previous REF were exempt from being included in the redundancy pool; everyone else had to reapply for their jobs). So it does tend to hang over you, even if it's not a direct reason you might get fired.

My university is an ex-polytechnic, so it focuses a lot on teaching as well. You can quite easily get by just teaching - we have a few "just teachers", though they tend to be the older employees. New fulltime employees are expected to do a bit of everything, which is actually a lot better, IMO. If teachers are doing research and contributing to their field, they're reading current stuff and can incorporate that into their teaching. etc. etc. Lots of good reasons for it to happen.

Interdisciplinarians are also rewarded - one of the reasons I got my job in a tough environment (I do computer ethics) was because I had a strong background in computer science vs. most comp ethicists who come from philosophy. So it helps to be able to cover a few different things rather than just one very niche specialty subject as well.

Ultimately though I love my job. It's highly flexible, stimulating, varied work, I have great benefits, especially holidays, and the university is very supportive of family life.


I don't know you, and I am not an expert in how you spend your time. I know it's not 80 hours of work per week, because that is completely absurd. If you have a better hourly breakdown it doesn't seem like you are sharing it. Your percentages are very different than what I would have guessed, and honestly I think your priorities are off.

If you nail down a single big result you could ride that for years. Maybe that isn't realistic for everyone, but if I didn't think I was going to be a major player and drive the state of the art, I think I would find something else to do. I really don't understand the concept of the 'hard working' professor, teaching is easy (show up, explain stuff), advising students should not take much time (pick smarter students, point them in the right direction, expect them to take it from there), conferences are a complete waste of time, trying to publish a bunch of little crap results is just generation of worthless volume and nobody is fooled by it. All this might sneak a person in at a 3rd rung school, but for what? What are all the grants for in CS, presumably you already have more than enough computers? Is that what people compete for when doing something important in the field is not realistic?

I'm sorry, but if you aren't concentrating on research, meaning real research not writing grants or typing up results that nobody cares about, then you are either at a really crappy school or you have the wrong priorities. Expanding the state of the art, or moving the needle on a difficult problem is going to get you attention, respect among your peers, and publication in real journals. The situation you describe seems more like a backup plan.


Wow, sorry, you really have no idea. It's not 80 hrs per week, sure, but it's certainly more than you're giving us credit for.

My percentages are base for what an early career should expect in their first couple of years with a permanent position (you'd call it "tenure track" in the US I believe). If I get a big grant, my teaching time would get bought out, so more of that time would go to research. It is possible to buy out of all teaching, though you would still be expected to supervise PhD students, so you'd still have some %age of teaching on your load.

Teaching isn't "easy", there's a lot more to it than "show up, explain stuff". Lecturing is easy. Preparing lectures, exams, dealing with students, marking, administration isn't "easy". If I could just lecture I'd be more than happy. It's the rest of it that's a complete drag of time.

Also, I had to laugh at your "real research" paragraph. You obviously have no idea how it works.


I'm friends with several lecturers, and while they don't work 80 hours a week, it's fair to say they work substantially longer hours than I do. That's fine, they generally enjoy their jobs and get to do interesting things (as well as deal with a lot more bullshit than I do). You're utterly crazy if you think they spend only 10 hours a week on non-research activities.

As an outsider, your estimates are finger in the air at best. How can you possibly know you've covered all their activies?


I think you're seriously underestimating the amount of work we need to do...


you can't really account for academic time on a hours-per-week basis. there's too much variety. as others have noted, it's a modal way of working. one week it's this, next month it's this, then maybe there's a paper deadline so you have to crunch down and get on a particular project. maybe it didn't fly so you have to try something else. it also depends where your university is.

being a professor is like running a company but with fewer support staff.

running your lab takes time most weeks, it's almost a constant: whole-lab meetings are easily 2-3hours per week, meeting each individual member of your lab is another hour or so each. then there's the admin for your lab (some of which you delegate). then there's organizing new grants: this involves collaborating with other people which means... yup, more meetings, often over skype or similar. then working out ideas, then writing, re-writing, and re-writing the draft and doing a lot of preliminary work. many grants need a lot of preliminary work to be shown before they get funding.

teaching is a lot more than just lecturing, you forgot: time spent talking to students one-on-one, running labs, designing interesting homeworks and exercises, preparing exams, marking homework and exams (marking easily consumes a week or two per course here). also, "knowing stuff already" doesn't mean you can teach it. it's enough to teach yourself, not others. developing a new course or just keeping and old course up to date and interesting takes an awful lot of time. what content will you include? what exercises? what text books? what's examinable? what's useful? what's the structure? what can students handle? all of these apply to courses old and new, but are faster for older courses, especially ones that do not change much (i.e., not active research areas).

then there's your research. what you do here depends upon the stage of you career. in the very least, you need to read. a lot. in CS-disciplines, this is easily a weekly activity. then you can spend some time thinking. then trying some things out. maybe. then going to conferences, talking to more people. presenting your own work: writing papers, arguing about their acceptance, visiting labs, workshops and conferences to give talks, preparing talks, etc.. believe it or not, you have to advertise and advocate your research, even once it's published and done. how else will people notice? so advertising is another thing you must spend your time on. now multiply all this work by the number of projects you have: potentially one per lab member.

interestingly you miss out one of the most important parts of being an academic--reviewing other people's work for conferences and journals. this takes a lot of time. firstly because reviewing a paper thoroughly takes considerable time, secondly because the discussion afterwards takes considerable time, and thirdly because often you will do this many, many times a year. as you become more senior, you take on a more senior role here: managing reviewers, journal and proceedings, conferences and workshops, etc.

i'm sure this isn't the same everywhere, but this is a small glimpse at academia in the places i'm aware of. it's busy busy busy.


As a fellow academic I totally agree - people outside really don't know what it takes to run a successful course - it's not just wandering in and rambling for an hour about stuff you like, it's highly structured, requires a lot of preparation and thought, not to mention having to deal with exams, marking, organisation, and students' pastoral care. Sometimes I think our own universities don't know how long marking takes - my current workload says I should be able to mark a Masters dissertation in an hour! Hahahahaha yeah right.


I think justin_vanw wanted to show that the article was not really informative. I agree with that, I haven't learnt much by reading it.


Oh, if that's the case I overreacted. Sorry justin_vanw!


It turns out I did not overreact. Apology rescinded, you do not know what you're talking about.


My understanding is that academia is very competitive, so I don't understand how you are only going to work 9 hours a week. Further, you'll probably have classes to teach which surely take up more than 9 hours a week. Also, since tenure is largely tied to the quality of your output, you can't always work on what interests you--you have to work on what you can publish.


Miaow, bitchy. You are right, however, about it being a bit depressing competing with 25 year olds that have 3 years experience. My first post-doc job started as borderline data-entry (but then in three months I became chief developer on the entire system :)).


The only exception I take with this is the part about coding in an interview. My complaints there are not specific to having a PhD though.

I understand the need for practical tests like these, but whiteboard coding is not a good way to do it. People dont' typically program on white boards, they do it alone with a compiler, reference manual, and the ability to test. Put them in a room with a terminal and ask them to write a straight forward program in 30 minutes. If you have to practice for the interview, what exactly are you determining in the interview? Their ability to program? Or their ability to practice interview tricks?

Immediately discounting a candidate because they are annoyed at your question to whiteboard code also seems rash. Asking someone experienced to write the "print a string in reverse" algorithm will naturally insult them. Especially so if the candidate has a CS PhD and spent time grading students on more complicated programs, or especially if their CV includes links to public repositories containing thousands of lines of code they wrote. On the other hand, asking a candidate to sketch a genuinely complicated algorithm on the board is a reasonable question.


When I started my career, I also used to find trivial interview coding questions irritating.

As I developed and started applying for more senior positions, I started to find them downright patronising.

As I developed further and started to sit on the other side of the table, I realised that they aren't there to catch out the people who forgot a closing brace, they are there to catch out the people who don't know what a closing brace is even for.

The first time you do the interviewing from the other side, it is truly amazing how many people with extremely impressive-looking CVs fall into that latter category. Having a CS PhD and experience assessing undergraduates on more complicated programs is no guarantee of any sort of programming competence by professional standards. Neither is having been a Senior Software Engineer leading a team of developers at a past employer, for that matter.


This is a side effect of the broken hiring process. Typically candidates are found by acronym matching. The folks who are below average (don't know what a closing brace is) will keep tweaking their CV/resume until they get called since it is pretty easy to guess what acronyms are being looked for.

When they don't succeed in getting hired they will keep repeating the process which means they will go through many interviews. A person on the other end of the bell curve is likely to interview a handful of times in their entire career so you are unlikely to ever see them.


It's definitely true that poor candidates are on the market much longer than excellent candidates so they are over represented in interviews. I don't think it follows that the hiring process is broken though. I think it's just evidence that hiring is hard. The very best candidates are essentially never on the "open market."


I meant that it is broken at a specific organization if you end up doing an in person interview, they walk up to the whiteboard and do not know what a closing brace is. They should never have made that far through the organisation's interview process, unless that specific process was broken.


> Asking someone experienced to write the "print a string in reverse" algorithm will naturally insult them.

Why? It's a trivial question that any programmer should be able to answer easily. An interview is a test. An interviewee should be happy if the test starts so simply. It's a chance to warm up and get in the groove.

When you took tests in school were you insulted if the first question on the page was too simple? No, you just answered it and were happy that you earned your first points so easily.


The reason it is annoying for people who are experienced is that generally they are looking for jobs where they get to work on difficult and interesting problems. Being asked such a simple question might indicate that you are interviewing for a position you won't want.

I'd personally be very tempted to answer the whiteboard question and immediately explain that I'm not interested in a job that consists only of writing trivial implementations and ask if they can give examples of real problems they face which are non-trivial.

I have actually had one interviewee ask me such a question, although not in response to a simple test or question of my own. This impressed me, although I get the impression that most interviewers would not appreciate this in a similar way to how they do not appreciate candidates that find their fizz-buzz test annoying.

An interview is not a one way test. It is a two way test where the employer can also fail.


Your answer implies you have a high ego as you consider the questions below you and only interested in satisfying your own intellect/ego instead of working in a team that gets shit done... read shipping products.


Precisely. It's a bozo-filter. _You_ know you're not a bozo but the interviewer doesn't (yet).

I ask questions like this in interviews because experience shows that a significant percentage of supposedly experienced developers have trouble with them.

If you ace the simple questions, you'll get something harder next.


Some people will take 30 seconds on a question a tiny bit harder than print a string in reverse... and some people take 30 minutes . Those people have similar looking resumes. It's a useful signal.

So if you are offended by that, then it's really your problem. Just answer it in the 30 seconds it takes and the interviewer will immediately go on to other (harder) things.

Your complaints kind of conflict, because it doesn't matter if you don't have a compiler -- you should able to write code to reverse a string on the board. People realize that it's not an ideal coding situation.


> Asking someone experienced to write the "print a string in reverse" algorithm will naturally insult them.

Being insulted is the wrong reaction. Either you are a good coder, you will write it down on the board in one minute and we'll move on, or you are not a good coder, you have a crappy attitude and you are not getting an offer.


How about third possibility: they will think that your company has a crappy attitude and hiring policy and move out?


>Their ability to program? Or their ability to practice interview tricks?

Often, neither. It can be the ability to think clearly and communicate that thinking. I personally avoid whiteboard coding questions when I interview people, but the good ones I've been given involve a good back-and-forth, basically working on the problem together.

I usually ask more open-ended questions, like "how would you go about solving this large, potentially ambiguous problem". Both types of questions give the interviewer the ability to gauge personal skills and mental clarity. Whiteboard questions provide some ability to test technical ability in exchange for some ability to test higher-level knowledge. They both work in the same kind of way, though.


I agree in general, but easy fizzbuzz-type questions are not the main problem with interviewing process by far. There are just too many people who almost cannot write code, including CS PhDs.


This is true. For CS PhDs you should always look to see if they've done any large implementation work or if they've focused more on math. The former will generally have large amounts of source code available for inspection. The latter may or may not be good at programming.


I felt similarly about whiteboard questions the first time I went through interviews, but after having to interview other people myself (and a lot of them), I quickly realized that I simply hadn't properly understood the purpose of these exercises. They aren't designed to see if you can code (you are usually given the benefit of the doubt on that one), rather they are designed to see how you think.

Usually a whiteboard question (with me) will go something like this: I will ask you to write a pretty simple algorithm, then tell you there are no tricks because people are surprised that is so simple. I then change the problem scope which makes you change the algorithm to something slightly harder. We then spend time debugging it because no one ever gets it "right" the first time. That's fine, a big part of the exercise is to debug it together. And from this I can glean something I never would by just looking at your GitHub commits: how you think about a problem. Do you make a table for yourself and step through the function mentally? Do you immediately think about edge cases? Etc. The way I can figure this out is by the interviewee talking (and if he isn't talking I ask him some questions to get him thinking out loud).

If you just give them a terminal and leave them to program it, it doesn't really tell you anything about these aspects. Unfortunately I can't jump in your head to see what's going on, and if I leave you to start blasting away at a keyboard I probably won't be able to analyze you're wild moving of code around or fast typing, not to mention it being hard to see on a screen over your shoulder. The whiteboard slows down their writing (since they're not typing), forces them to think vs. blindly changing values and hitting run -- again, it accommodates having a conversation about the code. I already know that you can look this algorithm up online or its included as part of the standard libraries for this language anyways, and I'm fairly certain given a debugger and sufficient test cases you could whip it up in ten minutes.

The other reason something simple like "print a string in reverse" is chosen is precisely because it is understood that it is harder to program on a whiteboard without tools and while being watched and nervous. Not to mention if you asked an actually difficult question, it would be really hard for even the interviewer to know what's going on (whether it be looking at the code you wrote afterward or as you wrote it on a whiteboard). I cannot understand being "offended" by an easy question. If its easy then better for you right? You were already planning on spending an hour in the room with me, how could this possibly be a "wasting your time"? Just do it. Also, it is well known that this is part of many CS interviews, why would anyone be surprised by it? If someone gets offended by a whiteboard question its usually a fantastic indicator that they are a bad candidate in my opinion.

Additionally, these questions can sometimes also serve as fantastic early indicators that a person is way out of their league. As I said earlier, I don't think I've ever met someone who got everything right immediately (and I agree, if they did, the question would be useless). Lots of people think they did poorly because they choked and missed some edge case that had to pointed out to them -- not at all, I've been incredibly impressed with people that took them multiple revisions to hone their algorithm. However, when someone really doesn't know what's going on its painfully obvious, they usually won't even understand what the algorithm is supposed to do or have never even heard of it. I've had people with Masters take the entire interview, not finish the simple question, and not even be really sure what was asked.


"I cannot understand being "offended" by an easy question."

My thought process: Reverse a string? Really? Is this how they hired their existing programmers?


I think it's a fine question.

My thought process is roughly "Okay. Which language? Is it null-terminated, random addressable, or mutable? Do I get compiler errors or unit tests? Should I buffer manually? Do I have a signature? Can I assume single-width characters? Should I ask?"

caveat: The question needs to be asked right. Instead of "here's a task I need you to perform", it should be "can you walk me through how you would solve a task like this?".


No -- this is how they weeded out prima donnas.


I did an interview with a C++ exam. They left me in a room for an hour, without the internet. Damn, that was embarrassing. I need to log into facebook to tell you my name and birthday. Who the codes without google?


Actually, the article doesn't mention anything about coding on a whiteboard.


In my experience, the degree has not been a factor at all when it comes to value in industry. I have seen people that didn't have a degree produce more maintainable code in a more productive manner. I also have seen coworkers with bachelors outshine others with masters. I have climbed the corporate ladder at my company and now with my power to influence hires, I rarely take the level of degree in consideration.

Strangely, I have seen the attitude problem with new grads that possess a masters or doctorate. I even had one tell my manager that he had a masters and that meant he shouldn't ever have to do support work, and that he was smarter than my manager. He was a fresh 23 year old grad and eventually left for a larger company because he believed they would appreciate his masters. To add salt to the wound, we found that almost every bit of his code either broke our current projects or just flat out wasn't done correctly because he knew better than everybody so there was no reason for him to ask a senior dev or even a BA any questions about what he was doing.

All that to say, no matter the shop, no matter the degree, business is about money, and we, as coders, produce value which is why our salaries are high. A coder that is seen as valuable will always do better in there career and most of the time his coworkers will see his value and respect him for that.


I have a Bachelors in Engineering (BEng) in Computing from Imperial College, not a PhD, but I work in a group at Google where a significant percentage of my peers do, but only know this because from time to time people mention the kind of research they did because either it is relevant to what we are doing, or comes up naturally in discussion.

I don't see any difference between what I do, and what others in my group do, we are all in it together, but took different paths to get there. Our group is more research focused (though we are Software Engineers) than many at Google, and this has given me a great opportunity to learn from those with an (even) more academic background.

TL;DR - I don't see any differences between PhDs and otherwise at Google, even in 'researchy' teams.


At my previous employer, we hired on a guy who was very close to earning his PhD in CS from a midwestern university known for a prestigious engineering program. I was excited because I thought that at the very least he would be able to learn quickly and start tackling problems relatively quickly.

I was not disappointed. He was very assertive about not having his hand held wrt figuring out problems, and was proactive in learning new things. When I left to take another position, I felt comfortable that I was leaving the codebase in good hands.


> there is no job in the modern tech industry that involves working alone.

Positions with aspects of management or sales typically involve actually working with people, but this can be hard to achieve for many others. The high tech field can be great for people who want to work in isolation, and fairly lonely for people who do not. I have never worked in a place that effectively implemented things like mentoring programs, for PhDs or anyone else. I am on the East Coast, I suspect that this trend is less prevalent out West. You may form bonds with people through a work environment, but the majority of your daily (and weekly, and monthly) effort is a solitary endeavor. I have heard this complaint from people working on PhD-level industrial research, to coding, to website design. Getting in with a company that recognizes and values real teamwork, assuming you are able to communicate effectively with others, will make nearly any job a much more pleasant experience.


I think this is really interesting.

What "working alone" means is complex. For example, at Hashable, I was effectively working alone, despite there being a wider notion of collaboration. More over, when other people came in to contact with parts of the code I was actively working on, instead of trying to work through those components together (not neccsarily in a pairing way, but just a more general collaboration), the standard was to make you change and not care about the consequences.

In a code base without tests this was a nightmare.

If / when I am next looking for a job, really understanding how a company collaborates is high up on my agenda of questions.


One very good point this article brings to light is the emphasis academia places on the tenure-track job as the end-all be-all of getting a Ph.D. I can't even begin to count how many times I've heard how getting a professorship is the only worthy job, and how there's no point to a Ph.D. if you don't get such a job.

It's nearly to the point of brainwashing (a sweeping generalization, of course, which likely varies greatly from department to department). As a soon-to-be Ph.D. graduate, it's refreshing to know that industry is not only a viable option, but doesn't have the same [superhuman][1] expectations.

[1]: http://www4.ncsu.edu/unity/lockers/users/f/felder/public/Pap...


I'm 3 years into a PhD at a "second-tier" department (i.e. not MIT or Stanford, but still a pretty good place), and I've never felt any pressure at all to go into academia over industry. So there's one more data point for you.

Professors can make a huge "real-world contribution" via their students that go into industry. So they should be proud of those students and feel that the effort of educating them was highly worthwhile.


It's good to know there are still realistic perspectives in academia. To be fair, the "anti-industry" attitude is probably adviser rather than department dependent, as I have heard a few first-hand data points regarding advisers that align much more with your experience than mine.

Given the 1/10 statistic, sooner or later professors will have to satisfy themselves with student's "real-world" contributions, whether they like it or not. I'm just worried how often I see real skill sets (e.g. proper software engineering) being devalued among graduate students because it's not directly relevant to a hypothetical future job as a professor. I feel lucky that I've been able to make extra time to write "real" code from my dissertation work, but it's rather unfortunate that this isn't always the norm.


A professor from a local university was giving a computer science related talk; he was quite proud of how many of his PhD students (graduated and ungraduated) had been recruited by Google.


Good points, as a PhD in the industry this all aligns pretty well with my experience as well.

The only comment I have is regarding

>> Do people usually work alone, or with others?

> Again, the answer to this question does not depend on what degree you have. The simple truth is there is no job in the modern tech industry that involves working alone.

Agreed that this probably does not depend on what degree you have. But there is a lot of variability in terms of how many people you work with. I worked almost alone for a while, and in large teams at other times.

I suppose there is no job where you work 100% alone, with no contact at all with anyone, ever, but I doubt that was the original question.


I have a PhD and I worked for several by industry research labs. So when the OP mentions " he went “oh well, if you must make me code…” That pretty much made me go “no hire.” He clearly is being either dishonest or taking shortcuts, PhD's want to have a job that is interesting if it's to piss code all day I would have stopped at my Master degree. Pissing code, as much as many would like to think is not exactly challenging if you have a PhD (I'm not saying that having highly optimized, readable code isn't but having something that works is not that hard) so if the whole interview process is centered around how well you can write something in A/B/C/D language yes I understand the interviewee.

One funny thing: "Most places will actually pair you up with a mentor who is not your boss" this has to be a joke, I worked for HP-Labs and other big labs never ever had a mentor, usually if you have a issue you are pretty much welcome to discuss it with your colleague, have them review your proposed solution but boy you are on your own when it comes to do implement it.

Again the OP has no clue clue of what he is talking and it's sad because his research interests are security related: "You are much more likely to land an interesting job at the coasts. West more than East." I can prove easily that if you have a PhD in a security or somehow related (formal methods) you have more chances to land a job on the east coast: DARPA,IARPA, MITRE, DoD all these agencies are funding 80% of the security research of the country ! Easier if you are a citizen indeed, but I'm a foreigner on H1B and I still work on a DARPA funded project ... One more joke: "Most industrial positions will not care about your publications" seriously this guy is ridiculous or just unable to think outside his google circle, most labs are indeed more than interested in your publications because it means for them potential patent application, more contacts in conferences and the end goal a better reputation and more easier to build a consortium to respond to call for proposals or BAA(http://www.arl.army.mil/www/default.cfm?page=8). But coming from someone who has not many publication I understand, for information here is his publication list http://www.informatik.uni-trier.de/~ley/db/indices/a-tree/h/...

Anyway I'll stop commenting the rest of his post, his post is not all bad but it's based on his personal experience and he does a generalization out of his limited experience.


I've seen approximately zero correlation between degree status and ability to write code. The best programmer I've ever worked with has a PhD... but so have at least two of the terrible ones.

I also submit that if programming is "not exactly challenging" then you are working on the wrong products and/or at the wrong level of abstraction.

There are legitimate reasons to prefer research, or management, or marketing, or sales over development, but lack of challenge is not one of them.


The incentive system in academia is to produce people who cite their mentors in formal research. That's it. Because universities are non-profits, they go to the indirect incentives. This trumps any altruistic ideals. In their defense, most Phds in computer science are fully funded, so it's not transactional like an undergrad degree.

I've worked with several Phds with background in Stats and Computer Science. I found all to have "Good Background" - they knew a lot about their fields. Some were great at working with others, but some not so much. It wasn't a magical degree. Many regretted not getting the 5 years of work experience instead. Relative to the All But Dissertation crowd, they were a little better at Getting Things Done, versus just talking about things. (Very small sample size, so don't read too much into it)


I don't think your analysis of the incentives is accurate at all. If I were to oversimplify similarly, I'd say instead that modern incentives in CS academia, for non-tenured faculty at top research universities, heavily favor money, wherever it comes from (in large part because with budget cuts, professors have no choice). Grants are good, industrial partnerships are good, basically anything that brings in money is good. Citations are secondary, and used mainly as a stepping stone to bring in money. From that perspective, students going to a big company are good, especially a big company like Microsoft, IBM, or Intel that funds research. Students going into academia to write papers that cite the advisor are also good, but not as good as students who go somewhere that has money.


The incentive system in academia is to produce people who cite their mentors in formal research.

Really? My advisor has tenure, and I think he couldn't care less how much his students later go on to cite him. Why should he?

He will get lots of citations anyhow (some from former students) because he does a lot of good work, but that's kind of an orthogonal issue.


Academia has the same incentive system as other fields, i.e., a combination of money/fame/power and all three comes from doing great research. Great research will get you the citations regardless of the number of former students. Also number of citations is a necessary but not a sufficient indicator of great research.


Do you work 9 to 5, or do you take your work home with you? This, again, is completely independent of what degree you have. How you structure your priorities and your work to get it done within sane hours, and while maintaining some sort of “work/life balance” (I hate that term, but that is a whole other story) is entirely up to you.

This is indeed independent of degree but highly dependent of company you are working for. There are developers working 10-12 hours a day who cannot structure their work because they are just expected to stay in the office for said time period.


The major distinction I've found from having a PhD is that you come with built-in "social proof". If you're going for a job, you're very likely to get an interview. If you're talking to investors, you're very likely to get past the first triage. After that it's up to you of course, but that's pretty useful.


The submitted article contains a key fact, "The problem with this picture is that there are 10 PhD graduates for every tenure-track position. And, while I don’t have figures, the industrial labs don’t hire at a much faster rate either. And that left regular industry jobs as the only viable option for the vast majority of PhDs," which actually applies to a lot of academic major subjects. For the most part, sooner or later a Ph.D. holder may have to consider working for an organization in the for-profit private sector rather than for academia, simply because the private sector is where the majority of the jobs are.

I see that the top few comments in this thread as I read the submitted article are mostly about coding tests as a hiring procedure. If the coding test is a reasonably accurate simulation of actual work on the job, it is a very good idea for a company to use a coding test in hiring. From participants in earlier discussions here on Hacker News I have learned about many useful references on the subject of company hiring procedures, which I have gathered here in a FAQ file. The review article by Frank L. Schmidt and John E. Hunter, "The Validity and Utility of Selection Models in Personnel Psychology: Practical and Theoretical Implications of 85 Years of Research Findings," Psychological Bulletin, Vol. 124, No. 2, 262-274

http://mavweb.mnsu.edu/howard/Schmidt%20and%20Hunter%201998%...

sums up, current to 1998, a meta-analysis of much of the HUGE peer-reviewed professional literature on the industrial and organizational psychology devoted to business hiring procedures. There are many kinds of hiring criteria, such as in-person interviews, telephone interviews, resume reviews for job experience, checks for academic credentials, personality tests, and so on. There is much published study research on how job applicants perform after they are hired in a wide variety of occupations.

http://www.siop.org/workplace/employment%20testing/testtypes...

EXECUTIVE SUMMARY: If you are hiring for any kind of job in the United States, prefer a work-sample test as your hiring procedure. If you are hiring in most other parts of the world, use a work-sample test in combination with a general mental ability test.

The overall summary of the industrial psychology research in reliable secondary sources is that two kinds of job screening procedures work reasonably well. One is a general mental ability (GMA) test (an IQ-like test, such as the Wonderlic personnel screening test). Another is a work-sample test, where the applicant does an actual task or group of tasks like what the applicant will do on the job if hired. (But the calculated validity of each of the two best kinds of procedures, standing alone is only 0.54 for work sample tests and 0.51 for general mental ability tests.) Each of these kinds of tests has about the same validity in screening applicants for jobs, with the general mental ability test better predicting success for applicants who will be trained into a new job. Neither is perfect (both miss some good performers on the job, and select some bad performers on the job), but both are better than any other single-factor hiring procedure that has been tested in rigorous research, across a wide variety of occupations. So if you are hiring for your company, it's a good idea to think about how to build a work-sample test into all of your hiring processes.

Because of a Supreme Court decision in the United States (the decision does not apply in other countries, which have different statutes about employment), it is legally risky to give job applicants general mental ability tests such as a straight-up IQ test (as was commonplace in my parents' generation) as a routine part of hiring procedures. The Griggs v. Duke Power, 401 U.S. 424 (1971) case

http://scholar.google.com/scholar_case?case=8655598674229196...

interpreted a federal statute about employment discrimination and held that a general intelligence test used in hiring that could have a "disparate impact" on applicants of some protected classes must "bear a demonstrable relationship to successful performance of the jobs for which it was used." In other words, a company that wants to use a test like the Wonderlic, or like the SAT, or like the current WAIS or Stanford-Binet IQ tests, in a hiring procedure had best conduct a specific validation study of the test related to performance on the job in question. Some companies do the validation study, and use IQ-like tests in hiring. Other companies use IQ-like tests in hiring and hope that no one sues (which is not what I would advise any company). Note that a brain-teaser-type test used in a hiring procedure could be challenged as illegal if it can be shown to have disparate impact on some job applicants. A company defending a brain-teaser test for hiring would have to defend it by showing it is supported by a validation study demonstrating that the test is related to successful performance on the job. Such validation studies can be quite expensive. (Companies outside the United States are regulated by different laws. One other big difference between the United States and other countries is the relative ease with which workers may be fired in the United States, allowing companies to correct hiring mistakes by terminating the employment of the workers they hired mistakenly. The more legal protections a worker has from being fired, the more reluctant companies will be about hiring in the first place.)

The social background to the legal environment in the United States is explained in many books about hiring procedures

http://books.google.com/books?hl=en&lr=&id=SRv-GZkw6...

http://books.google.com/books?hl=en&lr=&id=SRv-GZkw6...

Some of the social background appears to be changing in the most recent few decades, with the prospect for further changes.

http://intl-pss.sagepub.com/content/17/10/913.full

http://www.economics.harvard.edu/faculty/fryer/files/Fryer_R...

http://books.google.com/books?hl=en&lr=&id=frfUB3GWl...

Previous discussion on HN pointed out that the Schmidt & Hunter (1998) article showed that multi-factor procedures work better than single-factor procedures, a summary of that article we can find in the current professional literature, for example "Reasons for being selective when choosing personnel selection procedures" (2010) by Cornelius J. König, Ute-Christine Klehe, Matthias Berchtold, and Martin Kleinmann:

"Choosing personnel selection procedures could be so simple: Grab your copy of Schmidt and Hunter (1998) and read their Table 1 (again). This should remind you to use a general mental ability (GMA) test in combination with an integrity test, a structured interview, a work sample test, and/or a conscientiousness measure."

http://geb.uni-giessen.de/geb/volltexte/2012/8532/pdf/prepri...

But the 2010 article notes, looking at actual practice of companies around the world, "However, this idea does not seem to capture what is actually happening in organizations, as practitioners worldwide often use procedures with low predictive validity and regularly ignore procedures that are more valid (e.g., Di Milia, 2004; Lievens & De Paepe, 2004; Ryan, McFarland, Baron, & Page, 1999; Scholarios & Lockyer, 1999; Schuler, Hell, Trapmann, Schaar, & Boramir, 2007; Taylor, Keelty, & McDonnell, 2002). For example, the highly valid work sample tests are hardly used in the US, and the potentially rather useless procedure of graphology (Dean, 1992; Neter & Ben-Shakhar, 1989) is applied somewhere between occasionally and often in France (Ryan et al., 1999). In Germany, the use of GMA tests is reported to be low and to be decreasing (i.e., only 30% of the companies surveyed by Schuler et al., 2007, now use them)."

Integrity tests have limited validity standing alone, but appear to have significant incremental validity when added to a general mental ability test or work-sample test.

http://en.wikipedia.org/wiki/Employment_integrity_testing

http://apps.opm.gov/ADT/Content.aspx?page=3-06&JScript=1

http://www.princeton.edu/~ota/disk2/1990/9042/9042.PDF

http://www.hotelschool.cornell.edu/research/chr/pubs/reports...

Bottom line: if you are someone with a Ph.D. degree in an academic subject, congratulations. If you seek a job outside academia with good management that understands research, be prepared to do a work sample test to get the job. Companies that hire on the basis of resume biographical qualifications (what degree you have) do demonstrably worse in hiring than companies that make sure that all job applicants can do the actual work of the job. Wouldn't you rather work somewhere where the company focus is on hiring the capable, rather than on hiring the possessors of school credentials?


The submitted article contains a key fact, "The problem with this picture is that there are 10 PhD graduates for every tenure-track position. And, while I don’t have figures, the industrial labs don’t hire at a much faster rate either. And that left regular industry jobs as the only viable option for the vast majority of PhDs," which actually applies to a lot of academic major subjects.

The humanities are worse: http://jseliger.wordpress.com/2012/05/22/what-you-should-kno... .


Here in Ukraine people write their PhD thesis and work in IT industry because scientist salary is extra low (300$) and average salary in industry is about 1500$+

I found the same situation with PhD status as this article highlights. Degree means nothing here. Really nothing. The only thing which matters are results of your work. PhD should be better in analysis, problem solving etc.


One thing to add: when we interview PhD's we make sure they want to be in industry. There is nothing more off putting when you talk to someone and it becomes immediately clear they much rather would've been in academia but they were not the right person at the right time.


I think in Europe (at least in Austria) a PhD is a pretty huge factor for a job in industry. In my opinion this is a bad thing, but academic degrees are very important here. This does not especially consider the computer science industry, but it happens there as well. There are some positions that are always taken by academics, for the sake of publicity and reputation. No matter what they can do, the only important thing is the PR-effect of a academic degree.


Nowhere on Earth has a higher opinion of academic degrees and higher education than the German speaking world. And if anyplace equals it I don't know where it is.


Even within the German speaking world, Austria has a reputation of being obsessed with titles.


XKCD's perspective: http://xkcd.com/664/




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: