Hacker News new | past | comments | ask | show | jobs | submit login
I've been a software engineer for 10 years and I can't do interview questions (reddit.com)
115 points by tempw on March 6, 2017 | hide | past | favorite | 175 comments



If you hate the idea of coding questions as part of interviews, please consider the other side for a moment.

There are people out there (not you, obviously) who can talk their way into an interview for programming jobs that they simply can't perform. We're talking about no skills whatsoever, apart from the ability to sling buzzwords around enough to impress a recruiter. God only knows how they get their degrees, but they have them.

Simple, straightforward programming questions that aren't already on the Web are like an immune system response to these non-programmers who interview for programming jobs. I'm truly sorry when a real programmer feels insulted when I ask them something like "given the starting and ending times of two calendar appointments, write an expression that's true if they conflict", because it's not their fault that there are people out there who will struggle for 20 minutes thinking about how to solve it before trying something like a doubly-nested loop over all the milliseconds in both intervals, testing for equality.

I am not making this up. It's a real problem. if you can program a digital computer, you have nothing to fear from a good coding interview. And if you have a better way to reliably identify the non-programmers, I'd love to hear it.


I myself have sat in an interview with my mouth agape as I watched a "senior" engineer try to figure out how to find the largest element in a one-dimensional array.

So, anymore I don't have much of a problem with simple coding questions to filter out total non-programmers (fizzbuzz, your calendar problem, etc.).

But the whole stand on one foot and implement a red-black tree in bash script on the whiteboard while I peck at my smartphone and sigh every time you pause to think has got to freaking stop.


> while I peck at my smartphone and sigh every time you pause

This is a much bigger deal to me than difficult algorithm questions. Shitty interviewers give shitty interviews, regardless of what kinds of questions they ask.

The worst interview I ever had was like this. The guy showed up late, told me he forgot he even had an interview, gave me a strange question with little detail, and told me to start coding. Then ignored me for an hour while he typed on his laptop. Even when I directly asked for feedback to make sure I was solving the problem he asked and not a misunderstanding, he gave me basically no feedback. I've had interviews that I bombed before, but this was the first and only time that I felt like I bombed an interview and it was utterly not my fault. It was also the only time I've had a bad interview that actually ruined the rest of the day for me because it threw me off so much.


I think I would just walk out at that, at the end of the day this is someone you are going to have to work with or for.


I wish I had. If something like that ever happens again, I will.


> implement a red-black tree in bash

The only good answer to that is "I can't imagine why I'd ever need to do that, and if I had to anyway, I'd use a 3-2 tree instead because it's almost as good and nobody ever gets red-black trees right the first time."


I had two people this week, 10+ years "experience", fail at writing a function removing duplicate values from an array.


I have interviewed two "senior web developers" with "10+ years" of experience that were unable to tell me the difference between a 404 and 500 http response code. Of course, they had perfectly memorized answers about JavaScript closures.


who memorizes http response codes?


Presumably, programmers who are working with the web should have a passing familiarity with the most common categories; this isn't memorizing all of them, but being able to recognize the significance of 2xx vs 3xx vs 4xx vs 5xx codes isn't an unreasonable expectation.


Who can't tell the difference between a 404 and a 500?


Somebody who rarely instructs a user agent to make requests to a server.


This isn't entirely accurate. I told my RN Fiance this story and she said "oh 404, that's like... page not found or something isn't it?" She has zero programming experience.


So, basically, somebody who never uses a web browser? 404 and 500 should be familiar just from browsing the web for more than a few weeks.


Modern browsers seem to do quite a bit to insulate users from error status codes in their default configurations.


The people in question do in fact use internet browsers on a daily basis.


That's good to hear.


Programmers who want their REST API to return correct http response codes. I have a good many of them memorized by now. :(

...someday I'll find an excuse to return a 418.


Bah! Just always return 200 and if there's an error, make your users look for an error property in the json return value that contains a thoroughly unhelpful error message, like "I'm a teapot." ;)


Web developers spend most of their time in javascript - coding fronted where closures and functional programming matters a lot. Maintennable modern fronted is done that way.

They spend very little time staring at http error codes. Like, none.


Web development includes server-side, and that API the fancy JS code is interfacing with.


Node.js is javascript too :). Even there, error codes are something you maybe handle once when the project is setup and the rest of the time you deal with everything else.


Are you sure it was exact question about largest element in a one-dimensional array and not some highly obfuscated question about bees, trains and traveller bags?


"There are people out there (not you, obviously) who can talk their way into an interview for programming jobs that they simply can't perform. We're talking about no skills whatsoever, apart from the ability to sling buzzwords around enough to impress a recruiter. God only knows how they get their degrees, but they have them."

"I am not making this up. It's a real problem. "

No, it's a survival bias. You have no way of knowing how they perform, unless you hire them, but you don't hire them, because they can't pass your interview process.

There are many reason why someone capable and experienced would "fail" an interview like that. But ultimately it just doesn't test for skills related to software development.


If a candidate can't construct a simple Boolean predicate to characterize a well-defined condition in an interview situation, do you really think that they're going to be able to do much more complicated work later if we hire them?

Be serious.


This is HN, we can't be serious unless everyone gets a medal


It's not just survivorship bias, it's also confirmation bias. These folks fail to acknowledge the adverse selections made by their process. What about the people on your team that didn't work out? What about the net-negative producers? I don't believe you that your whiteboard heavy process eliminates all of those people. I've seen the counter-examples.


YES. This is why interviewers ask experienced devs to do FizzBuzz. It's not an insult, it's a safeguard.


I consider myself an experienced developer and have experienced collegues who likely could not do FizzBuzz or solve simple algorithms like finding duplicates. The worst was when they had high charisma and could talk well in generalites and bluff - took long for management to figure them out.

Experience taught me not to be insulted about those questions. They guarantee all collegues will be able to pass them.


Fizzbuz is not what we're talking about here. It is "balance a tree" with me looking over your shoulder (and $100k in a suitcase) type of problems.


So that if the candidate struggles at fizzbuzz (instead of showing off a well rehearsed set piece), they have an indication that what the candidate told before stems from actual ability and not from ruthless optimization for the local maximum of being good at interviews?


OK, but I got this question:

http://www.geeksforgeeks.org/given-n-appointments-find-confl...

It requires the simple comparison you mentioned, but is far more complex to come up with an efficient algorithm. Needless to say, I totally failed that one. Never heard of an interval tree until I looked it up after the interview. It took me just a moment to look it up with my awesome google skills and if I had to actually use that, I could copy paste that code in an instant :-) Even better, I could find a library and not even have to copy paste.

One time I was working on a language translator at my actual job. I was storing formulas for excel columns in a database and they had to be translated from their representation in the database to the excel formula language. It turns out that the columns had dependencies on one another and they had to be processed in dependency order. This required something called a topological sort. Again, something I had never heard of before I encountered this problem at work (I do have a CS degree). But I found the solution by a quick google search and employed a library that was already in use in my client's software that contained that algorithm.

That's the actual skill that's valuable: translating your requirements into a bunch of search terms that will quickly give you the name of the algorithm you need. (and obviously building the inputs / working with the outputs of that algorithm). Yeah, I guess if you already have all of those in your in your brain already you might save some time, but what about when you encounter a problem you don't know how to solve?


Not sure if I'm overlooking something obvious, but given the possibility to reduce dates/times into simple integers (doable according to the example), you should be much faster sorting the data and using and intelligent comparing than allocating a tree with O(n•logn) nodes. Those heap allocation should slow you down drastically, at least for any problem size where linear work is still efficient enough.


I do not understand the problem you linked to of finding "conflicting appointments". It does not usefully define what a "conflict" is, only saying that "An appointment is conflicting, if it conflicts with any of the previous appointments in array". It does not say what "conflict" means for appointments.

A natural assumption is that two appointments conflict if their times overlap.

The example instance of the problem gives this as input:

> Input: appointments[] = { {1, 5} {3, 7}, {2, 6}, {10, 15}, {5, 6}, {4, 100}}

The problem spoke of "appointments in array" and this is an array, so apparently {1, 5} {3, 7}, {2, 6}, {10, 15}, {5, 6}, {4, 100} are the appointments themselves. So we have 6 appointments.

So I'd guess that {1, 5} means an appointment that starts at time 1 and ends at time 5. That {4, 100} concerns me, though...if the numbers are times there is no obvious unit of time for which all of these appointments make sense.

They give this as the expected output for the above input:

  Output: Following are conflicting intervals
  [3,7] Conflicts with [1,5]
  [2,6] Conflicts with [1,5]
  [5,6] Conflicts with [3,7]
  [4,100] Conflicts with [1,5]
OK...all of those ARE conflicts under my guessed interpretation of the notation. But under that interpretation, {2, 6} should conflict with {3, 7}, and {4, 100} should conflict with everything else.

So that means my interpretation of the notation {x, y} on input meaning an appointment from time x through time y is not right.

Another idea is that {x, y} means person x and person y have an appointment together, and two appointments conflict if the same person is involved in both. Nope...that doesn't give the sample output either.

OK...maybe the sample output is wrong. So now I grabbed their code and compiled it on my Mac and ran it. It produced this output:

  Following are conflicting intervals
  [3,7] Conflicts with [-1960454340,32767]
         [2,6] Conflicts with [-1960454340,32767]
         [10,15] Conflicts with [-1960454340,32767]
         [5,6] Conflicts with [-1960454340,32767]
         [4,100] Conflicts with [-1960454340,32767]
WTF? OK...so I take it to a Linux box, and try it there, and it gives the output given at the site. Hmmm. I do see that I got a compiler warning on the Mac and not on Linux, about control reaching the end of a non-void function. The function newNode is missing a return! Putting "return temp;" at the end gets rid of the warning, and then the Mac produces the same output as Linux. (Raising the question of why gcc on my Linux box did not warn about the missing return...).

Anyway, I see that there is a function in there to determine if a pair of appointments overlap:

  // A utility function to check if given two intervals overlap
  bool doOVerlap(Interval i1, Interval i2)
  {
      if (i1.low < i2.high && i2.low < i1.high)
          return true;
      return false;
  }
OK...looking at that, it sure looks to me like it should say that {2, 6} and {3, 7} overlap. A quick test confirms that {2, 6} and {3, 7} in fact do overlap according to doOVerlap. So why isn't it figuring that out?

Putting a printout in doOVerlap to see how it is getting used:

  // A utility function to check if given two intervals overlap
  bool doOVerlap(Interval i1, Interval i2)
  {
    cout << "check " << i1.low << "," << i1.high
        << " against " << i2.low << "," << i2.high << "\n";
    if (i1.low < i2.high && i2.low < i1.high)
        return true;
    return false;
  }
changes the output to this:

  Following are conflicting intervals
  check 1,5 against 3,7
  [3,7] Conflicts with [1,5]
  check 1,5 against 2,6
  [2,6] Conflicts with [1,5]
  check 1,5 against 10,15
  check 3,7 against 10,15
  check 1,5 against 5,6
  check 3,7 against 5,6
  [5,6] Conflicts with [3,7]
  check 1,5 against 4,100
  [4,100] Conflicts with [1,5]
OK...so it is not finding that {2, 6} and {3, 7} overlap because it is never even checking them against each other.

I'll leave it to someone more curious than I to figure out what is wrong with their "solution".


PS: here is a quick and dirty Perl program to do it. I did not bother matching the output format exactly, as I just wanted to get the algorithm right. This should be O(n log n) where n is the number of appointments, assuming that the built-in sort is O(n log n) and that the built-in hash implementation is O(n log n) or better.

    my @appt = ([1, 5], [3, 7], [2, 6], [10, 15], [5, 6], [4, 100]);

    my %active;
    my @endpoints;

    for (my $i = 0; $i < @appt; ++$i)
    {
        my($start, $end) = @{$appt[$i]};
        push @endpoints, [$start, 1, $i];
        push @endpoints, [$end, 0, $i];
    }
    @endpoints = sort {$a->[0]*2+$a->[1] <=> $b->[0]*2+$b->[1]} @endpoints;

    foreach (@endpoints)
    {
        my($time, $side, $which) = @$_;
        if ($side == 0) {
            delete $active{$which};
        } else {
            print "$which overlaps $_\n" foreach sort {$a <=> $b} keys %active;
            $active{$which} = 1;
        }
    }
Idea behind this: make list of all the appointment endpoints, tagged by whether they are a start or end time and which appointment they came from. Sort this by time, with end times coming ahead of start times if a given time occurs more than once. You can then scan this list, keeping track in a hash which appointments you have seen a start time for without yet seeing the corresponding end time.

When you see an end time, you can delete the appointment from the hash. When you see a start time, the appointment that supplied it overlaps with all the appointments currently in the hash.


What happens when you're asked to solve a problem that hasn't been solved before? By definition something you CANNOT google for

That's what I was looking for as a hiring manager.


Generally you end up doing a review of relevant literature and a few days/weeks/months of research and development. I'm not sure answering a question under pressure that has just been asked is that relevant to making ground in new areas of research.


The point is that solving problems, either by inventing new computer science, or more likely formulating a hybrid of known algorithms requires some semblance of problem solving.

A genuine interviewer (and I realize several/many may not be genuine) is solely trying to figure out if the candidate has these types of reasoning skills.

Let me phrase it this way: By asking a common CS question, you'll get people who simply memorize answers/algorithms. By asking something obscure that is rarely known, you can try to get a glimpse into someone's thought process, which is infinitely more valuable than rote memorization.


Right but there are lots of ways to test problem solving skills that don't involve asking people to both solve a surprise (and obscure) problem and come up with code at the same time whilst under stress. The latter three things are probably clouding your results in a way it's not really possible to account for. It seems like a lot of people setting interviews lack a bit of empathy.

The grandparent seems like the problem solving equivalent of yelling "think fast" whilst tossing a basketball at someone's face to test reflexes.


Okay, give us a generalized example? everyone who (appears) to care about being empathetic while at the same testing for these skills is continually asking for better options.

If you're hiring a point guard, isn't testing their reflexes by tossing them a basketball a reasonable approach?


Ask a simple, standard question that the candidate could reasonably answer in 10 or so minutes, confirm it works with them, and then change the parameters to invalidate their solution?

This is the closest I can get to actual project work in 45 minutes.


The second the interviewer changed the parameters people would start complaining that they were "Setup to fail." These threads commonly illuminate that interviewers simply cannot win, regardless of how genuine they are.


That's why I recommend confirming that the candidate's first solution was right. Use lots of positive feedback: ok, looks good, that's right, that's how I would do it; I see where you're going with this; etc. Then introduce the change, with a justification e.g. the team providing the input data changed their format/guarantees/technology, and we have to blah blah blah.


Why bother? You still fail to detect candidates who memorized the answer.

My standard coding question is a funny-looking prime sieve. If someone isn't familiar with one, we can work it through together and I'll have a much better idea of their learning potential & what they'll be like to work with than the intern straight out of their Algorithms final.


Personally, I don't bother. I went the route of requesting candidates go through a work sample project that I spent a considerable amount of time trying to standardize across all the languages relevant to the role. In the end though, management wasn't for me, and so I ended changing roles for something non-managerial


Work sample tests are better in several ways to white board/puzzle tests. The latter is what the original article is objecting to, not coding demonstrations in general.


No. Non-programmers are great at getting other people to do their work for them. I think that's how they get their degrees. It's only when you force them to demonstrate a small bit of problem-solving and coding skill that you can expose them, and that has to be done in real time and off line.


> to demonstrate a small bit of problem-solving

That's what the work sample test does. It doesn't have to be given off site.


I am not the author, just to clarify that.

I think the point here is more about how experienced programmers track records are not enough to prove they can do the work sometimes.

Using whiteboard questions to prove ones competency can backfire as well in cases candidates are not really prepared but learned to game the system.


The real problem with whiteboard tests (and it's close cousin, the live coding test) is there is no correlation between being able to pass that test and being able to do the job. Companies that rely on it to determine programming ability are like the drunk looking for his keys under the street lamp because that's where the light is.


"no correlation"? Nonsense. If you can't program a computer, I'll find that out. And if you can't program a computer, you wouldn't be able to do the job.

The point of my coding interview isn't to give a positive result if the candidate can program a computer. It's a test that's looking for non-programmers.

There might be "false positives", but if I avoid well-known memorizable questions, there won't be "false negatives".


There will be false negatives too, but most likely not related to the skills but something else, like mental health or work ethic.


>struggle for 20 minutes thinking about how to solve it before trying something like a doubly-nested loop over all the milliseconds in both intervals, testing for equality.

Tell me you're kidding... that this is just a contrived example. please.


Nope, not kidding. A new grad with a masters in CS.


Actually this isn't a totally trivial question.

The time zones in calendar appointments can change, if there's a switch between standard time and summer time.

You need to know when is the hour set back by 1 hour, and in that case sometimes the answer is ,,maybe''.

I was working at a small phone company, and handling the switch between time zones were not just non-trivial, but they caused hard-to find bugs, as these things happen only twice a year.


Which is great, because you can pivot and expand on the question.

Once they've got the comparison out of the way, you can bring up those issues and see where they grow their implementation.


I was in the same situation a couple of years ago. I had over 15 years of experience, was invited for a technical interview at Amazon, and failed. At first I was irritated, just like the author. But then I decided to turn this negative energy into something positive and followed a couple of courses on Algorithms on Coursera, also I discovered Hackerrank. And I found out that I actually loved solving algorithmic problems!

Now it is a couple of years later and I actually was able to land a job thanks to my new skills (not at Amazon though). But what's much more important: I found a new hobby that will keep me entertained for many more years.


My story is the same too. I was a promising junior developer when I went to an interview at Amazon. Got my back handed to me. I was upset, but decided to work on it. Read a few books on the topic, and I no longer find these interviews difficult.


Before I interview I mention that I am uncomfortable with whiteboards. I prefer pen and paper. It's faster to write and feels a bit natural.

Only one company denied me the request and I didn't go ahead for them. Their loss.

I've also found having a github profile where you've files issues on projects and made a couple of commits really helps the chances. It gives more data than a resume.

At one interview, the interviewer read some of my commits and he said "I don't have any doubts you can code, but I wanna ask you about design". He went on to ask why I did things in a certain way.

But seriously. "Say no to coding on whiteboards".


I'm the opposite way. I prefer the whiteboard to pen and paper.


I was in the same situation, but I simply stopped doing these kind of interviews and started doing contract work. The pay is better with less strange questions asked upfront.


It's weird how companies will hire another company to build a tech product and never ask to see examples of code or put them through white-board interviews. You just show them previous work and talk about what the client needs and how you can meet them.

But even in right-to-work states, to hire a person requires extensive hoop jumping for the employee. You can't just show them previous work and tell them how you can help them, you have to PROVE it with outlandish questions they may have little to do with the actual job.


It feels like employers think employees want to steal something from them...


I think algo/ds interviews are created to test only one thing, and that is hard work. The idea that they test problem solving is mostly bullshit.

Most people are good at algorithm interviews because they've practiced them a ton. They've seen basically all the different kinds of questions that get asked on an interview, and are able to use that mental database to answer any questions.

Most of the time in interviews, interviewers will commend an applicant's "problem solving" skills if they did well on a difficult algo problem, when in all likelihood they've seen that exact problem, or an extremely similar problem before. If the applicant hasn't seen dijkstra's algorithm before and is able to derive it in a 45 minute interview, you would need to hire them on the spot, assuming that Dijkstra himself didn't develop this algorithm in 45 minutes. That is a contrived example, but you get my point.

For better or for worse, algo interviews are for testing how much work you've put into studying (mostly worse).


I've seen that many good interviewers have a problem that has a straightforward sub-optimal solution, which they expect you figure out ('problem solving') and code. If it can be solved better using Dijkstra, that comes as a follow-up.


After working with a couple of people who did not liked to think, I see value in willingness to study and understand a bit more complicated things.


Most people are not good at doing these interviews. Which is one reason why they work better than a lot of people here think they do.


I've had a lot of trouble finding a job since shuttering my start up, which I constructed on my own. I open sourced my code, but no one seems to care. They still just look at me as if I have no experience and pass on me before a technical assessment.

It's troubling to know that engineers with many years of experience still have difficulties. Every job listing I see asks for someone with 6+ years, and the only feedback I ever get is "we're looking for someone with more experience, try again in 4 years."

Tech companies should really just hire for soft skills. Screw tech interviews completely. A lot of this stuff can be taught. Maybe just give people a contracted trial period. Companies are spending incredible sums of money just searching for candidates that meet their exact "needs." In reality, they're not going to know how well someone performs until they actually work with them. Stop wasting your employees' time by having them grade pointless coding quizzes. Use that time instead to train someone who's a good communicator, is passionate about the product, is creative and eager to learn, and wants build something your customers will love. Even if that doesn't work out, at least you didn't waste everyone's time.


You are arguing that we should just hire anyone regardless of tech skills if they can carry on a conversation? It takes years to do learn how to do actual complicated programming. I think we need to screen for both tech skills and soft/people skills. I always strive to do both in my interviews.


I'm not saying that, and I doubt there is a very high number of fraudulent applicants to companies. If that becomes an issue, then there are less terrible ways of dealing with it. Wha I am saying is that we should let people's experiences speak for themselves. If someone's been a developer for three years (and didn't get fired for incompetence), then they probably know how to code.


> I doubt there is a very high number of fraudulent applicants to companies

The payoff is a $100,000 salary that you can collect for a month or two, of course there are fraudulent applications.

> If someone's been a developer for three years (and didn't get fired for incompetence), then they probably know how to code

I've worked with very capable Ph.Ds with wonderful problem-solving & analytical skills who would write the worst code, and were the worst at accepting feedback because of their extremely visible and objective achievements allowed them to believe they didn't need to improve their coding skills (which had already gotten them so far)


Having been in your position, I understand that having built a product from the ground up makes you feel like you are qualified to do anything. I was lucky enough to get hired at a late-stage startup after my endeavor failed, and am constantly reminded that time in the industry matters. Two years of intense coding is worth a lot, but you've only been exposed to a limited set of problems, both technical and business. 6+ years sounds reasonable to me if they are looking for someone with well-rounded experience.

Mind you, as someone else mentioned, some folks spend 6 years in the industry and don't gain the knowledge and experience you already have, but that doesn't mean you've attained 6 years worth of knowledge and experience.


I don't claim to be as good as someone with six years of experience (I'm not delusional). Nor am I claiming that there are no companies that need to hire senior engineers. I'm well aware of many types of problems for which I am not qualified to solve. What I am asking is, does EVERY company need to exclusively hire senior engineers? I've interviewed at companies whose product was not much more than a CRUD app, and, despite performing fairly well in a technical assessment, was rejected for lack of experience (even though they could see how much experience I had right on my resume).

Most companies don't need the best of the best of the best. They can save a lot of time and money by hiring someone who, with just a little bit of investigation, they know is going to come in with some decent technical skills, and can quickly learn to become a valuable asset.


Companies with that mindset invest heavily in interns: you get a three-month work sample in your working environment & can (initially) offer them less than an experienced engineer.

Most companies have problems they needs solving right now or in the near future, not when an intern gets experienced enough to tackle it.

> come in with some decent technical skills, and can quickly learn to become a valuable asset

This sounds like you described a consultant.


Yeah I was in a similar position with a lot of code work from a startup that crashed (wasn't the founder, but 1st technical team member, and the founder ended up being useless as a reference) that people refused to look at.

One thing I unfortunately take from that period is a bitterness toward people who complain about whiteboard interviews, because I was never even given a chance to do coding problems. I understand that many of the complaints are rational (especially when the interview skills are totally misaligned from the on-the-job skills), but since I was stuck before that phase, I'd emotionally rather have less early filtering and more code-based interviews.

If I actually came into a position of hiring power, I'd take time to reevaluate and separate myself from the question, but that's the instinctual response my history has brought me to.


But could all practicing lawyers pass the bar right now in their respective state? I'd guess no. The problem is not the questions necessarily, it's that you have to study a bit and do the same goddamn interview questions at multiple companies, sometimes with very different interviewers. Some are helpful, some make me nervous, some are asking the hardest possible questions without warmup easy questions, etc.

Whiteboard or live coding is not the problem. Bad or untrained interviewers are, along with the fact we can't do one interview that proves to multiple companies what level we are at.


However, should a practicing lawyer have to re-pass the bar in order to land a new job?


Are we going to make the PE exam a requirement for the title of "Software Engineer"?


It really should be considered. I'm a registered (civil) EIT. IT would help towards the ambiguity in hiring and the ever increasing absurdity in interviewing.

Especially since the conflicting statements of "software jobs need protection to ensure good salaries" and "everybody needs to code" are all to prevalent here.

Being a PE means you have a certain level of experience and skill. It's a distinguishing feature and sets you apart as a "real" engineer. Additionally, it holds you to an ethical code.

What continues to astound me is the lack of rigor and ethical backbone coming from engineers in important places. Yeah you're not building a bridge, but incidents like Volkswagen and Yahoo could have been prevented if people's jobs were bound by not being unethical.


I suspect your advocacy of a PE designation for software and the discussion in this thread are slightly orthogonal. Not that is a bad characteristic, but I want to point out that the question under discussion would be similar to asking if civil engineers should prepare as if they are sitting for their PE exam every single time they start the interviewing process? Should practicing doctors, no matter how accomplished, prepare as if sitting for their boards every single time they start the interviewing process to say, move to a new hospital?

I also suspect that everyone is asking the wrong questions. From the anti algorithm questions for interviews crowd, there is scant acknowledgement that it can't be a blanket policy, as undoubtedly some positions are strongly algorithm-heavy roles. From the pro- crowd, there is scant acknowledgement that by the time you are trying to find new high tech hires through a process that bases an evaluation upon actual interactions that are measured in hours or at best days, you may have already failed. The industry is fitting an interviewing process last majorly overhauled during the industrial age that screens for basic three-R's and attendance skills onto a post-industrial landscape where those skills are barely above cosmic background radiation noise, and demonstrated achievers are as often groomed into hirings through networking over years and decades. Perhaps part of the response to better outcomes to "this high tech interviewing process that takes place in hours or days" isn't "find better questions" or "find a better procedure that still fits in hours or days", but "re-think our premises"? Some kind of PE might fit into that re-think, but the brokenness is so bad that Google has quantified the inability of our conventional processes to yield significantly-better-than-random outcomes, so it might be time to start questioning the entire premise that we can even hire based upon tech interviews in the first place.


> but I want to point out that the question under discussion would be similar to asking if civil engineers should prepare as if they are sitting for their PE exam every single time they start the interviewing process?

I suspect the typical PE exam is quite a bit more rigorous than the typical technical interview. Technical software engineer interviewees are asked to do thinks like reverse linked lists and provide the most basic runtime and space complexity analysis. The structural engineering PE covers everything from load analysis and building codes to runoff analysis an slope stability[1]. There are 9 different "breadth" exam areas and 3 different "depth" areas, and they're all covered.

I think it's slightly ridiculous to compare understanding basic datastructures and algorithms to sitting for a PE.

[1] http://ncees.org/wp-content/uploads/Civ-Str-April-2015_with-...


I think what they are trying to point out is PE holders don't revisit taking even part of the PE when they interview. Is that true, though?


I have no idea what a typical (civil/structural/etc) engineer goes through when interviewing. I suspect that the PE isn't actually very relevant, though, since most practicing engineers are not PEs. Something like 20% of engineers get their PE. I'm guessing the interview process for PEs and non-PEs is pretty similar.


Doctors, lawyers, and engineers have other forces regulating their competence. In addition to licensing, they also bear personal liability for malpractice. As a result, malpractice claims that stick in those fields is significant news because it's relatively uncommon.

If it were possible to sue a software developer to recover their salary and for damages done, you'd better believe that nearly all (though not all) of the people who are rather less than qualified would be out of the profession.


> Should practicing doctors, no matter how accomplished, prepare as if sitting for their boards every single time they start the interviewing process to say, move to a new hospital?

But software engineers doesn't have any such designation or examination process, so I don't understand this analogy. If anyone could call themselves a doctor, then the interview process would have to be far more rigorous.

Also, I think doctors do need to rewrite certain exams every few years to retain their medical license.


Most everyone with an undergraduate CS degree will have taken an algorithms course. I think the comparisons with designation and examinations is drawing a rough equivalency to these designations, for example. I can't offhand think of another high-skill profession where the interviewing process tends to draw upon one phase of a career (undergraduate studies), and one specific aspect in that phase (algorithms and not say, compiler design, or programming languages survey, or software engineering, etc.), so our field might be a little odd in that respect if indeed that kind of emphasis is rare across high-skill fields. Doctors, PE's, attorneys have continuing education requirements, but I haven't heard of doctors boning up on say, anatomy or OChem for their interviews, for example.

For those who hold a PE, what have interviews in your field been like?


Many professions have designations that require passing standardized exams, minimum qualified experience requirements, and on-going professional development requirements. Doctors and lawyers are the most prominent such examples, but it is also true of actuaries and licensed professional engineers. Tradespeople, like plumbers and electricians, also have licensing requirements. In finance, there is the CFA designation, along with a number of securities exams and registrations imposed by financial regulators for certain roles. Managers, of course, have the MBA professional degree, which somehow counts as a "professional degree." These systems are beneficial to both employers and employees. As an employee, you gain a certain amount of job security, mobility and collective bargaining power by having a recognized professional designation and an association of peers that can prescribe ethics and safety rules to defend against abuses by management. And as an employer, you can safely hire professionals without an onerous amount of vetting - you know that, for example, your electrician won't burn down your building with faulty wiring because he can't solve an electrician's equivalent "Fizz Buzz."

For whatever reason, the software engineering profession has no such professional organization. Perhaps it is because software engineering is a relatively new field. Many of the examples of functioning professional associations are related to very old professions (e.g. doctors, lawyers, and even actuaries), and some have their roots in the medieval system of guilds and apprenticeships. The MBA and CFA professional designations were established in the 1940s.

Somehow it is difficult to imagine that, at this point, a software engineering professional designation could ever be established.


If "software engineers" want to be treated like "professional engineers".

Licensing has been proven to work for other engineering fields.


You know the process is fucked up when experienced professionals (with a track record of delivering projects) start complaining about how companies don't look for the ability of delivering projects.


Every engineer says they delivered projects on time with great maintainable code. Unless the project is open source, they often have no way of giving any evidence.

I'm not saying whiteboard interviews are the right way to interview, but in my experience you can't trust people's judgment about themselves and their skills. You need an interview that is equally fair to the person that will exaggerate their past successes to the people that are more reserved or the people that don't have much experience yet.


>>Every engineer says they delivered projects on time with great maintainable code. Unless the project is open source, they often have no way of giving any evidence.

Okay, but having them solve coding problems on the whiteboard doesn't provide any evidence of that either.

If they say they delivered projects on time with "great maintainable code," ask them what makes code maintainable, have them provide examples (using pseudocode) and talk through each one. Ask them about a time when they had to sacrifice maintainability for time, application performance or user experience. Ask them what they learned from the ordeal.

In my opinion you're going to be much more likely to accurately judge their expertise level based on their answers to such questions. The reason is simple: one can memorize whiteboard questions prior to the interview. They can't, however, bullshit their way through questions that are experience-based - and if they do, well, you should hire them immediately because you just found yourself a great salesperson!


I think its just as easy to memorize 'design' questions as 'whiteboard' questions. Not every whiteboard question is "implement a linked list", in fact I'd argue that's very rarely the case (or maybe I'm wrong and I've been insulated from these interviews). Most of my whiteboard interviews have involved a relatively small amount of actual coding (I think I had one question that was more than ~15 loc in python, and I've interviewed a few times).

The rest was taking a relatively abstract problem presented in English and understanding it, removing ambiguities, and discussing tradeoffs with the interviewer. After that, a recursive function or a hash map or whatever is really easy.


As a person who successfully solved a problem but was rejected because of coding style (on a whiteboard, not on IDE), I am respectfully wary of the term "removal of ambiguity".


How do you mean?

By "removal of ambiguity", I'm talking about when someone poses a problem like "Write a function to recognize an IP Address". My first thought is then "ipv4 or ipv6", because that problem is underspecified, and I don't want to inadvertently solve the incorrect problem (which is kind of a case of the xy problem)


Having solved a problem on whiteboard, the solution looked ugly. Reason it looked ugly is because I used variable names like i, j, k, did not adhere to OO principles and messed up indentations. All these, because I did not have an IDE and wanted to focus on the solution more than translation into code. Rejected. Recruiter told me, "Maybe this is the way you code but the interviewers didn't seem to like it".

All I could think of was, "This is not the way I code at all. I just gave a simulated solution to a simulated problem solving environment". If you want to look at my coding style, give me a manageable problem and an IDE and I'll solve the problem with a focus on style.

I've not had a problem with phone interviews simply because that is a more natural coding environment.

Could I practice whiteboarding and become better? Sure.

Is whiteboard coding style a relevant metric of measuring my productivity as a software engineer? Maybe not!


> Unless the project is open source, they often have no way of giving any evidence.

Isn't this is the case for every job applicant in every industry? You look at the evidence available of someone's work history and make a judgment call.


> you can't trust people's judgment about themselves and their skills.

Unless the candidate is fresh out of uni, any qualified candidate with experience should be able to provide at least several solid contacts at previous employers from which to obtain this information.


References from co-workers aren't really that valuable. Many people have no qualms about giving glowing reviews for friends, even if those friends don't really deserve glowing reviews.


Like a pair coding project on a computer with access to their internet resources?

Can find out a lot about a person in a one/two hour pair coding interview.


In other engineering disciplines there's accreditation and references for this sort of thing. No CEng for software engineers sadly.


The interview process is great for employers...

1. Make qualified job interview candidate feel stupid by asking difficult questions unrelated to the job

2. Offer candidate low salary and argue that they barely passed the job interview

3. Profit

Best solution I can offer is to not play their game. You'll lose no matter how hard you try. You're better off studying marketing and psychology for job interviews.


Great points about marketing and psychology. once you've won the lottery and got a whiteboard question, that you've prepared for. You additionally have to win unconscious biases of the interviewers and exhibit behavior that makes them judge you as a trusted "cultural fit" tribe member


Sigh. I completely crashed and burned at a phone screen once with a question that was easy but required knowing a bit shifting trick. I knew that the trick existed because I had glanced at it while studying and thought, "no way they'll ask that."


> You're better off studying marketing and psychology for job interviews

Well, yeah. You should know how to sell yourself (to get past the resume screening process) and analyze the company's goals/culture/intent/desires (to avoid working at the next Uber).


Studying an algorithms book (like Cracking the Coding Interview or Programming Interviews Exposed) is also an effective way to "hack" the interview. :-P


Form over substance, as usual with large bureaucracies. Unfortunately, tech seems to view the biggest and most "successful" company as the model way to move forward, instead of throwing up the middle finger like they used to. It's sad to see the transformation of Silicon Valley to a pseudo-government style group of mega-conglomerates. The next batch of real innovators will not be from the Valley or be from Valley culture.


Yep. Used to be, the dream was to beat Bill Gates. Now the dream is to get bought by Facebook.


The dream was to make your own rules (which includes a ferrari, a private office and all sorts of perks). Getting bought by Google, Facebook, etc. is the new way of getting "fuck you" money.


I worked as a programmer for a promotional marketing company and their modus operandi was to copy everything they could from their biggest competitor.

Eventually, it even got so bad that we were instructed to copy, word for word, certain disclosures from the competitor's web site.

The big competitor is still the big competitor and my former employer is still doing OK but they'll never catch or pass anyone by copying what they're doing.


I have spoken about this to many SEs who are senior enough to conduct interviews in their companies. I ask them, what do you intend to get out of such interview process. Their response: 1) Evaluate how to think about a problem 2) Evaluate their approach to a problem 3) Evaluate how to solve by asking questions

So, most of these interviews, an experienced Software Engineer dons the hat of a psychologist (without having any qualification/experience of a psychologist) and evaluates an interviewee.

How sad is that?

Whats worse? I know a few who prepare to interview a candidate by going through "how to crack the coding interview"!

I'm only glad that there are quite a few in my circle who believe in giving a problem they have faced at work, to a candidate, and ask the candidate to solve, using the same tools they use, with the laptop hooked to a projector. Interviews where I have done this as a candidate, I feel are more respectful of the candidate.

Many of those who love the "let me be an unqualified psychologist today" interviewing style disagree with my opinion. So do I, with their's.


Application performance matters, that's why these data structures and algorithm questions still matter. The issue is dogma surrounding the technicalities of data structures and algorithms.

We make sure in our group to focus on application of those concepts, not memorization or detailed knowledge of the concepts themselves. But yeah, fast algorithms matter. Memory usage matters. Proper data structures matter. I don't really care if you can spit out a syntactically correct implementation of a red-black tree on the spot, but you should know what a tree is and why you might pick or tree over a hash table.

I'd rather hire a writer who can really get a concept across but relies on autocorrect for spelling than a writer who can spell the hell out of stuff but can't actually write. Dogmatic interviews weed out the software engineers with the big picture skills and performance thinking who might be fuzzy on the details, like experienced folks who are decades out from school.


Application performance matters, that's why these data structures and algorithm questions still matter.

THIS! It's one of my go to stories but back when I was working as a co-op during my undergrad days, in my department there was a .NET programmer who was skilled but had no understanding of the underlying data structures.

He would be mystified at how I could write Perl programs that would outperform his .NET programs, even on less robust hardware.


> Application performance matters, that's why these data structures and algorithm questions still matter

I'm yet to have a problem where the particular data structure makes a huge difference on performance, I'm sure the exist but I don't come across them much personally. Most of the time an inefficient list search would not be noticeably different from a dictionary lookup, sometimes faster. What I have come across quite often is people who quibble about data structures but have n+1 queries everywhere and don't know what a join is.

Another thing that's critically missing in the field is experience with a profiler, someone that knows how to use a profiler will know more about performance than anyone thinking about algorithms. In fact, it's an approach I'd like to try in future for vetting candidates.


And premature optimization (ignoring architectural concerns) is usually a mistake.

Profile, find hotspot, then use your clever cache-friendly hand-rolled tree algorithm when it makes sense.

I'd rather a programmer understand that than to have a bunch of programming puzzled memorized.

And for seniors, architectural questions are far more critical. Fixing a hotspot may get you a multiple or maybe 10x. But architecture is what gets you orders of magnitude.


It's all become a game now. If you spend enough time on sites like codefights and hackerrank and know all the algorithms you can land a job. However, if you aren't prepared for these whiteboard interviews you are screwed.


This is by design, they're looking for students.


Students, super high achievers / superstars, and people who are willing to "play the game" and re-study everything for 2 weeks or a month.


LMAO.2 weeks?

HA!

Currently a graduating student that goes to one of those top schools. Do you really wanna know how long many of my peers are studying for these interviews? Months!!!!! I have a few friends that spent all of last summer after their daily internships doing interview prep for fall recruiting season. I also have a cousin that spent 5-6 months unemployed(given he did a masters/bs in ee and not in cs but wanted to move into the field) doing interview prep and just landed offers from Google and Dropbox. The competition is absolutely ruthless.


this is a good point. Every successful interview I had was the result of about a month of study time.

I think the only way I could make it through an interview is by quitting my job and studying for a month!


Backdoor ageism


I worked at an unnamed big company where the interview process involved being interviewed by a PHP expert, a java expert, and a javascript expert. They were surprised when NO ONE ever passed the interviews. Most actual hiring at the company involved getting contractors from a recruiter and then hiring them if we liked them--no technical interviews, just how much we liked them personally and if they could perform at the most basic level. I would imagine this is pretty common.


That's called 'contract to hire' and it was fairly common about 5-10 years ago. I don't see it as much these days. I'm not sure why, but it seems to have fallen out of fashion.


You don't see it because it gives the candidates a chance to see what a companies code base is like and look elsewhere when they discover how bad it is.


And that's exactly why I like work to hire. The biggest factors that contributes to whether you like your job or hate your life are your boss, the people you work with and the state of the code base. You don't get a good impression of any of them until after you've been hired.


I think that employers learned that they had difficulty keeping good people. If I'm working for you on a contract to hire basis but three-six months in, I get a better offer, I'm gone. I have no retirement vesting to keep me locked in for 3-5 years.


For some folks 10 yrs experience is truly 10 yrs experience, for a lot of folks, 10 yrs experience is 1 yr experience repeated 10x. Or worse a few months experience repeated more. Unless you are truly learning each week and working with new tech often. Number of years doesn't tell me much. I've seen 2 yrs experience crush 20 yrs experience all around, in design, programming, OS knowledge, database, etc.


I don't quite understand this point of view and the whining. Why is it that Software Engineers feel the need to whine about these technical interview questions? I don't think many of these are THAT technical either, just requires some practice. I dislike brain teasers, but I don't think Algorithms questions are out of line, even if you might not use them.

I graduated with an Aerospace Engineering degree. While I don't exactly remember how to derive the Normal Mode of a wing, I wouldn't have any issue relearning it for an interview. For my first job, I was asked questions that had very little to do with my day to day but I still answered them and didn't complain.


> Why is it that Software Engineers feel the need to whine about these technical interview questions?

Because they're pointless for the vast majority of positions: why make me learn something for the interview that I'll never use again? This offends many engineers' sense of laziness.

Because it's a pithy indication about what's wrong with tech interviews: we can't ask 'are you good at delivering with weird requirements' (because the candidate says 'sure I'm the best'), and we can't check their ability to do that in 45-60 minutes. Instead, we use these shibboleth questions to see if they are like other engineers (who can deliver etc.)

Because it's gatekeeping: if you didn't study algorithms or have the time/energy to keep it fresh, you can't get past these interviews and get these jobs. It doesn't matter if you'd excel if you don't have the same opportunities as the normal privileged young white male engineer.

Because it's inefficient: plenty of engineers could perform well in the position but get screened out by these imprecise questions. Sure, for a startup of 4 employees, you can't settle for great and have to keep rejecting until you get the best, but for a mature stable company, you want people with solid skills and abilities who can learn (how to do algorithms) on the job.


None of that is specific to software engineers.

My non software-engineering Interviews contained tons of questions that aren't completely relevant. For example, describe in detail how pressure and velocity are correlated in each turbofan stage and how they affect efficiency. Describe what isentropic flow is and how it relates to lift.

I also hate getting random whiteboarding questions, but I'm not going to complain about it. If I really want a job I'm going to study for it.


> If I really want a job I'm going to study for it.

Congratulations on having the resources and stability that these interviews select for! Unfortunately, these are not requirements for doing well at the job.


In general I think it's because forcing experienced, practicing people to learn things that are of little use day to day (and hence forgotten) is onerous to the interviewee and a false signal for the interviewer. It's akin to learning a secret handshake.


I call bullshit on this train of thought.

I've been reading about this on HN and reddit for years now. This last December, I was promoted high enough, that I could design the entire interview process (with executive help) for my teams. Which was important, I very much need to hire some people.

So, keeping posts like this in mind, I designed as follows:

Phone Interview - We talk about you, the company, etc. Then a few technical questions. Do you know certain things about js that make me think you are genuinely familiar with the language (as opposed to just jQuery)? Do you know certain things about sql that make me think you are genuinely familiar with the technology (as opposed to just using a table designer)?

Assessment Test - Do a quick crud app, that will require some thought behind how to query the data. Make a point when talking to the prospective applicant that the code will be read by the team members they'll be working with, so while you can do the test quickly, please spend enough time on it that we can talk about the code, your choices, etc going into the technical interview. Ie) don't just link together 3rd party plugins.

Actual Interview - Interview with leads, interview with actual team members. Offer both use of Visual Studio / SSMS / IDE of choice or whiteboard. Ask questions about technology in general. Ask the applicant to explain how he'd design out a new feature. Change the feature spec, so that there's a little bit of a hard problem. Have to populate a tree from flat sql data, or aggregate data from a service, or some other not-quite-an-algorithm question.

The RESULTS?

Recruiters figure out what questions you are most likely to ask during the phone interview, and school the candidates.

The assessment test? No one bothers, everyone just hands in the simplest 3rd party libraries plumbed together with shitty code. Was the question select top per group? How about top 1 *? The really good devs shine through here, but a huge percentage don't even bother.

Actual Interview? No one uses IDE / SSMS. All of them choose whiteboard. The ones who genuinely know how to code, have no problem answering, and would be fine answering algorithm questions straight out. The ones who can't, get confused somewhere or other. They try to implement hacks around the problem (I'd have the dba do it), then fail to actually solve the question.

So In Short? I completely understand where this guy is coming from. But I also think, that at this point, I'm so tired of interviewing people who clearly don't care about their craft, that I want to start asking nothing other than algorithm questions... because the guy or gal that actually answers them will actually care about their craft. And at least with algo questions, I can change the question easily, and frequently. Coming up with real code problems, and good probing questions is much harder, particularly if they're going to have a short shelf-life before recruiters start training their candidates with answers.


Algorithm questions come in two flavors:

sensible: given the problem statement discuss your initial choice of algorithms and datastructures. This probably covers 80% of tech jobs out there. And you easily dig deeper with big-O and related here. You should have a very good idea if the candidate is clueless or not after this.

other: given a datastructure and algorithm, implement it. This makes sense for a minority subset of jobs/devs.


> given a datastructure and algorithm, implement it

This can be entirely appropriate for most jobs/devs, depending on the datastructure/algorithm. One of the guys I work with sometimes asks people to implement mergesort. I was suspicious of this question at first but it's been quite good at weeding out mediocre candidates. Mediocre candidates will write broken code all over the board, and more telling, they won't understand why it's broken even when you point it out. Good candidates will generally make mistakes (because no one has mergesort code memorized and coding on a whiteboard is frankly tough), but they'll recognize them, typically on their own and always with pointers, and they'll be able to explain the issue and how to fix it once they see it.

There's no trick to this question. There's no riddle, no "gotcha", and it's not a hard algorithm to understand or code. Just show that you can write minimally complex code and walk through it. It's like fizzbuzz but you can't memorize it.


How do you conclude that that the guy or gal who knows interview riddles and "hide the ball" questions, will somehow care more about their craft, than someone who does a great job on an assessment test?

It sounds to me like you just had a bad crop of people, most of whom wouldn't have passed an "algorithms" style interview either.


People who do well on the assessment tests almost always do well overall. Its very rare that someone does well on an assessment test, and then fails an interview. I don't think I've had it happen yet.

Yes, that's most certainly an issue.


Unless your assessment was more specific, I would not know what it is that you are looking for, picked up some libs, plumbed them together and got ready to defend my choice. I would not be willing to spend more then couple of hours for it, since I wont be able to use it anywhere else (unlike general knowledge like ago or sql).

The rest is understandable. Whiteboard, cause you want design and it will be faster to point, talk, erase and change. Unless you do some deep tech, people who can do general algorithms will be able to adapt fairly quickly.


I fully agree with you on this. If you ask someone to build a trivial CRUD app as part of their interview process, you should probably expect them to hack something together to get the job done. In fact, you should expect that on the job. If you don't have complex needs or high scalability requirements, gluing a bunch of plugins together seems like the appropriate approach.

Who wants to hire an architecture astronaut who hears "design a new ORM system" when someone says "build an app to update customer addresses"?


Perhaps its personal bias showing. When I've done assessments in the past, I've written more than the project required, then left notes in the code saying: "This likely isn't necessary, but this is a chance for me to show you my ability to code", or oppositely: "This would be a good spot for X, but that seems outside the scope of this exam"


My personal bias is against take-home exams because my experience is that they claim they take "an hour or two" but they're actually looking for a day and a half of solid work to build a system from scratch, test, and document it.


The main issue is time commitment. I have spent 5-6 hours learning and coding an algo maze thing in ES6 for a lead Ruby dev job because they wanted to know how my JS was. ended doing a day of ingerviewing mainly about design, JS and React. I did well because even if I know backend best I know frontend a lot too. I finnaly didn't get the job because they wanted somone only frontend not backend... For a Ruby job.


Yeah, its a mixed bag. I could make the test harder, so there's more you have to code, but I'm afraid I'll then filter out the more desirable candidates who have more than one test to do.


I think the main issue - as a candidate - it's hard to make the difference between companies that are serious and not.

Like this one, only one person reviewed my code and he didn't know JS. I had re-prove that I know JS at each interview because my resume shows morw backends experience. I didn't care personally because I need very little to have fun and I have money plus some passive revenues. However, I can see why people are exhausted by this process.


Sounds to me like you should just focus on the assessment test and skip the rest. After all the point is to filter out the bad ones as early as possible and focus all your time and attention on deciding whether a good developer will fit with your team, your platform, your project. So after a quick phone screen, give them the assessment test, and only continue if they shine on that.


I am just beginning my studies in computer science. I find discrete mathematics, data structures, theory of computation and algorithms to be the most interesting aspects of computer science. If I took an undergraduate algorithms course (Chapters 1-3, 7-9, 11-13, 15-16, 22-24 from CLRS) would I be prepared for most interviews?

I noticed the graduate course overlaps with many of those chapters and also cover: chapters 18,21,25,26, 29-32, 35.

Ideally, I would like to take both courses out of pure self-interest.


As others have suggested, get a copy of "Cracking the Coding Interview" and review the questions therein. If you can answer those questions easily, you'll have an easy time at the average interview.

If you want to prepare to interview at one of the top companies, then afterwards procure a copy of my secret weapon: Aziz, Lee, & Prakash's "Elements of Programming Interviews". The questions there are an order of magnitude more difficult. If you are able to apply the CS concepts you are learning to questions of that difficulty, all doors will be open to you.


Thanks for taking the time to answer my question. This is my first time commenting on HN and am being downvoted for my legitimate question.


Just a helpful tip: mentioning being downvoted is against the HN guidelines and usually attracts more downvotes.

pkahler's sibling post also brings up an important side point: beyond the basic core, what you study really needs to be focused on what aspect of the software industry you want to be in or are likely to find yourself in. While I, too, encourage you to take C/asm courses or whatever your preferred focus is, the bulk of the jobs in the industry are in creating web sites and business systems (much of which are so-called CRUD apps; see https://en.wikipedia.org/wiki/Create,_read,_update_and_delet...). Even if you plan to do something else, a course in databases and concurrent programming will stand you in good stead as a fallback if your career doesn't quite go as planned.


>> If I took an undergraduate algorithms course would I be prepared for most interviews?

I can't comment on "most interviews" but that would not qualify you to work for me - or in my industry. I want someone who does embedded C, understands hard real time control, can write code doing fixed-point math in a way that can be understood later, writes safe code, state machines, signal processing, fault management, LIN or CAN networking, and some other stuff. Knowing the function "malloc" or any algorithms that may use it is not necessary as it is explicitly forbidden.

Just sayin'


In addition to taking an Algorithms course I would take a course that focuses on C and ASM. Parent comment does not imply I would only take an Algorithms course.


Hiring at most companies is fundamentally broken.

We have no way of knowing if someone will work out, so finding ways to rule people out makes it seem like we did our job in the hiring process. Some Wonderlic-type questions, the person asking them gets to say, "Well, look we have some quantifiable way of measuring applicants." Total BS.

No cure for it, other than to not turn around and be that asshole once you are hired. We should be more open about what the criteria is; hiring comes down to a few gut checks... do you like the person and think they will fit in, do you feel confident that they can do the work you will ask them to do, and are they in your budget?

Unless you work with a person over time, there's not going to be any way of knowing how they are as an employee. If they are smart, if they keep up on current trends, if they are a good communicator (and any good at communicating inside of your organization), if they care, if they show up on time, if they are conscientious or just pass the buck, if they are a hard worker, if they are opportunistic / selfish with praise...

Anyway HR needs to justify itself so we can't just throw darts at resumes... but essentially that's all hiring is. Trying to get a sense of someone quickly, cheaply, and correctly? Nah, just pick 2... at most.


Well, you have to understand, the whole point of an interview to construct questions you or at least a subset of the candidates can NOT answer. When there's too many qualified candidates, you've got to weed them out somehow.


I must admit I can not really relate. What about somebody who can not FizzBuzz, but claims they can code? Most whiteboard interviews I had were not much harder than FizzBuzz. I was asked to implement Quicksort once, but they gave me the specification, some time and a sheet of paper.

Why would somebody for whom it is easy consider to hire somebody who can not do a seemingly simple thing?

I can understand if somebody has anxiety issues. Then maybe you can negotiate something else, homework project or coding on site for 30 minutes. But odds are you'll discuss stuff with your colleagues on a whiteboard, too.


I reject all that too, the recruitment process is broken, it's not even the riddles it is the stupidity of some tests for the "less bright" companies.

But every problem is an opportunity :), now most of my freelance work are companies who "had a problem and couldn't find anyone willing to solve it" and me who asked "can I give it a shot?"

Showing you want to solve problem so far "works for me" much better than doing the guinea pig with riddles


Same here, I've stopped doing tech interviews, they are a waste of time. Hopefully my freelancing network will continue to grow and I can avoid them forever.


Freelancers do no tech interviews?


When I did freelance work, it was all about connections and reputation. "Hey we got some problems with our overnight processing, you've done that before" type stuff.


They do some, but I find it more of a formality once you build a track record. The risk is lower than a full-time employee.


Not often, I think I only had one in the time I was freelancing, networking is everything.


I think tech interviews are broken. I've been on both ends of the equation and the process usually seems both unfair and random.

I work hard, get good reviews, have nice looking work that doesn't crash and scales - and I can't interview either.

I've also interviewed great candidates that get shot for no reason at all.

That said, it is a numbers game, if you need a job keep at it - you have not other choice.

But in the long term I think our community needs to chill out a little bit.


The prevalent thinking is that algorithmic skills transfer very well to other problem-solving skills. This is especially the case in Google, Microsoft, Facebook and Amazon who ask a lot of these questions. I do not necessarily agree with it, but it is the way things are.

It's just another skill that can be acquired with proper effort and concentration. I can recommend "Cracking the Code Interview", it is chock full of these problems.


"I oppose hiring practices that would exclude me." Really? How profound.


Modern interviews are metrics and every smart system (human/AI) will exploit them. Metrics are always exploited. Job Interview is just another profession that requires a lot of training and preparation.


I have read these threads for a while now and concluded that people are going to complain no matter what companies do.

I think that while whiteboard nor algorithms nor random questions about technology are perfect, they are still better then vague talking about how passionate one is or variants of beer test. The issue with talking about good code is similar - it matters whether you know the principles, but it is too easy to fake by people who have more talking bluffing skills then anything else.

Personally, I like the best when they tell me what they will ask for, so I can prepare myself - whatever that is. That seems the most fair to me - good programmer does not know everything fresh, but should be able to refresh or learn something new reasonably fast.


I have never, and would never subject myself to the types of sadistic games masquerading as "hiring" that you see at a lot of software companies.


How do you find work?


Most jobs in this world do not require a laborious interview process with timed puzzles and riddles.

Bear in mind that the companies that do put people through that kind of thing do it because they can. When you have people lined up down the street to work for you, you can be very selective in who you hire and put them through all kinds of skill tests before hiring, to triple check that they have the requisite skills. Many (most?) companies do not have that luxury.


> Most jobs in this world

You're not US based then? Because I can say with some certainly that most advertised jobs in the US absolutely do. Granted, most open jobs in the US aren't advertised, but I haven't found a way to reliably find jobs that are both open and unadvertised.


Sorry, I was referring to literally most jobs, not just the ones in our industry. Electricians, plumbers, writers, accountants, etc. are not typically solving timed riddles and jumping through silly hoops to get their jobs. There will be technical questions, yes, but for the most part their experience is presumed and the employer assumes some risk by hiring them and having them go through a probationary period.


I'm truly sick of this rubbish that has been brewing lately. If you can't do interview questions, your ability is questionable. The bar has been lowered enough with the push of "everyone can code and should code" mentality. If we want quality, we must raise the bar. Everyone can code, everyone should be able to code if they want to. But everyone can't qualify for as a professional software programmer. If the company has a standardized interview process and everyone else gets asked the same set of questions, then it's fair. If you can't answer or whiteboard it and other candidates could, Don't cry! Those candidates obviously know more than you in regards to what the company is looking for. Take the case of Google or Amazon, most of us can agree that they have very smart folks working for them. If the filter these smart folks had to go through was puzzles, math problems, algorithms, white boarding, then why should anyone get an exception to join them?

Interviews are broken in most environment, and that doesn't mean we should do away with them or make them easier. Rather, we should fix them, make them better, and have them test what the candidate will be doing on the job. I see people say, "you don't need to understand algorithm complexity to build a website, or to know C or assembly" Sure! But you don't know what you don't know! If you know and understand algorithm complexity, you will use it even when building websites! You will think about scale. If you understand lower level languages and you are using PHP, you might understand how your code turns to C and how choices at a higher level turns to lower level. If you never studied computer architecture, you wouldn't understand the magnitude involved in fetching something from a register, processor cache, ram, drive, over the network. Yet such things do rear their ugly heads when developing these so called simple CRUD apps.

So if you are interviewing for a PHP programming position, and you get asked about algorithm complexity or C or TCP/IP or other questions. Don't frown, those are important questions, if you don't know it, you don't know it. It doesn't mean you are a "bad programmer" but it does reveal that you are a programmer without a wide breadth of knowledge. A one trick pony for the most part. Get over it and widen your knowledge.

The other day I was using an external API that was crapping out. After hitting a dead end, I informed the vendor and they said nothing was wrong. I did a system call trace and noticed a lot of error was being returned on poll(), I fired up tcpdump and noticed an abrupt and random FIN ending the connection. With this I was able to convince the vendor to test their API from outside their network, and once they did, they agreed that the issue was on their end and they fixed it. This was a basic PHP app. Without a wide breath of knowledge, I would have been stuck, no google or stackoverflow would have solved this.

Great developers have knowledge that are wide and deep. If you fail an interview, Don't get mad, get glad that you found your weakness and brush up your skills.


I'm kind-of on the fence about this topic.

On the one hand, algorithms is a broad subject, so there's a large hole for false-positives and false-negatives to slip through. Sometimes it's like they'd pass on Gerry Spence as a trial attorney because he flubbed their question on tax law.

On the other hand, can't totally drop the brain-teasers because you'll need all the IQ points you can get before the corporate retardation sets in.


Kind of off topic but I love relating programming to law. Both are enormous fields where no one has mastered all fields but the general populace (non-tech/legal) fails to grasp that. The interview process for engineers seems to have failed, to some extent, to understand this as well.


Definitely. I've had some algorithm-y interviews where I solved it in an original way, and they were ready to marry me off to their prettiest daughter. Then I've had others where I didn't even know what the hell they were asking, all requests for clarification were met with repetition of the original question, and both of us were probably wondering: "Is this guy stupid? an asshole? both?"


Does ability to solve brain-teasers correlate with a drop in "corporate retardation"?


Everyone gets hit by the CR, so it probably doesn't hurt to start with a higher baseline.




Consider applying for YC's Summer 2025 batch! Applications are open till May 13

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: