Hacker News new | past | comments | ask | show | jobs | submit login
“But didn't you write an embedded OS?” (tnhh.net)
255 points by jimmies on May 13, 2018 | hide | past | favorite | 166 comments



I hate the Cracking the Coding Interview/Leetcode style... studying for these type of interviews is annoying. Trying to find a good video on youtube, where they aren't just naively coding up the bruteForce->optimal possible solutions, especially is irritating. It is literally a landscape of college kids with thousands of viewers who treat these interviews like a standardized test (SAT, GMAT). Even the author of the book produces videos with very little insight or meaningful content.

"Find all the subsets in a set that add up to sum" -- "Okay for this we will use the sliding window technique and here is how it is done" -- WTF is this. I get that they want to see problem-solving skills, but this is on a different level requiring the interviewee to have studied and knowledge of the technique, otherwise we are basically trying to develop efficient algorithms from scratch and in little time. --This makes sense for college interviewees who have only studied the past 4 years, but for a professional with experience, why is this adequate??

Does algorithmic programming matter?-- still yes. But the way it is interviewed is absurd and inadequate. I had a production service centered around the stable-roommate-problem. It took me a week or two (mostly research) to develop something out and fit it into our codebase. It then took 1-3 more weeks to actually make it work for us and cover edge cases (i.e. Irving's algo quits after instability -- this isnt an option in the real-world). I read much material on the subject, other's code, had many deep-thinking sessions where I was mostly in my head, wrote unreadable scratch on paper, collab-whiteboarded (sometimes arguing), tested&failed PoCs, and had many breaks in-between it all. How successful was that project?--very, did I need to know and study techniques with lacking/meaningless basis to do it--no.


Recall when you first learned how to program. Something as trivial as iterating through a list requires some thought. What is the syntax for a "for" loop? Do you start with i = 0 or i = 1? Should you end at i < n or i <= n? At some point you stop thinking about it and write "for (int i = 0; i < n; i++)" instinctively. You can now solve harder problems that require iterating through a list without thinking about it.

All of algorithms and programming is like this. You unlock harder problems as you learn more problem solving building blocks (whether it's an algorithm, a software design pattern, or some API call). A programmer is someone who can learn these on the fly for any problem of any ___domain. An effective programmer is someone who has a large cache preloaded with these building blocks already.

In that sense I find leetcode style problems to be very fair. They are meant to be solvable in under an hour without thinking once cached into muscle memory. All it is testing is whether you're capable of becoming an effective programmer in some agnostic ___domain. All you need to do is warm your cache with a small number of standard patterns (which might even be useful for real work). It does suck that even the good programmers need a few weeks to warm their cache. But it weeds out the fakers who can't do it given any amount of time.


> In that sense I find leetcode style problems to be very fair. [...] It does suck that even the good programmers need a few weeks to warm their cache. But it weeds out the fakers who can't do it given any amount of time.

It also weeds out people who have better things to do than cram two weeks for your pretend-meritocratic little exam.

How about requiring that candidates comment their code using quotes from Classical Chinese poetry? They are proven timeless classics that an intelligent person can apply to any situation. This test would weed out the fakers who can't refresh their caches while also honoring an ancient tradition of stupid job interviews, the Chinese imperial examination.


Algorithms and coding are a lot more relevant than Chinese poetry.

Yes, you might need to prep two weeks in order to get a job that pays very well for 2-15 years. And since this test is so common, you don't have to dedicate 2 weeks to juat one potential employer. If you can't freshen up on standard basic algorithms and coding in 2 weeks for an interview, how can you do it for whatever complex problems you'll face at work?

You can't hire someone solely based on what they claim they built.

I see a lot of people that complaining that they can't take their senior engineer rank from company A and jump into company B at the same level without even justifying themselves, let alone working their way back up. To me that's utterly disrespectful to the the ___domain specific knowledge and experience that is needed to be a good senior engineer. Someone thinks they are obviously good enough to be a senior employee somewhere, but for some reason isn't good enough to build something valuable on their own, and isn't able to demonstrate skills in a face to face meeting.


> It also weeds out people who have better things to do than cram two weeks for your pretend-meritocratic little exam.

All interviewing techniques have to make precision-recall style trade offs. The mere fact that an interview method has false negatives surely doesn't disqualify it. It has to be compared against the available alternatives. What are the alternatives?

- White boarding? Algorithmic knowledge is often tangential to the actual job.

- Take home assignments/mini projects? High relevance to job, but in my experience takes the most time for the candidate.

- Trial period? Most people can't just drop everything they are doing to come hang out at your company.

- Conversational interview. Like white-boarding, tangential to actual job. My experience on the interviewing side is that it is often hard to learn much about the candidate.

- Read their code on github / blog. Lots of candidates don't have the time or inclination to code outside of work.

- Something else?

So what's your preference? I've done them all and find them all to be lacking in different ways.

> How about requiring that candiates comment their code using quotes from Classical Chinese poetry? They are proven timeless classics that an intelligent person can apply to any situation. This test would weed out the fakers who can't refresh their caches while also honoring an ancient tradition of stupid job interviews, the Chinese imperial examination.

This seems like the fallacy of grey to me [1]. When hiring, for example, a web developer, yes, algorithmic knowledge is a somewhat arbitrary indicator to use, but it is not completely arbitrary. Not all things are equally unlike.

If I were hiring for a basketball team, and had to choose between two candidates neither of whom had experience playing basketball and were alike in all ways except that one was an avid soccer player and one was equally fervent about pottery, I would choose the soccer player. The logic of course being basketball and soccer have more in common (athleticism at the least) than basketball and pottery.

Likewise, algorithmic thinking shares some common points with almost any kind of engineering task.

Interviewing is just a hard problem where you are trying to predict future performance based on a few hours worth of data. I don't think most of the popular the techniques we have are obviously stupid. Companies have strong incentives to make hiring efficient, but there just isn't a lot of low hanging fruit. Of course there are the occasional ego maniac interviewers, but an ego-maniac is going to be able to ruin any type of interview. Let't not throw out the baby with the bathwater.

1. https://www.lesswrong.com/posts/dLJv2CoRCgeC2mPgj/the-fallac...


> If I were hiring for a basketball team, and had to choose between two candidates neither of whom had experience playing basketball and were alike in all ways except that one was an avid soccer player and one was equally fervent about pottery, I would choose the soccer player.

That’s not at all what happens in programming interviews that use algorithmic puzzles.

You have candidates who already have a professional track record in basketball, and instead of focusing on that profile and whether it’s a good fit for your team, you give them a timed soccer workout because it’s somehow a more objective measure of athletic ability.

Any basketball team that hires like that wouldn’t survive for long. The quiz interview format in the tech industry is a form of “anti-Moneyball”. It works for the SV giants because they have an enormous supply of candidates and they need generic competence that can be shuffled around. Smaller companies would do much better to hire for the actual role, not for “Cracking the code interview” memorization performance.


The basketball was just a metaphor to explain relative similarity. Obviously hiring for an actual basketball team is a very different set of challenges as they literally have hundreds of hours of video taped performance history to evaluate before the candidate even walks in the door, and at lower levels, asking candidates to spend several days “trying out” is not considered onerous. But I would still argue that performance on programming puzzles much more closely correlates to programming job performance than knowledge of Chinese poetry.

As for hiring people based on their experience profile, it’s great of course in the case of candidates with lots of open source contributions and such, but this has the issue of ignoring the majority of candidates which don’t contribute to open source. Should being an os contributor be a hard requirement?

But if you are suggesting that a resume with the words “5 years experience web development at company x” means anything, I’m a little incredulous. I worked with people that claimed to have far more experience than that and struggled horribly with even the most basic tasks.

Finally, a little tangential, but memorization gets a lot of flack for being a “stupid” skill. My experience is that it is nearly impossible for adults to memorize something like Chinese by poetry “by rote.” Indeed if you try memorizing some poetry I think you’ll find that it really is a very fulfilling and creative process.


Here's what I do:

1. Filter candidates based on fairly simplistic early models for personality profile, motivational bias, and metacognitive disposition cues.

2. On a subset, further refine the above filtering further w/ another 1 or 2 interactions looking for inconsistencies and/or stressing facets of the model that seem contraindicating or hard to suss out.

3. Prep the team on what & how to assess and bring a small number of candidates on-site for a few hours, to work directly with the members of the team they'd be joining, and have them work together on exactly the work they'd be doing.

So I put in a lot of work ahead of time as a hiring manager to understand what kind of role I need to hire for, what kind of person would likely be successful in that role, and what kind of person would likely be successful working with the team that exists (or will be built). Then I completely avoid some contrived pile of quizes and weak competence signals by instead directly using an actual work environment w/ the same people, the same meetings (stand-up, design review, etc.), and the same tasks that both we & they would be cooperating on together.


Seems like a nice approach. I've been through similar setups and I think there are still tradeoffs though. Step #3 is tricky as you need a task complex enough for the candidate to show their skills, but it has to fit within a few hours. Unrelated to interviewing, I routinely under or overestimate task size, so carving out just the right task can be difficult. New features and bugs often have unknown unknowns.

The "contrived" puzzles approach has the advantage that each candidate can be given (and thus evaluated on) the same task. The size and perquisite knowledge for the task can be well controlled and since the problem is not new to the interviewers, we know how to present it in an easily understandable way and help them if they get stuck.

I think another reason why the "general cognitive ability" approaches are popular is because employees (especially at small companies) need to be good at such a wide range of tasks that it is not realistic to evaluate even a fraction of them in the span of a few hours.


I don't have any expectation they complete the task. That's not really the point, so time-bounding it isn't something I worry about. The point is that they're able to engage and contribute in some way: insightful feedback, mentoring, reintegrative learning, productive collaboration, etc. depending on the shape of role that's being hired for.

The tasks can be writing docs, writing requirements, writing property-based tests, writing CLI tools for developer ergonomics, formally modelling and verifying a scheduler, designing a DSL for safety-constraints or characterizing the electrical interfaces of a car's steering system. It's not entirely material what the tasks are. There will be opportunities in any of those to get good indicators of the important factors.

FWIW, the purported consistency of evaluating people using the same contrived task is fairly unlikely to be actually consistent, and even if it were, the value of what you can actually meaningfully derive from it is still deeply questionable all the same. As a result the reality is more likely that those situations are producing negative net benefit.


Ideally tests should not require preparation, thats where the issue is.


Why? Building software requires preparation.


> which might even be useful for real work

In my experience, this is where the house of cards collapses. For most programming jobs, none of that will ever be useful. And on the rare occasion that it is, you'll have gotten by without needing it for so long that you'll have to look it up and more or less learn it again from scratch anyway.

I find these sorts of things to give far too little useful information during a developer interview.


It's an aggressive filter. We sometimes give a post-interview take home programming problem that involves no trickery or algorithm Discovery/invention. It's just to verify they can code and see their style. You should be able to determine if a person can code from resume and interview alone, but sometimes I can't tell until the exercise. People suck at interviews so they fall back on hard problems. I suppose that works for big companies with lots of applicants.


> A programmer is someone who can learn these on the fly for any problem of any ___domain

Do you really believe in this? I would maybe call it an Ex Machina droid...


>treat these interviews like the SAT

If you have an interview with complicated tangled questions with too many things to handle, or questions that require the knowledge of a special algorithm, then either you fail because you haven't come across it, or you pass because you know the answer and pretend that you don't. Then those interviews literally become the SAT. I think it is a good strategy when it comes to giant companies (applicant's job demand>>supply). It's just as a high SAT score is relevant when it comes to the top schools. It signals to employers that either you're a genius who have done a lot, or you want the position badly enough to go through the pain to study and memorize them. Either is a good thing to have.

I think for smaller companies, posing those crazy hard questions and expecting the best answer is a bad idea.

I have seen companies that have executed the tech interview very well, though. The questions don't require specific knowledge about an algorithm. The interviewer hinted the interviewee when they needed help. I got asked questions that I never came across but managed to come up with a great answer by the end of the session. That's a sign of companies that know what they are doing.

After interviewing with several companies in different sizes with a mixture of good/bad interviewing processes, I realized the YouTube tutorials and the books weren't wrong. But not everyone has a bad interviewing process. And you probably shouldn't expect a non-tricky one from a giant tech company that everyone wants to get in.

It's like dating the hottest girl in high school -- you know the name of the game. But for a real happy relationship, maybe you don't need to chase the hottest girl. Maybe you should look for smaller companies that like you as much as you like them.


We can do without the casual sexualization of professional career hunting.


I didn't mean to be sexualizing the example. Notes are taken.


>I think it is a good strategy when it comes to giant companies (applicant's job demand>>supply).

Is it? I can't help but feel if, for instance, Google had useed a sensible interview process and hired Max Howell they might have gotten their act together on golang's package management a little earlier.


I started interviewing people this spring and since I hate algorithmic questions, I didn't look up any on the internet; I just wrote my own questions based on some data manipulation I've done in real work. They are basically data-normalization questions (given a 2d array of data from db, populate this class). But I found these to be a very good filter. You only need to be comfortable with Java Map and List interface, which is bread-and-butter of a Java dev in my experience. Yet still 75% of people couldn't solve these questions. However, the 25% nailed it and we made a couple good hires.

All this to say, if interviewers actually spend half a day and write practical questions that test the skills used in real work, it IS possible for both the interviewer and interviewee to be happy.

Also for the in-person, I just delete an existing integration-test we have and give the candidate our laptop and we pair program on rewriting it. This has worked well.

No memorization or trivia or tricks, just standard development exercizes.


Algorithm interviews underestimate how long it takes to flesh out good ideas to hard problems.



felt like I could get better light & debate on my opinions in this thread. Wasnt sure about HN etiquette.


> And I see the same trend in Machine learning, Big data, AI, Cryptocurrency, Internet of Things, Social networks, Decentralized networks, etc. Everything in life, however trivial or hard, is a nail that can be solved with those fancy hammers. So we can have a tablet juxtaposed on a fridge, a bluetooth speaker integrated to a salt shaker, a subscription model for juicers, and machine learning for all kinds of social problems.

I'm frustrated by this as well. I see these days, especially, junior engineers excited about machine learning analyzing data sets that could be tackled easily with learning the problem space and excel or a jupyter notebook. 90% of the work in analysis problem is, usually, identifying data sources and possibly curating the records to a workable data set. So it really doesn't help much to apply ML off the bat. If normal regression and human inspection don't help, maybe then you can iterate on the problem and try some ML approaches.

So why are junior engineers trained to think this way? Are their professors and guest lecturers excited about these approaches and teaching the students about their favorite hammers?


It's Anecdata, but when I was coming up through my Computer Engineering program a few years ago there were roughly three categories of students:

1. The Traditionalists: Students who are fascinated by the technology and what it can do at its limits. These people are disproportionately drawn towards the more arcane/unsexy applications (RF, hardware, high-power, radars, antenna design, IC design, embedded systems, robots, rockets, quantum computing, etc) and research.

2. The New Wave: Students who were inspired by a Steve Jobs keynote, these people were almost exclusively interested in potentially mass-market user-facing applications and have a greater emphasis on image (both personal and professional).

3. The Unmotivated: People who are there "because it pays well" and for no other reason. Not many of them made it to graduation, and the ones that did were "meh" engineers at best.

Most of the naive junior engineers you're seeing probably belong to the modern incarnation of category 2. Steve Jobs has been dead long enough for new grads to have been inspired by the next generation of buzzwords. They're fine engineers in general, but they have a sometimes unhealthy fixation on said buzzwords and how "future" they are, because that's what got them into engineering. Most will concede when a simpler solution is presented, but it's uncomfortable for them. Unsexy but effective techniques undermine their aesthetic impulses and, to a degree, their self-image as engineers.

A good popular example is Elon Musk refusing to use LIDAR on Teslas, and introducing way too much automation, too fast at Tesla's factory. People, even junior engineers who think they're part of a "ground-breaking" movement often have trouble separating "obsolete" from "tried-and-true".


3. The Unmotivated: People who are there "because it pays well" and for no other reason.

I don't see anything wrong with that. I started programming in the 80s in the 6th grade because I was the stereotypical "short fat kid with a computer". (I got better) but after a few years in the industry, the only reason I code is for the money. The only reason I keep learning is to stay employable. It's not for a higher calling, it's not to change the world, and I don't enjoy doing it enough to learn any technology that isn't in demand in my local market.


I love to program just for fun but as I've gotten older other interests have to be balanced against that and I have to stay current with the technology so while I still program for fun I direct it in areas I know will be more useful to me at work.

When I was in my 20's it was very different to late 30's I had far fewer responsibilities so I think part of increased responsibility is knowing which problems to tackle.


I don't really enjoy programming as much as I use to. I enjoy "building systems". I've enjoyed building software departments, building network infrastructure in AWS, building autoscale systems and finding ways to avoid the drudgery of building stuff and leveraging pre existing building blocks that allow me to avoid the "undifferentiated heavy lifting". I take a perverse joy out of being able to rip out code that I wrote after finding a package on the internet that does the same thing.

I'm not saying I don't enjoy development - but as a part of building the system.

If I can find a hosted solution for a piece of infrastructure, I jump on it. I'm all in on "serverless".

I would fail any algorithm type interview these days. I'm just not interested in algorithms. I'm much more interested in leveraging libraries/frameworks/infrastructure providers to get things done.


Ah, never mind my initial response then. You're just more a systems engineer than a pure software engineer. That's a completely different animal than just being in it for the money.


You're here on HN reading (and commenting!) on Mother's Day (or just Sunday depending on your locality?). You don't seem completely unmotivated ;) .


Touché,

Reading about technology::coding for fun

Watching basketball on tv::going outside and playing a pickup game.


I love to program just for fun but as I've gotten older other interests have to be balanced against that and I have to stay current with the technology so while I still program for fun I direct it in areas I know will be more useful to me at work.


People in your category usually lack the motivation necessary to keep up to date on the most efficient techniques, and they'll put fewer hours in general into their craft. They'll typically produce less effective solutions as a result, because they don't know all the options and refuse to learn all relevant tools.

For a concrete example, I'm currently (hopefully for not too much longer) working at a large defense contractor with a lot of 20 year+ software developers who have never worked outside the company in an engineering capacity. Some have kept up to date, but many hang up their engineer hat the moment they walk out the door, and it shows in their work. They don't learn anything new that they aren't forced to, and they often don't care about code readability, maintainability, anything that wasn't taught in mid 90s. They don't know what continuous integration is, and they don't particularly care. They argue, sometimes successfully, against changes and infrastructure that would be no-brainers at any marginally competent software dev house. They think agile is a horrible idea because all they know is my company's half-assed, broken implementation of it. These are the people who are still writing bloated 500-line functions in new code, giving me blank looks when I mention refactoring at the code review, and approving said functions anyway. It's very hard to be fired unless you commit a security violation, so they'll keep happily churning out horrible code until they hit retirement. Then they dream of kicking their feet up in Florida (literally in one case) and doing nothing. Or so they claim. Hell they've been in the aerospace industry for over two decades, and in many cases I, the junior man, had to tell them who SpaceX and Blue Origin are and why they mattered to our industry (and possibly their jobs in the medium/long term). They simply do not care. They approach their job like they're making corn flakes on an assembly line, and our product is many times shitier than it should be as a result.

If you can't tell, their wilful mediocrity pisses me off. :P They're part of the reason I'm looking to leave, although my larger problem is with leadership, or lack thereof (imagine these guys getting promoted, where they can care even less about staying up to date). Not to paint the whole company too badly, there are good engineers/managers that I work with, and they're a welcome breath of fresh air when I do. Just far too few of them for my liking.

If your goal is to simply remain employable then fine, you're welcome to find all your fulfillment outside of work. But if your only motivation is the stick and there's never any carrot, the moment that stick goes away you're going to be passed. From the sound of it fear of unemployment is your only incentive. For the sake of your co-workers, I hope you never find an overly stable job.


> These are the people who are still writing bloated 500-line functions in new code, giving me blank looks when I mention refactoring at the code review, and approving said functions anyway.

The problem with these people is not that they haven't kept up to date with modern practices (half of which are probably cyclical anyway). The problem is they never learned any best practices.

People in 1998 knew not to write 500 line functions.


Heck I read articles in InCider in the 80s about how to simulate structured programming in AppleSoft Basic when all we had was Gosub/Return and how to create small "subroutines"


People in your category usually lack the motivation necessary to keep up to date on the most efficient techniques, and they'll put fewer hours in general into their craft. They'll typically produce less effective solutions as a result, because they don't know all the options and refuse to learn all relevant tools.

Yes I do put a relatively few hours in my craft outside of work. But I also keep my eye on the market and I aggressively change jobs if my job ia not allowing me to keep up with technology. Why work at a job where your skills are stagnating and then spend extra time learning the new and shiney when you can learn the new and shiney and get paid for it? Especially since wage stagnation is real and the easiest way to make more money is to change jobs.

I’m not above doing resume driven development or spending extra time at work doing a proof of concept on a new to me technology.

As far as the fear of unemployment being my only motivation. Why wouldn’t that be enough? At least that and optionality. There is no such thing as a stable job. A company will lay you off in a heartbeat. Fear of not being employable is a major motivation. The other motivation is not being able to jump ship at a moments notice. It usually takes me about a month to start a new job after I start looking - including a two week notice - always paying substantially more. (20 years and counting)

I’m at the point now though where I have to keep my skills marketable just to stay at my current salary.


Fair enough. If you're in an area where market forces are immediate enough to force you to stay current/effective, or you just prioritize being competitive, then I suppose simply being in it for the money can work. You're a rare breed though in my experience. And arguably your desire to remain competitive is an extra motivator outside of the money.

Most people I'd describe as being in it for the money are just looking for a bland, suburban middle class existence and are happy to plateau once they start making enough to be comfortable.

As for job stability, oh believe me it exists. One thing I've discovered is defense contractor work (at least at the large contractors) is only one step removed from government work in terms of non-firability. Security clearances take a minimum of 6 months, often a year or more to process, it costs hundreds of thousands of dollars per investigation, and there's no guarantee a prospective hire will pass. As a result the large contractors hoard cleared engineers, almost regardless of ability. Even if your program ends it's an unwritten rule that some other program will pick you up. We haven't had layoffs since sequestration/the financial crisis, and relative to the size of the company those were minimal.


Most people I'd describe as being in it for the money are just looking for a bland, suburban middle class existence and are happy to plateau once they start making enough to be comfortable.

Again, you're describing me. I'm happy to "plateau" and unless something drastically happens with my local market, I'm close to it since I have no desire to go into management. If I want to keep my "bland,surbaban, middle class" lifestyle, I've got to keep my skills current.

But if I look what's on my agenda for possible things I can learn next, the most I can hope for is 10K more - excluding cost of living raises -than I make now (after making $45K more in the last four years by changing jobs three times).

I'm definitely trying to get even stronger at infrastructure (AWS mostly), but everything else I'm planning on learning - Node, at least one modern front end framework, Android and/or iOS, just increases the number of jobs I'm qualified for, not my salary by much.


Just as an aside, I never understood why the Frank Burns character on MASH was so reviled. Yes he was a best an "only OK" doctor, but anyone who can hack that kind of combat surgery deserved a medal. I always liked the episodes where he got to be a foil for Hawkeye's arrogance. Like gee, I'm sorry I'm not in your cult of hyper-competence. Too bad you have patients that go south just like the rest of us do.

There was also a political subtext. Burns' motivation was patriotism.


Question: If you had to pick a Doctor for your operation, would you want Hawkeye or Frank? :D

I actually found Frank to be somewhat sympathetic and to have some valid points on a few episodes, but many times he was just grossly incompetent to the point of threatening his patients' lives and his co-worker's careers. He wasn't just honestly mediocre, he rationalized it in all sorts of ways from extreme Patriotism to racism to outright denial, and was constantly looking for empty, sometimes underhanded ways to prove his supposed superiority.


The show was careful to never show someone getting killed due to sheer incompetence. But to be fair even "realistic" shows like ER would tread lightly when they went there, it's nearly always depicted as a systemic failure due to poor communication or some other unforseen circumstance.

But since the overall topic is the nature of elitism, I would volunteer that when I've had a friend or relative at a hospital with an "elite" reputation, that was no guarantee against episodes of incompetence. But that's a completely different topic.


4. The Savant: Also known as the absent minded professor. You could throw questions at this person in an interview and they will seem completely inadequate and unprepared. But, then apply them to a problem on the job and like a juggernaut they will defeat any obstacle. They will even solve problems your best engineers have been struggling with for years without breaking a sweat. A lot of companies make the mistake of weeding this type of engineer out in the interview process.


5. The HN Commentor. This species carries an inflated sense of self worth and uses many opportunities to tell others. Tends to blame shortcomings on others rather than accepting responsibility for own faults.


Hit the nail on the head - and as usual when that happens around here, people reach for the down-arrow.

Upvote/downvote mechanisms always drown out unpopular truths.


That’s my wife (and she actually works at one of the big tech companies).

Try and talk to her about tech and you won’t get much out of it. At home she watches Netflix and does her nails. No HN, Reddit or anything. No conferences.

Tell her she needs to learn language X and tools Y and Z and she’ll master it in a week. She’ll deliver 10x what everyone in the team does. Top performer in no time.

I can’t really wrap my head around it. I’m your typical tech enthusiast always here on HN, coding on my own time, etc. And yet if we were on the same team I’m sure she’d be orders of magnitude more productive than me. Which can be seen by the fact she makes almost twice what I make (am also an SDE at a large company), and is at a higher level despite starting 2 years later than me.


Presumably you're spending your free time following your interests and passions, which have some overlap with your work.

Sounds like she follows the needs of her job. The tangential, part-overlap enthusiast is always going to get toasted by the person going full on what's needed.


It might well be that, I've run into a few people with the ability to ruthlessly focus on the problem at hand often by reframing the problem in a way that people didn't expect.

A couple of them where so far ahead of me it was like teaming up with a different species.

I don't buy the 10x developer stuff but I'd certainly put them in the 1.5x-2x category, We ended up working well together though I'd naturally take the "flesh this out, make it bullet proof, document it" side and they'd do the more creative part (it's not that I couldn't do it as well, it was just they'd do that in half the time).

I learnt a lot working with them.


10x is real, and the way it is achieved is by making a key insight which results in requiring only 10% of the work to be done to get the required results. The actual work then gets done at the normal 1x rate.

This of course is impossible at companies which don't allow the flexibility for these people to stretch their wings. A big part of increasing a company's productivity is giving everyone the possibility to do this so you can uncover those hidden 10x-ers and put them in architectural roles.


Sometimes is just making the mental work 10 times faster, e.g. evaluating the possibility space 10 times faster. There are people with brains running at 1 GHz and also people with brains running at 100 MHz. Although, having a faster brain is not good "per se"


I'm not convinced that churning out the same solution 10x faster than average is possible. 3x yes. But the 10x comes from doing less work to achieve the same results.


if by "work" you mean writing code, that's a small part of what I'm referring to. 10x faster is the speed at which the possibility space is scanned.


I don't buy the 10x developer stuff... 1.5x-2x category

Closer to 2.5x, if you're comparing best to median. 10x is comparing best to worst, and that's very believable.

A 2nd edition of Peopleware summarises it; the 10x programmer is not a myth, but it's comparing the best to the worst; NOT best to median. It's also not about programming specifically; it's simply a common distribution in many metrics of performance.

The rule of thumb Peopleware states is that you can rely on the best outperforming the worst by a factor of 10, and you can rely on the best outperforming the median by a factor of 2.5. This of course indicates that a median developer, middle of the pack, is a 4x developer. Obviously, this is a statistical rule, and if you've got a tiny sample size or some kind of singular outlier or other such; well, we're all adults and we understand how statistics and distributions work.

Peopleware uses Boehn (1981), Sackman (1968), Augustine (1979) and Lawrence (1981) as its sources. [ "Peopleware", DeMarco and Lister, 1987, p45 ]


I mean conferences are about socialization primary and HN+reddit are mostly pop. Reading HN is not learning for job nor makes me better at work stuff.

Also an idea: maybe she is fast because she has rest and downtime. We are more productive when we are not working all the time and have rest.


I was just illustrating a point. She’s not an enthusiast at all. Software is strictly her job and she’s doesn’t care about it outside of work.

It baffles me because my initial reaction to someone like that is to think they’re not good. But she’s really good. She’s able to pick things up on the job fast and is extremely focused on her work. I OTOH always need to use my personal time to learn things and have trouble focusing, despite being a tech enthusiast.


I did not said she is enthusiast. I think that many of those "enthusiast" activities are not making you better at the job if you think about it. Focused learning with goal is very often more effective then just playing with something and rest+downtime+change of activity really help with learning and effectivity on the job.

But, I think that since she is "extermly focused on job" she probably likes (or at least does not hate) the job itself - not pop version of what programer should like or random messing with gadgets, but the job itself. Many people are like that - they are able to like many different things, so they like what they are doing now.


I find a lot of interesting, neat articles on HN. Occasionally but rarely something worth talking to coworkers about.

Not once can I recall reading something on HN I had no experience with an being able to use that at work without doing a lot of additional research. In terms of time spent and career impact HN is on the same level as Reddit, which is a net negative.

Conferences aren't necessarily a net negative but who wants to spend 2-3 days going to talks and all the after party stuff (which is where real career progress is made) to get one or two solid technical bits? Especially when those will be available a week later in countless blog recaps?


On HN, etc: I was just illustrating a point. She’s not an enthusiast at all. Software is strictly her job and she’s doesn’t care about it outside of work.


I may qualify in this category. Companies reject me because I gave them a brute force solution for finding repeated values in an array. But there hasn't been any single instance with my current job that I couldn't check all of the requirements that were assigned to me, even those which I thought I wasn't prepared for. The hiring/evaluating process needs to get refined.


To me this screams "justifying why I failed an interview". Maybe you're not a savant, maybe you're just terrible at working under pressure. Most people are - but it's a skill you should learn like any other.


oooh this sounds familiar. Might be me :)

yeah been weeded out a few times. and left some really interesting solutions at past jobs. When I get exhausted though I need breaks.. Flip side, it's way too easy to get obsessed over the wrong kind of solution too ... I LOVE working with QA teams and competent management because they'll usually spot this before I will!

I'm very happy in my current company.


I'm going to ignore #3. They really don't matter and are crap at best.

The problem with #1 is that they refuse to acknowledge the newer tools of analysis and understanding. They cling to older "mature" tools and methods, with nary a care of if/how the newer ones work. They won't be the fastest or novelist, but one ting can be said: it will work, and it will work well.

The problem with #2 is "Oh shiny OH SHIT". They go down the path of the next new thing, with little to no understanding that the old fogeys already did that and it didn't work then. But their big advantage is, it might just work now and enable a whole new area.

And it's really hard being in both #1 and #2.


About #3s - read this comment I just posted about my wife: https://news.ycombinator.com/item?id=17060366

I didn’t mentioned it there but the only reason she got into Software Engineering is because it pays well.


Ah. I was also attributing "lazy, claim a lot, do little, and copy from stackexchange" type. I certainly wasn't thinking of 10x dev who wants to get away from it when out of work.


The biggest issue is that it is only part of the problem, the other side is that it also infects HR requirements and now everyone needs to be a highly skilled motived engineered with 5+ years experience and several deployed applications to go through the HR filter and interviews driven by those junior engineers.


they want to get well-paid, high prestige jobs using machine learning.

whereas doing the exact same task with a regression in Excel turns it into low-prestige clerical work.

i think they’re acting pretty rationally once you consider their own incentives. (plus of course, the general cultural perception that machine learning is cool, smart, and cutting edge must rub off on them too and how they perceive themselves.)


Why solve the problem in Excel when you can solve it with a 1000 line C program and earn 4X what the Excel analyst makes? Why solve the problem with a deterministic and efficient C program when you can solve it with 99% confidence using TensorFlow on a 100 node Spark cluster and earn 4X the C programmer?

Incentives are currently aligned toward everyone choosing to solve problems using the most complicated tools available.


My counterclaim would be that doing ML in code is actually easier these days than doing stuff in Excel (and a 1080 is sufficient). So why not solve the problem with a sufficiently easy to use tool? Your other point is valid. If it can be solved deterministically than you should probably do that (but I'd probably not reach for C since I don't need the C-efficiency hammer right away)


Because it's likely that when solving things in Excel, you actually know what you're doing. The popular ML approach these days is throwing data at whatever you downloaded from Kaggle and hoping it sticks.


> i think they’re acting pretty rationally once you consider their own incentives

Is that even true? Or are candidates and companies just caught up in a cycle of signaling?

In my experience, there is amazing demand for pretty straightforward software engineering, and the talented leaders there are coveted and paid very well.

Analytics, ML, etc. are desired as well, but are a minority portion of the demand.


If you're in a '2nd tier' first world country like Canada, Australia, the UK, Germany etc, Machine Leanring is one of the only ways you can get a salary comparable to a 'normal' Software Dev in the US. Keep that in mind.


Which countries are first tier?


Not many, Switzerland and Norway are all that come to mind.

The real point I am trying to get across is the huge gulf between the career of a US software engineer vs almost anywhere else in the world.


I am not looking to move right now but I still have a couple of job searches saved from before and everything it seems is data science, machine learning, big data etc now. Even jobs that are very obviously Excel or similar and paying accordingly.


It is also difficult to get any funding or interest in what I would call "medium data".


I suspect a smart engineer can make the vast majority of big data problems fit in memory by appropriate filtering and pre-processing but you’d never get your purchase order for that 200-node Hadoop cluster approved if you admitted it...


“It takes a big man to admit his data is small.”


Even though every nearly every enterprise runs on medium data, and can't admit it because that would make them seem inferior to their competitors, who they think have big data.


I remember when we used to just call that “data”


Sometimes those terms are added to advertisements to attract people who want to be doing those things, not because they are actual job requirements.


Also known as “bait and switch”.

Most job ads mentioning Haskell and OCaml are really for Java too.


I thought the well-paid, high prestige jobs required a Phd in the field? Or has machine learning become widespread enough for the unwashed programming masses?


We now can teach applied ML to undergrads and have them complete decent practical projects using the various ML frameworks which now are quite high-level and abstract away most of the math. Properly understanding your data and what's possible is still a bit of a challenge, but that's more about general intelligence and less about any particular skill.


There are point and click GUI tools and frameworks and libraries that simplify a great deal of complexity involved, but ___domain expertise is still required to be able to understand what all the parameters actually mean, eg how not to overfit the model to the training data.

There are well paying ML jobs that don't involve getting a PhD first.


It's an inverted approach to problem-solving that is fine for side projects and self study but inappropriate for getting things done.

I suspect part of the problem is that people don't really have enough time for self study, so they, consciously or subconsciously, start finding excuses to inject their interests into their work. I know I am guilty of having done this in the past.


So they can put on their resume that they have experience with machine learning? I don't have any shame about overdoing things when I think it will improve my resume.


I think these are just the normal hype cycles for new technologies. Some technology (maybe new, maybe not) captures the imagination of the industry, and there's a flurry of mostly silly activity where people are trying to figure out what else that technology can be used for. It feels like there's low hanging fruit waiting to be plucked, but no one knows what's actually going to stick. Eventually the hype dies down as a consensus forms on what it is and isn't good for.

From a money perspective, there's also a gold rush mentality. VCs are going to be drawn to these sorts of things, even knowing that 99% of it will turn out to be a square peg in a round hole, because somewhere in there, there might be a unicorn. People seeking investment from VCs might also promote the buzzword of the day for their own marketing purposes (to investors, the public, and the press).


>I think these are just the normal hype cycles for new technologies.

You're not wrong. Remember in the late 90's when everything was all i-this, and cyber-that? Everything was going to be on "the interweb!"

Those big mainframes of yesteryear? We don't need them anymore. We have AOL and thin clients!

Back in the 80's: We don't need rockets anymore, we have space shuttles!

In the 70's: Who needs stereo? We have quadraphonic sound now!

Look at old mainstream magazines from the 1950's. EVERYTHING was going to be solved by being "atomic." Too busy to cook dinner? Let your atomic kitchen do the work for you! Spilled coffee on your tie? Let an atomic washer make it clean for you! Little Johnny put in the dunce corner again? Slap an atomic Think-o-Tron on his head and let radiation smarten him up!

I'm sure there are a brazillion other/better examples. Feel free to add yours.


Definitely applies to all of that and blockchain if you want to separate that from cryptocurrency.

“We know what it is - just not what it’s for... yet”


The other day I read about a company wanting to prevent ticket touting by using... blockchains.

Erm, what... the... fuck? The hype around cryptocurrencies is that you can use it without trusting a single entity (a central bank). Only 1 entity determines whether an event ticket is worth anything or not, and that's the ticket checker at the gate. Why the eff would you need blockchain for this?

But anyway, the startup probably got some millions of dollars, they're probably smart because they can see how stupid investors are...


I’ve just been replacing blockchain when reading with “mathy database” and yea a lot of context doesn’t hold up.


As a student in CS I see myself falling for this simply because other hammers have learning curves.


I'll play devils advocate and disagree. I'll claim that deep learning is the greatest hammer I have seen in my lifetime and it can tackle all nails you throw at it (universal approximation theorem). Additionally it is easy to apply and get very reasonable results. At least for me it's easier to whip up a Jupyter Notebook and use some basic heuristics and a library like fastai than to struggle through Excel.

Assuming that the data is available and curated (I'd argue this is a hard and very value creating task).


>I see these days, especially, junior engineers excited about machine learning analyzing data sets that could be tackled easily with learning the problem space and excel or a jupyter notebook.

Ditto. I watched the entire Google I/O Stevenote, and was surprised how many of the "problems" it's tackling with "AI" have already been solved by other, simpler solutions.

To be sure, there were some impressive moments. But much of it was reinventing the wheel with an AI spin.


So why are junior engineers trained to think this way?

For us, it can be a competitive advantage, as thousands of engineers dismiss more traditional techniques in favor of sub-optimal, but fancier ones.

And, a few years from now, it will be another powerful tool, as the The Next Big Thing takes over.


As long as we find other ways to get through HR filters.


Because technology is more than practical, it's also cool.


I'm not so sure when it comes to machine learning. Those techniquss represent such a fundamentally different approach to pattern recognition that it enables the tackling of whole classes of problems that couldn't previously be sufficiently automated with hand-coded heuristics and straightforward statistics.


This is undeniably true, but consider that 99% of classification problems encountered in the industry can be solved with a logistic (if not linear) regression.


Those techniquss represent such a fundamentally different approach to pattern recognition

You are kidding right? With the exception of a few modern breakthroughs like GAN the vast majority of ML is 70s and 80s ideas that Moore’s Law has just taken mainstream.


The fact that these approaches are now tractable represents a paradigm-shift in what we can do with computers.

That's what the parent-poster meant.


> With the exception of a few modern breakthroughs like GAN

And ReLU (2011). And Glorot initialzation (2010). And He initialization (2015). And Dropout (2014). And Batch Normalization (2015). And Adam optimizer (2014). And distillation (2015). And... None of these were known or used in 70s and 80s and performance difference is enormous.


70s and 80s ideas that couldn't be implemented at scale due to the lack of massively parallelized GPU architectures with trillions of transistors to simulate the neurel net layers.

It was only in the mid-late 90s that simple handwriting / character recognition techniques were able to be used in somewhat real-time.


True. But many many many problems can be solved without that hammer as well.


Reminds me of this article, which recounts a book by Knuth where the reviewer comes up with an 6 command shell pipeline equivalent to one of the multi-page examples:

http://www.leancrew.com/all-this/2011/12/more-shell-less-egg...


That exchange never sat right with me. I feel, and I hope Dr. Knuth would agree, that literate programming is a good choice for writing `uniq`, that the problem ___domain was chosen for didactic reasons.

In effect, he chose to show off parts of several programs, rather than finishing one. The intention was to teach.

That shell script was a tour de force at the time, showing off the Unix Way of small, sharp, composable tools. What Dr. Knuth was (and is) saying: write those small, sharp tools in a literate manner. A real missed opportunity.


Indeed, yes exactly. And in fact, later Knuth wrote `wc` (one of the shell programs in the pipeline) as a literate program, and showed that it was better as a result than the one that came with Unix.


That was a good read, thanks


How’s the old interview question go?

Using the programming language of your choice, sort the strings in this file and remove duplicates. You have 30 minutes.

    sort foo | uniq


I always give props to people who give a solution like this because sometimes that is the right thing to do and it acknowledges that we are dealing with a toy problem with a ready solution. And I think that’s what the interviewer in this post failed to do. But in reality we may have a not-dissimilar problem that doeasnt deal with strings and is just a bit more complicated so you can’t just take a bash command or genetic function off the shelf and have it work. You’re going to have to write some code, it’s going to take a little thought, and your solution can’t be O(n^2) when we throw ten million elements at it.

I’ve worked with engineers who are competent programmers but just can’t handle these problems when they come up in practice. If the job actually requires it, I don’t know of a better way in the context of an interview to determine if a person is that type of programmer than with a few toy problems and a solution in code. You may miss some good people because the format is suboptimal, but you will weed out the people who just don’t have the theoretical and practical background to solve this particular type of problem.


I'm not certain that it's true that you "will weed out the people" who can't solve algorithmic problems. There are a relatively small number of algorithmic problems, that are simplest enough to be asked and solved in an interview context. People now study these types of problems for months when preparing for interviews...

I think there are better ways of doing things than the standard X interviews of 1 hour. One method that works well is hiring people for short term contracts (few days) and seeing how they work with the team on real problems.


I agree, the interview format isn't sufficient if it's algorithmic trivia.

The parent comment of yours identified the right need for them (a person who can come up with a solution when the time comes in the real world), but the method of toy algorithms means you could end up with the following scenario:

Person A: Spends months drilling solutions on whiteboards to memorize the answer. They pass as they can regurgitate every toy algorithm possible without thinking. They pass the interview.

Person B: doesn't deal well with interview situations and doesn't get the chance to talk in general about how they'd approach it in real life. They freeze up as they're stood up in front of a panel of people holding a whiteboard marker and being stared down. They don't progress in the interview.

Perhaps person B was the person who would sit at their desk for 20 mins in silence and come up with:

"ok if we're not deduping these on the inserts for our system, I guess the least we can be doing is maintaining one of those in-memory bloom filter things, I read about those one time as a good probabilistic data structure, might end up being a part of solution. I'll ask the tech lead if we have any memory constraints as I see that our AWS instances are compute optimised instead of memory optimised, anyway, we might get an off the shelf solution"

Person A: "which question in my 'interviewing for algorithms' text book does this belong to?"

You want Person B, you optimised your interview process for A.


> One method that works well is hiring people for short term contracts (few days) and seeing how they work with the team on real problems.

The problem is that most really great people, who are in high demand, won't put up with this because they know they can get someone else to hire them with less hassle. Your approach ends up weeding out your best candidates.


Google and Amazon (and I assume Facebook) interviews are already a large and uncompensated investment of time for the interviewee (up to several days). So I don't see it has much different.

I can see that there are scenarios where it might not work, for example if non-competes are in place and the industry is similar. But in most cases I think there's enough flexibility to make it practical.


I definitely agree.

I've seen the typical examples of ugly code over the years; over-complicated conditions, piles of IF statements, things that should be in a database or associative array, all because the person just couldn't solve the problem in a simple, straight-forward way.

These people get stuck on what should be a simple problem and productivity plummets. If you've spent the whole morning on something, taking it from 5 lines to 105, you're probably attacking the problem from the wrong angle.


Definitely can relate to that... One of my colleague is the type of person who somehow forcibly denies doing the most straightforward things/solutions. Every problem is being 'solved' by pulling in yet another library that somehiw touches the problem. Of course it needs it's own layer of abstraction to be swapped out easliy for yet another library im case the first is deprecated.

All that for stuff which can be solved with simple 1-10 lines of code. (not these everlasting functional programming chained to eternity lines. real slim and dumb code)


So... we use the SortedSet provided by the library?


Clearly a suboptimal solution. That pipe isn't free. /s

  sort -u foo


Seems to be too little UNIXy. Modern tools sometimes not only do one job. (the z flag might have been kept out of tar too for example) :P


With piping you can leverage multiprocessing for free.

Using sort -u alone might actually not be as fast as sort | uniq -u.


Doesn't have to be Bash either, Python is pretty good for this too:

  sorted(set(foo.strip.split(' ')))  #although we need to open the file first
just to be on the safe side

Perl's not too shabby either for this kind of stuff :)


If you're using gnu coreutils: sort -u foo.


And what if the strings are separated by spaces instead of newlines?


sed foo -e 's/ /\n/g' | sort | uniq


If you want to preserve order, this is how you uniq with awk: awk '!x[$0]++'


Yeah, but what if the strings I wanted to sort contain newlines?


You (sir|madam), are a feature creep. If it weren’t for people like you and their “real requirements” software would be easy.

By the time you say “unicode” and “collation” the programmer is going to be in serious trouble.

Fortunately the programmer can lawyer the problem. The original requirements don’t specify what the strings are sorted by. I choose “position in original source” file and complete the task with a NOP.


echo "Input file already sorted."


You haven't specified a delimiter.


protip:

tr ' ' '\n' < foo | sort | uniq


Broken. uniq alone removes only adjacent duplicates. You need uniq -u to remove all duplicates.


This is not a concern for already sorted lists.


> > sort foo | uniq

> Broken. uniq alone removes only adjacent duplicates. You need uniq -u to remove all duplicates.

As benchaney (https://news.ycombinator.com/item?id=17060154) points out, the duplicates will be adjacent after `sort`.


For those of us who have written an embedded OS in C, the situation is no different. The white board questions are crap like "How is React different from other JS frameworks?" Whaaaaat?


To me those sorts of questions are terrible interview questions. I'm looking for a few things when I hire:

- What are you like as a human being? Can we work together?

- Can you code? [1]

- Are you willing, in fact keen, and able to learn?

- Can you see beyond the technical issues?

[1] This tends to be frowned on in some circles, but given the bi-modal distribution of developer skill, leading to plenty of smart people who aren't good programmers, in my view it's of critical importance. Generally asking somebody to solve a relatively simple problem in a language they're comfortable with and that is also similar to something we use, or pairing together on a problem, works well here. What I don't care about is understanding of specific frameworks or technologies: if you're a good dev you can learn those, and we're happy to invest and give you the time to do so.


I just ask them to talk about the projects they've worked on recently and any issues that arose, and how they dealt with them.

If they can communicate clearly what happened. I've never had that fail me as far as them being "technical enough" for the position as no one knows everything, can you figure it out is the bigger issue.

Then I typically just have them tell me about what they liked and disliked at prior jobs and what environment they would like to work in.

After that, what are their goals for their career and how can we help them achieve them.

If the interviewee can demonstrate they have anything in those areas, I've never had an issue with someone I've hired. I don't care if you can program everything walking in the door. If you've got a can-do attitude, we can work together to get you up to speed on our crap code and the whys behind it.


Right, seems the real issue here is that HR/recruiting misrouted the OP's resume, or perhaps the interviewer did not read it carefully, resulting in a negative experience for both. The real issue is that more senior engineers and managers need to be involved in earlier parts of the hiring process.


In my experience, HR/Recruiting teams hurt more than they help the hiring process.


Given how much of software development itself is often little more than cargo culting w/ strong doses of guess & check stirred in, I suppose it makes sense that hiring/interview practices would mirror that, but it's still depressing everytime I come across stories like this one.

It doesn't have to be like this.

I don't hire people this way because I've never seen it produce good results, and I have to hire esoteric skillsets for a challenging problem, and without the time or budget that will forgive making bad hires.


I understand that the article is about how many developers are inclined to use wrong "hammers" for certain tasks.

However, it also makes me think about how we actually code. At my university, we learn to sort files, parse data-structures, model various data formats, etc.

But in most UNIX systems, we already have the basic building blocks in the form of simple programs that do one thing, very well. Will programming ever evolve from programming languages to more complex building blocks in the form of programs combined with pipes in the bash terminal?


> Will programming ever evolve from programming languages to more complex building blocks in the form of programs combined with pipes in the bash terminal?

Fundamentally, UNIX "programs combined with pipes" are equivalent to function calls.

  sort foo | uniq
is equivalent to

  uniq(sort(foo));
Shell is nothing but a programming language/environment optimized for fast input and immediate, stateful execution. If you seek evolution of programming to higher levels of abstraction, this isn't the place. Command line interfaces work at the same abstraction level as regular programming in a high-level language.


Well, some programming languages encourage that kind of design by emphasizing message passing. You even get piping between functions in languages like Elixir.


> Will programming ever evolve from programming languages to more complex building blocks in the form of programs combined with pipes in the bash terminal?

1) A bash script

or

2) Those are called libraries.


Microservices?

No bash though


The only sound recruiting advice is Joel's, which remains true today:

Hire people who are smart and get things done.

The interview questions to determine that go something like:

"List out your top achievements in programming"

"Explain each of them to us in as much detail as you possibly can"

I'd look for enthusiasm for this particular job too, assuming they have had time to fully understand it and give it some thought.


Sitting on the other side of an interview, I was so annoyed with the guy I was doing the interview with because he insisted on these algorithm challenges. I told him these tasks are bullshit, because the job we do doesn't involve coding algorithms EVER.

I wanted to focus on the ability to read messed up code and figure out what it roughly does and how to improve it.

Turned out we were both kind of wrong, although I would claim my co-worker was way more wrong than me. Our candidates pretty much failed all these algorithm questions. And when my co-worker tried to help it turned out that he didn't have a proper grasp of the problems we presented either.

My reading code test didn't work very well either, because these test subjects were fresh out of college and had little experience in reading larger code bases, refactoring and dealing with software engineering problems. Like most people in college they had mostly dealt with toy sized programs.

So in the end what proved the best guide was plain old boring interview questions. Our interview was largely a failure but we hired the programmer on a hunch anyway based on her explanation of some theoretical subjects.

It turned out that she worked out just fine despite failing pretty much all the tests we had made. That is because a lot of development these days is more about persistence, motivation and ability to google. She was practical, eager to find solutions and eager to google and that made her get stuff done.


> That is because a lot of development these days is more about persistence

This is true of most technical endeavors ...


I'm honestly tired of programmers complaining about interviews. I don't think the whiteboard blitz is a good gauge of an employee, either, but I don't know of any good "test" of a potential coworker other than working with them.

One thing I think people wildly underestimate is how aware the interviewer is of the inadequacies of parts of the process. I will ask basic programming questions for the same reason I write unit tests. Weed out obviously broken candidates with really easy questions. But beyond that, I need people with common sense and a strong sense of professional ownership. I need engineers with spines. I need people who tell me what they think needs to be done, not people who are going to sit in their cubical complaining about how crappy the interview process is. I need people who learn quickly, not people who will be paralyzed by code smell or new tech. I need people who can jump into management roles to fill in gaps because they get how things work and want the team to get shit done.

If you don't like the process, fix it yourself on your team. Do better. Teach others. Believe it or not, lots of people have gone down the same rabbit hole, and your "crappy interviewer" is likely one of them.


I'm honestly tired of programmers complaining about interviews. I don't think the whiteboard blitz is a good gauge of an employee, either, but I don't know of any good "test" of a potential coworker other than working with them

That's the answer - "you work with them".

I create a simple version of a real world class that we have worked on. Each method has comments on what the code is suppose to do and a bunch of failing unit tests and we pair program.

The first set of unit tests are relatively easy. I then add a second set of requirements with a second set of unit tests. They have to modify the code to pass the 2nd set of unit tests without breaking the first.

But as far as "going down the rabbit hole", I've never gone down that rabbit hole of algorithm type tests. I was drilled on C in an interview in 1999 because they were doing a lot of hard core C work (I stayed on that job until 2008). But since then, most of my interviews have either consisted of the sitting down at a computer and solving problems type or just drawing on the board and explaining architecture.

Even though I'm very much hands on, at this point most companies hire me more to help them with architectural issues than just as another coder. My salary requirements filter the just another developer type roles out.

No, I'm not bragging about how much I make. I'm still only making the median for a senior software Engineer/architect for my market - if the salary surveys and my anecdotal experience is accurate.


In defense of the interviewer, if somebody told me they wrote an OS for the Raspberry Pi I would assume they wrote it mostly in C, not mostly in Bash.


> I made an distro for the Raspberry Pi called Crankshaft

Someone who picks up that info and turns it into interview material should be well aware about the difference between making a distribution and writing an OS!


Not to mention that since the interviewer now has a name for said distro, they should have looked it up and found what language it was written in.


No that's the problem. I said I did not write this to bash the interviewer and I really meant it.

The idea here is that to achieve something quite useful like a distro ("made an OS") for the Raspberry Pi, the important part is to figure out a problem that needs solving and a tool to solve it. A person who made something cool might be actually quite dumb, and that's alright. A good answer is something that foremost solves a problem. A bad answer is a fancy technology that doesn't do it better.

Here I sucked at my Go solution and I deserved to fail. But here it's just a lame fail because my initial solution worked and the rule changed to accommodate the fact that it was not the technology he expected.


I think that whats missing in this assessment of HR these days, is that interviews are just as much an opportunity for the prospective employee to evaluate their potential new employer, as it is the other way around - only, we're seriously NOT used to thinking in these kinds of terms.

I've been programming since the 70's, and have built systems and subsystems that are still up and running around the world. If you logged onto an ISP in the 90's and early 00's, chances are you were going through a code-path that I was responsible for .. if you took a train in any one of 38 different countries in the world, in the last 15 years, chances are your life is being protected by code for which I was once responsible. I've done a lot of crazy shit; I've done a lot of cool stuff... I say this only to indicate that I'm no newbie when it comes to software systems engineering, and all it entails: I've shipped to tens of millions of people, and I've shipped to a small, localised group. Point is, I've shipped, yo.

A few years ago I went to a job interview at a local game development company, since I have a high interest in game engines and have even built a few of them myself, as well as published a few games over the years. I thought the interview was going well, and at first I hit it off just great with the guys doing the interview .. but then as they started pulling in more and more other engineers to meet and greet, and ask questions, I started to get a feel that I really didn't want to work at this place.

Mid-interview, I felt like I should just stand up, say "sorry guys, you're not a good fit for me", and left... but instead I decided to sit through the interview. This was a big mistake - they came with the inevitable whiteboard questions, and so on. So, I flunked them all - not because I couldn't work out the A* algorithm on the board, but because I really lost interest mid-interview.

It was only a day or so afterwards, having a bit of retrospective rumination time, that I realised that I just really didn't want to work with those guys. They rubbed me up the wrong way, and there was just something about their culture that I didn't feel for .. well, I should've just said something. I should've said "well, you guys have a great company (it seems), but I'm seeing all sorts of warning signs in this interview, so I think I'll just pass on this opportunity - and thanks!".

But instead, I stuck around and ended up just degrading myself with stupid responses to their requests. I didn't really feel like I was in power, or had any control - but honestly, I did. I really could have just stood up and said "well, that last guy you brought in was a bit of a wanker, and I don't want to work in an environment of wankers, so thanks for the opportunity but no thanks"..

I think this is a real cultural dilemma. We, builders of technology, should really acknowledge our value, and place it a lot higher than the current HR-infected job market implies. Its a double-edged sword of course - nobody wants to be known as the guy who shits on this interview companies - but on the other hand, why do we put up with such imbalance?

EDIT: Just wanted to add that this particular line of the article resonated with me: 'He asked, "But didn't you write an embedded OS?".'. That, right there, is a sign that the interviewer has a very, very poor understanding of the subject, and the talent should just get up and leave. I wonder how many of us overlook this fact in our interviews, out of desperation?


I interviewed once for a job with a defense contractor. Mid-way through the interview I was told that if they decided to give me the job I could expect to work 60 hours a week for the next 18 months without getting paid any overtime. I didn't come out and say so, but I'm sure the interviewer could tell that I wouldn't accept an offer if it was forthcoming.

In retrospect, I think I should have said "Thank you for spending the time to talk to me, I don't want to waste any more of our time", and left.


I was told that if they decided to give me the job I could expect to work 60 hours a week for the next 18 months without getting paid any overtime

I’ll wager whoever said that was a decent guy who liked you and decided to do you a favour.


In the USA, I'm pretty sure that violates federal contracting rules. Maybe you ought to report them. As I understand it, they have to pay overtime if you work 48 hours in the week. The typical expectation is thus 47 hours. Perhaps you would not have been directly billed to the customer, possibly making the rules inapplicable.

Unpaid overtime is something I expect much more from employers that are not doing government work, particularly the game industry.

I've found a very different situation at a federal government contractor. Normal weeks are 40 hours, it is rare to work beyond that, and any extra work means extra pay. BTW, see my "Who is Hiring?" post if you wrote an embedded OS.


I'm jobhunting now. Two interviews come to mind.

1. During a technical phone screen for an SRE role, the questions delved into, well, the nitty gritty parts of containerization, as if I were building my own Docker. Afterwards, I regretted not calling out how bored I was by this. I didn't want this to be my day-to-day work, and I disengaged from the questions. I should have bailed.

2. I'd submitted my resume for an SRE role, and a few minutes into a technical phone screen we realized it was for more of an IT role. The interviewer asked if I wanted to keep going, but I opted to pass, as it wasn't a great fit. Talking to the initial recruiter again got the SRE role rolling again.

Calling the interview off early saved time for me and the interviewer. If it had been more interesting or relevant, I could consider it practice.


To take this even further: how about requesting the opportunity to ask the interviewers questions before instead of after the technical portion? Then, if there are any clear red flags about the company or work environment one can avoid a gruelling yet ultimately pointless whiteboard session. It should make no difference to them so if they refused this it would be a giant red flag in and of itself.


It really demonstrates a lack of competence, when a firm uses a standard cookie cutter interview approach on a candidate whose record clearly and obviously demonstrates the abilities the interview approach is trying to determine.

A bad sign for sure, from the candidates perspective.


I need to do this too. It's not polite to waste everyone's time, including mine.

Thanks,


"Invert a binary tree on a whiteboard" I wonder if you get any points for turning the board upside down.


> If his tech model turns out to be relatively successful, a medium-sized company in tech will be able to easily copy his model and out-compete him

Does this happen all the time and we don't realise it due to survivorship bias? And if so, how do we account for the successful stories that started from the bottom up and became medium-sized companies themselves?


> "I can do it in two commands chained by a pipe in bash."

> any programming language I like

I choose Bash.


Probably depends on what the "commands" are.

You could also do it with two Node.js functions (from npm packages).

The presumption is that the available library is limited, though exactly how limited depends on the interviewer.

Plenty of times I've chosen a language because of a particular function in its standard library, only to be told that function can't be used.


>I don't intend to write this post to bash the interviewer

pun, intended?


So I said, iL call the snmp API to read the data. Then he said, "no, you need to read the bin file to read the data". I thought, what an idiot he is. I lost respect, lost faith. Little knowledge is a dangerous thing.




Consider applying for YC's Summer 2025 batch! Applications are open till May 13

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: