This reminds me of a Steve Jobs quote "it doesn't make sense to hire smart people and tell them what to do; we hire smart people so they can tell us what to do." Any company that doesn't have this as part of its culture will feel very depressing to creative people.
Who's the "us" in this quote? We have project managers who are ostensibly experts at determining the needs of the customers, but whereas they want to solve the problem of findability in our app with a glorified tagging system, I want to solve it by making our search good. Who's the smart person and who's listening?
I've been working in ops for quite some and only within the past two years got to work with PMs. Some of them are absolute pleasures to work with, but.. get more than two PMs on a call and just sit back and watch the chaos of Getting Nothing Done in full force. It's pretty amazing until the dread kicks in that those same bickering folk are making a huge portion of the decision making around the world. Bless the souls who can deal with that and do a good job at the same time..
My most notable experience with PMs was when a PM gave another PM a six month project to me and only introduced himself about a month before it was over. Normally, this wouldn't be that bad at a big company, but the dude was maybe 100 feet away and could've stopped by at any time. Ugh, I say!
Even Apple doesn't do it that way. Like many things Steve Jobs said, it contradicts what he did. The things we say about ourselves espouse ideals and far-mode thinking to enrich our status and legacy. But the brass tacks of what we actually do is still boring, controlling, manipulative, micromanagerial, and petty.
I've never encountered any company, nor heard of one, that enacts the advice of that Jobs quote.
In fact, I've recently been taking a deep dive into some organizational management literature to try to better understand where these absolutely ass backwards ideas of "cross functional" and "full stack" teams came from, and why they are so bundled together with open-plan offices.
I found an interesting paper [0] that seems to serve as somewhat of a basis paper in this area, and which makes bald and unsupported claims about the superiority of what are called "process complete teams" (this, as far as my research has yielded so far, seems to be the prototerm that led to "full stack" and "cross functional").
The entire idea of this management philosophy is that you exactly should hire smart people then tell them what to do. The advice is basically to take generically smart people and then re-train them to have a generalist skill set as it relates to your business, so that every person can do every task, from talking to the customer to filling out the right paperwork to actually writing the code (whether it is front end, back end, whatever) and walking through all of the steps from start to finish.
The paper advocates having a small number of job titles in an organization, each of which has many employees with that title, and to create teams and positions with enormous overlap in terms of their skill set and responsibility set.
The paper makes all sorts of weakly supported (or entirely unsupported) claims that organizing by specialization (what they call "functional" units) is bad, and spends time talking about how to "break" specialist employees form their mindset that their specialization matters and instill into them a cross-functional "sense of responsibility" for the entire possible pipeline of work.
This is where the ties to open-plan seating arrangements come from. It is argued that open-plan, community seating creates a communal sense of responsibility for the entire pipeline of work, whereas even changing to an organizational structure that is "cross functional" or "full stack" is not enough whenever employees are allowed to have privacy while doing their work -- their privacy, so it is argued, and private work habits prevent you from successfully "breaking" their mindset that they are a specialist focusing on their special subset of tasks. Eerily similar to both military indoctrination and prison inmate indoctrination...
It's very easy to see how this mid 1990s organization literature has created the bizarre dysfunction of modern "cross-functional" teams and "full stack" developers, leading to Agile/Scrum bullshit, pan-everything job ads with a million bullet points, the treatment of specialized workers as undifferentiated cogs, and the flagrant hostility on the part of the modern HR apparatus towards employees whose sense of pride in their personal specialty (something they had to protect in order to have a resume that would have gotten them hired in the first place) leads them to protect themselves by demanding an employer provide relevant work resulting in HR labeling them as "not a team player" or some other buzzword meant to "break" them (isn't it insane that a professional management article is using language like "breaking" an employee, like this is boot camp or something??).
Anyway, while I love indulging in some good far-mode thinking about the peachy world where companies actually hired specialists and did the real work of managing them (instead of the lazy "throw it all in a cross-functional bucket" non-management they do now), it's just not the world we live in.
About the idea of "bizarre dysfunction of modern "cross-functional" teams and "full stack" developers, leading to Agile/Scrum bullshit": I have never heard people advocate that this should apply to sales, marketing and HR, too. User-centered designers and graphic artists also often seem to be able to carve out a niche for themselves. So, it seems the only discussion should be about how specialized the specialists can be, not about banning all specialization.
I also don't really know any organization that goes all in on "all team members are interchangeable". I do know one organization that seems to go all in on "each team member has his own unique tasks, rights, and responsibilities", and that is disastrous (X writes SQL script, but is not allowed to know the name of the database or the server it is installed on, and isn't allowed to run the script; Y is allowed to run SQL scripts, but not modify them, not even to fill in the database name. Their manager compliments them because they refuse to get anything done/follow the rules to the letter)
I think it is examples like these that led to a swing to the other end of the spectrum that Scrum advocates. I also think the pendulum will swing back.
> I also don't really know any organization that goes all in on "all team members are interchangeable".
This has been a ubiquitous and openly stated goal from management in every job I've had. In one job that followed a consultant's implementation of Agile to the letter, it was actually part of new employee on-boarding to be turned into a full-stack developer no matter what your background was.
Our team worked on a large predictive analytics project, and you can imagine how hard it was to recruit machine learning engineers when the proposition was, "Gee, your amazing degree from Georgia Tech and your experience building a novel recommender system from scratch for a start-up are great and all, but don't you think it would be good for you to spend 6 months fixing Javascript bugs related to poor performance of some of our app's drop-down menus? BTW if you don't exude boisterous enthusiasm for that, it means you're toxic, poor attitude, not a team player, and HR will never let you through the interview process."
I've read the Netflix document many times. To be fair, I don't have any special knowledge about working at Netflix to suggest that Netflix doesn't actually empower specialists. I am curious if Netflix avoids Agile/Scrum or offers real private working conditions though, and if not, why not? Answers focused on "collaboration" would count as evidence that, contrary to the slide deck, you aren't actually "hiring smart people and getting out of their way."
Generally we have cubes (as far as working space goes). It's up to the teams if they want Agile/Scrum (most don't, a few find it useful, folks are encouraged to experiment).
Collaboration, in contrast to many other valley companies, isn't a big priority. Since we only hire senior (Finder) devs, each is capable of taking on one or more projects almost solo. So where you'd have an ~8 person team at Google, Netflix has a single developer. There are pros & cons to this approach, I might talk about them in a future article. The practical upshot is that everyone does what they think is best for their particular project (others are encouraged to provide feedback but the ultimate decision is left up to the project owner [who is not a manager]).
> Collaboration, in contrast to many other valley companies, isn't a big priority. Since we only hire senior (Finder) devs, each is capable of taking on one or more projects almost solo. So where you'd have an ~8 person team at Google, Netflix has a single developer. There are pros & cons to this approach, I might talk about them in a future article. The practical upshot is that everyone does what they think is best for their particular project (others are encouraged to provide feedback but the ultimate decision is left up to the project owner [who is not a manager]).
The emphasis on individuals is refreshing. Saying "the team decides" isn't a huge help when "teams" get thrown together with little regard to preferred working styles.
If you can guarantee that non-Agile teams never have deliverables required for Agile teams (and it sounds like such a guarantee is something you strive for), then this seems reasonable. But it seems very fragile if you can't guarantee that condition.
"Do what you feel is best" is the right attitude, so at least it suggests you treat employees as grown-ups. But I would worry how that "do what you feel is best" property could possibly trickle down to subordinates within Agile teams.
Subordinate is just anybody who has to take direct instruction from someone else. Doesn't have to be codified via job title. If you have a "team" that does Agile, you thereby have subordinates (the members of that team, subordinate to whomever is reviewing e.g. burndown) even if no one calls them subordinates or explicitly thinks of them that way.
FWIW, Agile/Scrum is not, by itself, something that smart people would avoid. It's good for certain situations and bad for others, like every other management technique. I use (a lightweight form of) it to manage my own projects when they reach the "mopping up, fix all the little details that the customer will be annoyed by" phase, and I am both CEO, manager, and sole developer, so it's clearly not a power thing. :-)
The question you should be asking is when would you want to use Agile or Scrum. An answer of "All the time! It's the greatest thing ever" indicates either stupidity or limited experience. But an answer of "We never use that bullshit" also indicates either stupidity or limited experience. Good managers and good cultures deal in shades of grey, and are able to drill into the details of the situation to identify when a technique is appropriate.
There's a distinction between 'agile' (the manifesto) and Capital-A Agile. I believe your parent comment is referring to the latter, where Agile has just become yet another way for managers to control their underlings.
...I mean seriously, "Agile process" is an oxymoron and terribad managers go around saying it without a hint of irony. Seriously?!?
There's a distinction between 'agile' (the manifesto) and Capital-A Agile.
Yeah, that's kinda my point. People throw the word "agile" around as though it has one precise meaning, when it clearly doesn't. And IMO, the kind of "Agile" that people are railing against is so far removed that it doesn't deserve any association whatsoever with the expression "agile". And then there's the thing where people act like "Scrum" == "Agile" when Scrum is just one of many processes that claim some degree of affiliation with the "agile" world-view.
All of that said, I get the vibe that some people either don't agree that there is a distinction, or don't care, and have a blanket "agile is bad" mindset. It's frustrating to me, as I've experienced a shop with a well run Scrum process (when I was at Lulu.com) and I know from experience that, done right, it's a pleasurable and effective way to work.
Yeah, that's kinda my point. People throw the word "agile" around as though
it has one precise meaning, when it clearly doesn't. And IMO, the kind of
"Agile" that people are railing against is so far removed that it doesn't
deserve any association whatsoever with the expression "agile".
I agree with you, but I'm also realistic about the fact that the term "agile" has been thoroughly and completely tainted by its association with bad implementations. One may as well complain about people referring to "Linux" rather than "GNU/Linux", or using "hacker" to mean people who break security rather than programmers who come up with novel solutions to interesting problems.
Words are defined by usage, and every manager I've encountered has used "agile" to mean a system of project management in which work is broken up into things called "user stories", and is then bucketed into fixed-length "sprints", with planning sessions to actually put the work into the buckets. Now you may say that this is one mere implementation of agile, and you'd be right. But it is by far the most dominant implementation. It's like saying that GNU/Linux is one mere implementation of a GNU system. You're right. But realistically, no one you know will be familiar with any others. It's the same way with agile software development. Scrum is one mere implementation of agile. But it's the only one most developers and managers encounter, so it's understandable that they'll conflate the two.
So few implementations of Agile have the slightest connection to the quality-focused spirit of the Agile principles that there is no such thing to talk about "Agile" apart from the ubiquitous failure mode.
You say,
>the kind of "Agile" that people are railing against is so far removed that it doesn't deserve any association whatsoever with the expression "agile"
but I say that the idealized notion of "agile" you're talking about simply does not exist in reality, and no company using Agile even comes remotely close to it. The fact that Agile is so easily subverted is an indication of how poor a tool it is, and in the end when everyone is misusing a certain tool (the way that everyone misuses Agile) you have to stop doing mental gymnastics to defend the tool and acknowledge the widespread failure is the tool's fault.
but I say that the idealized notion of "agile" you're talking about simply does not exist in reality, and no company using Agile even comes remotely close to it.
Based on how ubiquitous it is to hear about agile failure modes, this suggests to me your experience isn't widely applicable.
I'm glad you found some of those one-in-a-million workplaces that "do agile right" -- but they are so rare that we can't go around basing our overall opinion about agile, or our expectations about the next marginal adoption of agile, upon these kinds of freak occurrences.
Based on how ubiquitous it is to hear about agile failure modes, this suggests to me your experience isn't widely applicable.
Counterpoint: Maybe you're just hearing from a "Vocal Minority". And maybe the people quietly doing agile and enjoying it, don't feel the need to go around trumpeting it to the world?
No, the grievances against Agile are just too voluminous and wide-spread, consider even Dave Thomas's presentation about how Agile is dead due to the ease with which it is subverted for political manipulation [0].
If someone as key to software productivity as Dave Thomas is saying this, it's clearly not just because of a vocal minority.
Your suggestion that we should check whether it's just a vocal minority is a good one. We should check that.
Unfortunately, it's extremely obvious that it's not the case, and the dysfunction / failure mode of Agile is extremely common, by far the majority of Agile implementations.
Your comment represents a No True Scotsman fallacy -- the idea that whatever "agile" is, it must be equivalent to the ideals espoused by people who are affiliated with it.
I've written much more about the in principle failure of Agile [0] so I'll leave it to that.
As others have said, the agile manifesto itself is far less objectionable than most implementations of capital-A Agile Processes. However, I think we should be careful about giving it a free pass -- in particular, it's sometimes used as a counter-argument against remote work.
Having worked in places where each project gets handed off from specialist to specialist, waiting in a queue at each handoff point, I much prefer the cross-functional teams. You have to have accountability for the end-to-end flow of work through the process.
It was very painful to read, because I made exactly the same decision. I was VP/chief developer at a company that withered on the vine after a 10 year roller coaster ride at a software company that I had co-founded. I had two paths: I could have looked for a VP/Director job or I could have continued on as a Developer/Consultant. The money was pretty much the same, so I took the consultant route. The money was pretty much the same and programming was more fun.
All went reasonably well until September 2008. I had just finished a very satisfying consulting gig, but I walked out into a very changed world. The day after was the banking crash. Even earlier a divorce had taken me to the cleaners. But far worse, small interesting companies had no use for older, still up-to-date, still competent developers. I found myself going from Finder to Implementer. Agile development, which had seemed like a boon, turned into a tool to lock everyone into two week death marches controlled by upper management. Technical decisions where made by people who didnt have a clue.
Just as in the linked article, a woman working for me in the 80s took the management route and was a VP in a major SV software company a few years later. Now she is quite well off. Good for her; we are good friends.
Moral? If you are a software developer, dont get old. (I'm still doing it because I like it.)
Wow, I'm sorry to hear about your experience. That must have been an exceptionally crummy time in your life. Here's hoping things have been better for you. If you're ever in the bay area, shoot me an email (in my profile) and I'll buy you a burrito just to hear more of the stories you've got (and hopefully, happier ones).
On Job Titles:
I think it is unfortunate that Senior Engineer is a title that is often now in the middle of the career ladder[0] but I feel that has more to do with the fact that historically most companies didn't have a technical career path like some do now.
Those companies that only have the three levels (Junior, Intermediate, and Senior) are generally speaking not looking for Finders to fill their senior-level positions; their business model just doesn't call for it. At most, their looking for Solvers who can also give direction to the more junior engineers. In many places that's all you need.
However, I think at companies where technological innovation is a strategic business value, you need to be able to distinguish between different levels of Solvers (Senior vs. Staff engineer) as well as identify and reward Finders (Principal and beyond).
I think its important to have the larger list of job titles because I think it communicates valuable information to non-engineers, but I also think separating out this notion of autonomy is really useful and I think can help a lot of technical people think about where they want to go with their career. Thank you for sharing.
Maybe my experience isn't typical, but working as a SW developer gives me a lot of autonomy, even if am "just implementing features".
First off, the requirements are "what should happen", not "how it should be done" (the latter is up to me, or the team).
Secondly, I often end up discussing the requirement with the product owner. What should happen if this happens? Why don't we do it like this instead? It's often a back-and-forth before we arrive at the actual requirements.
Then comes the implementation phase, where it's up to me how to solve it in code.
So in my opinion, I do get a lot of say in what to do, and get to make a lot of decisions - which is why I still love coding, even after 25 years: https://henrikwarne.com/2012/06/02/why-i-love-coding/
I don't know about how typical it is necessarily, because I have a more similar experience to you, but I've definitely worked at jobs where I wasn't only told what to do, but how to implement it (usually by architects).
There are two really good rules for building teams that have worked well for a bunch of companies I've observed and been a part of.
- Only hire people better than you
- Ensure you create a workplace with high alignment and high autonomy
Management's job is not to dictate solutions or to be "the people with the answers". It's their responsibility to make sure the constraints are set properly, and that the environment allows smart people to do their best work.
I've got a few issues with that. First, and I know it's ego-centric, but I don't get a chance to hire many coders who are better than me. I've only met a few of them, IMO. That's partially because it's so hard to judge others, but partially because I honestly don't think they code as well as I do.
Second, just because they aren't better than me right now doesn't mean they can't surpass me in the future. If I rejected people just because they aren't better than me, I'm denying them the chance to grow and the company the chance to help and prosper from that growth.
I'm sure I'd have other objections if I sat and thought about it for a while, but those 2 came to mind rather quickly.
I've switched to using "Only hire people who bring new things to the team" rather than "better than" because the latter assumes that talent is linear. So if your team already had expertise in one area, don't hire in that area - add people who will be coming to the problems you have from a different direction. It's hard to innovate when everyone's just a carbon copy.
"The insight is that alignment needs to be achieved around intent, and autonomy should be granted around actions. Intent is expressed in terms of what to achieve and why. Autonomy concerns the actions taken in order to realize the intent; in other words, about what to do and how."
This only works if you're hiring strictly senior individuals. If you yourself are senior, and you need to staff a team with some more junior individuals so you can build your bench strength, odds are you won't be finding many junior/entry-level people better than you.
But I fully agree with the rest of it. Too many times managers view themselves as "the deciders" vs. supporting their teams and empowering smart people to make their own decisions.
At a certain point once someone is ramped-up, you need to let them sink or swim on their own. If they screw up, you address it, but you need to give them a chance to do that on their own.
That doesn't really make sense, because people "better than you" won't want to work for you. I certainly wouldn't work for someone who I thought had strictly inferior skills.
It makes more sense to frame it as "hire people who can do something that you can't". You should be able to do things that they can't as well -- otherwise you're not bringing anyting to the table.
In other words, this is someone who is equally talented but brings a different perspective or skill set. It takes a lot of different people and skills to make anything.
Not necessarily true. People better than you might want to be a part of the company and the problems you are solving more than they care whether their "boss" is more skilled than them.
Also, depending on the role, it might be different skillsets. Perhaps you are a better manager, but they are a better coder. That's often a good fit.
However it does run the risk of hiring ambitious people who expect to grow, and if they feel you prevent them from doing so, and they bring more to the table, it might turn the relationship sour.
I think that, at the end of the day, most companies have no clue how to properly utilize really smart and self-directed people (Finders). And that's at least due, in part, to the way granting people serious autonomy break the "command and control" structure that most people are so familiar / comfortable with.
Many (most) of the Finder types are probably better off leaving and launching startups, as opposed to trying to carve out a niche in someone else's company.
I like this framework–it captures different levels of autonomy in different development jobs. As my career has advanced, it's definitely moved from implementer to problem solver, with occasional elements of problem finder. I have a lot of autonomy, but still get to spend most of my time writing code (or rather, thinking about code.)
I don't think I want to be a problem finder, at least not yet. Being given business problems and solving them feels like the point where my brain can really engage without feeling like my work is either too vague or too controlled to be important.
Glad this article helped you out - and no worries about needed to shoot to the next level. There are some great advantages to the Solver level - as you've mentioned, there's plenty of space to play & learn without worrying about incredibly vague instructions.
Enjoy it all you want. I can't begrudge someone who's found someplace that makes them happy.
and no worries about needed to shoot to the next level
I think it's a bit off to characterize them as "levels" in the sense that a "higher" level is "better" than a lower level. My sense is that some people want near complete autonomy (I do, for example), but other people would be very uncomfortable in that kind of role (or at least, ineffective). I suspect this overlaps to some extent with some basic personality characteristics as well as goals / intrinsic motivation.
Still, it's a very interesting and worthwhile article, and I believe this definitely contributes something new and useful to the overall discussion. I'll definitely use some of these ideas in the future when it comes to hiring and what-not.
Three years is more than enough for a good and motivated programmer to become disillusioned and want to be a manager to "escape" or to "do it better". Only to fall in the same traps previous engineer-turned-managers fell into.
One of the things that I've started to say to friends for the past few years is that "The worst thing that I ever did was to turn a hobby into a full-time job."
I mean that only half-seriously. I imagine how much worse it would be to do something I did not enjoy doing previously even as a hobby.
Still, the motivation dies from a thousand cuts, until all that remains is looking forward to having more free time, so that one can finally do really interesting things.
Unless one is afforded John Carmack-levels of autonomy...
This is an awesome insight. In the past I've matched them up this way, coder = implementor, tech lead = solver, architect = finder. It has always been interesting to me to look at the skills that made people good in each of those roles, whether it is attention to detail in implementors, breadth of understanding in solvers, or skill diversity in finders.
Perhaps problem finder should be solution seeker? Not everything requiring this level of insight/autonomy is a problem (in the sense of a negative), often a new opportunity for an entity or an individual does need a solution, however.
For those developers who are complaining about management anti-patterns, I don't think they necessarily want autonomy. I think they really do want just better managers, but the truth is that, whether or not they want it, they need autonomy to counteract/avoid bad management.
Everyone is flawed, including managers. You can try and "fix" managers, but that generally works out as well as trying to "fix" anyone else that isn't yourself. The best thing to do is to restrict the scope of those who manage you and increase the scope of your own self-management.
However, most people (if they are honest) are their own worst boss. They work for other people because the alternative would be to work for themselves. Having come to that realization a few years ago, I have definitely been working to improve my self-management. A consequence of this is that I come to crave that autonomy.
I also get to be more picky about who I work for: if I do a pretty decent job at managing myself, you're going to have to really impress me if you want me to hand that job off to you instead.
It's not about skill. My brother who recently started programming has a lot of autonomy at work because he's trying to help out the business in anything he does there. Same thing has happened to me. Your boss is more likely to let you do your own thing when you actually want to help make the company and your boss succeed. Again it's not always about skill
Yeah, I've had a similar experience. Programmers who show that they care about the business tend to get a lot of trust, and therefore a lot of autonomy.
This depends entirely on the person above you. I've found in the past that doing this just results in the person above me either (a) feeling threatened, or (b) relying on me to solve all their problems. Neither is good for career progression.
Just remember, becoming a manager will not necessarily lead to more automation, in fact in most cases I have seen in my career is it leads to even less autonomy. (and in many cases more unhappiness)
Middle management sucks because you are stuck between a rock and a hard place. You are expected to produce results by top management, but can't produce those results yourself, instead you are to cajole those results out of your team.
That is the ultimate in powerless, the unreasonable demands from the top and the stupid developers constantly not working fast / good / hard enough to produce those results. And constant firefighting...
Great article, but I want to rant a bit about this:
>Turns out science agrees on this: People want power because they want autonomy. Most of the time, folks desire to move up the career ladder not for pay, better title, or keys to the executive washroom (are those still a thing?) but because they wish to be able to exercise greater autonomy over their lives. Psychologist Daniel Pink agrees - he’s found that the three qualities that contribute most to workplace satisfaction and overall productivity are autonomy, mastery & purpose.
It's very common to cite psychology to explain human behavior. But I hate that it can often come off as dismissive of a person and their decisions. There's implied irrationality.
Often times the psychological need aligns fully with rational objectives.
If I were to program a robot to work a human job, and I wanted to maximize its generated income, I would try to program it to maintain its own autonomy in that workspace. Because without autonomy, resources are endangered.
An interesting write-up and perspective, yet in my experience, has no practical or pragmatic designs by which to replace seniority-based, heirarchical and heavily engrained systems. Re-naming titles is a start, but it's still wishy-washy. I mean, sure, start from scratch and implement, but can it work in software or running a bakery or for a pool cleaning business? Context is important, and while I enjoy reading 'open mind' type educational riffing (of which this is clearly an example), it's hard to capture any genuine take-away approaches for, well, generalization.