I think the key phrase here is "low probability, high impact event". For a startup, ceasing to exist because nobody cares about your company is a high probability, high impact event. That's why many new products have gaping privacy problems. Also security flaws, technical debt, and other failures of sufficient planning and investment.
The challenge is that for a company with a good product, the highest risk threats (no product, crap product, nobody cares about product, etc) are mitigated. Then these lower-probability high-impact risks come to the front. This tends to happen faster than most small companies realize - like the first day there's any meaningful customer traction.
Suddenly, a team that's lived by 'move fast and break things' needs to catch their breath and pay down some debt - privacy, code quality, security, etc. Nobody enjoys that, so it's easy to postpone it too long.
Nicely said. One point to add would be that bolting on security/privacy
after the fact generally results in failure. The "highest risk threats"
you mention push companies toward trying to solve their technical debt
later, if they survive, but this just makes the after-thought bolt-on
failures more probable.
I would argue that devs only think about what their cultures encourage them to think about. These days developer cultures value "do more faster", so developers think about speed. Until that changes, privacy will be a second class citizen.
s/privacy/qa or your discipline of choice that is seen as "slowing down" development. While most devs should be thinking about this, you're correct in that their culture generally doesn't encourage them to.
whilst there some similarities between qa and privacy, in terms of technical debt/slowdowns, I don't think they are the same. Privacy is much more elusive, difficult to see any immediate value of, and subject to interpretation. QA can prove its value by finding real bugs that are usually harder to argue against.
Approve software only if they have a well-founded belief
that it is safe, meets specifications, passes
appropriate tests, and does not diminish quality of
life, diminish privacy or harm the environment. The
ultimate effect of the work should be to the public
good.
3.12:
Work to develop software and related documents that
respect the privacy of those who will be affected by
that software.
In Australia we have in addition the Privacy Act, which creates very strict regulations on collection, storage and use of information about individuals: http://www.privacy.gov.au/
The challenge is that for a company with a good product, the highest risk threats (no product, crap product, nobody cares about product, etc) are mitigated. Then these lower-probability high-impact risks come to the front. This tends to happen faster than most small companies realize - like the first day there's any meaningful customer traction.
Suddenly, a team that's lived by 'move fast and break things' needs to catch their breath and pay down some debt - privacy, code quality, security, etc. Nobody enjoys that, so it's easy to postpone it too long.