Does regulation actually improves software security in practice?
I'm skeptical it does. I saw a few times how regulated software (certain billing systems) were certified and that was a bad joke. Maybe you have different experience, though.
If you want government bodies to spend taxpayers money on something, I'd rather suggest spending it on funding security researchers actively attacking systems and cooperating with manufacturers on fixing the discovered issues (and you can legally mandate such cooperation). This might work, actually improving end-user security. Although you'd have to somehow audit those researchers are actually doing something...
> Does regulation actually improves software security in practice?
Yes, it can. DO-178B is a widely used security standard in military equipment. It's difficult and expensive to obtain, and caters to fighter jets, not cell phones, but there is precedent for true technical security improvements through government programs.
Hang on, is this DO-178B _different_ from the DO-178B now replaced by DO-178C that is about _safety_ in aircraft systems? It isn't is it. So, you're talking about _safety_ when the parent and article are specifically about _security_.
Because confusing safety and security is exactly the kind of awful goof that we're talking about here. I'm sure these car alarms are _safe_ the problem is they don't keep your car _secure_.
The security record of military procurements is... not good. Same for the financial industry. Do a bad job, hope nobody finds out, if they do insist you didn't do a bad job and hope nobody who understands the difference is empowered to do anything about it.
Let's take something easy, communications. Your generic Android phone is capable of doing secure voice communications over a distance subject only to traffic analysis and other inevitabilities.
The British infantry have Bowman. At squad level it's unencrypted spread spectrum voice radio. So, much worse than that Android phone. A sophisticated bad guy (so, not some random bloke who decided to join ISIS last week, but say, a Russian armoured division you've been deployed to counter) can literally listen to everything you say, seamlessly, without giving away their position. Brilliant.
Now regulation _can_ improve things by mandating something that people who know what they're doing already recommend. But you're not going to get there with things like DO-178B.
> Hang on, is this DO-178B _different_ from the DO-178B now replaced by DO-178C that is about _safety_ in aircraft systems? It isn't is it. So, you're talking about _safety_ when the parent and article are specifically about _security_.
It's about both. The DO-178B/C standards require software to work as formally specified, and require robust branch testing to ensure the code conforms to the spec. This means different things for different applications, but in an operating system, for example, it means that no process can affect any other process if the two are deemed independent. This requires that the OS prevent all covert channels, strictly limit memory, CPU performance, etc. For example, a fork bomb wouldn't be able to affect other processes, and caches are wiped on every context switch.
So yes, it is definitely relevant to security as well as safety (which in military aerospace go hand-in-hand anyway).
It might be interesting to require a bug bounty program with high enough reward values. If companies want to indemnify themselves from the potential payouts, insurance companies could step in with their own standards and testing methods
I'm skeptical it does. I saw a few times how regulated software (certain billing systems) were certified and that was a bad joke. Maybe you have different experience, though.
If you want government bodies to spend taxpayers money on something, I'd rather suggest spending it on funding security researchers actively attacking systems and cooperating with manufacturers on fixing the discovered issues (and you can legally mandate such cooperation). This might work, actually improving end-user security. Although you'd have to somehow audit those researchers are actually doing something...