Hacker News new | past | comments | ask | show | jobs | submit login

What's "critical software"? Software controlling flight systems in planes is already held to very high standards, but is enormously expensive to write and modify.

In this case it seems most of the software which is failing is dull back office stuff running on Windows - billing systems, train signage, baggage handling - which no one thought was critical, and there's no way on earth we could afford to rewrite it in the same way as we do aircraft systems.




Something that has managed to ground a lot of planes and disable emergency calls today is in fact critical. The outcome of it failing proves it is critical. Whatever it is.

Now, that it was not known previously to be critical, that may be. Whether we should have realised its criticality or not, is debatable. But going forward we should learn something from this. So maybe think more about cascading failures and classify more things as critical.

I have to wonder how the failure of billing and baggage handling has resulted in 911 being inoperative. I think maybe there's more to it than you mention here.


Agreed, there is no such thing as perfect software.

In physical world, you can specify a tolerance of 0.0005 in but the part is going to cost $25k a piece. It is trivially easy to specify tolerance, very hard to engineer a whole system that doesn't blow the cost and impossible to fund.

Great software architectures are the ones that operate cheaply, but are bulletproof when software fails. https://en.wikipedia.org/wiki/Chaos_engineering


Given how widespread the issue is, it seems that proper testing on Crowdstrike's part could have revealed this issue before rolling out the change globally.

It's also common to rollout changes regionally to prevent global impact.

To me it seems Crowdstrike does not have a very good release process.


> but is enormously expensive to write and modify.

We're talking about critical software. If we can't afford to reach the level of safety needed because it's too expensive, well so be it.

Besides, the enormously expensive flight systems don't seem to make my plane ticket expensive at all...


There's only one piece of software which (with adaptations) runs every Airbus plane. The cost of developing and modifying that -- which is enormous -- is amortized over all the Airbus planes sold. (I can't speak about Boeing)

What failed today is a bunch of Windows stuff, of which there is a vast amount of software produced by huge numbers of companies, all of very variable quality and age.


I meant critical software a short-hand for something like: quality of software should be proportional to the amount of disruption caused by downtime.

Point of sale in a records store, less important. Point of sale in a pharmacy, could be problematic. Web shop customer call center, less important. Emergency services call center, could be problematic.


I, as a producer of software, have effectively no control over where it gets used. That's the point.

Outside of regulated industries it's the context in which software is used which determines how critical it is. (As you say.)

So what you seem to be suggesting (effectively) is that use of software be regulated to a greater/lesser extent for all industries... and that just seems completely unworkable.


What you're describing is a system where the degree of acceptable failure is determined after the software becomes a product because it is being determined by how important the buyer is. That is backwards and unworkable.


It isn't, though. "You may not sell into a situation that creates an unacceptable hazard" is essentially how hazardous chemical sale is regulated, and that's just the first example that I could find. It's not uncommon for a seller to have to qualify a buyer.


I think the system is rather a one where if you offer critical services then you're not allowed to use a software that hasn't been developed up to a particular high standard.

So if you develop your compression library it can't be used by anyone running critical infra unless you stamp it "critical certified", which in turn will make you liable for some quality issues with your software.


I assume you mean "if the buyer will use the software in critical systems."

That's very realistic and already happens by requiring certain standards from the resulting product. For example, there are security standards and auditing requirements for medical systems, payment systems, cars, planes, etc.


> Software controlling flight systems in planes is already held to very high standards, but is enormously expensive to write and modify.

Here's something I don't understand: those jobs pay chump change compared to places like FB and (afaik) social networks don't have the same life-or-death context


Hence, Windows should blue/green kernel modules and revert to a past known good version if things break


Would not shock me for AV companies to immediately work around that if it were to be implemented. “You want our protection all of the time, even if the attacker is corrupting your drivers!”


This seems like the kernel module was faulty for some time. The update only changed the input data for the module.


Crowdstrike should have higher testing standards, not every random back-office process.


> Software controlling flight systems in planes is already held to very high standards, but is enormously expensive to write and modify.

Boeing disagrees.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: