Hacker News new | past | comments | ask | show | jobs | submit login

I am really curious about how you test software...



It's a little different in software. If I'm writing a varint decoder and find that it works for the smallest and largest 65k inputs, it's exceedingly unlikely that I'll have written a bug that somehow affects only some middling number of loop iterations yet somehow handles those already tested transitions between loop iteration counts just fine.

For a system you completely don't understand, especially when the prior work on such systems suggests a propensity for extremely hairy bugs, spot-checking the edge cases doesn't suffice.

And, IMO, bugs are usually much worse the lower down in the stack they appear. A bug in the UI layer of some webapp has an impact and time to fix in proportion to that bug and only that bug. Issues in your database driver are insidious, resulting in an unstable system that's hard to understand and potentially resulting in countless hours fixing or working around that bug (if you ever find it). Bugs in the raw silicon that, e.g., only affect 1 pair of 32-bit inputs (in, say, addition) are even worse. They'll be hit in the real world eventually, and they're not going to be easy to handle, but it's simultaneously not usually practical to sweep a 64-bit input space (certainly not for every chip, if the bug is from analog mistakes in the chip's EM properties).


Literally no piece of software is bug-free. Not one. What are you talking about? Of course it’s impossible to test all inputs, because there’s going to be inputs that you can’t even convince of at the time of designing. What if your application suddenly runs at 1000000x the intended speed because hardware improves so much? How do you test for that?


Hardware doesn’t change over time…


Yes it does. It ages. But even if it doesn't, my point still stands. Or are you insinuating that the engineers over at Intel, AMD and Apple don't know what they're doing, because clearly their CPUs aren't flawless and still have bugs, like Spectre/Meltdown.


It deteriorates, it doesn't change. The functionality is still there and no modern hardware deteriorates to a failing state before it gets obsolete. Yes, I am insinuating that the engineers at intel, AMD, apple and nvidia are incentivized to prioritize expedient solutions over developing more robust architectures, as evidenced by vulnerabilities like Spectre and Meltdown.


print("No Bugs!")


Depending on the language, this simple code actually has a bug:

https://blog.sunfishcode.online/bugs-in-hello-world/




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: