Hacker News new | past | comments | ask | show | jobs | submit login

If I were an evil VP tasked with creating this loop hole, I'd tell my project manager we needed a configurable system that let us control emissions for our intneral testing environments or some other plausible and legal requirement. I'd keep the knowledge of the system obscure and communication between dev, testing and deployment sparse. I'd then have only one person change the configuration before deployment. I'm sure it could be arranged so that no single engineer would realize that evil configuration was released and if they did, it could be excused as a mistake.



If I was an evil VP, I would give my eningeers the incentive to do something without giving them a direct order. So, you might tell them: If you can improve gas millage by X% you get a huge bonus and I don't care how you do it.


This is how modern companies break the rules. Management doesn't break the rules, either overtly or covertly. Management scrupulously follows the rules, while placing requirements on their workers that can only be met by breaking the rules.

Want your workers to work more, but you don't want to pay overtime, or you run into trouble with regulations about consecutive hours on the job? Just bump up how much work they have to get done, threaten to fire low performers, and make it clear that under no circumstances is anyone allowed to work overtime. Your workers will start working off the clock, and better yet they'll hide it from you, so you can legitimately plead ignorance if the law comes after you.

Want to cut corners on safety to save money? Tell your people that safety is the top priority but you need to see an X% reduction in costs, and it works itself out. They may fudge or falsify metrics, but if you're really lucky they'll find loopholes in the metrics instead.

(I have a friend who worked at a warehouse and fell victim to this. They were officially big on safety, which included bonuses for everyone if they went a certain period of time without any safety incidents. Unofficially, this meant that incidents wouldn't be reported unless it was unavoidable. The one way to ensure that an incident had to be reported was to see a doctor for your injury, so people were heavily encouraged to wait to see if they got better on their own before they got medical attention, which often made things much worse. I'm sure upper management's metrics looked great, though.)

As an added bonus, this sort of thing gives you a lot more control over workers. If you want to get rid of a troublemaker, do a little digging and you'll surely discover that they're violating safety rules or working off the clock or whatever. Everybody is, but selective enforcement is a wonderful thing.


This is also how we end up with "crazy" regulations. Case-in-point: nuclear waste handling. Engineers create complex six-sigma safety plans which are then slowly eroded as the regulations and safety protocols are wacky overkill, right?

The WIPP nuclear isolation site had a 15 year run so management began to cut corners. Then three unlikely events all lined up and nearly got people killed. It started with a truck catching fire, which prompted operators to bypass the HVAC's filtration system. They stopped the bypass for a few days to perform maintenance on the only underground radiation detection unit. That unit gave a false alarm during testing, but was fixed and placed back into service so they started ventilating again. Then a cask was breached ~midnight because someone upstream had used organic kitty litter instead of clay kitty litter. The operator assumed it was a false alarm due to the previous false positive and kept things running. It wasn't until the next morning that they realized they were blowing radioactive particles above ground.

Had management kept maintenance up, enforced protocol, or done more than the absolute bare minimum (i.e. installing multiple underground radiation detection units) US tax payers could have avoided paying $500 million dollars. And this isn't a one-off thing, there are dozens of instances just like this where the US dodged a bullet.

0: https://lessonslearned.lbl.gov/Docs/2091/OES_2015-02%20-%20R...


What a fascinating DB you linked to. I'd never known these types of things were public or common.


Seems like the "organic" fad was the problem


The problem was substituting an unsuitable material because of mistakes when revising procedures and insufficient review of the revisions. The fact that the mistake involved an "organic" product is coincidental.


You have made some very strong points - thanks for summarizing it together.

I would like to ask: What do you think should an ordinary employee do when they see a behavior like you describe from their management? How to efficiently protect themselves (and their colleagues) from this kind of treatment?


I've just observed this, not experienced it, so I'm not sure. It's far easier to see the problem than figure out a solution!

Much will depend on your job prospects and financial position. If you're a fancy programmer type who's constantly bugged by recruiters, move on until you find an ethical company. If you need this job to eat, you'll have to be a lot more careful.

In general, I'd say:

1. Point out the impossibility of the requirements to management. Gently if need be. It's possible they don't realize what they're doing.

2. Contact the local department of labor or whatever regulatory agency would be interested in what's going on. They may be able to take action if management is pushing violations in a quiet way like this. If not, they may be able to at least take action against the workplace if people have started breaking the rules.

3. If you can afford to risk the consequences, follow the rules as much as you can. Don't work off the clock, don't break safety rules, etc. If being fired will make you homeless then maybe this isn't an option.

4. Document everything. If regulators weren't interested originally, they may be interested once you can show a pattern. Upper management may be blissfully ignorant, and you may be able to get them involved once you can show them what's going on. Whatever happens, if things come to a head then it will probably be useful to be able to demonstrate that this wasn't your own doing.


The list of 4 items is excellent. Under no circumstances should you act in an insubordinate manner until you've exercised other channels of communication. Acting before things get too far is the easiest remedy.

I'd recommend the following order of operations:

Convey concern over associated risk to your immediate manager. Verbally first during the meeting, switch to written (email, a paper trail of opposition) if no action is taken.

When documenting the paper trail, simply reference meetings which you voiced opposition. Your notepads should also be able to back up the talking point you're referencing.

Being asked to briefly switch hours or work late is often listed in your job description so stopping suddenly at 5p can be considered insubordinate. Time should be compensated in a time off or paid OT arrangement promptly. If your verbal requests go without action, again, switch to email.

Simply documenting the events as they occur makes it easy for when you need to go above your immediate supervisor ( more senior manager, corporate hq, department of labor) for help.

Key is to be polite in all interactions. Innocent mistakes happen. Managers are under deadlines too. Paper trial should be maintained regardless of action or inaction.


This is why the financial regulators in the UK have changed to consider the corporate culture when dealing with breaches. A company whose senior staff "live a compliance culture" will be penalised less for the same breach than one operating as described above.

Now you can argue efficacy, being able to pull the wool over the eyes of regulators etc but I think it is a good direction to head in.

All that is way beyond the industry described with institutionalised cheating in tests, it wounds more like the graphics card industry than one with regulators.


Just imagine how Facebook must have (initially) pushed the datr cookie on its engineers: "We only need to track everyone like this to check against DDoS attacks."

Still, even such an excuse should have raised alarm bells, but I assume most developers would just shrug their shoulders and develop the feature anyway, as they would've liked to keep their nice-paying job and juicy stock options.

In reality, Facebook only recently used the DDoS protection excuse for its datr cookie, well after it announced that the cookie would be used for advertising purposes, which also happened a few years after the cookie was introduced.

I imagine whatever Facebook told developers then was even less subtle than "using it for security purposes", and that most of the developers figured it out right then that the datr cookie would be one day used to track users across the web for advertising purposes.


I'm pretty sure people who work for Facebook are completely aware of the company's methodologies and are completely OK with it.

The conversation would be more along the lines of "We need to track people who aren't logged in, suggestions?"


Yeah; in weapons, it might go like this:

* Govt research agency awards contract to study smallpox, including stockpiling smallpox. Researchers are happy, they're protecting the world.

* Govt weapons agency gets notice from govt research agency that "it's ready now."

* Govt weapons agency confiscates smallpox stockpile and data, makes more, spoons it into the tips of missiles.

Original researchers are more or less legitimate victims, whose goal to help humanity was used to obscure the Govt weapon's agency's goal to kill humanity.


Great point, it all depends on how the problem is framed.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: