I find myself quite torn by this statement, because while I dont think hacking tools should be illegal, this is the exact same argument pro-gun people make, and i'm quite anti-gun (I'm from UK). Then again, in America drug paraphernalia is illegal but in the UK its not and i personally dont think you should be locked up for having a bong because theoretically you might only use it for tobacco.
Anyway, personally i'm quite adamant in the blanket statement that no software should be illegal.
I am anti-gun, so while you may be right, you shouldn't presume you are about the person you're replying to.
I'm not a moron, I don't know if we could ever learn to live without armies/armed police, and I'm not calling for a drastic change such as "the UK should give up all armed forces", obviously it doesn't work that way.
But you have the same problems for private gun ownership - for example, how would you set about making it illegal in the US given how many people already own guns, it would be a crazily difficult task. However that doesn't stop people from being in favour of finding a way to do it.
Pretty easy, and superficially appealing, but I find that there must surely be exceptions to a rule such as this. Nuclear weapons (and similarly, some chemical weapons) are the obvious special case.
Edit: reminded reading a comment below that child pr0n is another one that the general moral consensus has problems with (I refer to the 'consensus' not because I think it is always correct but simply as it provides cases worth thinking about), and also handling stolen goods (although this can be justified through property rights).
Seems kind of like the "Godwin's law" of rule of law. The fact of the matter is that we simply don't encounter these sorts of situations often enough for myself to understand using them as policy setters.
Seems to me that in that case, banning owning this hypothetical something would be redundant. Just use the laws that already ban doing whatever it is that they are doing.
Of course, your milage may vary. I also support repealing intoxicated driving laws, since I reason that reckless driving laws already cover that sort of behavior. For reasons that I don't really understand myself, people tend to think I'm off my rocker there too ;)
Let's not compare gun control with software. They're not really the same thing at all.
More importantly, if security professional has never created any kind of malware he or she is probably pretty bad at infosec. The fields are just two sides of the same coin.
Internet activity to a significant extent does not have national borders.
Mexico might be a good if rather simplistic "gun" analogy. Tons of illegal weapons flowing over from the U.S., arming drug runners and other criminals who are terrorizing (hmm, I suddenly realize the additional nuances that that word carries these days) the general population.
Back to computers: You can't make secure systems without having appropriate tools and research at your disposal. And we've yet to see any security effectively "legislated", especially world-wide.
So, make the jobs of those who are effective difficult or impossible -- or highly restricted and privileged through special sanctioning and/or the requirement of having very significant capital, investment, and influence -- while gaining no real security advantage. Yeah, that sounds like a good plan.
Lockpicking tools are illegal and yet locksmiths and companies that make locks are allowed to have them. That seems pretty directly analagous; I see no particular reason why we couldn't have licensed and bonded digital security experts/companies.
Bit of a nitpick (and I'm not sure where you live) but lock picks are generally legal in the US. In some areas however they become illegal if they can prove malicious intent.
That's all fine and good if you only take companies and security consultants into account, but I'm not sure it's 100% analogous. What about random hypothetical geeky teenager who wants to contribute security patches to an open source project? I don't think there's a lockpick equivalent to that.
There is an analogy: Locksporting. Groups like Toool and individuals across the world pick and design locks as a hobby, and have shown flaws in high-security designs like Medeco that were later corrected.
Funny that that quote should come from Alexander Hamilton, who was a leader in the federalist movement to consolidate and centralize power in the US government.
Power tends to corrupt and absolute power corrupts absolutely - lord baron acton
The fun thing about it: the german cia equivalent "BND" lets german developers develop hacking tools via ssh or rdp on boxes that sit in other countries to circumvent that law.
I'll provide a link as soon as i find a source other than one of the hackers i know.
I do believe my point was that no law is free of unintended consequences, particularly not your law that outlaws laws with unintended consequences, thus your law outlaws itself.
The outlawing of the law against laws with unintended consequences by the law against laws with unintended consequences would undoubtedly be considered yet another unintended consequence of the law against laws with unintended consequences.
Just make the law "any law with unintended consequences is illegal", then embrace any possible "consequence" of this law (such as perhaps laws you otherwise like being declared illegal), then the law should fail to outlaw itself. Any consequences would be by definition intended.
Anyway, I don't buy the suggestion that no law can be free of unintended consequences. If you construct a sufficiently formal definition of the system of law, and each law itself, then making a law without unintended consequences is simply (lol :P) a matter of proving the law. Quite similar to how programs can in fact be proven correct (contrary to the popularly held opinion that no program can be free of flaws.)
Is this practical? With our current setup, no. Hypothetically? Maybe... certainly at least worth pursuing the idea I would say.
Wouldn't it be simpler and more efficacious to simply ban sales of Windows in the EU, or mandate that they fix the security issues?
Not that I favor ludicrous bans of this sort, or that I think they will work. Because I manifestly don't. But geez, if you're going to be over-the-top Orwellian, at least do something that has a chance of achieving your stated goals.
It seems naive that you assume that banning Windows would decrease the rate of successful malicious attacks on machines. Every piece of software has holes - the largest of which is the user. If everyone in the EU switched off of Windows, you'd just have a large percentage of the population using linux or OSX without understanding how security works on those systems (many of whom would gladly enter their root password to install a spyware program, so long as they can keep playing farmville or whatever it promises to do).
What about vulnerability testing software? In principle those can be used as attack tools.
Maybe a line can be drawn... Design kits for viruses come to mind. But even then, it's a fine line, and history has shown once a mechanism is in place to outlaw something it will be extended and abused to apply to things that were not originally targeted.
Yes, it is illegal to financially damage a company, and many crackers do exactly that. This article and most of the comments here argue about the tools. As hackers we find it hard to understand why a hammer could be outlawed because it is good at breaking through the windows of houses.
Why does no one talk about the network that was broken into? Why does the general public believe that crackers are so good at their job it is impossible to secure a computer system? There are two possibilities that I can see here.
1. Most cracks happen because of a less-than-perfect system administrator. Either some subtle problem with a configuration file opened up a hole for the cracker or nobody bothered securing the network to begin with.
2. Most cracks happen because crackers have found a reliable method of discovering 0day exploits or our current computing model is fundamentally insecure.
In either case, I find it unjustifiable to declare cracking an act of terrorism without spending ANY effort reflecting back on our own security. If millions of us routinely use the same password (or a easy-to-guess pattern) for all of our accounts who is the terrorist? The people who take advantage of an easy opportunity, or the people who created that opportunity in the first place?
It is well known that users are stupid, and that two-factor authentication is much harder to break than static passwords. Bruce Schneider has been saying so for at least a decade. Why have we not moved on? As a system administrator, it should be an act of terrorism to NOT make two-factor authentication the DEFAULT way of using your service.
While I can't really see legitimate uses of some of the "hacking tools" - viruses, botnets, rootkits (yes, you, Sony!), etc. - I can't get rid of the feeling that there is another hand trying to get a grip on the free land of Internet, and I really don't like that.
On a completely tangential matter, I have a feeling this is going to be another one of that laws that cost a lot of money and have little to no effect... at least positive effect.
"""
Member States shall take the necessary measure to ensure that the production, sale, procurement for use, import, possession, distribution or otherwise making available of the following is punishable as a criminal offence when committed intentionally and without right for the purpose of committing any of the offences referred to in Articles 3 to 6:
(a) device, including a computer program, designed or adapted primarily for the purpose of committing any of the offences referred to in Articles 3 to 6;
(b) a computer password, access code, or similar data by which the whole or any part of an information system is capable of being accessed.
"""
Which seems to be saying that, say, nmap isn't illegal, unless you download it with the intent to run it against a machine you're not supposed to, in which case you've broken the law (even if you never actually use it). Kind of like laws against 'burglary tools' in some parts of the US, the crime seems to be based on context/intent.
(Obvious disclaimer about how I'm not a lawyer, European, or a unicorn.)
If you leave your wallet on the street in a bad neighborhood and come back, you'll probably never see it again.
The problem with such protection laws is that it doesn't take into account the ignorance or incompetence of service providers. It also holds back innovation and we end up with less security. Even if these vulnerable companies don't have the expertise they can hire a reputable security company to audit their system to plug the gaping holes.
Do we need to pass laws for companies to do security audits? Maybe for listed companies or companies that have services of a certain size, since they'll try to skimp on costs or executives don't understand IT needs.
Trying to criminalize the intent of developers even if they create tools solely for cracking is a slippery slope. While we're at it we should make defense contractors liable for war damages and execute the engineers responsible for creating weapons.
In Japan a closed source p2p software called Winny caused a lot of disorder with viruses and lots of government information and embarrassing private pictures leaked onto the net due to security issues. Unfortunately, the developer was busy fighting a trial based on whether he had intentions of violating copyright with his software (he was finally acquitted on appeal to a higher district court). If he at any point publicly endorsed copyright violations, he'd probably be locked up for a long time even if he didn't violent a single bit of copyrighted content. Needless to say the project is abandoned and full of holes. Good for the anti-virus industry though.
There's a sensible reason for implementing a law of this kind - if they catch the guy that wrote Zeus, I'd like them to be able to prosecute him (not that they could, as he's probably not in the EU, but you get the idea). Of course, it does need to be carefully written to avoid collateral damage.
In UK, it's illegal to sell knifes to under-18s in many places. Yet, you'll have to live on your own if you go to school away from home before that age... Another silly rule that doesn't actually stop people from stabbing each other.
This is already the case in the Netherlands. Hacking tools are only allowed for private use or research, e.g. for checking the security of your own network. Possession of hacking tools with the intention to harm other peoples' systems is not allowed.
These sorts of laws need to include exceptions for tools that have a non-criminal purpose. Otherwise, a broad reading could include things like NetCat, Curl, and Apache Bench.
Which would render the whole shebang useless. All tools used for breaking into computers have legitimate uses for security professionals, not the least of which is penetration testing.
Oh, whew. False alarm. He has the standard "EU-IDE: Certified Developer Edition". Gonna have to write him up for not updating to the latest version, though.
This seems more like one of those ideas which end up being a law used to slap people a second time when they are nabbed for something rather than something that would be enforced on its own.
Meaning if you're a software developer or system admin in the EU, you better be on standby 24/7 to combat 0-day exploits.