I guess this is what it would come down to (ignoring other liabilities like the CFAA): if Signal actually implemented their feature, would they be able to argue that their intent was to stop criminals abusing Cellebrite software, or would it be possible to argue that their intent was from the start to disrupt police investigations?
With Apple that's probably going to be hard to argue, they'll come out with some basic stats that talk about theft reduction, they'll point at their advertising and messaging around the feature.
I can see a distinction there, even though I don't see a super-clear reason to believe from Signal's one blog post that they're specifically trying to disrupt the police.
> That's why we hire attorneys! Legal counsel is here to help.
This is obviously good advice, and people shouldn't be looking at HN musings to figure out what is and isn't legal. But at the same time, minor sidenote:
I'm not angry at you, and this isn't anyone's fault in specific, but I low-key hate this answer because a metric ton of innovative software gets built by people who do not have the resources to hire attorneys for every decision that they make, and it's really unreasonable on a societal level to expect every Open Source dev, teenager, single-founder entrepreneur, etc... to have the resources to have legal counsel on hand. The majority of people in the US can't just go out and talk to a lawyer whenever they want.
Not really relevant to what we're talking about, and again, nothing to do with you, but I'm still unable to keep myself from ranting about that whenever this topic comes up.
> if Signal actually implemented their feature, would they be able to argue that their intent was to stop criminals abusing Cellebrite software, or would it be possible to argue that their intent was from the start to disrupt police investigations?
You should read the article again if you haven't already. The author discusses the context that would make this a hard sell to a court. Moxie Marlinspike has been around a long time, and he hasn't been particularly discreet about his opinions about law enforcement.
> etric ton of innovative software gets built by people who do not have the resources to hire attorneys for every decision that they make, and it's really unreasonable on a societal level to expect every Open Source dev, teenager, single-founder entrepreneur, etc... to have the resources to have legal counsel on hand. The majority of people in the US can't just go out and talk to a lawyer whenever they want.
Have you tried? I think there's a misconception that lawyers are resources that will only talk to you if you can verify you have $1M in the bank. Even when I was a poor student I found that I could phone up just about any lawyer and get 15 minutes of their time. This is often enough to get the gist of whether whatever I want to do is going to be legally risky. Most attorneys are nice enough to tell you what you're about to do is incredibly stupid (assuming it is, in fact, incredibly stupid) without charging you for the privilege.
And come on, we're not talking about writing a new game or a container orchestrator or some new ML algorithm here; we're talking about technology that clearly has a strong relationship to law enforcement and has a legacy of adversarial practices. Let's practice a little common sense here.
> Moxie Marlinspike has been around a long time, and he hasn't been particularly discreet about his opinions about law enforcement.
I did read the article, I don't personally see the intent that the author is attributing (although I understand how other people might, and again, this is all separate from the CFAA concerns). The author is claiming that Cellebrite's users are synonymous and interchangeable with law enforcement. But that's clearly not the case, or else Moxie wouldn't have a copy of the software since he's not a cop.
Signal never mentions law enforcement in the original post; the only mention they make to governments are non-US authoritarian countries, which... it's not illegal in the US to build software that Turkey dislikes. The only category of users that Signal's post specifically supports by name are journalists and activists -- in other words, not criminals under US investigation.
And this is part of what weirds me out about this conversation. When you start talking about how obviously Moxie is trying to hack police departments because he's criticized them in the past -- my opinions about law enforcement overall shouldn't block me from writing secure code. I believe in police accountability, I publicly backed Apple's position during the San Bernadino case, which according to multiple police spokespeople apparently means that I love terrorists and hate America. Does their framing of that position mean I'm not allowed to build secure software now?
You don't see it as problematic that being critical of the government would mean that you have less legal leeway to write secure code? "Hasn't been particularly discrete about his opinions of law enforcement" to me reads as "he's going to face increased legal scrutiny and have fewer legal protections purely because of 1st Amendment protected speech."
> Have you tried? I think there's a misconception that lawyers are resources that will only talk to you if you can verify you have $1M in the bank.
It's entirely possible that I'm bad at looking around at stuff like this. I haven't seen specialist law offices that aren't charging more than a hundred dollars an hour, but... I am not going to pretend I'm an expert on this stuff, at all. I'm not an expert on anything we're talking about.
> And come on, we're not talking about writing a new game or a container orchestrator or some new ML algorithm here; we're talking about technology that clearly has a strong relationship to law enforcement and has a legacy of adversarial practices. Let's practice a little common sense here.
I'm not sure I follow, is your argument that security code is in a separate category from other Open Source software? I don't understand what you're implying. Signal is a messaging app, shouldn't ordinary people be able to build those?
There aren't a ton of consumer-facing projects I can build that won't have to care about security and privacy. You don't think that games, or music storage/tagging, or backup systems have to care about this stuff? It's not only banks that do encryption, any system that touches user data should be able to protect that data from criminals.
I guess this is what it would come down to (ignoring other liabilities like the CFAA): if Signal actually implemented their feature, would they be able to argue that their intent was to stop criminals abusing Cellebrite software, or would it be possible to argue that their intent was from the start to disrupt police investigations?
With Apple that's probably going to be hard to argue, they'll come out with some basic stats that talk about theft reduction, they'll point at their advertising and messaging around the feature.
I can see a distinction there, even though I don't see a super-clear reason to believe from Signal's one blog post that they're specifically trying to disrupt the police.
> That's why we hire attorneys! Legal counsel is here to help.
This is obviously good advice, and people shouldn't be looking at HN musings to figure out what is and isn't legal. But at the same time, minor sidenote:
I'm not angry at you, and this isn't anyone's fault in specific, but I low-key hate this answer because a metric ton of innovative software gets built by people who do not have the resources to hire attorneys for every decision that they make, and it's really unreasonable on a societal level to expect every Open Source dev, teenager, single-founder entrepreneur, etc... to have the resources to have legal counsel on hand. The majority of people in the US can't just go out and talk to a lawyer whenever they want.
Not really relevant to what we're talking about, and again, nothing to do with you, but I'm still unable to keep myself from ranting about that whenever this topic comes up.