Hacker News new | past | comments | ask | show | jobs | submit login

Signal’s encryption algorithm is fine. The problem is the environment in which it runs: a consumer device connected to the general internet (and it’s hard to believe that someone who does this installs patches promptly). He’s one zero day or unwise click away from an adversary getting access to those messages and potentially being able to send them. Signal’s disappearing message feature at least helps with the former risk but runs afoul of government records laws.

The reason why the policies restrict access to government systems isn’t because anyone thinks that those systems are magically immune to security bugs, but that there are entire teams of actually-qualified professionals monitoring them and proactively securing them. His phone is at risk to, say, a dodgy SMS/MMS message sent by anyone in the world who can get his number, potentially not needing more than a commercial spyware license, but his classified computer on a secure network can’t even receive traffic from them, has locked down configuration, and is monitored so a compromise would be detected a lot faster.

That larger context is what really matters. What they’re doing is like the owner of a bank giving his drunken golf buddy the job of running the business, and the first thing he does is start storing the ledger in his car because it’s more convenient. Even if he’s totally innocent and trying to do a good job, it’s just so much extra risk he’s not prepared to handle for no benefit to anyone else.






An obvious issue that I noticed. He sent the exact same message to two different group chats.

I assume he copy pasted the message on his unsecured device.

How many apps had access to that text in his clipboard?

To me this isn't a technical problem with Signal, it's an opsec problem, and that's quite a lot harder to explain to people.


Yikes. Especially if it's been near a Windows PC. If I have link-to-PC switched on then I have a shared clipboard between phone and PC...

Or if it's an iPhone & Mac pair it probably synced with iCloud.

Surely they don't have iCloud on their devices though...


Signal does have a forward feature which would look the same but I don’t know if he uses it.

> Signal’s encryption algorithm is fine.

At least in the case of the leak the culprit was the UX, no?

Suppose a user wants the following reasonable features (as was the case here):

1. Messages to one's contacts and groups of contacts should be secure and private from outside eavesdroppers, always.

2. Particular groups should only ever contain a specific subset contacts.

With Signal, the user can easily make them common mistake of attempting to add a contact who already is in the group. But in this case Signal UI autosuggested a new contact, displaying initials for that new contact which are the same initials as a current group member.

Now the user has unwittingly added another member to the group.

Note in the case of the leak that the contact was a bona fide contact-- it's just that the user didn't want that particular contact in that particular group. IIRC Signal has no way to know which contacts are allowed to join certain groups.

I don't know much about DoD security. But I'm going to guess no journalist has ever been invited to access a SCIF because they have the same initials as a Defense Dept. employee.


My understanding was that the journalist's phone number was accidentally added to the existing contact of a trusted user through the following process:

1. Journalist emailed trusted user seeking comment on something. This email contained the journalist's cell phone number in the signature block.

2. The trusted user forwarded this email to the fool with Signal.

3. The fool's iPhone suggested adding the journalist's cell phone number to the trusted user's contact.

4. The fool accepted this, perhaps blindly.


You are right. Here's the discussion from a month ago:

https://news.ycombinator.com/item?id=43601213

This iPhone feature exists probably to save a few taps when people text you their new phone number.

But it fails to account for the fact that this is not the only reason phone numbers happen to be in texts.


Definitely - that kind of context is critical. Signal, iMessage, etc. are designed to let you securely connect to people you just met and don’t share much with other than a phone number. The DoD has the opposite problem: they have a list of people they trust enough to have access and blocking anyone not on that list is a major feature. Beyond the fact that both are sending messages, these problems are less alike than they seem at first.

Well apparently at this point it’s his phone, his wife’s phone, his lawyer’s phone, maybe anyone in the inner circle of people he cares about.

>> The problem is the environment in which it runs

Too deep. The problem is the physical environment, the room in which the machine displays the information. Computer and technological security means nothing if the information is displayed on a screen is in a room where anyone with a camera can snap a pic at any time.


That’s valid in general, but in the specific case being discussed is an official military facility with strict access control and I would assume it’s regularly checked for bugs.

well you can alway retrieve the messages from the Keylogger eh swiftkeyAIsendsItToCloud plugin



Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: