Hacker News new | past | comments | ask | show | jobs | submit login

[flagged]



Interesting. Could you provide a better alternative? Preferably even remotely as user friendly as signal/whatsapp?


I found SimpleX recently (https://simplex.chat/), which got my attention because it uses a unique account schema that doesn't have any individual account identifiers. It's got some interesting features. Not sure about user-friendliness since I don't use other messaging apps to know what to expect.

It's mostly just interesting to me that they did away with the username entirely and they instead have users connect exclusively through shared secrets like they're Diffie and Hellman.


I don't have a good answer. Personally, I'm using Jami for secure communications but I won't pretend it replaces signal in terms of user-friendliness especially back when signal allowed SMS and secure communications in the same app!


Threema comes to mind.


Following the conversation down, it sounds like what you're really saying is that Signal stores sensitive information encrypted with PIN+SGX, which is controversial. And maybe you have a good argument for why it's bad (and I'm uneasy with it myself). But I think people don't like that you made the assumption for them that PIN+SGX is bad.


Even if everyone agreed that the system was secure, and they absolutely don't, see for example

https://web.archive.org/web/20210126201848mp_/https://palant...

https://www.vice.com/en/article/pkyzek/signal-new-pin-featur...

I think we should all agree that outright lying to users on the very first line of their privacy policy page is totally unacceptable.


You cited this so I think this is what you mean:

"Signal is designed to never collect or store any sensitive information."

I interpret this, I think reasonably, to not include encrypted information. For that matter they collect (but probably don't store) encrypted messages. The question is, does PIN+SGX qualify as sufficiently encrypted? This line is a lie only if it does not.

Sorry I skimmed those articles, I don't want to read them in depth. But it sounds like they are again ultimately saying "PIN+SGX is not secure enough".


> I interpret this, I think reasonably, to not include encrypted information

I disagree since attacks and leaks can happen/have happened which could compromise that data. Signal was already found to be vulnerable to CacheOut. Even ignoring that guessing or brute forcing a pin is all anyone would need to get a list of everyone a signal user has been in contact with. just having that data (and worse keeping it forever) is a risk that absolutely should be disclosed.

> I don't want to read them in depth. But it sounds like they are again ultimately saying "PIN+SGX is not secure enough".

that was my conclusion back when all this started. The glaring lie and omissions in their privacy policy were just salt in the wound, but charitably, it might be a dead canary intended to help warn people away from the service. Similarly dropping the popular feature of allowing unsecured sms/mms and introducing a crypto wallet nobody asked for might have also been done to discourage the apps use.


Okay, so you not only take issue with PIN+SGX, you think that any encryption scheme (at least from Signal) isn't secure enough. Your point still comes down to "they are storing sensitive information in a form that is ostensibly encrypted but still subject to attack (in the opinion of XYZ reputable people...)".

My point is only that the headline of your point was "they are lying about not storing sensitive information". That leaves out a very important part of your point. IMO it makes the claim seem sensationalized and starts you off on the wrong foot.


That's fair, I can see how someone could feel that way.


"I interpret this, I think reasonably, to not include encrypted information"

Why? Encrypted information is still sensitive information.


Maybe via metadata? The size of the information, etc. Do you mean that they should have a caveat about that?

Or if you want to be literal, you have to say that they're storing sensitive information even if it's encrypted. But by connotation that phrase implies that someone other than the user could conceivably have access to it. So for all any user could care, they just as well are not storing it. Do you mean that they should rephrase it so it's literally correct?

Or do you mean that it's actually bad for them to be collecting safely encrypted sensitive data? Because if so, you literally cannot accept any encrypted messenger because 3rd parties will always have access to it.


Yes, I think they should rephrase it so that it's literally correct. Personally, I have a very high trust in the safety of Signal's encryption and security practices. But privacy policies aren't for the Signals of the world, they're for the ad networks and sketchy providers. For example, many ad networks collect "Safely Encrypted" email addresses—but still are able to use that information to connect your Google search result ad clicks with your buying decisions on Walmart.com. Whether something is "safely" encrypted is a complicated, contextual decision based on your threat model, the design of the secure system in question, key custody, and lots of other complicated factors that should each be disclosed and explained, so that third parties can assess a service's data security practices. Signal is a great example of a service that does an excellent job explaining and disclosing this information, but the fact that their privacy policy contradicts their public docs lessens the value of privacy policies.


Okay that's fair. But as I said to autoexec, if your point includes that you don't rely on the encryption to be safe, you should probably include that in your point. A lot of people probably don't share that as a prior. (I suspect that's why autoexec was downvoted and flagged).


A ciphertext is not sensitive information. If your ciphertext can't be exposed to an adversary, your cryptography is fundamentally broken.


You can't make that statement blindly without knowledge of the entire cryptosystem and threat model. For example, to me, an encrypted version of my email address, as used by many ad networks to do retargeting, is still sensitive information if it lets Walmart serve me ads based on my Google search history.


I'm a big signal user yet skeptical that it's not directly involved with intelligence agencies. That's all to say, this sounds like FUD but I think it should be taken seriously. Out of curiosity, where have you read this?


It was a bit of a controversy when the change happened:

see https://web.archive.org/web/20210109010728/https://community...

https://www.vice.com/en/article/pkyzek/signal-new-pin-featur...

Note that the "solution" of disabling pins mentioned at the end of the article was later shown to not prevent the collection and storage of user data. It was just giving users a false sense of security. To this day there is no way to opt out of the data collection.


Obligatory request to provide a source to backup some serious claims?


If you're a signal user and didn't know about this already, that should tell you everything you need to know about signal.

See https://community.signalusers.org/t/proper-secure-value-secu...

Then read the first line of their terms and privacy policy page which says: "Signal is designed to never collect or store any sensitive information." (https://signal.org/legal/)

Signal loves to brag about the times when the government came to them asking for information only to get turned away because Signal never collected any data in the first place. They still brag about it. It hasn't actually been true for years though. Now they're collecting the exact info the government was asking for and they're protecting that data with a not-very-secure/likely backdoored enclave on the server side, and (even worse) a pin on the client side.


I see a link to a forum where an anonymous participant says

“Since a recent version of Signal data of all Signal users is uploaded to Signal’s servers. This includes your profile name and photo, and a list of all your Signal-contacts.”

They then link to a Signal blog (2019) explaining technical measures they were testing to provide verifiably tamperproof remote storage.

https://signal.org/blog/secure-value-recovery/

I’m not equipped to assess the cryptographic integrity of their claims, but 1) it sounds like you’re saying that they deployed this technology at scale, and 2) do you have a basis to suggest it’s “not-very-secure or likely backdoored,” in response to their apparently thoughtful and transparent engineering to ensure otherwise?


The communication Signal put out was extremely confusing and unclear which caused a lot of issues. They avoided answering questions about the data being collected and instead focused everything on SVR (see https://old.reddit.com/r/signal/comments/htmzrr/psa_disablin...)

The problems with the security of Signal's new data collection scheme was talked about at the time:

https://web.archive.org/web/20210126201848mp_/https://palant...

https://www.vice.com/en/article/pkyzek/signal-new-pin-featur...

You'll have to decide for yourself how secure pins and enclaves are, but even if you thought they were able to provide near-perfect security I would argue that outright lying to highly vulnerable users by saying "Signal is designed to never collect or store any sensitive information." on line one of their privacy policy page is inexcusable and not something you should tolerate in an application that depends on trust.


> 2) do you have a basis to suggest it’s “not-very-secure or likely backdoored,” in response to their apparently thoughtful and transparent engineering to ensure otherwise?

The forum post explains this:

> This data is encrypted by a PIN only the user can know, however users are allowed to create their own very short numeric PIN (4 digits). By itself this does not protect data from being decrypted by brute force. The fact that a slow decryption algorithm must be used, is not enough to mitigate this concern, the algorithm is not slow enough to make brute forcing really difficult. The promise is that Signal keeps tge data secured on their servers within a secure enclave. This allows anyone to verify that no data is taken out of the server, also not by the Dignal developers themselfs, not even if they get a subpoena. At least that is the idea.

> It is also not clear if a subpoena can force Signal to quietly hand over information which was meant to stay within this secure enclave.

That should be very concerning for activists/journalists who use Signal to maintain privacy from their government. Subpoena + gag order means the data is in the hands of the government, presuming Signal want to keep offering their services to the population of the country in question.


I wanted to add that there is the cease and desist case against Signal-FOSS fork that tried to implement an open server, too.

In my opinion Briar is where it's at, but because there's no data collection it's pain to do a handshake or manage contacts.


Signal-FOSS is still around right? Got a link to some of the drama? I'm curious to see what their grounds were. Did they just object to the use of their name?


There is still Molly as a fork, but no idea how hardened it actually is.

After Moxie's statement at the time I kind of ditched everything regarding Signal's ecosystem. I understand the business perspective of it, but it's kind of pointless trying to say this is open source when it's illegal to press the Fork button on GitHub, you know.

https://github.com/mollyim/mollyim-android


Also signals spam folder isn't open source on server side. They literally have code that reads your messages and checks if spam or not and you cant see what it does or how it's written.

Couple this with signal being the preferred messaging app for 5 eyes countries as advised by their 3 letter agencies and well if you think those agencies are going to be advising a comms form they can't track, trace or read you obviously don't understand what they do.


While it seems to be true that it’s not open-source, they claim (in strong terms) that they use techniques other than reading the message to make that assessment:

https://signal.org/blog/keeping-spam-off-signal/

They point out that the protocol’s end-to-end cryptographic guarantees are still open and in place, and verifiable as ever. As far as I can tell, they claim that they combine voluntary user spam reports and metadata signals of some sort:

> When a user clicks “Report Spam and Block”, their device sends only the phone number that initiated the conversation and a one-time anonymous message ID to the server. When accounts are repeatedly reported as spam or network traffic appears to be automated, we can issue “proof of humanity” checks to suspicious senders so they can’t send more messages until they’ve completed a challenge. For example, if you exceed a configured server-side threshold for making requests to Signal, you may need to complete a CAPTCHA within the Signal application before making more requests. This approach slows down spammers while allowing regular messages to continue to flow.

Does that seem unreasonable? Am I missing places where people have identified flaws in the protocol?


Sounds quite fishy :( . Any specific proofs in addition to all what have been said so far? I've checked the links, they don't really prove anything...


here are links to additional discussions from the time of the change: https://community.signalusers.org/t/mandatory-pin-is-signal-...

One of the few articles that talked about it at the time: https://www.vice.com/en/article/pkyzek/signal-new-pin-featur...

One of the many reddit posts by confused users who misunderstood the very unclear communications by Signal: https://old.reddit.com/r/signal/comments/htmzrr/psa_disablin...


Can you elaborate? I'm semi-familiar with the Signal protocol but I'm not sure what you are referring to here.


See https://community.signalusers.org/t/proper-secure-value-secu...

Then read the first line of their terms and privacy policy page which says: "Signal is designed to never collect or store any sensitive information." (https://signal.org/legal/)


I was going to discount most of this. It appears this link is missing or private now. What did it say?


fixed the link. Just to be safe, here's an archive of the page: https://web.archive.org/web/20210109010728/https://community...




Consider applying for YC's Summer 2025 batch! Applications are open till May 13

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: