I really hope that of all those issues that were raised over the last weeks some make it out of our tech bubble into the consumer world. All of us know pretty well by now that this software is probably never going to be trustworthy and shouldn’t be used for any purposes that require confidentiality or authenticity, but I’m afraid that this is overshadowed by their “military-grade” marketing efforts for people outside of this scene.
One of the Telegram apps for Android did use Google Maps in http-mode. The issue was fixed within an hour, and the update is on its way to Play Market, containing several other security-related improvements.
At this moment we are working together with security experts on code review for our client applications. We awarded the person who discovered the bug and are designing a bug bounty program, which will be rolled out soon.
Good on you for being so quick to implement these bug fixes.
Others on HN have commented that the iteration process is probably not the best way to approach crypto, due to the high risk for the end user.
It seems obvious to me that nothing in this world is ever bug-free, and there is no such thing as foolproof where time is involved. You might as well accept this, and actually embrace it. It seems to me Telegram are embracing it very well.
Telegram is not robust at any point in time, but it is antifragile, since it benefits from shocks to become stronger over time. Like Hydra's heads, you can cut them off, but they will grow back twice as numerous. This is actually much better than robustness, it just doesn't look like it, because heads being cut off is more memorable and mediatic than heads growing back.
Of course, there is the argument that such high claims should not have been made on buggy software. But it is because such high claims were made, and because crypto people got annoyed about them, that everyone has been trying to break it, thus rendering it more foolproof.
It's annoying, but it's clearly working: at this rate of improvement, I'd be surprised if the product weren't pretty damn good in just a few months' time. If anyone has doubts over the current version, well, just don't use it in life-threatening situations until you're reasonably confident about it being fit for your purposes, which is an assessment that will also depend on the person or institution you're trying to avoid, and the quality of resources they have at their disposal.
Would you be happy with bridges, or elevators, using an iterative process for safety design?
How about training doctors? Don't build up through school (frogs and a sheep's lung) through universities (pigs, human corpses) and medical school (more corpses, watching surgery, assisting surgery) - just let them learn with open heart surgery?
> If anyone has doubts over the current version, well, just don't use it in life-threatening situations
That's good advice. How is Telegram advertising itself?
On the homepage:
> Telegram messages are heavily encrypted and can self-destruct.
> Telegram keeps your messages safe from hacker attacks.
In their FAQ:
> Telegram is the fastest and most secure messaging system in the world.
> Secret chats are meant for people who really want secure messaging.
etc.
This is nice wording, and I appreciate them saying this:
> Telegram is more secure than mass market messengers like WhatsApp and Line. We are based on the MTProto protocol (see description and advanced FAQ), built by our specialists, employing time-tested algorithms, to make security compatible with high speed delivery and reliability. We are continuously working with the community to improve the security of our protocol and clients.
It contains worrying bits ("built by our specialists"), but is mostly okay.
Your point about bridges and elevators actually illustrates my own point very well. Bridges and elevators may be quite safe today, but they weren't when people first started making them. Some bridges collapsed under various forms of pressure, and better bridges were designed as a result. They all necessarily made the claim to be safe until they were shown not to be.
So, yes, even bridges were built using the iterative process. It's just that the iterative process started long before you were born, and you found the world as it is without seeing the iterations that occurred before.
The same is true for doctors. There used to be all kinds of theories about how to cure the plague, or tuberculosis, or diseases we now know how to cure with a pill. Some of them worked, others didn't. Iteration at work, just over a longer timeline than your own lifetime.
This would make sense if cryptography was a young field, but this iteration and improvement has been going on for thousands of years (and modern digital cryptography has been developing for the better half of a century). It is known how to implement cryptography securely. Just as you would expect bridges built today to stay up, and doctors working today to be properly trained, you should expect cryptography implementations to be sensible and secure, or at least not try to carve a new, experimental path when people's lives are potentially at stake.
On the other hand, think about the benefits we get from seeing a bug in the software, and then seeing that Telegram have fixed it within the hour. Until a bug is shown and fixed, you don't even know whether it exists or not. So, you have doubt. But once it's exposed, and fixed, your attention is brought to an aspect of the software that you now know is good. The doubt is reduced. Which is a good thing.
That assumes that there are people with enough expertise and time to point out these flaws and the company actually listens to them. Something like an http/https grep is easy enough to do and doesn't require a lot of deep technical knowledge about how crypto works and should be designed, but someone doing an analysis of their entire algorithm and architecture for free and point it out to them? Forget about it. Especially since their "bounty" program has very specific parameters for what is acceptable to get any prize money.
So far, Telegram have been listening a lot to people pointing out errors, and fixed their errors promptly.
>for free and point it out to them? Forget about it.
It doesn't appear to be for free: aside from their bounty program, Telegram have been rewarding various troubleshooters with pretty decent ex gratia payments in bitcoin. In the article linked to this title, the first comment was from Telegram, asking the author to contact them for a reward.
They haven't been transparent about the amounts nor about the parameters of the initial bounty, which does not mean that the algorithm is safe, but rather that it wouldn't be worth the reward in effort to expose a vulnerability in the exact specified way by the authors.
That can apply to other apps and products, but much less so to crypto tools. It's basically one of the reasons why new crypto algorithms only start being used after 5 years of being in the wild, and many more in pre-release research.
You allow HTTP outgoing in your "secure" app? Of ___location information, that can lead to someone's safety or life to be put in danger if they really buy all your marketing? Why allow outside requests at all in the first place from your app? What if DNS is compromised and someone is tapping into google.com from a local tower/wifi DNS override and sending all Google Maps traffic to their server instead? Shouldn't you be using a proxied call to these outside services through your internal ___domain as API calls, with a way to verify that someone didn't hijack that connection and imitating it as well? This is all very much security 101 stuff and would have expected much better from an app labeling it as a simple, secure crypto solution.
How the F* is it possible that your app's reason of existence depends on proper security and you miss usage of HTTP where only HTTPS should be used? A vulnerability like that could be found with a proper grep on the source code. But even that seems to be too much effort.
It may be worth pointing that the Telegram FAQ claims there are two Android apps and they were both developed through a contest (I assume by third party developers).
Let's try and not turn every flaw of every Telegram app into the OMG-look-at-these-morons fiesta. This is an issue with a specific version of one out of many apps, not a protocol issue and it hardly warrants the near-hysterical "NEVER" in the title.
You would be right if the telegram developers haven't started with ridiculous security claims ("military-grade", "unbreakable"). Their marketing implies you can use telegram for sensitive communication, which in practice you can not.
If you are a political dissident and use the app, you might get jailed or killed due to the fake sense of security. Secure IM is not for hiding your diary from your sister any more.
Even you are using HTTPS to call Google Maps API, it is useless if NSA can access the contents in plain text inside the Google datacenter. Why not just remove it?
Why is such feature even enabled in a "secure" application is even better question. After all you may be dealing with compromised endpoints or evil maids - so showing your ___location is stupid.
If Telegram had hired any decent security company to go over their design, then this bug would have been caught, because this is a rookie mistake. One of the most basic security requirements for any app is that all network traffic is protected via SSL, so that would've been one of the first things that a decent appsec company would've checked.
Telegram could hire an appsec company to analyze their design, but they're choosing not to.
Agreed, the fact that no credible security firm was hired to analyze their entire architecture and implementation screams of something wrong happening here. Either that or gross negligence, neither of which are reassuring.
I think all the hate towards Telegram might be good for them. Provided that they have thick enough skin to suffer through it. They will emerge as the bulltet proof communication tool they wish to be.
As for me anything that kills skype is worth rooting for.
As tptacek explained very well yesterday, this is not the way crypto systems are built:
> Be honest with yourself. Crypto doesn't get beta-tested into resiliency. Strong systems start out strong. If you're building something because its your dream to thwart the NSA, don't kid yourself into thinking that you'll get there by first protecting people's Warcraft clans.
Yes, and don't forget that if the NSA gets direct access (by threat, payment or persuasion) to their infrastructure these bold statements mean null. Secure communication will NOT be centralized.
One man's beta-testing is another man's peer review. They are fundamentally the same process, so - sure, you can iterate your crypto design into a better shape by releasing a production app and attracting the reviewers by making over-the-board marketing claims. It's certainly more conventional to request feedback during the design phase, but it doesn't mean that doing it ass-backwards is not an option. In the end it's the underlying motivation that matters.
Requesting feedback when the app is released to the masses and they actually believe that it's secure and doesn't send out their ___location data in clear text can have huge consequences, much larger than a small visual bug in an app. It is a core flaw in what they're promising, and you can't iterate over that. How about if a bank did that with their login system? Iterating over fatal bugs and exposing all of their customers to attack? The loss of credibility and fines would quickly put them out of business.
tl;dr: if a geolocation is sent via telegram "secret chat" (end-to-end encrypted), the telegram android app will use clear-text google maps api to display the map, thus revealing the ___location to passive adversaries.
according to a comment, this issue is already fixed in the source and an update for Google Play is in the works.
I'm not quite sure why programmers release crypto apps as "SECURE", instead of "We'd like this to be secure, so we're releasing this beta edition with a bug bounty. Don't use it for safety critical applications yet."
Look, I'm not saying it's not concerning. I'm saying this does not imply anything on the design of the application. If the only lesson I can get from this is "to use SSL", then the design lesson wasn't very illuminating. Get it?
"None" is a bit strong - after all, we're talking about people who have assured us that their boffins are smart enough to roll a new cryptosystem who then turn around and fail Security 101.