Awesome to see Filippo continuing on as the crypto maintainer.
I can't imagine I'm alone in not knowing much or anything about the crypto/* packages, other than they exist, have appropriately cryptic names, and seem to be respected implementations.
Filippo, if you ever find yourself having the time or energy, a "go crypto for dummies" blog series would be a great read :)
Something really important thats going under the radar is TLS fingerprinting [1].
Multiple servers are using this now, including some requests to subdomains on google.com, googleapis.com, CloudFlare and others. I keep reporting this [2][3], and no one seems to care. If a server blacklists your client, whether its cURL or Go "net/http", you can no longer request to that server using that client. Period. Any HTTP client that wants to be robust, should be thinking about this.
This kind of flexibility is a non-goal of crypto/tls. We have a TLS stack with one of the best security track records because we implement an opinionated subset of the specification, amongst other things. Moreover, fingerprint evasion is a cat-and-mouse game we can't sustain in the six months Go release cycle.
That doesn't mean I don't care! I was just talking with a friend about this the other day, and I suggested it should be possible to make a small, easily maintained patch that focuses on chasing the fingerprint of one well-known browser. He implemented https://github.com/hellais/utls-light in that spirit, which looks like a viable solution to me.
Anyway, I think matching TLS fingerprints to HTTP User-Agent strings is a valid abuse prevention technique. Rejecting any non-browser fingerprint is bad, and websites should get pushback for that, but I am skeptical that's something they can reliably do without breaking any time Chrome flips a field study. TLS is not _that_ rusted shut.
I feel like this answers one particular possible endgoal rather than a problem.
The problem in this case being, the user not being able to specify and guarantee a cipher list.
If there's a good reason for not allowing it, it should be spelled out for dummies(like me). And left at that. If there's not a good reason, perhaps it should be considered.
Both the bug report and reply here instead argue about TLS fingerprinting, which while was brought up in the bug report, is not explicitly what's being reported as the problem.
The cipher list being out of the hands of the user is following a modern best practice in security API design: remove configuration at both protocol level (where possible), and from the end user, as they present constant sources of security problems.
For TLS in particular, to provide a safe, known secure set of defaults that the language can change with shifting guidance as cipher suites rotate in and out or are blacklisted for problems is necessary to meet the goals of usability without sacrificing on security. Adding user configuration is introducing a footgun, a place where the user can break the security properties of the system. In addition to not being free to implement, maintaining that configuration option, and then explaining how exposing it isn't a vulnerability everytime someone misconfigures their system and breaks TLS is undesirable and has the potential to give golang's crypto packages a bad name.
Note that nowhere was there a hard no, we'll never implement this, but so far, the use case for needing it is fingerprint bypassing, and that doesn't outweigh the general concerns.
If server blacklists your TLS fingerprint, you can no longer use that server with that client. Only workaround would be a client that can configure its own fingerprint. What you're saying is akin to "let remove the clients ability to set HTTP headers". IE its just nonsense. If server rejects every request because of clients TLS configuration, and client has no agency over that configuration, then the client has no usability.
> Note that nowhere was there a hard no, we'll never implement this
I would say the maintainer closing the issue, is a pretty clear signal for that:
The issue was closed for bypassing fingerprinting, a use case that the maintainers feel is not a valid precursor to this feature. Again, that isn’t a no to the feature under any circumstance, just a statement that this use case is insufficient.
More generally, if a server wants to fingerprint you, there are a variety of additional fields and changes that will flow from that as it’s an arms race, one that this libraries authors are not inclined to keep adding features and configuration for them. There exist other libraries for this purpose, some linked in the discussion. So no, it’s not like saying you can’t set your own HTTP headers at all, more akin to saying that if your needs fall outside the 99% use case you may need to go elsewhere.
> The issue was closed for bypassing fingerprinting, a use case that the maintainers feel is not a valid precursor to this feature. Again, that isn’t a no to the feature under any circumstance, just a statement that this use case is insufficient.
I implore you to open a new issue then, under the guise of your valid use case. I am sure that it will stay open, and that it will not be closed as a duplicate /s
I still wonder why Filippo tried to frame Dan Bernstein's FOIA lawsuit in a negative way [0]. I see it as important and given the NSAs history with Dual_EC_DRBG I think it's perfectly valid to suspect influence from them in the post-quantum crypto standardization process with the goal of trying to push cryptography they can break.
The NSA been caught with their hands in the cookie jar so many times that they deserve extreme suspicion if not outright distrust.
Some may argue that DJB has personal or social failings but he has never been caught compromising a crypto system. Any individual who did wouldn't not by credible anymore in the community, a government agency however is constantly welcomed back to the table.
> Accusing scientists of taking bribes or (previously) plagiarism is wildly unprofessional.
This statement is either naive or purposefully misdirecting the audience. Anyone working in security knows that insider compromise includes a lot more than just monetary bribes - even a paragon of ethics has friends, family and their own personal safety they care about.
Time will tell who is right. I find it weird that they are pushing back on this. There's plenty of poor quality paper publishing with shoddy peer-review. We need more peer reviewers who are openly critical, not less.
One thing is not trusting the NSA, another is being asked to dismiss the work of well-know independent academics because they engaged in an open selection process run by NIST without any objective proof of technical issues.
Again, the FOIA is good, the framing and FUD is harmful and part of a pattern that might be hard to see outside the community.
> One thing is not trusting the NSA, another is being asked to dismiss the work of well-know independent academics because they engaged in an open selection process run by NIST without any objective proof of technical issues.
The selection process wasn't as open and transparent as one would like given the importance of cryptographic standards. I think we agree there, as you also think that the FOIA is good - which means that there needs to be more transparency.
It took 6 years until there was proof that Dual_EC_DRBG really has a backdoor, which has been suspected since the beginning. Shouldn't we be extra cautious this time? If there was some backdoor again, it might take years before it is discovered.
> Again, the FOIA is good, the framing and FUD is harmful and part of a pattern that might be hard to see outside the community.
Could you explain which pattern you see there for us outside of the community? So far I don't see harmful FUD or a framing that is baseless, I see concerns that I feel are valid because of lacking transparency and the NSA's history of interfering with NIST's standardization processes. Why do you think that these concerns are harmful?
> I find it more efficient to work in topic-scoped batches, so I can load context on a protocol and codebase once and use it to land multiple changes.
This is my favorite way of writing software as well. My current gig has a ton of microservices, and when a feature comes up that requires changing one of them, I much prefer to make a couple other, smaller changes that help keep the service operational and easier to maintain with it.
One issue is that this often times bring out the yak-shaving, but I think it's a fair tradeoff and helps reduce the time burden of doing large migrations.
If yak-shaving in this case refers to polishing, I don't believe it's a bad thing per se for a crypto library - or any standard library, for that matter.
Interesting preface I thought. What's the benefit of leaving google to maintain something he could have maintained while still working at google? Just wondering, not questioning.
I imagine he had his reasons... But quitting to continuing what you were doing being paid for in such a large org sounds strange. Maybe I misread the preface.
> Being employed as a full-time maintainer by a big company pays better but is not much healthier, both organizationally and individually. Executives and promotion committees start asking "what is it that we pay you for exactly?", and suddenly you're spending more and more time proving your work is important, and less and less time doing it. The workload increases as the project grows, but the team struggles to get more resources, no one gets promoted, and people burn out and leave or change roles. I've seen this play out across multiple companies and ecosystems, over and over.
If I can be cynical, they I'd say that it's quite clever strategy of Google to get people who actually care about the product and who can bring meaningful contributions, to work for free.
> I've seen this play out across multiple companies and ecosystems, over and over.
I don’t see what’s Google specific about any of this? Certainly one can argue that it’s a massive global conspiracy coordinated across the multiple companies alluded to…
A key reason from the "I'm leaving Google" tweets [1]:
I believe Open Source will change one way or another, so I'm putting my (lack of) money where my mouth is, and doing what I think can best catalyze the change I want: becoming a professional, independent Open Source maintainer myself.
I hope this person is not the only one working on or responsible for Go's crypto package(s). I mean since it's open source all contributions CAN be checked and will have public scrutiny, but a single person being the end-responsible makes me uncomfortable.
The standard library is very unlikely to include, or at least expose, non-final standards. I personally plan to toy with implementing Kyber soon, and plan to produce an experimental age [0] plugin. If a TLS experiment were to gain traction, I don't exclude we might support it in the standard library, since it would be negotiated and we'd be free to drop it later on.
I can't imagine I'm alone in not knowing much or anything about the crypto/* packages, other than they exist, have appropriately cryptic names, and seem to be respected implementations.
Filippo, if you ever find yourself having the time or energy, a "go crypto for dummies" blog series would be a great read :)