Hacker News new | past | comments | ask | show | jobs | submit login

Valid concerns about op-sec and personal responsibility aside, I think this is another example of why "security at the expense of usability comes at the expense of security". Official DoD communications equipment sucks, so people use the less secure, more usable encrypted communications platform when they feel they can get away with it.

Maybe the DoD should work on developing some internal Android and Signal forks that focus on adding additional critical security controls without impacting usability. There's an obvious desire path here.






They are using unofficial comms to stay off the record and unaccountable, it has nothing to do with ease of use.

That's possible I suppose, but do you have any evidence of that or is it just your personal biases causing you to assume the worst motivation you can imagine must be the correct one?

I know personally that given the choice I'd probably rather use Signal than whatever messaging system the DoD contractors managed to come up with. And private conversations between senior military officials over encrypted DoD communication channels probably aren't FOIAable anyway.


Perhaps, but the simplest answer is often the correct one. People are exactly who they appear to be.

But the simple answer is that the devices suck and people don't want to use them. This is going to be more and more true as time goes on because new people coming in will be used to the creature comforts they have from their personal equipment. I'm not defending anyone here, the people in power need to be held to a higher standard than some rando citizen on the street.

> That's possible I suppose, but do you have any evidence of that

Yes, in the chat where a reporter was accidentally present, many of the messages were set to be disappearing. I don't know why anyone would do that if not to avoid recordkeeping laws.

> The images of the text chain show that the messages were set to disappear in one week.

https://apnews.com/article/war-plans-hegseth-signal-chat-inv...

Further, Project 2025 suggests bypassing federal record keeping legislation by simply holding in-person meetings without record.

https://www.youtube.com/watch?v=xxe55mU4DA8

Oddly, the Project 2025 training videos that presumably the members of the executive cabinet have seen say _not_ to delete messages or set messages to auto-deleting _because_ that would be in violation of federal record keeping legislation.


IIRC, DoD uses Wickr RAM for TS messaging and Teams for non-sensitive comms.

Both are fairly "meh" WRT to usability, but neither are so awful people should be breaking the law over it.


Wickr is supposed to only used for unclassified info like health data.

They have a completely sepearate internet for TS/SCI (JWICS https://en.wikipedia.org/wiki/Joint_Worldwide_Intelligence_C...)


I thought there was a build/version of WICKR for DoD-like usage. Maybe not.

I think Wickr RAM is specific to DoD... still not for anything S/TS

That is not the worst motivation I can imagine.

Tell us the worst one you can imagine, please.

The worst one is obvious: the people involved are foreign agents.

The real answer is that none of the people Hegseth wants to show off to have access to those networks.

> when they feel they can get away with it.

It's not just this. Security involves compromises and trade-offs. Humans will be stupid humans and re-use passwords, install better but insecure software, not ever update, etc. It's an old story.

In the year 2025, if communication with any other human on the globe isn't as simple as opening and app and typing, then people will find another way because there are about a thousand better ways.

So I doubt they are trying to get away with anything. They're just preferring the trivial option over the option that probably involves a physical token or slow biometrics or 15-second logout or whatever arduous security features the government comms probably have. Just like any human would.

Perhaps this will force the government COMSEC people to re-evaluate their practices.

Updated to add: I'm not defending their practices, just giving a likely explanation. Blaming the users is not always the best way to evaluate a security failure.


I would hope that when it comes to OpSec, SecDef, DoD, NSA, etc, don't act "Just like any human would."

All humans act like humans would. From a security standpoint, it is a mistake to assume otherwise in any context.

https://www.google.com/search?q=computer+security+human+natu...


It's partly true. They even had Android that runs on General Dynamics OKL4 and Green Hills' INTEGRITY RTOS. Signal could be ported to that. They could fund their own separation layer if they wanted which any vendor could use.

I think big companies' influence on purchasing decisions (aka corruption) drives a lot of this.


The dude has a staff of 30 people who's whole job is to connect him to literally anyone he wants to communicate with -- you're telling me that the usability of concierge service with more than two dozen staffers is inferior to using signal in a building with shitty cell service?

I've wondered about this quite a bit and imagine there's got to be a "telephone" (the game of message distortion) like aspect where if some of the communication was explained, even a little push back might change the outcome. For example, a human intermediary presented with "send these details to these people" might get a "are you sure this person should have access to this?" ultimately preventing a bad/illegal action. People avoiding this kind of accountability, even just to a communications staffer, seems like it would have to be to reduce the subtle steering that happens when people are faced with conflict they don't want to, or have run out of psychological budget to, address.



Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: