The article describes the Justice Dept. doing exactly what has been so widely predicted on HN and elsewhere. Doesn't seem like good timing on the government's part to publicly announce the intention to seek numerous iPhone "unlockings". Rather it plays into Apple's argument about unleashing a torrent of court-ordered demands which will have adverse consequences for security.
Legal minds should weigh in, but I'm thinking the only effective remedy is going to be congressional action, to pass law that defines the limits of court discretion re: forcing firms' assistance breaching their carefully constructed security/privacy systems.
This news prompts me to write my representative and senators and strongly urge them to support enacting this form of protective legislation. If there's enough of an outcry from the electorate there's a far greater chance of putting sensible policies in place. The fact it's an election year can only make voicing our concerns all the more effective.
It makes total sense that the givernment would use it's forensic capabilities whenever the need arises. The question at hand is: "can the All Writs Act compel a company to proactively defeat its own security measures?" The answer will apply to all cases.
The more important question is "Should the government be able to access citizen's digital data with a court order? And if so, how can that be enabled without compromising the general security of the device?
I suspect we'll ultimately wind up with mainstream device manufacturers maintaining some kind of per-device master key that they turn over when ordered.
This will enable government to access data in the typical case. Including foreign governments. Sophisticated bad guys will continue to use zero knowledge software, which government will attack by other means. The per-device master keys will be compromised eventually, which will force enterprises to upgrade their fleeta but normals won't care.
> Should the government be able to access citizen's digital data with a court order? And if so, how can that be enabled without compromising the general security of the device?
No. And that's both impossible and a massive compromise.
This case helps us tackle that first question. Here the murderer's personal phone and computer hard drives were destroyed — rendering them "above/beyond the law". Just because some data is digital doesn't place it in some special legal realm more important than shreddable/burnable paper or the air secret conversations were spoken into. There are fundamental limits to recoverability. If technology companies are to be forced to maintain vulnerabilities because governments see all their customers as potential terrorists, the industry is doomed.
The real problem here is although terrorism will never touch the average citizen anywhere near the extent of other tragedies like illness, accidents or natural disasters, the media treat it like it's the single most important issue — making people fear for their lives is good business. I'd die before sacrificing freedom of speech every time, but the news business just seems too like racketeering. We need to fight the fear.
> terrorism will never touch the average citizen anywhere near the extent of other tragedies like illness, accidents or natural disasters
This argument ascribes zero weight to the injustice of terrorist attacks. Your logic--that a death is a death--does not admit distinguishing between someone dying in a freak accident, someone being killed by a drunk driver, and someone being murdered in cold blood. It's all the same.
You're ignoring a very fundamental aspect of human psychology: people view a death very differently based on the intent of those doing the killing. Unlike murder, terrorism isn't just an attack on one person. It's an attack on the values, religion, economy, and lifestyle of a whole society. That's why people weigh it so heavily.
Possibly.
An individual murder means it is unlikely to have an impact to you. A serial killer will often impact a demographic in a city. A mass murderer will often impact a demographic in a large area or country. A terrorist will usually target a demographic in many countries, but on a much smaller scale than a mass murderer.
I currently think a mass murderer would be a greater threat to the world collectively, but the terrorist triggers the fear that any ___location might be attacked. Hence, while much less destructive, has the "me" factor that pulls heart strings of society at large.
They can search my phone and computer with a court order. What they should not be able to do is force companies to compromise proper encryption so they always have the ability to find something.
That's not the only angle here. Apple was asked to aide the FBI in attacking a phone, not to design bad crypto. (They may have also been asked to design bad crypto, but that's not what is happening here)
Except that the authorities do want companies to use bad crypto. The only reason they've had to fall back on demanding an attack vector is because they haven't yet been able to force their preferred solution (bad crypto) to be implemented.
Demanding an attack vector should be seen as the same concept as demanding bad crypto, because the intent behind the request is the same. They're trying to convince us that these are different requests, but the end result is the same. A workaround to attack good security is the same as having bad security to begin with. I can't imagine why anybody would think that "bad crypto" and "attack vector" are not very nearly the same thing.
But what you are forgetting is that Apple has been fully compliant and cooperative throughout this investigation. The problem with building a backdoor into a highly encrypted security system gives pathways for others to find the same backdoor. If other hackers knew there is a for-sure way to gain access and hack an iPhone, they will find that path. With today's plethora of technology, a line needs to be crossed in order to protect our privacy. We hold so many personal details inside of our phones and if by some chance the backdoor were to be released, chaos and panic would run ramped. I can understand completely why Apple deems this process "too dangerous".
The pathway is obvious- build a signed image that lets you guess unlimited passwords at maximum speed. Apple doesn't have to do it to make it apparent it would work. The avenue is already in use:
As many jailbreakers are familiar, firmware can be loaded via Device Firmware Upgrade (DFU) Mode. Once an iPhone enters DFU mode, it will accept a new firmware image over a USB cable.
The special "backdoor" Apple has access to:
Before any firmware image is loaded by an iPhone, the device first checks whether the firmware has a valid signature from Apple. This signature check is why the FBI cannot load new software onto an iPhone on their own — the FBI does not have the secret keys that Apple uses to sign firmware.
Because they can't. A court order can't let them fly. Nor can it compel you to build them wings. (Though you may have to buy them a can of Red Bull.)
The police were already searching you and your house so we enacted rules to try to control that. Those rules didn't enable the searching - they placed restrictions on the applicability of evidence to reduce the desire to search improperly.
If they can get in. Suppose your house is made of some material (10' thick steel?) which cannot be broken into without a use of a nuke (bear with me....). Is the government allowed to nuke that neighborhood just to get into the house?
>I suspect we'll ultimately wind up with mainstream device manufacturers maintaining some kind of per-device master key that they turn over when ordered.
It will be impossible to secure that database. Any number and size of bribes would be worth spreading around to gain a dump of it. Once I have a dump, you're up shit creek because you're faced with changing every key of every device you've ever sold to resecure them. However, any key changing mechanism you create is an exploitation vector criminals can use to randomly scramble their "master key" and not report the new one to you, and again, any certs you use to secure this process will become v cost effective to bribe for. So we're left with a situation where one breach screws everyone and it hurts legitimate users way more than the criminals it is targeted at.
Back doors are back doors man. Anyone can use the door handle, once it exists.
These kind of master keys already exists so it is not something hypothetical. I am talking about the private keys of trusted root certification authorities. If it would be as easy to get them as you say nobody would probably use TLS since it would be useless.
You're vulnerable to MitM if adversaries have issued fraudulent certificates for sites that you use. And we know that this has happened, more than once.
> The more important question is "Should the government be able to access citizen's digital data with a court order? And if so, how can that be enabled without compromising the general security of the device?
The government already does have access to these citizens' data. That data simply has the quality of having been encrypted.
If I have printed encrypted text and lock it in my home safe, when the government gets a court order to search my house, they'll have complete access to those documents. They can't compel me to decipher those documents though, it's on them to break the encryption.
It's the same situation here. Here, Apple may be able to perform that decription service for the government.
The question to me is, should the government be legally able to press a person or firm into service?
I think a better analogy for this sort of thing is an unknown ___location. It's like you took some unencrypted text and hid it in a tree in a national forest. Everyone strongly suspects that the unencrypted text existed in the past. It's just that only one person knows how to access that text in the present.
The problem with the locked box analogy is that it implies that the cypertext is in some way equivalent to the plain text. In practice that isn't really true. If you don't have the key you don't actually have the plain text. Nothing says you can't look for it if you want. You could even hire a company to help you look. Can you force a company to look if that search would be against their business interests?
Ah, I'm not using the safe in the above as analogy to the cyphertext; in both cases I'm talking about encrypted text being stored (printed physical form or digital data form).
You scenario is wonderfully worded and poses a great dilemma. I personally believe that the government should not be able to press a person or firm into service. In today's world of technology, we tend to forget that what we have on our phones is simply an electronic copy of paperback versions. "Printing out" our phones in the encrypted way you refer would not be any use to anyone but the primary owner of the data. Just because we have the means to decipher documents does not mean we should and it should be ultimately up to the person/firm to decipher said documents.
If the encrypted data is stored on a hard drive in that safe, the government can compel production in some cases, blocked mainly by a legal quirk involving the Fifth Amendment:
IANAL, but I think your original hypothetical would be roughly analogous. I think the difference in this Apple case is essentially the amount of effort required (producing a password versus actually writing [a modicum of] new OS software), but I could be wrong.
My NAL understanding there was that, in cases where the government has already seen the evidence, decription is not self-incrimination. But in general excepting that, they can't compel the owner in the possibility that doing so would force the owner to self-incriminate.
The other question is if it's legal for a company to design their devices so that it is technically impossible for them to comply.
Is there any practical reason why the secure enclave could not be programmed such that any firmware update to the enclave requires the PIN, and if not it erases the key. That way Apple is still able to make updates to the enclave, but cannot be compelled to bypass security restrictions on locked devices.
Pure speculation: I wonder if v2 of the Secure Enclave was going to do just that, but Apple wanted to make sure there weren't glaring software flaws in v1 before they lock themselves out of the chip for good.
It would be difficult to design a law in a Napoleonic system that disallowed that(designing their devices so that it is technically impossible for them to comply), although in the American system I guess it would be possible.
We've come close to this already (mandated Clipper chip). Which would have been a disaster, in many areas.
IIRC, the ITAR rules even prohibited systems that lacked crypto, but that had pluggable APIs which permitted crypto to be added later (e.g., by the end-user). It's hard to see how legislation to the same effect would pass constitutional muster, but it's a gamble, and the administration has little to lose.
>Should the government be able to access citizen's digital data with a court order?
I don't think that's a good question at all. Regardless of the ruling, they aren't going to up and declare encryption illegal. You can't throw a legal requirement at math to stop working.
This is dangerously naive. You think the government can't do something illogical? You just wait for it. You think the government can't outlaw the use of something, even if it's an ineffectual half-measure that does more harm than good? Were you alive during the "War on Drugs"?
If you studied history they passed prohibition that banned alcohol. It did not work as bootleggers and gangsters would make their own alcohol and sell it at hidden clubs. Stores would sell fruit juice next to sugar and yeast and all the customer had to do was find a way to mix them to make alcohol.
They eventually had to repeal Prohibition.
When they banned drugs, they did the same thing. It is prohibition all over again. Criminals and gangs got into dealing drugs for money and people can't give them up. The war on drugs doesn't work because the CIA makes money from drug lords, and gangs and criminals make a lot of money selling drugs on the street. I managed to grow up without getting into drug but during the 1970s and 1980s there was a lot of peer pressure into doing drugs with others when I was growing up. It has become a social thing.
Many prisoners are in prison for using drugs. So our prison system is over crowded as a result. All the war on drugs did was imprison drug users and not go after the gangs and dealers.
The government doesn't get privacy rights. So they violate privacy in order to crack encryption on an iPhone via a backdoor or master key or modified iOS that gets rid of passcode penalties and brute forces the passcode. They only see getting the data off the iPhone to use as evidence, and don't care that it causes iPhone users to lose their privacy and make iPhones less secure.
The government isn't logical about these things, judges and politicians are for the most part not tech savvy enough to understand the technology and the need for privacy. They studied law, and other things, and not technology in the most case to understand how it works. They see getting evidence on terrorists as a priority and don't care what rights it violates. They can pass a law that pressed criminal charges at Apple employees and management for not helping out and try to force them to do what they say. We faced the same issues on the war on terror that required domestic spying and collection of metadata via the US Patriot Act.
> You can't throw a legal requirement at math to stop working.
Well, it's not stopped them trying in the past [1] - and we now live in a world where hilariously/terrifyingly certain numbers are just flat out illegal [2]
In my mind, a large part of the problem comes from the fact that the people making these rules not only have no understanding of the underlying science, but they aren't even expected to _try_ and understand it, even at a cursory level.
Which in itself might not actually be so bad, except we've seen examples time and time again (especially in the UK) where ministers feelings or beliefs trump expert opinion and/or hard science [3]
Not only that, seeing "the government" as a single entity is at best naive.
Can a schoolteacher access a citizen's digital data with a court order? Can the FBI access a citizen's digital data with a court order? Can a court access a citizen's digital data with a court order?
Those questions are fundamentally different and have varying support in lawmaking.
We can. But we'd still reduce the accidental gun deaths tremendously—as well as killings by people who weren't previously criminals. Yes, we probably won't get the guns away from criminals—there are too many guns, and they've been around for too long. But we'll still reduce the deaths.
> "Should the government be able to access citizen's digital data with a court order? And if so, how can that be enabled without compromising the general security of the device?
IMO yes and here is how I'd implement this given a secure enclave:
1) Have Apple generate a device specific key that unlocks the device without pass code.
2) The key gets encrypted with an Apple held master key and printed on the inside of the device, e.g. a sticker on the chipset. Note that at this point, the device is as secure as before if we assume that someone who gets the Apple certifier key can exchange the secure enclave software at will.
3) Apple should ensure that the unencrypted key gets deleted from all of their systems after it was printed, to ensure plausible deniability.
4) The process for accessing the data on a specific device becomes simple - law enforcement needs to be in possession of (A) the device and (B) a court order. Apple will decrypt the key if forced by court order.
4) The process for accessing the data on a specific device becomes simple - law enforcement needs to be in possession of (A) the device and (B) a court order. Apple will decrypt the key if forced by court order.
I think, we can assume that up till now, Apple has its private keys protected on some disconnected machine in a HSM, perhaps multiple HSMs, that are only accessible with very restricted access. When there is a gold iOS build, some of the key personal will go through a security procedure to get the build signed.
Now suppose that such a procedure is put in place. And you get requests from US courts and Western European courts[1]. Once it's there, there will probably be hundreds, perhaps thousands of daily requests. It suddenly becomes undoable to keep the same security procedures. More people need access to the master key to handle all the requests for which there is a court order. Or even worse: there has to be an semi-automated procedure to handle such requests.
In any case, any such procedure will weaken the security of iOS devices significantly. Since Apple's master key is so extremely valuable, it will be an extremely attractive target for attacks (in the semi-automatic scenario); or it's easier to find a corrupt person to leak the keys.
Since people store anything from personal photos, access to bank accounts, to business plans on their phones, we should move into the direction of making crypto systems stronger, not intentionally weakening them.
[1] Let's avoid the slippery slope that this would trigger for not-so-nice regimes for a moment. Although, once such a backdoor is there, it will happen.
I concede, these are good points. I'd like to add though, that it would be better overall to have a solution that at least tries to limit access to specific devices, rather than Apple being forced by the government to build in general backdoors. But I can't really think of a better solution. Maybe have a hierarchy of keys, one per some serial number suffix, each in different servers in different buildings and accessed by different people, so if one gets lost it at least only affects a fraction of devices on the market?
I think that we as a society (though I am not American) should decide first whether we think backdoors for court orders are worth the risk at all.
Let's be honest with ourselves, the chance to be killed by terrorists is minuscule compared to other factors that are under human control (air pollution, traffic incidents, homicides).
If you are looking at non-terrorist criminal activity, one has to wonder if you don't have more serious problems than unlocking a bunch of iPhones when your homicide rates[1] or traffic-related deaths[2] are higher than nearly any other Western country.
Terrorism and crime are just easy distractions to get more power.
And if the Apple master key is ever revealed or stolen, everyone's phone can be decrypted. So yeah, this plan still compromises the general security of the device.
As is the case with Apple's software certifier key. AFAIK this is what we currently have to assume since Apple doesn't specify how the update process for the secure enclave works. If this is the case it doesn't change anything, the certifier key is already like a master key.
Replacing the firmware (Secure Enclave or not) with a version that doesn't include rate-limits and auto-wipe isn't the same as having a crypto backdoor. Right now, your data is still protected by the complexity of your passphrase. This is not true for your scenario.
No. Whatever backdoor you enable for the US government can be used by other governments. When traveling internationally I would rather just deal with the minuscule increase in terrorism risk in exchange for a phone that can't be decrypted by anyone but me.
> "can the All Writs Act compel a company to proactively defeat its own security measures?"
I think there's a qualifier on there that you're missing. It's not just if the All Writs Act can compel a company to undermine their security measures. It's "if a company can undermine their security measures, can the All Writs Act compel them to do so?"
I think the answer is, arguably, "yes."
But that's a meaningful distinction because the 5C lacks trusted execution, and newer iPhones have it. I just read an article where Apple claims the precedent set in this case could force them to undermine phones with trusted execution. And that's probably true, too, IFF they can undermine it themselves.
But the All Writs Act certainly cannot bar a company from creating something that cannot be undermined.
So they'll make firmware that can't be modified without wiping the device.
I wonder if there's a way of using software licensing to bypass the issue. Consider this. What if there was a piece of encryption software where the copyright was held in the UK (i.e. not in the US). The key thing about this encryption library: it has a special license. One of its clauses says that it may not be edited, it may only be compiled by clang, may only be distributed in the form that results from a good-faith compilation under clang, and that - as a condition of use - the distributor may not implement workarounds. The US court would have no jurisdiction for ruling on the actions of the copyright owner. And Apple could defend its behaviour by saying that intellectual property law means that it is not within its rights/mandate to modify the included software.
What is it you are seeking a 'citation' for? This is so obvious nobody with a legal background would question it, let alone write about it. The gp is so silly it doesn't warrant discussion. Law is not a logic game with a closed rule set, it's nonsensical to reason based on that assumption.
The best way (IMO) is to make it unbreakable, and off by default. Then the people who care about privacy can have it, and the justice department can be happy most of the time.
But I am sure a large portion of the people involved in activities where the government wants their info will have it turned on. I'm sure if Snowden's phone was encrypted they wouldn't go "Oh well ~90% of americans dont have encryption turned on so we'll just be happy with that."
Snowden is a poor example here. He worked in intelligence, a field the government would logically and rightly expect to value encryption. This, Snowden would fit in by using encryption here. He'd stand out if he didn't.
Perhaps the middle part of the argument is missing?
People who care about privacy == people who use encryption. People who use encryption already have all of their communications flagged and stored, if the rumors are true.
I disagree. This surely applies to the tech sector, but what about the wider world? Many people have all sorts of privacy habits (close curtains, have tall fences and hedges, they shred their mail or don't discuss wages, etc) yet don't understand or practice anything similar in the digital world.
It's not actually his phone. It was his "work phone" and it belongs to the San Bernardino County Department of Public Health, who is very willing for the FBI to read the data.
> "Rather it plays into Apple's argument about unleashing a torrent of court-ordered demands which will have adverse consequences for security."
I wonder if that would have negative economic consequences for the US, e.g. tech companies that value privacy moving to Europe or elsewhere and customers avoiding US tech products.
If so, i imagine that would probably be the best reason for the US government to not proceed with this?
> Doesn't seem like good timing on the government's part to publicly announce the intention to seek numerous iPhone "unlockings"
That was my immediate gut reaction, that it looks bad because Apple objections are proving true.
But now I wonder if the timing maybe works to their advantage, and the admission they want to use phone cracking everywhere is perhaps a very smart strategy. To technologists and privacy advocates, this might look like an admission of guilt. To the public at large, and the entire country, this might actually be a mountain of evidence that phone cracking is absolutely necessary, something we can't do without if otherwise phones are going to become unreadable by the "good guys".
For better or worse (and IMO it's worse) this might be a very strong argument with very good timing.
Do we know what can the software running on the secure enclave do exactly? If the software itself doesn't have access to the master key (a.i. can't read it directly), and if the slow authentication is inherent from running the algorithm on the chip, rather than a having been slowed down artificially, then even an update to the secure enclave would not compromise the security of the system.
Do we know exactly what the secure enclave software can do and what it can't do?
> if the slow authentication is inherent from running the algorithm on the chip, rather than a having been slowed down artificially
There is no known cryptographic algorithm that provides inherently increasing times. The slowdown is entirely artificial, and it is believed that an update could remove it.
I was under the impression that the algorithm (running on the secure enclave?) takes at least 200ms. Yes, obviously longer times are artificial, but my question is whether the 200ms baseline is artificial or not.
Of course this doesn't really matter at all if modified software could just read the key from the chip. That is the most interesting question. So, can it? Or can the software only provide data to the chip, to do the algorithm in hardware, but the software can't read the key burned (?) into the chip.
Their hash algorithm takes ~80ms per attempt. All crypto happens on a dedicated AES engine. Files are encrypted with a key derived from UID (device key which cannot be extracted via firmware) and passphrase.
With a sufficiently complex passphrase, it's not a problem. With a typical 6-digit PIN, disabling the artificial delay and auto-wipe (which is possible via firmware updates) is all you need for a successful brute-force attack.
> “What we discover is that investigation into one crime often leads into criminal activity in another, sometimes much more serious than what we were originally looking at,”
They want access so they can go fishing too? They're really not doing a good job of sticking to the 'necessary and proportionate' line.
So they are publicly stating that they are not really interested in just solving the case and prosecuting the offender. Instead, they want to see what else they can stick to the man.
I'm sure if you just dig deep enough, you will find some crime in everyone's data. Guilty until proven innocent.
>We are to look upon it as more beneficial, that many guilty persons should escape unpunished, than one innocent person should suffer. The reason is, because it’s of more importance to community, that innocence should be protected, than it is, that guilt should be punished; for guilt and crimes are so frequent in the world, that all of them cannot be punished; and many times they happen in such a manner, that it is not of much consequence to the public, whether they are punished or not. But when innocence itself, is brought to the bar and condemned, especially to die, the subject will exclaim, it is immaterial to me, whether I behave well or ill; for virtue itself, is no security. And if such a sentiment as this, should take place in the mind of the subject, there would be an end to all security what so ever.
Amazing. Either I'd never seen this quote before, or didn't understand the huge negative implications of punishing the innocent on society.
If innocence isn't held in the highest regard, then society itself collapses as people no longer deem it necessary to act in an ethically- and morally-superior manner.
I see many signs in modern society that this maxim was ignored.
they came first for the nouns, And I didn't speak up because I wasn't a noun; And then they came for the personal pronouns, And I didn't speak up because I wasn't a personal pronoun; And then they came for the gerunds, And I didn't speak up because I wasn't a gerund; And then . . . they came for me . . .
"The judge has indicated skepticism over the government’s demands. Initially, Apple agreed to a formal order to help the Justice Department gain access to Mr. Feng’s phone, but Judge Orenstein balked, questioning whether the All Writs Act could be used that way. He invited Apple’s lawyers to raise objections."
I think that's a misunderstanding on the part of the author.
The judge's initial order was issued ex parte -- meaning the judge issued it to the justice department without Apple lawyers being present to argue against the order. Instead, as it was issued ex parte, the judge invited Apple lawyers to respond to the order after the fact by offering reasons it was an undue burden.
Essentially, Apple wasn't there to agree or disagree to it. The Judge encouraged Apple to object to it in recognition of the fact she was issuing it ex parte, not to nudge them out of inaction.
Precisely, and it's worth mentioning that ex parte filings are where a lot of dubious things happen, as that format by definition doesn't really have a traditional adversarial format. Civil forfeiture is the other big example.
This sounds exactly right. From an article I posted earlier[1]:
> In interviews with BuzzFeed News Wednesday, the former officers with the FBI and NSA acknowledged that U.S. intelligence agencies have technology that has been used in past intelligence-gathering operations to break into locked phones. The question, they added, was whether it was worthwhile for the FBI to deploy that technology, rather than setting a precedent through the courts.
This article seems to confirm that Law Enforcement is going to do its best to set a legal precedent.
> The fact that they can resist the order and in a such a public way feels a little bit theatrical, given what we know about how these things work.
I wondered that too. I read that Apple wanted to keep the debate quiet, and that it was the FBI who wanted to have a public debate.
Anyway, here we are having the "public debate about security vs. privacy on computers"
FBI sympathizers complain that some people are talking in extremes, and that the issue needs a balanced, nuanced approach like Bill Gates has offered.
Tech experts tell them that encryption can't be outlawed and the FBI is deaf to it
Regardless of whether we're talking about zero knowledge devices or not, now is the time to have the debate, because compelling Apple to act here is one step away from telling them they can't legally create a completely secure device. If we do this, so will China and other oppressive regimes, and it will hurt the global cause of free speech irrevocably.
> Anyway, here we are having the "public debate about security vs. privacy on computers"
What I think we're having is a debate about access vs. security.
The government wants access to be more important than information security.
Apple, and I guess people educated on the topic who aren't the government, want information security to be more important than access.
If one actor has a magic key to a system, n actors have that key because logic and the faliability of human systems. The government has proven too often that information security is not its chief concern, and we should expect they'll not perfectly protect the capability they're asking for as well.
Do we care to keep information safe, or "safe"? I think that's what the legislature is going to have to decide.
That's commonly assumed, but it's not quite accurate. It's not about "data originating from foreigner". Instead, the criteria[1] is "foreign data" or "data that traveled over a foreign connection". It's not the person that must be foreign, but the data or specific wire on which it travels.
Several sources have explained how the NSA captures domestic data when it travels internationally, such email (gmail) that is stored or processed at a foreign data center. This probably allows almost any data to be captured with some clever packet re-routing.
[1] this may be from one of the infamous "new interpretations" of FISA/etc
I am not a lawyer (and not even an American), but from my perspective, I don't see how they can loose. They have a warrant. I used to work on telephone switches and we added code that would allow someone with a warrant to record conversations all the time. We even had a generic facility for it.
The main difference, of course, is that adding an ability to allow someone to intrude on privacy for a service is different than adding functionality to a product that allows someone to intrude on privacy. The service provider can act as a gatekeeper to ensure that a warrant is provided. If you make a special build that anyone can load onto a device, then, of course, there is no gatekeeper for the warrant.
The key is that I am aware of no law that requires a gatekeeper. Services ask for warrants out of respect for their customers. Often they don't respect their customers and provide access without a warrant. It's not like the software we added to the telephone switches requires some kind of special key or anything. There is no oversight and there has never been any oversight.
The thing is that (as an American in US) you are under no obligation to hand over your cell phone, unless the police have a warrant. Once they have a warrant, then I don't think you have any recourse. As we have seen, you can be found in contempt of court if you do not reveal your password anyway.
So from the perspective of the law, I think it is relatively straight forward. It's going to have to go all the way up to the supreme court. I imagine that the supreme court would choose to hear this case. Essentially you will have to argue that compelling a company to create an exploit to satisfy a warrant constitutes an unreasonable search if that exploit can also be used to perform illegal searches. I think the case would be heard, but I doubt very much that the argument will succeed. If the exploit is constructed on the condition that it is only used to satisfy the warrant in question, I'm pretty sure that it will be acceptable. We all know that such a condition need not be followed since the FBI et al don't care if the evidence they gather is admissible in court. However, I doubt it will matter legally.
It is not settled law in the U.S. whether being forced to handover a password is "contempt of court" or actually self-incrimination protected by the Fifth Amendment.
Wiretaps ordered into the U.S. phone and internet networks via CALEA and FCC rulings, which basically meant that all network and phone switches internationally would have the back door. (See what happened to Greek politicians thanks to that). But the phone portion of those rules came from a very deliberate act of Congress. (Don't get me started on CALEA and the Internet...)
The All Writs act is vague as hell.
And there are definitely laws that require legal process; your telephone records, for one. That's why the Congress passed a law retroactively immunizing AT&T, Verizon, et. al. for helping the NSA collect communications data on American citizens in plain violation of communication privacy laws.
In this case, NOTHING is straightforward. Apple could win on First Amendment grounds; the assistance asked for could be determined to be too burdensome. The Fourth Amendment may not even be an issue.
I'm not even sure this will go to the Supreme Court. If the feds lose, they may well choose NOT to appeal on purpose so they can continue browbeating companies with All Writs act in other jurisdictions and under seal.
That's still not "settled." That's one state court ruling. Other states could rule differently. A federal court in that district could rule differently.
That said, if your threat model involves anyone technically sophisticated or any government actor, I would suggest not relying solely on a fingerprint ID to control access to a device.
The legal question here is really whether the All Writs Act can be used to compel development of new products with specific features law enforcement desires, because that's what it boils down to: they want a custom iOS build with custom security-bypass features that can be loaded onto any phone they want to snoop into.
And when framed like that -- government ordering a private company, without negotiation or contract or even agreed-upon -in-advance compensation, to develop a product -- I don't think there's a chance in hell of it standing up. The question is whether the court that ends up hearing the final appeal will frame it like that.
How is it different than getting a phone company to install a wire tap?
What you describe is not what's being asked for. What the FBI wants is a tool to get into this phone. They will definitely want to get into more phones in the future, but that's not the same as a single tool that gets into any phone. It can be a new one every time.
If it wasn't possible to make such a tool, I'd be a lot more outraged. But it is. Tell me why that can't be done.
The difference is simply that telecommunications are heavily regulated (see CALEA) where if you want to be a service provider you have to provide wiretap functionality.
Personal computers in general are not covered by any such regulation. It is legal to create and use encryption on your own device without building in a government backdoor. For now.
Ironically, the biggest threat to an unregulated computer future is the massive centralization of device control via platform DRM, in the hands of just a few companies like Apple.
CALEA was extended to ISPs once the industry coalesced into a much smaller number of central players.
The difference is that tools and equipment for wiretaps already exist. Tools that bypass the security features of more recent versions of iOS don't. Hence the FBI's desired insecureOS would be a new product, developed under compulsion from the FBI and the courts.
> Essentially you will have to argue that compelling a company to create an exploit to satisfy a warrant constitutes an unreasonable search if that exploit can also be used to perform illegal searches.
From the majority opinion in another All Writs case that made it to the Supreme Court[1]:
"[The lower court] was apparently concerned that sustaining the District Court's order would authorize courts to compel third parties to render assistance without limitation regardless of the burden involved and pose a severe threat to the autonomy of third parties who for whatever reason prefer not to render such assistance. Consequently the Court of Appeals concluded that courts should not embark upon such a course without specific legislative authorization. We agree that the power of federal courts to impose duties upon third parties is not without limits; unreasonable burdens may not be imposed."
Key point: unreasonable burdens may not be imposed
The order also said:
"The order provided that the Company be fully reimbursed at prevailing rates, and compliance with it required minimal effort on the part of the Company and no disruption to its operations"
So the FBI could be expected to reimburse Apple for weakening its product and future earnings.
More details on that specific case as it relates to Apple vs. FBI are available from a post by a computer crime law professor [1]
I don't think the question turns on unreasonable searches. Apple isn't being searched here.
Instead, the question is whether the All Writs Act grants sufficient power to compel a company, third-party to the action, to build a tool for breaking into its product. This is different to your telephone interception example, because that interception is authorised by a specific statute, not using a general writ issued under the All Writs Act.
The argument here is executive overreach - that they should be asking the legislature to pass a law specifically to require this kind of assistance, rather than using the All Writs Act.
The thing is, Apple made a phone with a backdoor. It has a backdoor already. Unlocked by apples software signing keys. If Apple wants to continue to do that, but wants to mostly respect their customers' privacy, they need to create a facility to unlock it selectively upon seeing a warrant. It really isn't that hard to do, since they can make firmware updates that only work on specific devices. It can and probably will be argues that this is exactly like wire tapping, where it is up to the telecom to actually demand a warrant before giving the gov access.
It would be neat if the supreme Court would say this is unconstitutional on some ground, because that would seem to apply to other forms of wiretapping. They aren't fundamentally different. I'm sure the first time the gov asked a company to help them install wire taps, it seemed like a big deal.
I think some people are being intellectually dishonest when they say a backdoor can't be created without sacrificing security for everyone else. It can be done such that your privacy on an iPhone is just as good after a backdoor is created as it was before. Which is to say, not perfect because Apple had the key either way. If you want real security use a good passcode and keep it in your head.
You misunderstand. Apple can't unlock the phone with the signing keys, all they can do is make it easier to brute-force the passcode. If this person had used a long passphrase instead of a 4-digit code, even this software hack wouldn't help.
Orin Kerr, a computer crime law professor and writer, tried to guess over two posts [1] [2]. He did not give an answer but he has not finished writing his thoughts on the subject.
One thing he hasn't talked about yet is how the FBI would reimburse Apple.
The most recent relevant court case that used the AWA was US vs. New York Telephone. Part of the court decision states:
"The order provided that the Company be fully reimbursed at prevailing rates, and compliance with it required minimal effort on the part of the Company and no disruption to its operations."
In his analysis, Orin simplifies and assumes that the FBI could pay for Apple's services. I wonder if they could. Apple is being asked to weaken its product, and despite the government claiming otherwise, I think Apple knows best about whether their product is being weakened or not.
I still don't understand why Apple needs to save the FBI from it's own incompetence in asking the government office that managed the iPhone to reset the password.
It seems to me that unless tech companies come together to defend Apple, we may see (unregulated and unaccountable) government become very very part of all facets of computing.
I can't think of any way in which the FBI's incompetence would have legal implications for Apple's obligations. How would that work?
If they're able to help, and can be compelled legally, why would the FBI having previously had the ability to get the data but losing it to a mistake change their obligation?
The point is that the legality of the court order compelling Apple does not depend on whether or not the FBI made a mistake in other investigative approaches. If the order is illegal, then it doesn't matter whether the FBI needs the access it provides or not, it's still illegal. On the other hand, if the order is legal, then the FBI needing it due to a mistake doesn't suddenly make the order illegal.
Not the same. This is more like if the FBI wanted access to investigate a space murder and demanded Virgin Galactic to build a special shuttle to allow the FBI's equipment to the station, something only Virgin Galactic can do because they're the only ones who have the key to the space station.
and Space Station Technical services had a key to the space station but the FBI ordered them to throw it away some time before contacting Virgin Galactic and saying put people to work building us this special shuttle because otherwise we can't get in.
Then let's say the FBI handled a confiscated SSD drive wrongly, and some of the communication circuits of one of the flash chips holding the data got fried as a result. Your only bet to get data off that chip is the manufacturer of the chip. Could the FBI compel them to try?
But that might affect the reasonableness of the request. If the FBI can do it with off the shelf tools, it's less reasonable to ask someone else to do it than if there's no other option.
I know this is going to sound crazy, but there are parallels here between Apple vs. the FBI, and Superman vs. Batman.
Superman represents an unchecked power, and Batman finds this unchecked power to be unacceptable as a risk, in case the power is ever used against humanity.
Superman vs. Batman, they are making a new movie about that.
Superman caused a lot of damage in The Man of Steel movie and many people died as a result of his fight with Zod that ended up with him breaking Zod's neck and killing him. The movie could have gone a different way if he asked his Father's hologram about Zod, and any weaknesses he might have during the 24 hours he had to think over. Then use the craft he came to Earth with to destroy Zod's ship engines, defeating him without any property damage or killing innocent people.
But anyway Batman always has a plan in taking down any super powered being in case they go rouge or mind controlled. He stores them on his computer and makes technology or finds items that can weaken them or take away their powers.
For Superman he has a battlesuit and kryptonite. The battlesuit gives him strength and power to fight Superman and the kryptonite weakens Superman so Batman can fight him.
Yet in The Dark Knight, Batman used a program to use everyone's cell phone to create a spying device that gave hi the ___location of the criminals Joker employed and where the victims are. Sort of abusing his own powers for domestic spying and making cell phones insecure by infecting them with an exploit that installs his own backdoor to get access to the cameras etc to scan for things.
Given the powers these two characters have, Superman would kill Batman while watching a movie with Lois Lein and eating pop-corn - without Lois realising he was gone for a split of a second.
ps. Not big on DC/Marvel comics/movies, they're all the same to me but keep some f*cking consistency plz.
"But anyway Batman always has a plan in taking down any super powered being in case they go rouge or mind controlled. He stores them on his computer and makes technology or finds items that can weaken them or take away their powers."
That's why DC invented Kryptonite, so Superman would have a weakness. And Batman is familiar with Kryptonite, and possesses some to use in a fight against Superman.
I'm under the impression that Android devices have had optional full disk encryption[0] for a long time, and that encryption isn't dependent upon cloud synchronization.
The plan was to have it enabled by default but that decreased disk performance drastically because of limitations of the encryption hardware. So they scrapped that plan, and its now only an option. Since it was optional, I doubt a majority of users looked at it, let alone enabled it. That's probably why Android doesn't figure in these discussions at all.
Having said that, high-end devices [1] from Marshmallow onward (6.0+) are supposed to have encryption enabled by default, according to the Android Compatibility Definition Document [2] so we'll probably see Android playing a bigger role in the encryption debate in the future. Budget phones are exempt from the requirement.
[1] - For device implementations supporting full-disk encryption and with Advanced Encryption Standard (AES) crypto performance above 50MiB/sec
At least it's easy to set separate boot and unlock passwords on Android. I'm not sure how likely it is for the police to keep the phone turned on, but to remove the SIM on most of them you'd have to remove the battery, which kills all chances of decryption. Even when kept on, dumping PoP DDR isn't a walk in the park..
Yes, but it is opt-in and require the user to have the phone fully charged and plugged in for an hour while the process is performed. Few people bother doing it. The encryption hardware in most Android phones is also not as fast as the iPhone, so there is a performance hit by encrypting.
I believe that by default Microsoft keeps a copy of your BitLocker key and is happy to provide it to the police when requested. Also, see Bill Gates recent comments on the topic.
"In a sweeping November report on encryption, Manhattan District Attorney Cy Vance wrote that Google can “reset the passcodes” on some Android phones without full encryption, when served with a search warrant. “This process can be done by Google remotely and allows forensic examiners to view the contents of a device,” according to the report."
Selective quoting FTW! It devolves into a he-said/she-said conversation.
> "Google has no ability to facilitate unlocking any device that has been protected with a PIN, Password, or fingerprint," [Android’s security chief, Adrian Ludwig] wrote. "This is the case whether or not the device is encrypted, and for all versions of Android."
Also note it talks about being able to unlock phones without full encryption. Which yes, understandable. But I'm interested in specifics at the level that we've been talking about with iPhones - eg, on phones with appropriate TPMs (like the new Nexus devices), is it possible to bypass?
EDIT: Reading the source post [1] more, it notes that
- Google has no ability to facilitate unlocking any device that has been protected with a PIN, Password, or fingerprint. This is the case whether or not the device is encrypted, and for all versions of Android.
- Google also does not have any mechanism to facilitate access to devices that have been encrypted (whether encrypted by the user, as has been available since Android 3.0 for all Android devices, or encrypted by default, as has been available since Android 5.0 on select devices).
- There are some devices that have been configured to use a "pattern" to unlock. Until Android L, "pattern" unlock did provide a recovery option with the Google account. This recovery feature was discontinued with Android L.
- The lost pattern recovery feature never applied to PIN or Password so if you are on an earlier model device and don't want to use the pattern recovery feature, you can switch to a PIN or Password and it will be disabled.
Two-thirds of Android devices in use are still running versions of Android older than L which allow Google to remotely bypass the unlock pattern on demand from the authorities. Companies are still releasing new low-end smartphones with these older Android versions - for example, Walmart recently started selling several $20 smartphones with Android 4.4
At least now all doubt is formally removed that they want to use the precedent set in the SB case for more mundane requests. Hopefully this should make Apples' PR fight easier.
I think I need someone to explain this to me because none of this hype makes any sense.
I am usually wary of oversimplifying these sorts of controversies, but this one does seem exceedingly simple to me. It goes like this:
Is it possible, even with Apple's help, to break the encryption of <insert device model here> without knowing the encryption key, given that the passphrase provides reasonable entropy?
If the answer is other than a flat, unambiguous "no," enjoying the consensus of the scientific and security communities, then that device is simply not secure, right?
...and, to extrapolate just a bit: when device that are secure by this definition are in the mainstream (and my understanding is that even current iPhones, unlike the one at issue in this case, are) then this is entirely moot, right?
The specific technique/functionality they are asking for in this case most likely only works on the 5c, but there are also reports that the auto-wipe on the 6s which is handled by the secure enclave could also be disabled on a locked phone.
Even if both of these particular vulnerabilities are closed, it's almost a guarantee that additional vulnerabilities exist which Apple could be forced to exploit.
The next best example is perhaps police forcing Apple to wiretap iMessages in real-time. They have tried countless times and Apple has said they do not have the capability and refused to create it. But it is clearly technically possible from the design. Closing that vulnerability would require significant changes to the iMessage UX. Some people argue we can't trust Apple and they should make those changes in any case, but obviously Apple is willing to impose some level of trust in their own infra in order to gain UX advantages.
I wish I had a timeline that showed me the frequency of stories coming out on this.
I am only spitballing here, but does anyone else think the heat has risen for federal law enforcement to set some precedents on this stuff now that Antonin Scalia has died? My hunch is that with another liberal judge on the Supreme Court there may be a push to have some of this type of case heard at the highest court.
> I am only spitballing here, but does anyone else think the heat has risen for federal law enforcement to set some precedents on this stuff now that Antonin Scalia has died?
Scalia often ruled against law enforcement's attempts to abuse their search powers.
On any case where he would have had a deciding vote, the result will now be 4-4. That means that the lower court decision stands, that there is no national precedent (appeals court decisions are only binding precedent within their circuit), and that the Supreme Court is very likely to take up the case again when there's a 9th judge. Essentially as if they had never taken the case. I doubt the current members of the Supreme Court would agree to waste their time hearing a case where they know it'll end up 4-4.
It seems hard for me to believe that Apple now has at least 10 instances where the government is trying to force them to decrypt their phones, and this hasn't happened with Google or MS yet.
I don't recall ever hearing a big standoff with MS refusing to decrypt a Windows desktop or server.
Is encryption just that much more common on Apple devices? or is Apple just the first ones to make this all public?
Note that this is the open source, made by one guy in his spare time version, so it has some caveats. But I'll bet you dollars to donuts that the three-letter agencies have their own, more capable version.
That is an disingenuous representation of how the attacks works. That attacks OPSEC, not the Bitlocker itself. Any full-disk encryption is "vulnerable", to this kind of attack.
Not really. Full disk encryption using Pointsec/other commercial offerings, or as you typically do it on Linux with LUKS+dmcrypt, asks for the passphrase before the OS has loaded any Firewire drivers. In which case a fully shut-down computer is not vulnerable to this attack, ie. you have protection against evil maids, thieves, FBI etc.
But with Bitlocker, it only requires a password at Windows login, and by then all the Firewire etc. drivers are up and running. So you have no protection for computers that are stolen/seized by law enforcement.
IIRC BitLocker with pre-boot authentication mitigates DMA attacks. Most Windows hardware doesn't come with FireWire or Thunderbolt ports nowadays. Microsoft recommends pre-boot auth for devices with DMA ports.
These are fair points. But for businesses in particular, it's a problem since many skip on (or are unaware of the need for) pre-boot auth, and business laptops still pack FW ports, if not on the laptop itself, then surely on the docking station.
Maybe this whole thing will turn out to be a giant Streisand Effect that gets even more people using encryption and call out the companies who aren't doing a good job.
Part of apples winning strategy here is that they claim that they don't sell your data to advertizers. Google and MS are a different story. This means that Goolge/MS have a much bigger incentive to store your data in "the cloud" so that they can read it. As Apple has always claimed privacy as one of their features, by fighting this, they stick to that narrative.
It wasn't apples choice really. This is just the first time they've been asked in a public trial. Who knows how many times they (and others) have been asked in a FISA court.
remember that time TrueCrypt got hijacked by 3 letter agency and shutdown while recommending BitLocker? This makes clear Microsoft's willingness to cooperate.
"This is a specific case where the government is asking for access to information. They are not asking for some general thing, they are asking for a particular case" --Bill Gates
They are asking for 10 particular cases, each of which requires a warrant. The point stands that this is not an incident of mass or unwarranted surveillance.
> The Manhattan district attorney, Cyrus R. Vance Jr., foreground, and New York City’s police commissioner, William J. Bratton, behind him, say they have about 175 iPhones they have been unable to unlock.
Wait, how many have they been able to unlock and how? Sheer luck?
As long as the "delete after ten failed attempts" feature is off, you can try as many times as you want. It'll be slow going (the phone institutes delays after a bunch of failures), but I'm sure feeding SSNs, phone numbers, birthdays, etc. into it gets decent returns.
>> still works even with the “Erase data after 10 attempts” configuration setting enabled. Our initial analysis indicates that the IP Box is able to bypass this restriction by connecting directly to the iPhone’s power source and aggressively cutting the power after each failed PIN attempt, but before the attempt has been synchronized to flash memory
> Apple has in a number of cases objected to the Justice Department’s efforts to force its cooperation through a 1789 statute known as the All Writs Act
"(a) The Supreme Court and all courts established by Act of Congress may issue all writs necessary or appropriate in aid of their respective jurisdictions and agreeable to the usages and principles of law.
(b) An alternative writ or rule nisi may be issued by a justice or judge of a court which has jurisdiction."
--
Note it's not any law that regulates any form of encryption or communication security or what some company has to do to help some law enforcement procedures, a lot of laws with such topics were fought about, proposed, discussed and introduced through the years, like CALEA. This is just "we can demand anything we want."
The issue is, should this Act be allowed to be used in such contexts. A precedent can even make unnecessary the current process by which the laws are being made. Note there's nothing specific in that sentence from 1789. Who needs laws if anything goes?
In the US, these are three distinctly different things:
1. A person's "papers," memories and life details typically lives as data on their personal smartphones and other devices. Currently, a phone/laptop seems to be treated as pocket lint and other personal property, so a warrant is typically not needed to access contents, i.e., airport enhanced security screening, traffic stop, etc.
2. Important papers may reside in a home in a safe or filing cabinet. A warrant or probable cause is needed to search a home.
3. A search warrant is not needed to search a vehicle, only probable cause. All kinds of road vehicles including van conversions and RVs are considered vehicles, not homes, even for the folks whom dwell in them, and so there is currently little protection.
The fourth amendment needs an amendment to explicitly include one's personal electronic devices, hosted servers and cloud data in order to be congruent with the spirit of the law, because LEAs clearly do not respect sensible boundaries.
Furthermore, the Apple refusal is mostly a protest stance but moot considering the offensive LE tools industry jumps with glee at every opportunity to provide solutions... if the 10x-wipe retries counter is unencrypted, it will be broken by third-parties. (I hope this is not the case, and that it is an encrypted token of some sort)
> if the 10x-wipe retries counter is unencrypted, it will be broken by third-parties
It can't be encrypted with the same key as the rest of the harddisk, or it wouldn't be possible to increase it on a failed decryption attempt. But even if it is, you can always restore a previous state from backup.
To be honest, I still don't understand who the FBI can't brute force it (other than the time it would take), and how Apple's assistance could possibly help, if the encryption was done well.
There's nothing that makes the digital data on a phone any different or more private than the physical data in a filing cabinet in a home or office, both which require the same level of judicial oversight to obtain (your first point is factually incorrect with regard to this case, as the FBI does have a warrant to search this phone).
Right, and they're in full ownership of that phone. They can do whatever they want with it. The issue here is they want to force Apple to create something brand new against their will. The analogy is a lot like those TSA approved luggage locks that have master keys floating around so that the locks really only keep out law abiding people, as anyone who wants one can get it off E-Bay.
This is exactly the argument that the 'man on the street' is going to think of. There are reasonable checks and balances in most cases for searching things like houses or wire tapping phones, so people need some good arguments as to why it shouldn't apply to a phone's data as well.
Cant a legal hack be to move core security development out of US jurisdiction, and let Apple USA lease tech from Apple Iceland/Switzerland/Ireland/etc, as is done with taxes today ? This would make Apple USA unable to fulfil any court order demanding change to source code.
Question which I can't get out of my head: It would stand to reason that the ruling on this case would affect other commercial phone and/or OS manufacturers, but how would it reflect on some OSS projects, where the coders are not necessarily employees of the project?
It wouldn't have any impact, and that's the point.
To block all encryption apps, you'd need to censor the internet and check every digital device at the border. The government needs to understand that this is the implication of the direction in which it wishes to take us.
We are talking about physical phones produced by Apple, to which law enforcement has physical access. From my understanding this is more about Apple as a hardware manufacturer, who just happens to also produce the software for the same phone.
I fail to relate the scenario we have now to software project (though no doubt the FBI will try to).
With hardware you typically have a clear manufacturer. However things might get interesting if more people make their own hardware using open-source designs.
> The judge has indicated skepticism over the government’s demands. Initially, Apple agreed to a formal order to help the Justice Department gain access to Mr. Feng’s phone, but Judge Orenstein balked, questioning whether the All Writs Act could be used that way. He invited Apple’s lawyers to raise objections.
This is curious, because I was under the impression (due to the lawyers on HN) that this is basically an open and shut case in the eyes of the law: the government has the ability to compel Apple to act. Why would the judge think differently?
(I'm genuinely curious as to the legal aspects of this and not taking a side either way.)
It's tricky. The All Writs Act clearly must have some boundaries as to reasonableness and the due process of the defendant.
What Apple is being asked to do here isn't simply to unlock a phone. They're being asked to use their engineers, money, and expertise to build a tool to defeat the very encryption they developed.
It's not clear the All Writs Act enables the government to simply order a search warrant recipient to create new technology for them.
There is also a first amendment angle in that due to the previous crypto wars code is considered speech so the act of forcing a developer to write code could be considered compelled speech.
from wikipedia[1], apparently there is already quite a bit of existing caselaw for the All Writs Act:
"In the case U.S. v. New York Telephone Co. 434 U.S. 159 (1977), the Supreme Court established a three-factor test for the admissible application of the All Writs Act: the party ordered to perform an action cannot be too far removed from the case, the government's request cannot impose an undue burden on that party, and the party's assistance is necessary."
My guess would be that "undue burden" is probably what the judge is pondering. Possibly "the party's assistance is necessary" (it doesn't say "convenient"!) seems interesting in this case as well, if there are other means to break into the phone (NSA, John McAfee[2]).
I've already made my legal and political opinions in other threads on the topic and won't rehash them here, especially since so many other people are making the points better than I did.
However, here's something I haven't thought of, though I sort of hate to boil the thing down to a business proposition. The fact is that iPhone is a massive business. What all is a company allowed to claim as "burden" in the discussion of undue burden?
Let's say the FBI wins, and Apple is forced into this. Then the narrative in the mind of the public is that Apple has back-doored their phones and made them insecure.
Apple loses literally billions of dollars per quarter for some amount of time until they can repair the PR damage.
Is the loss of, say, 50 billion dollars in revenue over the next calendar year something a reasonable person would call "undue burden?"
What about other ancillary effects that cost either direct money or productivity? There are rumors of something like an iPhone 6c that is scheduled to be released perhaps soon. (Supposedly a revamped 4" phone like the 5s but with the latest hardware) After all this hubbub about a potentially insecure 5c, who is going to go buy a 6c without wondering if it has the same problems?
Casual tech watchers don't understand the nuts and bolts of this situation. They hear some things, read some things and go along with the popular media consensus.
If the alleged 6c were actually going to be launched in a couple of months, the branding, production, packaging, marketing would all have been bought and paid for already.
Does having to recalibrate the launch of a new product and all the costs that might incur count as "undue burden?"
Etc., etc. Maybe they need a significant portion of the iOS team to do this, and the work causes delays in the next version of iOS, iPhone 7 has to get pushed back for release and misses the holiday quarter, again, lost sales accounting for billions.
I think the potential impact on Apple's bottom line could honestly be taken into consideration of burden. Curious about what other people think. Is money just not talked about in these considerations?
> Is the loss of, say, 50 billion dollars in revenue over the next calendar year something a reasonable person would call "undue burden?"
I wonder the same. In another comment in this thread I mention that it's not just undue burden here that is expected under AWA according to USA vs. New York Telephone, it is also expected that the FBI pay for any work they ask Apple to do.
Yahoo was forced to deal with many requests from the NSA in 2008, only those dealings were unknown to the public until recently. [1] The last 8 years have not treated Yahoo too well..
It seems when you fight the government on privacy, you lose. It's better if you roll over and wag your tail like Microsoft
Another twist to this is that the hypothetical burden is not being born by Apple, it's being born by Apple shareholders. I might personally own 1 Apple share, so it's not a huge burden to me if the company loses $50B in revenue and the value of that share goes down 10%. But what about some individual who has invested $1M of her own money, her entire life's savings, in Apple shares and is about to retire and pay the bills by selling shares over time? I would speculate there would be hundreds of thousands of share holders (millions?) who would be burdened as a result of that hypothetical loss of $50B in revenue.
Might criminals, paranoids and privacy extremists wishing for their phones never to be cracked, just choose an 11 digit passcode? I hear this would take too long for a computer to crack.
Everyone else can choose a 4 digit code, and still enjoy very good security up until the point they are wanted by the FBI.
People are confusing the fight for unbreakable encryption with this new fight to keep manufacturer-specific passcode retry attempts nice and secure.
The very fact we have this dependency between the encryption and the retry system, is a weakness probably deserving of attention.
The question at hand is whether or not the DOJ can use the AWA to compel a company to weaken its product. It's unclear as yet whether the court will consider this too burdensome for Apple, or too expensive for the FBI to be able to reimburse Apple for any costs.
Apple's argument is that this will set a precedent, and the FBI will ask to unlock many phones in the future, possibly even to the point of preventing Apple from creating a phone that is not unlockable. Further, that the creation of the proposed hack will by its very nature put the whole iPhone ecosystem at risk if and when it gets out into the wild. The more requests that the FBI makes, the more people Apple will need to train to service such requests, and the more risk that there is a leak.
The analogy here, although dramatic, would be the atom bomb. Hillary even alluded to the need for a Manhattan-like project to circumvent encryption, although I don't think she thought that would be received as a bad thing.
The FBI's argument is that they're only asking Apple to unlock a single iPhone. The DOJ is unwilling to comment on how many iPhones law enforcement across the country would like to use this on, and defers to local officials to answer that question. They are pretending to focus on this one phone while knowing full well the value of what they're after. They wouldn't call up the AWA for a single phone.
Obama is unwilling to draw up legislation with congress about this issue given the nature of the public's attitude about mass surveillance, particularly during an election year. They're probably terrified this issue would fracture the public vote and then there's no telling who we will elect as the next President. So, they directed the FBI to use the AWA.
I'm aware of all that, your summary wasn't needed, and is lacking your personal opinion! "I think" is not a bad thing to say once in a while.
I'll take a guess that your position is identical to Apple's recent letter on the matter.
I think Apple should help unlock these phones, with the condition that such help may be impossible in future versions of the OS. Who wouldn't want future versions of iOS to prevent these requests from being possible even with Apple's intervention? How that can be achieved I don't know. Perhaps some fancy new hardware chip that kills the phone at any sign of tampering. I'd vote for that, most people would, but the FBI would hate it.
The crucial point is getting as much of the public on side as possible. Most people would support increased security and privacy measures for their phones. They would feel threatened by legislation denying Apple or others the right to improve security for customers, meaning such legislation would unlikely pass.
By fighting this current situation, Apple are putting themselves in an awkward situation of "not helping criminal investigation" which is not easy to get everyone on side if you're not being helpful.
It's a bit like chess, and Apple might have done better to make their move at a later time, first helping with these iPhones, then shutting the door on that option in later OS releases. "Sorry, it's encrypted inside and out, no way in, that's how good the security is, because that's what our customers wanted" would be impressive, and hard to defeat with new legislation.
So Apple should comply now, and make a better chess move later. I could be wrong, but that's what I think for now :)
Was your original question rhetorical? In case it wasn't clear by my summary, no, we should not be satisfied with a longer passphrase. We should be concerned about the precedent-setting nature of this case.
> your summary wasn't needed
This is an open discussion. Nobody's forcing you to read my summary.
> "I think" is not a bad thing to say once in a while.
I think that is implied by the fact that I wrote all that. Anything that I haven't cited is my personal take on the issues.
> I think Apple should help unlock these phones, with the condition that such help may be impossible in future versions of the OS
Nobody believes the FBI will adhere to this condition. They can always change their minds down the road.
> It's a bit like chess
It is like chess, you're right. The move has been made and the hand removed from the piece. Time to act, carpe diem. The best thing to do now is educate people about encryption and potentially make this an election issue at some point in the future. Right now the public has nothing to do with this case. It's going to be decided in a court among experts, lawyers and judges.
> Apple should comply now, and make a better chess move later
That's fine, it's your opinion. I disagree, and would add that now is the right time to take a stand while this is in the public spotlight. It will take time to educate the public about encryption and we might as well start now. Compelling Apple to circumvent its own security will make us less safe. There is a black market where exploits are bought by intelligence communities, and if Apple creates this exploit, there's a chance it will fall into the wrong hands.
Suppose Apple loses and tells its engineers to produce this update. What happens if those engineers all refuse? Can they be found individually in contempt of court?
I don't think so. But Apple could face some hefty fines, and they would offer engineers more money to avoid such fines.
If every engineer refused to do it at any price, Apple would cease to exist.
Around 2007-2008, Yahoo faced a $250,000 per day fine for non-compliance with the NSA's Prism program [1], according to documents released in 2014 [2]
Note also that the CEO and co-founder Jerry Yang left his position at the end of 2008, after which time, it seems, they became compliant and the orders stopped.
IANAL, but I believe the burden is on Apple, not individual employees.
Apple will have however to show that they are doing their best effort, and so will be forced to fire/transfer them to other positions and hire engineers willing to do the job.
In the article: "In a report covering the first six months of 2015, Apple said it had received nearly 11,000 requests from government agencies worldwide for information on roughly 60,000 devices, and it provided some data in roughly 7,100 instances."
So an icloud backup negates the need to force apple to unlock?
Any way to backup an ios device online on a 3rd party, encrypted using a public key?
> Any way to backup an ios device online on a 3rd party, encrypted using a public key?
Not as seamless as the iCloud backup. Essentially you need to back up each service itself and restore it all manually when you need to.
Not that Android is any better in this regard; it works pretty much the same.
It's unfortunate as there has been a huge focus on device security, mostly for the payment capability but the actual data on the thing? Easily backed up and extracted via government request.
Between this and the on-going case with Microsoft Ireland, would anyone like to speculate on the impact to US tech companies ability to compete internationally if the US Justice Department wins?
There may not be a foreign equivalent for US companies working on complex or enterprise problems.
Europe is often times the second highest revenue generating region outside of the US for US tech companies. Do European businesses care about this? Would this cause them to adopt a lesser alternative? Does this comply with EU Model Clauses that govern regulated verticals like defense, finance, academia, healthcare, etc?
I am not a lawyer, but I can't imagine the EU Model Clauses would allow for something like this.
And just plain criminals, with political motives or not - passing a law never stopped those. "It would be unfortunate if the data from your lost iPhone somehow got leaked..."
I still like BGs argument (and do consider myself a hacker haha)
"It is no different than [the question of] should anybody ever have been able to tell the phone company to get information, should anybody be able to get at bank records. Let’s say the bank had tied a ribbon round the disk drive and said ‘don’t make me cut this ribbon because you’ll make me cut it many times’.“
That is a ridiculous and misleading argument and Bill Gates should be ashamed of having made it.
The difference is that ribbon isn't hard to cut and so the bank wouldn't have to develop a whole new type of scissors to cut it and doing so doesn't weaken the security of a bunch of other bank records secured with similar ribbon.
Apple isn't an arm of military intelligence. They shouldn't be compelled to act as one, particularly when the task would take a month, yes.
It's not just about time. Part of Apple's product is security. The DOJ is asking them to weaken their product. This could cost Apple their business in the long run as foreign companies enter the fray and offer a secure product that Apple is no longer allowed to produce.
Keep in mind that there's a black market where hacks and exploits for various systems are sold by hackers and bought by intelligence agencies. If this software is created by Apple, there's a chance it could get out there and end up in the wrong hands.
We live a western society, not China. If more than 50% of the population wants the phone to be cracked it should be done. That's what we all agreed on living in a democracy. Law is formed by the wishes of majority.
I'm not a lawyer. Is the FBI asking something from Apple which is not legal? Than go to court.
Exactly. So why are we asking Apple to do something which Obama rebuked China for doing last year? [1]
> If more than 50% of the population wants the phone to be cracked it should be done
This isn't up to the public, it's up to the courts, who've been asked by the DOJ to consider the issue.
The public only comes into play around election time when there's a chance to vote in a new President. And, it's likely this decision will be made before Obama leaves office.
> Law is formed by the wishes of majority
No.. Law is created by elected officials who are tasked with studying the law more closely than the general public
I can't imagine a country where 80% of the population is against abortion, and it still would be legal. In some indirect way law reflects what a population wants. Even the constitution can be changed with enough votes in parlement (at least in The Netherlands, US I don't know).
So the question is still open, is there any legal ground in what the FBI is asking from Apple?
And if there is, should the cooperate? And what would be the alternative, leave America?
America is ruled by corporations. It is unjust but can you imagine Apple, an american company, having fines or being rendered such that it was disadvantaged to sales in China/India or setting off its decline.
The only time the exception was ever made was to prevent monopoly about a hundred or so years ago.
Otherwise its perfectly ok to be an American corporate citizen and challenge the law.
China did try to draft a law to compel companies to create this kind of back door for their government.
President Obama told President Xi that China would not be able to do that if they wanted to do business with the US [1]
As far as I know, the iPhone still sells in China, so that law never made it through. If the US allows this to happen, you can bet China will demand the same.
Well the spin game is guaranteed to ramp up. Sycophants are already laying the ground that Apple and Cook would have to accept some of the blame in any upcoming terrorist attacks. I am amazed they haven't tried the kiddie porn route yet.
Still another issue is, if they were compelled to create it they could be compelled to surrender it too. With that its a matter of weeks or months before it gets leaked to a criminal organization or country.
My long term concern is, would we ever know if they got compelled to change iOS to insert a backdoor that gets pushed to our phones. Even if we do how long before carriers are required to lock users out for not updating?
This is pure hypothetical, but I've wondered what would happen if apple were to lose in court, then make this custom iOS signed to work only on the specific phone in question, and provide the installer to the FBI with a bill for $5 million in development time. Then destroy all source on this special version, stating that they don't feel comfortable having that in their source tree since a rogue employee might leak it. Then the next time there is a phone to crack, they build the custom version again, an provide it to the FBI with another $5 million bill for services rendered since all development would have to be done from scratch. That would at least provide a bar that law enforcement would have to pay, rather than simply thinking once the cost is born once, future versions are cheap like $10k.
Well. "Hello, Mr. X and Mr. Y, we hear that you're the ones who built the previous version. We'll taking you on a vacation to Gitmo, so you can build us another one. You're welcome."
The discussion about encryption is getting completely out of context, like 'encryption' is something magical.
We're talking about cracking a 4 digit access code of a phone, which is extremely easy. Apple knows this, that's why they set a digital booby trap which fires after 10 tries.
So the real discussion should be, "Can the government force a company who placed a booby trap, to remove that same trap if needed?"
Whether this is a digital trap, of a bomb placed on a doorknob is not important.
> We're talking about cracking a 4 digit access code of a phone, which is extremely easy.
Technically, it's obviously not just a 4-digit access code that prevents FBI to unlock the phone, otherwise they wouldn't demand Apple to produce the whole new version of their iOS and setup them special access only to enable that.
And even more important, the legal basis they claim to have is the "we can do anything" sentence from 1789:
There's nothing about the obligation of companies to do something specific there, certainly not about changing their own products or making the new ones.
What is encryption? It's a method of denying access without the correct key (for as long as it takes for the data to become worthless - not necessarily forever). Actually destroying the data is one of the ways to deny access.
NB: if it were so easy, we wouldn't be having this discussion. So, apparently there are some mitigation techniques which help in case of weak passphrases, and make it not-so-easy, no?
Yes, but for Apple it is easy. Just change 1 line of code, recompile and patch the phones firmware. Than some FBI intern can try 10000 lockscreen codes and they're in. No big deal.
My whole point is, this is not a technical issue. It is easy. It is a legal issue. Can the government force a company to do such a thing? Especially when the impact on society is zero. The firmware upgrade is not released outside of Apple, no other phones get it.
It feels almost like a publicity stunt to me. Apple being the underdog in the fight against the big evil government.
If they don't want to cooperate and there is no legal basis for doing so, than don't cooperate.
"The firmware upgrade is not released outside of Apple, no other phones get it." - that's the non-obviously haaaard part. The one that would fail, sooner or later.
Let's end drug prohibition first, and then discuss whether to hack the remaining phones, if any. This whole fiasco has nothing to do with terrorism.
Alternatively, introduce a restriction that this form of forced labor can only be compelled in terrorism cases where lives are in imminent danger. How many phones will be left to hack?
That or they will move on to the next narrative, cyber crime or human trafficking or any number of crimes that would be sooo much easier to solve if we gutted the Constitution a little bit more.
I'm confused. From what I understand apple has been doing this for some time. I even remember someone saying that apple did this many times before. Even if apple isn't the one doing this, there are also plenty of people who are familiar with low level vulnerabilities of iPhones that can essentially do the same tasks.
Apple has not done "this" before. Apple may have unlocked phones for older models (with less security), but with the upgrades to iOS for versions 5 and 6, this is a much different ask.
The FBI is hoping to have Apple develop a new iOS that does not automatically wipe the device after <x> invalid password attempts, then use their signing keys to push a deployment of that operating system onto this specific phone.
Nobody else has access to Apple's signing keys, ergo nobody has the ability to do this on Apple's behalf.
Yes but they have unlocked phones before and they will conceivably continue to do it. It's not about the difficulty of the task, it is about the principle.
I haven't heard of the FBI asking to get apple's keys before, but that is crazy.
Ostensibly, they made the security increases in iOS to prevent from being able to comply with these types of requests, ensuring that their software was as secure as possible.
Aside from that, a fourth amendment search or seizure cannot generally compel someone to open a door. The usual logic is that it allows agents entry; the trade-off of letting them in is that you don't have to replace your door after.
This isn't a matter of standing aside while the agents effect the search, it's a whole different thing. Put into (what will assuredly be a bad) analogy, whereas a physical property search involves opening a door, or standing aside while the cops break down the door, this scenario is more akin to demanding that Apple build an entirely new house, one without doors, then removing the old building and installing the new building in its place so that the cops can enter.
With devices running iOS 9 there are no known vulns that Apple or anyone else can exploit to unlock or get data off of a locked device. The government wants Apple to create a special, signed version of iOS that can be loaded onto iOS 9 devices that will allow it to bypass security measures designed to prevent brute-force unlock attempts and hopefully never fall into the wrong hands (because that would never happen, right?)
Here is my take on the recent apple vs justice saga:
Being the cynical fuck I am, a former action arm of the darkside, I have been telling friends and family for years that they should assume anything with a cellular modem in it is potentially comprimised by a nation state or above actor (yes, "above" nation state exists... Its called the deep state you fool).
I automatically assume that such publicity is actually closer to a honeypot to entice foolish mid level criminals into thinking iBrain devices are "secure", when I think they probably have miltiple backdoor avenues in place.
Of course, I'm just the hn resident conspiracy theorist, so it's probably just me being paranoid...
I think it's important to dispel this fiction that Apple will be "unlocking" anything.
To use the word "unlock" seriously blurs the lines of what's going on here. They're merely asked to flash it with software that removes a delay in submitting passcodes and removes the wiping function after ten failures.
That's not unlocking it.
If Apple complies with the order, the FBI will still be getting an encrypted iPhone back, and they'll still have to sit around and try to decrypt it.
Today FBI has the resources to desolder the chip and put it into all sorts of controllers where they can prod and poke at the encrypted data.
What if this guy used a long passcode? They're still going to try to get in, and the only difference is that they'll move the heavy lifting off of the iPhone and try to crack it with beefy computer. And to do that, they'll have to lift the chip off the SoC anyway.
My point is that when people say Apple will "unlock" the phone, it insinuates that the only thing standing between FBI and the data is Apple. And that isn't true. Even if Apple complies, they're they're not guaranteed to get in. Furthermore, Apple could comply and they still might find themselves pursuing an angle that they're already capable of. FBI are going through all these court hearings and process all for the sake of trying 0000-9999.
In other words, this is obviously bullshit on the side of the FBI. The question is why they're doing that. I suggest that it's not about legal precedent, because newer iPhones can't be undermined like this and the All Writs Act can't compell Apple to stop producing such devices, it can only (arguably) compell them to undermine devices if it's within their reach. (IANAL so please call me out if I'm wrong about that.)
I suspect it's about PR, because when they lose they can throw their hands up and the news pundits will scream about terrorists winning in our courts. Washington will then push their backdoor legislation that they've been asking for over the past few months.
Legal precedent isn't what they're after, IMHO. The legal precedent that would be set wouldn't be applicable to where things are headed. They're looking for public appeal.
I think when a layman hears "unlock" they're more likely to picture a scene from a police detective drama when the investigators ask a landlord to unlock a deceased tenant's apartment so that they can investigate.
If it were rephrased as "Hello Mr. Landlord, we're ordering you to open a machine shop and pay several employees to develop a new lockpicking device that could be copied infinitely for free and which would allow us, or anyone really... a North Korean agency, an organized crime syndicate, or a jealous and abusive ex-lover perhaps, to more easily break into any of the homes of billions of people around the world with or without reason or warrant?" then the layman might perceive it a little differently.
That seems to be like the message that Apple's trying to communicate, but I don't know if they're getting it across clearly enough for most people.
That's where your analogy breaks down. This hardly makes it easier. If North Korea or FBI or NSA or a criminal organization wanted to break into your iPhone, and this backdoor existed and was distributed, they would have to externally flash it with firmware and then hook up a controller that tricks the digitizer so that they can brute force the passcode, trying each one synchronously on a shitty mobile processor SoC.
Is that really less complicated than externally reading the memory itself and using John the Ripper and a supercomputer? Is it meaningfully less complicated?
What's wrong with the letting the government get access to the phones of a few potential criminals/terrorists?! I don't get what all the fuss is about.
I don't fully trust the government, but I trust it far more than I trust Apple.
Sometimes I feel like the whole Snowden thing is just an excuse for big corporations to keep all their data and analytics practices to themselves outside of the scrutiny of the government.
Since when did the government become the enemy? There is something really twisted happening behind the scenes here.
Big corporations are manipulating us into thinking that the government is not to be trusted. But think about it; the government doesn't care about making a profit.
Without the government, the masses have no voice. I would gladly help society and let the government look through my phone if it will help prosecute a criminal.
Is this a serious question? There are volumes upon volumes providing viable, defensible answers. The problem is that you have to empathize with the under-privileged and unrepresented.
It's not about the government getting access to a tool like this, Apple specifically doesn't want to create a tool that can easily defeat its own security, to then hand over to the FBI. The government has, on numerous occasions, shown it has some pretty horrible practises in securing information, and this kind of tool being let out into the wild is bad for For everyone (not just Apple).
Why risk opening that Pandora's box? If the tool doesn't exist, it can't be exploited by bad actors
What tool? Apple can just give the government the private key for that specific phone. Done.
Everyone else is still safe. Safer, I might add.
So long as there is a clear process for the government to get access to specific keys for specific phones.
If Apple is CAPABLE of building such a tool (and use it for themselves), then I think the government should have access to it too.
Apple does not have the key for that phone. No one does.
What the FBI is asking Apple to do is write software that will turn off the "wipe after 10 wrong passcodes" feature of iOS, so that the passcode can be brute-forced.
Setting aside the government's interest in such a tool, imagine the interest from hackers.
Consider that in 2011, someone hacked into RSA to steal info about their tokens, just so that they could then hack in Lockheed to steal top-secet info.
Now imagine someone hacks into Apple (very possible to happen) and steals the security-defeating software code to install on other iPhones.
Though I find it hard to believe that Apple doesn't already keep some sort key(s) to unlock individual phones or to turn off this "wipe after 10 wrong passcodes" feature.
Facebook (and pretty much every other internet company on earth) keeps password hashes and salts in their databases - So in theory, the government could already brute force the vast majority of our personal data from these websites.
At least with a phone, the government has to physically get a hold of it in order to brute force the phone and read the data.
No need to brute force Facebook or most other hosted services, because very few of them store user data encrypted at rest.
Passwords control access to features of the web application, but employees of the company can just go around that and get the data off the server directly.
iPhones running iOS 8 or higher are different--they do encrypt data at rest, and create the key by combining device-specific info with the passcode that the user creates. So without that passcode, no chance to decrypt without brute forcing.
If you're asking what tool FBI wants (special weakened version of iOS), and suggesting Apple hand over a non-existent private key, then you don't understand the basics of the dispute.
Legal minds should weigh in, but I'm thinking the only effective remedy is going to be congressional action, to pass law that defines the limits of court discretion re: forcing firms' assistance breaching their carefully constructed security/privacy systems.
This news prompts me to write my representative and senators and strongly urge them to support enacting this form of protective legislation. If there's enough of an outcry from the electorate there's a far greater chance of putting sensible policies in place. The fact it's an election year can only make voicing our concerns all the more effective.