Hacker News new | past | comments | ask | show | jobs | submit login
Injecting malware into iOS devices via malicious chargers (blackhat.com)
89 points by klausa on June 3, 2013 | hide | past | favorite | 35 comments



As people commented on Reddit, this is not significantly different from any jailbreak using a computer. This is USB we are talking about, and the malicious charger in this case is a full-featured Linux computer.

So yes, they are installing unsigned software through USB on an iPhone by plugging it into a USB socket in a computer. That is just normal jailbreaking, and has been done forever, isn't it?


I think the big difference here is that people willingly plug their phones into the airport kiosk chargers, share chargers, use car-based usb plugs, and generally don't even think of it as an attack vector.

On another note, I've been amazed by the number of people who let others plug phones into their computer for a quick charge.

I'll let people borrow my wall charger if I'm sitting there but no one plugs anything into my computer.


>no one plugs anything into my computer.

I believe the same, including vice versa for that matter. (That is, I would never plug a phone into somebodies computer.)


Thank you for pointing this out.

The researchers are using a Beagleboard device for the exploit. Based on what they've said to the press so far, it sounds like a no-interaction tethered jailbreak (if it's truly functional on all Apple devices) or at best an untethered jailbreak on limited Apple devices.

If this presentation is claiming the ability to install software on ALL non-jailbroken Apple devices running the latest OS version that survives a reboot of the device (once it's detached from the charger), then that's potentially worth more than any "malicious USB charger" talk (since the jailbreak community is currently unable to to do this).


Dock speakers connect to the iphone through the same interface. The high end ones have ARM cores, they have wifi, they might even run Linux. There is a theoretical chain of trojan --internet--> PC --wifi--> speaker --usb--> iphone.

This is, of course, insanely unlikely today. But in a few years, when everything has a 1GHz ARM core and an internet connection, we very well might start seeing malware that jumps devices and infects entire households.


I believe "normal" jailbreaking requires the phone to be unlocked. This works without any user action. When the phone is locked everything is supposed to be encrypted and secured.


Not really, in fact I believe some methods even reboot the system into a recovery mode without any user interaction, so given a vulnerability, there are definitely ways of interacting with the phone without it being unlocked.


But do they start that process with it being locked? It's one thing to inject code while unlocked and another to take control of a locked phone.


The whole locked phone thing is just a graphical and OS feature. If the OS is vulnerable to a remote injection it might just as easily bypass the locked state.


I'm unfamiliar with the exact process of jailbreaking; if a phone is unlocked but plugged into a separate computer, you can essentially run arbitrary code on the phone if you are controlling the computer it's connected to?


>All users are affected, as our approach requires neither a jailbroken device nor user interaction.

That's the bad part! Somebody screwed up big time if the package manager does not insist on user permission for installs that are initiated without proper authentication token. (Google does this for app installs from store over the Internet - but you and your device both need to be obviously logged in to your Google account for that to work.)


The iOS "package manager" does not insist on user permission for such installs because it simply does not allow them at all. Or rather, it's not intended to allow them. Obviously they found a bug, but it's not a matter of forgetting to ask the user. Apple does not intend for this kind of thing to be possible even if the user wants it.


Well they have to allow transfer and install of apps over that channel (30-pin/USB or Lightning/USB) right? (How else could things such as app updates via iTunes/USB work?)


Yes, but only stuff with the right digital signature, so Bob's Happy Exploit App need not apply.


From the same page:

> The vulnerability involves discrepancies in how Android applications are cryptographically verified & installed, allowing for APK code modification without breaking the cryptographic signature; that in turn is a simple step away from system access & control.

Bugs happen.


> (Bugs happen)

Yeah that's the point - having a closed device doesn't magically make it more secure. FTA -

> Apple iOS devices are considered by many to be more secure than other mobile offerings.

Also the Android bug is different class - the vulnerability description doesn't really say what is required to be able to modify the APK in transit which is key to being able to exploit the bug. From the sparse description it sounds like somebody needs to do a SSL MITM or the user needs to install an APK from untrusted source and get fooled into thinking since its signature matches it must be from the original author. (Just to be sure failing to detect APK modification is horrible but whether or not it is easily exploitable is a different thing altogether.)

In iOS charger case - it's clear that it's just a matter of plugging in your device to a malicious charger.


Or I can put an APK on the store or on XDA with the exploit already in it.


What would that accomplish? User will still need to find your APK, trust you, want your APK for some reason and then install it. Here you are relying on high level of user stupidity. It's not like this bug allows you to login to some other developer's account and replace the original APK.


The potential for injecting malware via chargers was described in 2011 at DefCon, although it looks like they didn't actually infect devices.

http://krebsonsecurity.com/2011/08/beware-of-juice-jacking/


Seems like it might be handy to travel with a tiny "USB sanitizer" dongle that simply passes through power and not data. Does such a thing exist?

Edit: not quite a dongle, but something like http://amazon.com/dp/B009W34XMM should fit the bill, no?

Edit 2: http://amazon.com/dp/B0042LF23I looks exactly right, although build quality appears to be an issue.


That would be trickier with Apple's proprietary adapter since the pins are dynamic.


I don't understand. As along as the extension passes power through, it shouldn't matter if the other end of the cable you plug into them is dock or lightning connector, right? The reviews seem to indicate as much, since people report success charging iPads with them.


Ah, I thought you meant a pass through connecter you would plug into your phone.


Yes - it is called the "apple 12w usb power adapter" and costs $19 ;-)


Of course if you can use your own brick there is no issue. But there are times when all I have easy access to is a random USB port, and not wall power.


Apple iOS devices are considered by many to be more secure than other mobile offerings.

This sounds like weasel words (eg, "some people believe"), and even if it's true, why does this misconception exist?


Because Android allows a hell of a lot more access with their apps than iOS does. iOS aims for extremely locked-down security, Android aims for more openness.


Rather than using for evil, this sounds like an unpublished API that can be used to side load apps without needing to jailbreak the device. Excited to learn more.


So far as I can tell, this uses the standard method for sideloading apps via a provisioning profile, but speaks the iOS's USB device protocol directly instead of relying on the MobileDevice libraries from Apple.


Oh, that's it? That's not really news, then. How are they hiding it from the user though? I would expect that would require modifying and restoring the SpringBoard plist to the device, but surely the user will notice if their phone goes into a restore session.


So someone has to pay Apple money to get the provisioning profile in the first place...


If I understand the page correctly this is a description of a talk that will be presented in just under 2 months.

What happens in this situation? Would Apple try to get them not to give the talk? Will Apple patch the problem in the meantime? Does anyone get sued?


Apple could try, but there's no reason they couldn't give the talk. Apple could patch the problem, that's a possibility. Why would anyone get sued? It's completely legal for a security researcher to test their own devices. These guys aren't exploiting people on a widespread basis, they're researchers presenting a new exploit. That's what the Black Hat conference is all about.


I don't know anything about the law, I just assumed that this kind of thing isn't something big companies really want being spread around and I've gotten used to living in a world where legal action or at least the threat of it is a default response. Good to hear it's not like that in this situation!


This seems like a genius idea.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: