Forum Home
PC World Chat
 
Thread ID: 137568 2014-07-21 23:00:00 Security researcher suggests 600M iOS devices have Apple-created backdoors for data Geek4414 (12000) PC World Chat
Post ID Timestamp Content User
1379825 2014-07-21 23:00:00 Stock manipulation a day before 3rd quarter earning reports tomorrow?

Security researcher suggests 600M iOS devices have Apple-created backdoors for data
Gigaom By Kevin C. Tofel 1 hour ago

finance.yahoo.com

How secure is the data your iPhone or iPad? A little less than perhaps you thought, according to Jonathan Zdziarski, who has a slideshow of findings that may surprise you. A security researcher with several books to his credit, Zdziarski suggests that 600 million iOS devices have built-in backdoors and undocumented services put in place by Apple.

Zdziarski’s slides came to light on Monday through ZDNet and were used in a recent conference talk he gave called “Identifying Backdoors, Attack Points, and Surveillance Mechanisms in iOS Devices.”

The full set of slides are available in a PDF download here, but some of the highlights include:
•Library and cache files are not encrypted although since iOS 7, third-party documents are.
•Some of the undocumented services in iOS — “lockdownd,” “pcapd” and “mobile.file_relay” — can get at encrypted data for access over USB and perhaps thorough a cellular connection.
•Third-party forensic software companies that know how to access data through these backdoors are selling their services to law enforcement agencies.

Zdziarski suggests these services are part of iOS by design, perhaps so that Apple can comply with legal requests for data from the government. It’s difficult to say, of course, and Apple hasn’t commented on the original story.

I can understand why Apple might want such software loopholes; it makes it easier to provide such data if ever it needs to. And generally, Apple has put security issues at the forefront of iOS: Sure, there have been occasional security holes found, but the company is quick to deal with them. It also offers a number of security features for personal and enterprise use: Full device encryption, sandboxed applications, app code signing, and a secure boot chain.

Zdziarski thinks this situation is still a breach of customers’ trust, however, mainly because these services are undocumented and not mentioned to consumers. I can see his point: People don’t like to be surprised by learning that services have long been running on their personal devices without their knowledge.
Geek4414 (12000)
1379826 2014-07-21 23:18:00 So in summary, mobile phones arent secure
Not really a surprise , surely.
99% of users dont seem to care, look at all the uneeded apps downloaded & installed without a 2nd thought
How many ph's have no passkey to get into them.
1101 (13337)
1379827 2014-07-22 04:29:00 How many ph's have no passkey to get into them.

85% I believe is the statistic, which is why they're beefing that up with Android Wear and the likes. If your phone is within a meter or two of your device, in Android L, you don't have to enter a passcode. Phone too far from you, and the passcode kicks in.
Chilling_Silence (9)
1379828 2014-07-22 08:15:00 I can understand why Apple might want such software loopholes; it makes it easier to provide such data if ever it needs to. And generally, Apple has put security issues at the forefront of iOS: Sure, there have been occasional security holes found, but the company is quick to deal with them. It also offers a number of security features for personal and enterprise use: Full device encryption, sandboxed applications, app code signing, and a secure boot chain.

Just ignore the part where the NSA can stick whatever backdoor on them that it likes... ;)
Agent_24 (57)
1379829 2014-07-23 00:21:00 Shady . . .

In a response from Apple PR to journalists about my HOPE/X talk, it looks like Apple might have inadvertently admitted that, in the classic sense of the word, they do indeed have back doors in iOS, however claim that the purpose is for “diagnostics” and “enterprise” .


. zdziarski . com/blog/?_ga=1 . 87765276 . 585154247 . 1400508931" target="_blank">www . zdziarski . com

Apple Responds, Contributes Little

Posted on July 21, 2014 by Jonathan Zdziarski

In a response from Apple PR to journalists about my HOPE/X talk, it looks like Apple might have inadvertently admitted that, in the classic sense of the word, they do indeed have back doors in iOS, however claim that the purpose is for “diagnostics” and “enterprise” .

The problem with this is that these services dish out data (and bypass backup encryption) regardless of whether or not “Send Diagnostic Data to Apple” is turned on or off, and whether or not the device is managed by an enterprise policy of any kind . So if these services were intended for such purposes, you’d think they’d only work if the device was managed/supervised or if the user had enabled diagnostic mode . Unfortunately this isn’t the case and there is no way to disable these mechanisms . As a result, every single device has these features enabled and there’s no way to turn them off, nor are users prompted for consent to send this kind of personal data off the device . This makes it much harder to believe that Apple is actually telling the truth here .

Apple’s seeming admission to having these back doors, however legitimate a use they serve Apple, unfortunately have opened up some serious privacy weaknesses as well . We know, from the Snowden leaks via Der Spiegel, that NSA has penetrated target desktop machines to later access iPhone features . We also know that desktop machines are often seized by law enforcement and with that pairing record data, can access the data on the device using these services – even if backup encryption is turned on . Pairing records can be stolen a number of different ways, ranging from a shared coffee shop computer to an ex-lover whose computer you used to trust, leaving your phone unlocked at a bar (Apple should know something about this), or countless other scenarios – all giving the attacker perpetual access to your device via USB and usually WiFi until you wipe the device . It is only recently that iOS even added a trust dialog; prior to this, your device would automatically pair with anything that you plugged it into .

Obviously, Apple realized that pairing in and of itself offered very little security, as they added backup encryption to all backups as a feature – something that also requires pairing to perform . So Apple doesn’t trust pairing as a “security” solution either . And for good reason: it wasn’t designed to be secure . It is not two factor; it is not encrypted with a user paraphrase; it is simply “something you have” that gives you complete unfettered access to the phone . And it can be had as easily as copying one file, or created on the fly via USB . It can be used if law enforcement seizes your computer; it can be stolen by someone hacking in; it is by all means insecure . But even with the pairing record, I would have expected the data that comes off my device to be encrypted with the backup password, right? These services completely bypass this .

I understand that every OS has diagnostic functions, however these services break the promise that Apple makes with the consumer when they enter a backup password; that the data on their device will only come off the phone encrypted . The consumer is also not aware of these mechanisms, nor are they prompted in any way by the device . There is simply no way to justify the massive leak of data as a result of these services, and without any explicit consent by the user .

I don’t buy for a minute that these services are intended solely for diagnostics . The data they leak is of an extreme personal nature . There is no notification to the user . A real diagnostic tool would have been engineered to respect the user, prompt them like applications do for access to data, and respect backup encryption . Tell me, what is the point in promising the user encryption if there is a back door to bypass it?
Geek4414 (12000)
1