His work on security in iOS is quite interesting, but he seems determined to spin everything for maximum publicity rather than, well, accuracy or truth, which is a shame. For example, on that blog post he writes about pcapd and developers:
"Lets start with pcapd; I mentioned in my talk that pcapd has many legitimate uses such as these"
Yet in the slides for his talk[1] under theories he writes"
"Maybe for Developers for Debugging? No."
There are many examples of things like this in his writing, where actual facts are unsaid in order to gain the maximum melodrama for a particular statement.
On top of that he seems to continually avoid the point that to enable these you need physical access to the device (for the pairing process to have a machine marked as trusted). If you have physical access, enabling debug[2] features are probably the least of your worries.
Anyway, rant over. It just annoys me that genuinely interesting information often seems to be spun by personalities to give it artificial gloss these days, making it all feel a bit slimy and self-serving.
>Yet in the slides for his talk[1] under theories he writes:
>"Maybe for Developers for Debugging? No."
Followed by 6 bullet point reasons why this isn't a general excuse for all of the backdoors - it's mentioned in reference to all of his findings and not specifically pcapd (which is only mentioned on 2 consecutive slides out of 60, separated from this statement about debugging by 15 slides.)
Your comment is far more misleading than what he's written.
I was only providing an example for pcapd rather than all of the items he is classing as backdoors. The entire slide is:
Maybe for Developers for Debugging? No.
- Actual developer tools live on the developer image, and are only available when Developer Mode is enabled
- Xcode does not provide a packet sniffing interface for developers
- Developers don’t need to bypass backup encryption
- Developers don’t need access to such sensitive content
- Apple wants developers to use the SDK APIs to get data
- There are no docs to tell developers about these “features”
To me all those points seem to be provided to systematically deny legitimate uses for pcapd, which is contrary to the blog entry where he states, "I mentioned in my talk that pcapd has many legitimate uses". However it's entirely possible I'm reading it wrong.
As I mentioned, there is good information in there. Adding extra, potentially misleading. fluff is unnecessary and counter-productive to my mind. That's just my opinion though.
Since I no longer work for the company, i'll mention that I've worked with the team at Apple that used pcapd in the iphone. It was an extremely valuable tool for finding/testing issues.
The fact that the other major mobile OSs gets 98% of the mobile malware (according to studies), makes this point about the "nonchalant attitude" rather week...
> The fact that the other major mobile OSs gets 98% of the mobile malware (according to studies), makes this point about the "nonchalant attitude" rather week...
No it does not.
It's not acceptable when any company is nonchalant about any security problem on their device, product or service.
And, taking Android as an example, Google is very open about the malware and malicious app problem[1] -- and takes steps to help mitigate said problem.
Apple is just straight-up telling users it's not a problem. There is a key difference here.
> I thought this required me to unlock my phone and say 'I trust this computer'
It does, but only once, and then it's almost impossible to "un-trust" the computer without wiping your phone. Besides, you probably have already "trusted" your home computer -- which could be exploited between then and now and then used as a vehicle for attack.
And, as some commenters have mentioned, when you plug into a new device, use an airplane USB charging port for example, the phone may repeatedly ask you to accept a pairing with the other side (the USB charger/device), and an accidental tap on the wrong button can leave the door open.
That 98% figure is caused primarily by users disabling the default-enabled restrictions on third-party non-Play-Store apps, and particularly in the pirated app space.
The attitude you exhibit towards iOS doesn't fly when Linux advocates mention that Windows is the target of 98% of PC malware.
And maybe it's true: 98% of the PC malware is targeted at Windows and 98% of the mobile malware is targeted at Android. I certainly take advantage of the Windows malware situation by running Linux, and not running any malware checkers.
Apple is a respectable company who cares about their brand image, so they're obviously allowing only high-profile adversaries' malware on your device! </s>
I think that buying an Apple device is implicitly consenting to have all of your communications monitored and not to have access to your own data. In other words, I don't think this is a big deal, but there's also no need to spread FUD against the people who are specifically pointing it out.
edit: It's in the EULAs. I didn't think that I was saying something controversial:) I always underestimate people's level of denial...
I don't know what the situation is with Windows Mobile or any of the other mobile OS's, but with the big two it goes like this:
iOS: You get updates for your device for a decently long time after you buy it, even after new devices come out.
Android: You stop getting updates (all updates: security fixes are NOT backported) somewhere between 6 months to 2 years after getting your device, assuming it's a brand new product line. If it's not the latest and greatest there's a chance it's already out of date, and won't be receiving any updates at all. Even if you do get updates, depending on the device they may be months or even years after they are released.
So, ignoring any other mobile OS's, your choice is "a few known issues and an attitude problem" with iOS or "walking around with well publicised wide-open security vulnerabilities in your pocket" with Android.
Only if you compare certain aspects of it to certain (other!) aspects of other manufacturers.
That was the point. Saying "oh but other manufacturers do other things badly" needlessly polarizes the argument and detracts from Apple's fuck-up, which is the topic of discussion here.
Of course not providing security updates after a a relatively short time is a bad thing to do, but how is that even relevant to the backdoors in this article?
(edit: changed "Android" to read "manufacturers", as em3rgent0rdr rightly pointed out your complaint doesn't have anything to do with the OS, but the manufacturers providing locked devices with Android OS on it. with Apple/iOS this just happens to be the same party)
Actually if you use an android open source based distro like cyanogenmod or ornirom, you will likely be able to continuously get nightly updates. Android is a code ecosystems...you should really be harping criticisms of lack of updates to the individual manufactures such as Motorola or Samsung or HTC if you are comparing to Apple.
You just described significant portions of the security industry, which runs on maximizing the fear and FUD factor.
It's not just true of computer security. It's really true globally of the entire "security" sector, from infosec to police to the global "national security" defense/intelligence industry and so forth. Step 1: frighten, step 2: sell protection, step 3: profit.
Not saying there aren't risks out there, just that the industry markets itself through bombast and sometimes exaggerates them.
Back to the infosec realm, the simple truth is that the only absolutely secure system is one that is off and the only absolute privacy is in your own head (maybe). Everything else is a matter of degrees of risk, and the curve is hockey stick shaped. It's relatively easy to mitigate the big risks, but that leaves a long tail of small risks and small vulnerabilities that require an exponentially increasing amount of effort and inconvenience to deal with.
every industry trumps up the usefulness of their product, it's called marketing. It's on the consumer to cut through the marketing-speak and understand what they actually need to pay for.
Marketing can degrade into con artistry. It isn't always, but it certainly can.
That being said, I think in the long run people appreciate it if your marketing isn't scummy.
(Not directly aimed at the original article, though I do think there's a lot of deceptive, confusing, and overly hyperbolic FUD in the security sector.)
It's also on each industry to inform truthfully and honestly. As the work to "cut through the marketing speak" can obstruct business activity there are already various laws in place to punish a too liberal interpretation of the word "marketing".
Well, yes, but without this huge marketing effort on the side of the infosec industry, we'd still be running the web on http instead of https, and we'd still have to convince people that XSS is bad and not just a "neat trick".
Those are just the first two things from the top of my head that IMO rightly took great effort to be taken seriously by the mainstream (developers and consumers alike).
Also I disagree with the "Uncertainty & Doubt" part of FUD. Security researchers are generally extremely clear about what exactly the issues and risks are, with few exceptions when required for responsible disclosure.
No, definitely not. I was partially ranting :) I couldn't find a recording of any talks he gave, only the slides, so he may have covered that in those.
So if i understand correct, anyone who steals or temporarily has physical access to your iPhone can access your data including (email, etc)account passwords, which makes the encryption on the device completely useless.
If that's not a serious back door, i don't know what is.. the melodrama seems in order.
On top of that he seems to continually avoid the point that to enable these you need physical access to the device (for the pairing process to have a machine marked as trusted). If you have physical access, enabling debug[2] features are probably the least of your worries.
Anyway, rant over. It just annoys me that genuinely interesting information often seems to be spun by personalities to give it artificial gloss these days, making it all feel a bit slimy and self-serving.
[1] https://pentest.com/ios_backdoors_attack_points_surveillance...
[2] Debug if you're Apple, Back Doors if you're Mr. Zdziarski