BETA
This is a BETA experience. You may opt-out by clicking here

More From Forbes

Edit Story

Apple Backtracks On iPhone Photo Scanning—For Now

Following
This article is more than 2 years old.

Apple’s latest shock update is definitely a reason to buy the new iPhone 13 and upgrade to iOS 15 as soon as you can. But Apple’s fight with Google and Facebook has taken a sudden twist, and the future shape of your iPhone is now genuinely at stake.

By now you’ll know that Apple has delayed the addition of an iCloud Photos CSAM filter and an iMessage photo filter to iOS 15. This backtrack was becoming as inevitable as the backlash Apple’s decision triggered in the first place. The iPhone 13 launch has been hugely tainted and Apple has failed to shake the negative messaging. This is not what we’ve come to expect when the stage lights dim, and a glitzy Apple launch begins.

Apple is quite simply a victim of its own success here. We pay a premium for our iPhones because we believe the hype—privacy, security, control. The fact that Apple’s decision to run client-side monitoring prompted genuine questions over whether users should stick with the brand, shows just how fundamental an issue this quickly became.

This is a disaster for Apple—while hitting pause will be PR’d as listening to user feedback, the reality is that there are fundamental issues that can’t be resolved without a complete reversal of its plans. And worse for Apple, the company has now lined up behind its major rivals, following their own embarrassing reversals this year.

Three privacy backtracks inside just a few months—on that at least we can take some comfort. The privacy lobby is honing its skills, aligning its voice. Concerted campaigns have made a material difference when in the past that would not have been the case.

ForbesWhy You Suddenly Need To Delete Google Chrome

First we watched Facebook reverse its insistence that two billion WhatsApp users accept new data sharing terms of service or lose access to the app. That decision, which was as badly communicated as Apple’s CSAM update, prompted regulatory threats and complaints, a viral user revolt which catapulted Telegram and Signal into the headlines, and a WhatsApp PR recovery blitz that’s still ongoing.

Then we had Google retreat, after it secretly enrolled millions of Chrome users into its poorly designed FLoC trial. Again, the privacy lobby was incensed. Google claimed it was all fine, before quietly ending the trial and even more quietly admitting that the privacy fears were justified. Google backtracked. FLoC V1 was killed, and the company said it was taking time out to consider its next steps.

And now Apple. Just as Google said on FLoC that “it has become clear that more time is needed across the ecosystem to get this right,” Apple is now saying on CSAM that it has “decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.”

And so, Apple is now keeping company with Facebook and Google—the world’s leading data harvesters—on a hasty privacy backtrack. This really isn’t a good look. And like its rivals, Apple needs to ask itself some serious questions as to how it got this so wrong.

But a billion iPhone users need to watch closely now, because there is a key difference to the other backtracks—what Apple does next will shape the iPhone’s future.

Facebook needs to monetize WhatsApp—it runs the world’s largest messenger to generate revenue and profits, whatever the quirky, privacy-based social media posts might suggest. WhatsApp’s end-to-end encryption may have been a philosophy when it was privately held, but under Facebook’s ownership it’s a marketing USP.

Similarly, Google needs some form of user tracking in Chrome. The company can’t sell targeted ads if it can’t fuel the data-fueled, algorithmically honed ecosystem that enables the buying and selling of ads and the measurement of their success. That’s why Google has given tracking cookies a reprieve and why you should quit Chrome.

Apple doesn’t need to run on-device filtering on iPhones. Yes, it might need to up the ante on its CSAM screening, but that can be done with cloud screening, matching the industry norms with some additional Apple cleverness. That would prompt almost no resistance compared to its Plan A—others do the same, and it should use technology to keep such filth off its servers, whether mandated by law enforcement or not.

It is notable that when it transpired that Apple already screened some iCloud Mail for CSAM, there was no real backlash, no reaction, users will be fine with this. “Privacy means people know what they’re signing up for, in plain English. That’s what it means,” Steve Jobs said in 2010. “Ask them. Ask them every time. Make them tell you to stop asking them, that they’re tired of you asking them.” Apple, take note.

Apple also doesn’t need to run machine learning client-side on iMessage, issuing warnings to minors that send or receive sexual images. This is a poorly conceived idea on every level and should be left to gather dust on a shelf.

And so, when Apple does “release these critically important child safety features,” those need to have materially, completely and fundamentally changed or this backlash will trigger again. The company is never going to sell users bred on ideas of privacy and control on the idea of monitoring software lurking behind their home screens.

Apple’s plans for CSAM scanning and iMessage altering were technically complex and fraught with what-ifs. Many of the complaints were spurious. Yes, comparing hashes of watchlist images with user images could lead to false positives, and yes, there is a far-fetched risk that the system could be abused to compromise users’ photo albums. But in among the multitude of worries, four issues are very real and will remain a serious concern unless and until Apple confirms it plans to shelve or rearchitect its plans.

First, the threat that governments might press Apple into accepting compromises of these systems to comply with local laws or appease local lawmakers. Apple cannot just bat this away given its track record in China with iCloud data hosting and App Store censorship. In the real world, China is China and Apple is a profit-generating entity.

Apple needs to address this with more than PR before it returns to the table with new ideas. China is a major Achilles heel for the company here, its track record is not good.

Second, the client-side compromise to iMessage on the iPhone, not technically breaking end-to-end encryption but a boon to lawmakers who have asked for just such endpoint “backdoors.” Once the technology is in place, it becomes much harder to fight the ethics of terrorism or violent crime classifiers, and impossible to brush off law enforcement hawks with the claim it’s technically impossible.

End-to-end encryption remains under threat, and it’s Apple that is opening the door to some form of compromise. This is what incensed WhatsApp. The industry had seemed united, and then Apple announced iMessage client-side screening out of nowhere. Don’t believe the technical hand-flapping here. An endpoint compromise added by the developer of an end-to-end encrypted platform is a potential backdoor. Period.

Third, Apple is still reeling from Pegasus and the claims that NSO’s spyware was able to exploit vulnerabilities in iMessage and other Apple platforms to target its devices. Adding complex monitoring systems to users’ devices would inevitably add another attack surface. Apple cannot claim otherwise, that’s just not credible.

Fourth and arguably most critically, there is the intangible user issue. Apple’s move has clearly undermined the feeling of privacy and security it has worked so hard to foster across its user base. This was entirely predictable—you can’t work for years to build up a fortress device mantra and then prick the bubble without consequences.

It’s this fourth issue that Apple can’t fight with more promises of audits and expert assurances and technical papers. It turns out that users just don’t like the idea of this. If Apple wants such technology on its platforms, it should add it to its own servers. It will have the same effect, but it keeps monitoring away from users’ phones.

“This delay clearly demonstrates that Apple are listening to the many groups against the move,” says ESET’s Jake Moore, “but unless it drops the idea altogether, the backdoor remains a threat to users in the future. It is vital that users’ phones are protected and kept safe from any prying eyes regarding privacy or third-party threats.”

The privacy lobby has had a field day, not because monitoring and filtering such content is unusual or wrong, but because this is Apple and if even your iPhone can be subject to such seemingly Orwellian interference, then is anything still safe?

ForbesWhy You Should Change This 'Dangerous' iPhone Wi-Fi Setting

“This is a direct response to the outcry from users and civil society,” EFF’s Eva Galperin tweeted on the backtrack. “We're not done, but this is a reminder that collective action moves the needle.”

Apple’s take-away is more complex. It knows that even had it persisted with the updates there’s not much chance of droves of users switching to Android, which is less private and less secure on almost every level. But there would have been questions about iMessage and iCloud, and Apple would be inviting competing cloud platforms to pitch its users. It’s likely that Apple’s all-important services revenue would have taken a hit, more so than its devices carrying all that controversial monitoring software.

Thankfully, we can now update to iOS 15 without worrying about these complex additions. That means we will get the benefit of Private Relay in Safari, Mail Tracking Protection, improved (one can only hope) security to iMessage after Pegasusgate, enhanced (potentially) FaceID given our masked alter egos.

Apple has been landing successful blows on Facebook and Google all year. Restricting access to its users’ data, adding transparency to apps’ data harvesting, adding privacy improvements to Safari over Chrome and Mail over Gmail. The company won’t like that it’s just repeated their mistakes. Fortunately for Apple, what it needs to do next is easy and obvious, quickly acknowledging the strength of feeling, confirming that what happens on your iPhone really does (and will continue to) stay on your iPhone.

Follow me on Twitter or LinkedIn