
Apple Toys with Privacy
Terry H. Schwadron
Aug. 8, 2021
We have a continuing problem balancing privacy and crime, particularly when Big Tech is involved.
Apple put its thumb on the scale this week by announcing it is testing a system that will enable the company to flag child porn images being uploaded to iCloud storage and report it to authorities.
As with decisions among the Big Tech firms to try to balance free speech against misinformation on social media, the company announcement has privacy advocates buzzing about the slippery slope towards authoritarianism. As an issue, it is bigger or smaller depending solely on one believes whether individual liberty or social duties must triumph in the end.
If you believe that child exploitation is a problem, this was an intelligent, socially responsible move by Apple. If you insist on total privacy with no exceptions, this was a surrender that we will come to regret as government finds more and more reason to search your phone, computer files and keystrokes with abandon.
As usual, there are a bunch of asterisks to add here. For openers, Apple is not unique here; As CNBC notes, Google has been able since 2008 to identify illegal images on its services, and Facebook said in 2019 that it removed 11.6 million posts related to child nudity and child sexual exploitation in just three months. Facebook subsidiary WhatsApp, which also uses end-to-end encryption for some of its messages and has faced pressure to provide more access to people’s content to prevent child exploitation.
Apple describes its system is an improvement because its approach learns as little as possible about the images on a person’s phone or cloud account while still flagging illegal child pornography by scanning unique information corresponding to image files and comparing it with known child sexual abuse material maintained by the National Center for Missing & Exploited Children. Another feature, which can be triggered by parents, scans iMessage images sent or received by accounts owned by a minor for sexually explicit material.
Specific technology aside, Apple and others have made it a mainstay of their business to protect the individual’s right to privacy, even at the cost of denying the government and law enforcement information that could lead to arrests related to mass shootings or terrorism charges.
Slip-sliding
The worry, of course, is that if it is kiddie porn today, tomorrow, Apple and the others will find reason to identify other information sought by various authorities — up to political speech or documents related to government dissent. Consider not only left-leaning protest movements, but the current attempts to determine who was behind the Jan. 6 uprising at the U.S. Capitol.
Still, no one wants to defend child exploitation.
The Electronic Frontier Foundation (EFF), which has supported Apple’s policies on encryption and privacy, slammed called the new policy a “backdoor,” or a system built to give governments a way to access encrypted data. “Apple can explain at length how its technical implementation will preserve privacy and security in its proposed backdoor, but at the end of the day, even a thoroughly documented, carefully thought-out, and narrowly-scoped backdoor is still a backdoor,” the nonprofit said in a blog post.
Apple itself sees the new system as part of its privacy-protecting tradition in which it is protecting user privacy while eliminating illegal content. Apple also claims the system can’t be repurposed for other kinds of content.
Since 2019, Apple has used the marketing slogan, “What happens on your iPhone, stays on your iPhone.” Apple CEO Tim Cook has addressed the “chilling effect” of knowing that what’s on your device may be intercepted and reviewed by third parties, adding that a lack of digital privacy could prompt people to censor themselves even if the person using the iPhone has done nothing wrong. Over time, Apple has introduced more services to hide user IP addresses and locations, for example.
Betrayal
So, privacy advocates see the new system as a betrayal of principles.
From a practical point of view, without privacy guarantees, business deals for credit card or digital banking, health-care information and the like are at risk for the Apples of the world. So, it is unlikely that this decision came without a lot of review.
It is clear that these kind of policies are as much about reputation as fact.
The Electronic Freedom Foundation makes clear it thinks this will hurt both reputation and principle.
“Child exploitation is a serious problem, and Apple isn’t the first tech company to bend its privacy-protective stance in an attempt to combat it. But that choice will come at a high price for overall user privacy,” EFF argues. “To say that we are disappointed by Apple’s plans is an understatement. Apple has historically been a champion of end-to-end encryption, for all of the same reasons that EFF has articulated time and time again. Apple’s compromise on end-to-end encryption may appease government agencies in the U.S. and abroad, but it is a shocking about-face for users who have relied on the company’s leadership in privacy and security.”
EFF says that even a well-intentioned effort to build such a system will break key promises of the messenger’s encryption and open the door to broader abuses. “That’s not a slippery slope; that’s a fully built system just waiting for external pressure to make the slightest change,” the group says, citing new rules in India and Ethiopia.
Apple’s changes would enable more such screening, takedown, and reporting, the group says, adding, “The abuse cases are easy to imagine: Governments that outlaw homosexuality might require the classifier to be trained to restrict apparent LGBTQ+ content, or an authoritarian regime might demand the classifier be able to spot popular satirical images or protest flyers.”
Once again, public debates are framing complex issues as either/or, with a lot of spin on the desired ideological outcome — though this one cuts across other political divides.
In general, what makes all this go are databases of previously identified materials used for comparison with incoming messaging or storage instructions. That, of course, is the real issue here.
The tools are almost never the problem. It is what we do with them that matters.
##