Quantcast
Channel: Rafia Shaikh - Head of Business Development at Wccftech
Viewing all articles
Browse latest Browse all 777

No Longer Concerned About Privacy? Apple Opens Backdoors to iPhones to Detect CSAM

$
0
0

apple backdoor

Apple has announced its plans to bring changes to its operating systems that sound like a massive privacy nightmare. Raising concerns in the industry, the company argues it is doing so to protect children and limit the spread of Child Sexual Abuse Material (CSAM).

"We want to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of Child Sexual Abuse Material (CSAM)," the company writes. The efforts include new safety features in Messages, detection of CSAM content in iCloud, and expanded guidance in Siri and Search.

Two main concerning points are:

  1. Apple plans to add a scanning feature that will scan all photos as they are uploaded into iCloud Photos to see if they match a photo in the database of known CSAM maintained by the National Center for Missing & Exploited Children (NCMEC).
  2. It will also scan all iMessage images sent or received by child accounts (accounts designated as owned by a minor) for sexually explicit material. If the child is a minor, Apple will warn them if they try to send or receive sexually explicit photos and notify the parent.

These updates are likely to be introduced later this year in an update to iOS 15, iPadOS 15, watchOS 8, and macOS Monterey.

Apple breaks apart its security and privacy to protect children but might put them at further risk

The iPhone maker is planning to scan images to detect known CSAM images using its neuralMatch algorithm that has been trained on 200,000 sex abuse images collected by the NCMEC. According to reports, every photo uploaded to iCloud will be given a "safety voucher,” and if a certain number of photos are marked as suspect, Apple will enable all the photos to be decrypted and, if illegal, passed on to the NCMEC.

Apple argues that this is being done keeping user privacy in mind.

"Instead of scanning images in the cloud, the system performs on-device matching using a database of known CSAM image hashes provided by NCMEC and other child safety organizations," the company writes. "Apple further transforms this database into an unreadable set of hashes that is securely stored on users’ devices."

However, security researchers, while supportive of the efforts, are concerned that Apple is enabling governments worldwide to effectively have access to user data, which could go beyond what Apple is currently planning, as is the case with all backdoors. While the system is being purported to detect child sex abuse, it could be adapted to scan for other text and imagery without user knowledge.

"It is an absolutely appalling idea, because it is going to lead to distributed bulk surveillance of our phones and laptops." - Ross Anderson of the UoC.

Apple - which likes to make everyone believe it's at the forefront of user privacy - creating this backdoor for the US government would also push governments to make their own demands from other tech companies. While this is being done in the US right now, it is opening the way for other governments to rightfully make similar and more targeted demands from the tech companies.

Security researchers around the globe have been writing about why this is effectively the end of privacy at Apple since every Apple user is now a criminal unless proven otherwise.

"You can wrap that surveillance in any number of layers of cryptography to try and make it palatable, the end result is the same," Sarah Jamie Lewis, executive director at Open Privacy, wrote.

"Everyone on that platform is treated as a potential criminal, subject to continual algorithmic surveillance without warrant or cause."

The Electronic Frontier Foundation has released a full-page argument calling this move a "backdoor to your private life."

"Child exploitation is a serious problem, and Apple isn't the first tech company to bend its privacy-protective stance in an attempt to combat it," the digital rights firm wrote, adding that backdoor is always a backdoor regardless of how well-designed it may be.

"But that choice will come at a high price for overall user privacy. Apple can explain at length how its technical implementation will preserve privacy and security in its proposed backdoor, but at the end of the day, even a thoroughly documented, carefully thought-out, and narrowly-scoped backdoor is still a backdoor." - EFF

The new features are also concerning even without the government meddling and could prove life-threatening for queer kids. Kendra Albert of Harvard's Cyberlaw Clinic tweeted that these algorithms are going to overflag LGBTQ+ content, including transition photos. "Good luck texting your friends a picture of you if you have "female presenting nipples," Albert tweeted.

Matthew Green, a cryptography teacher at Johns Hopkins, said that Apple is starting this launch with non-E2E photos, so it supposedly doesn't hurt user privacy, "but you have to ask why anyone would develop a system like this if scanning E2E photos wasn’t the goal." Not to forget how these systems rely on databases of “problematic media hashes” that can't be reviewed.

Green also reminded everyone that while the idea of Apple being a privacy-forward company has brought them a lot of good press and consumer trust, it's the same company that dropped plans to encrypt iCloud backups because of the FBI.

Apple has shared full details of these new changes in this document. While Apple may be well-intentioned, the iPhone maker is not only breaking promises of security and privacy but is also throwing users to rely on their governments for not misusing this access to their personal data - something that doesn't have a good track record.

As EFF says, what Apple is doing isn't just a slippery slope, it's "a fully built system just waiting for external pressure to make the slightest change."

The post No Longer Concerned About Privacy? Apple Opens Backdoors to iPhones to Detect CSAM by Rafia Shaikh appeared first on Wccftech.


Viewing all articles
Browse latest Browse all 777

Trending Articles