April 24, 2024

Apple Gets Slam by Tech Experts & Organizations for its Child Safety Feature

Last week, Apple announced a child safety plan that is expected to help in fighting against the increasing number of child abuse cases in the US. The move of rolling out the software is in collaboration with National Center for Missing and Exploited Children (NCMEC). The iPhone maker will deploy child safety features that will restrict the spread of Child Sexual Abuse Material (CSAM).

Apple’s new software is still in the testing phase and is anticipated to be available later this year. NCMEC’s CEO John Clark said that “these new safety measures have the lifesaving potential for children who are being enticed online and whose horrific images are being circulated in child sexual abuse material.” He also addressed it as a “game-changer” in a statement. However, security researchers and other tech experts oppressed Apple’s privacy update. The privacy advocates conveyed that the new software will censor other content on user’s devices.

The company’s new software update won’t be a unique selling point for the company. Apple’s software uses a hashing system that will help to censor or report sexually explicit images on iCloud. This technology has been already in use by the search engine giant Google since 2008 to spot illegal images. While in 2019, the social media behemoth Facebook said that it has deleted 11.6 Million pieces of inappropriate images in only three months. However, Apple says that its software uses a combination of hardware and mathematics to learn as little as possible about images on Apple devices.

Privacy Experts Slams Apple

“Make no mistake: if they can scan for kiddie porn today, they can scan for anything tomorrow,” NSA’s whistleblower and President of FreedomPress Edward Snowden tweeted. The Electronic Frontier Foundation (EFF) said in a post that the software rollout is a decrease in privacy for iCloud Photos users. The organization has always been praising Apple about its privacy policies, but not this time. Apple is planning to build a backdoor into its data storage system and its messaging system.” the blog post reads.

The messaging platform Whatsapp has also criticized Apple’s move. Facebook’s subsidiary Whatsapp is known for its end-to-end encryption feature that avoids leaks or screening of users’ data. I read the information Apple put out yesterday and I’m concerned. I think this is the wrong approach and a setback for people’s privacy all over the world. People have asked if we’ll adopt this system for WhatsApp. The answer is no.” Whatsapp’s Head Will Cathcart said in a tweet. He also posted Apple’s letter to his customers that said once said focused on users concerns. Cathcart wrote that those “words were wise then, and worth heeding here now,” referring the letter.

Apple’s VP Sebastian Marineau also acknowledged in an internal memo that users would be “worried about the implications” of the systems. In an open letter to Apple by major organizations, which includes GigaHost, ThinkPrivacy, WebTorrent, and Mojeek, it said, “Apple’s proposal introduces a backdoor that threatens to undermine fundamental privacy protections for all users of Apple products.” The Center for Democracy & Technology posted an article named, “Apple’s Changes to Messaging and Photo Services Threaten Users’ Security and Privacy.”

“Apple is replacing its industry-standard end-to-end encrypted messaging system with an infrastructure for surveillance and censorship, which will be vulnerable to abuse and scope-creep not only in the U.S., but around the world,” said Co-director of CDT’s Security & Surveillance Project, Greg Nojeim. “Apple should abandon these changes and restore its users’ faith in the security and integrity of their data on Apple devices and services.” While the professor of security engineering at the University of Cambridge said that “It is an absolutely appalling idea, because it is going to lead to distributed bulk surveillance of our phones and laptops.”