World

Apple abandons plan to scan devices for CSAM

Apple is shedding plans to launch a controversial tool that could check iPhones, iPads and iCloud photos for child sexual abuse material (CSAM) after backlash from critics who criticized the author. the potential privacy action of this feature.

Apple first announced the feature in 2021, with the goal of helping to combat child exploitation and increase safety, issues that the tech community is increasingly concerned with. However, it soon stopped rolling out the feature amid a wave of criticism, noting that it will “need more time in the coming months to gather input and implement improvements before release these all-important child safety features.”

In a public statement Wednesday, Apple said it has “decided not to continue with our previously recommended CSAM detection tool for iCloud Photos.”

“Children can be protected without companies checking personal data and we will continue to work with governments, children’s advocates and other companies to help protect young people. age, protect their privacy and make the internet a safer place for children and for us all,” the company said in a statement provided to Wired. (Apple did not respond to CNN’s request for comment.)

Instead, the company is refocusing its efforts on developing the Secure Communication feature, which will first be made available in December 2021, after consulting with the feedback of experts on the its child protection initiatives. The Safe Communication Tool is an opt-in parental control feature that alerts minors and their parents when incoming or outgoing image attachments in iMessage contain pornographic content and if there are, blur them.

Apple has been criticized in 2021 for its plans to offer another tool that will begin checking iOS devices and iCloud photos for child abuse images. At that time, The company says the tool will turn iPhone and iPad photos into unreadable hashes – or complex numbers – stored on the user’s device. Those numbers will be matched against a hash database provided by the National Center for Missing and Exploited Children (NCMEC) once the photos are uploaded to Apple’s iCloud storage service.

Many child safety and security experts have praised the effort, recognizing a company’s responsibility and ethical obligation for the products and services it creates. But they also called the efforts “deeply concerned,” largely due to the way part of Apple’s child abuse image screening process is performed directly on users’ devices.

In a PDF published on its website outlining technology called NeuralHash, Apple attempted to address concerns that governments could also force Apple to add non-child abuse images to its list. hash book. “Apple will refuse any such request,” it stated. “We’ve faced requests to build and implement government-mandated changes that degrade user privacy in the past, and have vehemently denied those requests. We’ll continue continue to reject them in the future.”

Apple’s announcement of canceling their plans for the tool comes at the same time as the company announced several new security features.

Apple plans to expand iCloud data end-to-end encryption to include backups, photos, notes, chat history, and other services, in a move that could further protect user data. but also causes tension with law enforcement officials around the world. The tool, called Advanced Data Protection, will allow users to keep some data safer from hackers, governments and spies, even in the event of an Apple data breach, the company said. ty said.



Source by [author_name]

news7h

News7h: Update the world's latest breaking news online of the day, breaking news, politics, society today, international mainstream news .Updated news 24/7: Entertainment, Sports...at the World everyday world. Hot news, images, video clips that are updated quickly and reliably

Related Articles

Back to top button