Apple plans to scan iPhones for child abuse imagery

Apple plans to scan iPhones for child abuse imagery By Qatar Day - August 08, 2021

iPhone

REUTERS

Apple IncĀ (AAPL.O)Ā is planning to install a software on U.S. iPhones that will scan for child abuse imagery, the Financial Times reported on Thursday, citing people familiar with the matter.

Earlier this week, the company had elaborated its planned system, called "neuralMatch," to academics in the United States via a virtual meeting, the reportĀ said, adding that its plan could be publicized widely as soon as this week.

Please use the sharing tools found via the share button at the top or side of articles. Copying articles to share with others is a breach of FT.com T&Cs and Copyright Policy. Email licensing@ft.com to buy additional rights. Subscribers may share up to 10 or 20 articles per month using the gift article service. More information can be found here.
https://www.ft.com/content/14440f81-d405-452f-97e2-a81458f5411f

Apple intends to install software on American iPhones to scan for child abuse imagery, according to people briefed on its plans, raising alarm among security researchers who warn that it could open the door to surveillance of millions of peopleā€™s personal devices.

Apple detailed its proposed system ā€” known as ā€œneuralMatchā€ ā€” to some US academics earlier this week, according to two security researchers briefed on the virtual meeting.

The automated system would proactively alert a team of human reviewers if it believes illegal imagery is detected, who would then contact law enforcement if the material can be verified. The scheme will initially roll out only in the US.

Apple confirmed its plans in a blog post, saying the scanning technology is part of a new suite of child protection systems that would ā€œevolve and expand over timeā€. The features will be rolled out as part of iOS 15, expected to be released next month.

ā€œThis innovative new technology allows Apple to provide valuable and actionable information to the National Center for Missing and Exploited Children and law enforcement regarding the proliferation of known CSAM [child sexual abuse material],ā€ the company said.

ā€œAnd it does so while providing significant privacy benefits over existing techniques since Apple only learns about usersā€™ photos if they have a collection of known CSAM in their iCloud Photos account.ā€

Reporting by Akanksha Rana in Bengaluru; Editing by Arun Koyyur

By Qatar Day - August 08, 2021

Leave a comment

r