Technology

Apple update will check iPhones for images of child sexual abuse


Apple on Thursday said that iPhones and iPads will soon start detecting images containing child sexual abuse and reporting them as they are uploaded to the iCloud.. ― SoyaCincau pic
Apple on Thursday said that iPhones and iPads will soon start detecting images containing child sexual abuse and reporting them as they are uploaded to the iCloud.. ― SoyaCincau pic

SAN FRANCISCO, Aug 6 — Apple on Thursday said that iPhones and iPads will soon start detecting images containing child sexual abuse and reporting them as they are uploaded to the iCloud.

The software tweak to Apple’s operating systems will monitor pictures, allowing Apple to report findings to the National Center for Missing and Exploited Children, according to a statement by the Silicon Valley-based tech giant.

“We want to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of child sexual abuse material (CSAM),” Apple said.

The new technology will allow the phones’ operating systems to match abusive photos on a user’s phone against a database of known CSAM images provided by child safety organizations, then flag the images as they are uploaded to iCloud, Apple said.

The feature is part of a series of tools heading to Apple mobile devices, according to the company.

Apple’s iPhone messaging app will additionally use machine learning to recognize and warn children and their parents when receiving or sending sexually explicit photos, the company said in the statement.

And personal assistant Siri will be taught to “intervene” when users try to search topics related to child sex abuse, according to the company. — AFP



READ SOURCE

This website uses cookies. By continuing to use this site, you accept our use of cookies.