Apple to scan devices, iCloud, for child pornography

Tech Tuesday

LANSING, Mich. (WLNS) – In an effort to protect children, Apple is unveiling a new program that will scan for child pornography on Apple devices. However, some privacy advocates are concerned..

The tool, known as“neuralMatch,” will scan images before they are uploaded to iCloud. If it finds a match, the image will be reviewed by a human. If child pornography is confirmed, the user’s account will be disabled and the National Center for Missing and Exploited Children will be notified as well as law enforcement.

Apple also plans to scan users’ encrypted messages for sexually explicit content as a child safety measure, which also alarmed privacy advocates.

The detection system will only flag images that are already in the center’s database of known child pornography. Parents snapping innocent photos of a child in the bath presumably need not worry.

Separately, Apple said its messaging app will use on-device machine learning to identify and blur sexually explicit photos on children’s phones and can also warn the parents of younger children via text message. It also said that its software would “intervene” when users try to search for topics related to child sexual abuse.

In order to receive warnings about sexually explicit images on their children’s devices, parents will have to enroll their child’s phone. Kids over 13 can unenroll, meaning parents of teenagers won’t get notifications.

But researchers say the matching tool — which doesn’t “see” such images, just mathematical “fingerprints” that represent them — could be put to more nefarious purposes, such as the surveillance of protesters by the government.

This technology could be used by authoritarian regimes to identify dissident groups or groups facing governmental persecution.

The Associated Press contributed to this report

Copyright 2021 Nexstar Media Inc. All rights reserved. This material may not be published, broadcast, rewritten, or redistributed.