Apple postponed child abuse detection system after strong opposition

Internet privacy update

After strong opposition from privacy activists, Apple has been under pressure to plan to launch software to detect child pornography and sexual abuse photos on the iPhone.

The company said it will postpone and possibly modify the new system, which was originally scheduled to be launched this year.

Apple said in a statement: “We have decided to spend more time collecting opinions and making improvements in the next few months before releasing these vital child safety features.”

One of the proposed features involves a system for matching files uploaded to iCloud photos from a user’s iPhone with a database of known child sexual abuse images.

But the new controls announced last month have caused widespread panic among privacy and human rights organizations, who fear that the tool that scans images on the iPhone may be abused by authoritarian regimes.

The American Civil Liberties Union warned that any system that detects data stored on mobile phones could also be used against activists, dissidents and minorities.

“Given the wide-ranging interests of governments around the world, we cannot be sure that Apple will always resist the request to scan the iPhone for additional selected materials,” ACLU technical expert Daniel Kahn Gillmor, Said last week. “For all iPhone users, these changes are a step towards worse privacy.”

Apple’s changes certainly frustrated some child protection activists. Andy Burrows, head of online policy for child safety in the UK charity NSPCC, said the move was “Incredible disappointment“, and the company “should stick to its position.”

Apple’s initial proposal was welcomed by officials in the United States, the United Kingdom, and India, but it aroused anger in Silicon Valley during delicate negotiations between the technology industry and regulators to address online child abuse.

The person in charge of WhatsApp called it “very worrying.” The Silicon Valley digital rights organization, the Electronic Frontier Foundation, said it was a “shocking shift” for users who rely on the company’s leadership in privacy and security.

In an email circulated within Apple, child safety activists regarded complaints from privacy activists and safety researchers as “the screams of the minority.”

Apple spent several weeks defending its plan, saying that the plan involved “state-of-the-art” encryption technology to ensure that the company itself cannot see any images stored on customer devices.

It stated that the system will only be used for child protection, and the participation of a human review team, and the minimum number of images that must be detected before the account is flagged, can almost eliminate the possibility of error or abuse.

But Craig Federighi, Apple’s senior vice president of software engineering, admitted that the introduction of a child pornography detection system and a separate tool that can warn parents if their children receive pornographic photos through its iMessage system is confusing.

“Obviously, a lot of information is very confusing in terms of how people understand it,” Federigi told the media Wall Street Journal last month. “In hindsight, the introduction of these two functions at the same time is a secret to this confusion.”



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *