Saltar al contenido

Apple cancels controversial child safety plans

Apple cancels controversial child safety plans

The company now plans to take a few more months to gather information and make improvements before launching the features, sparking the fire of privacy advocates.

In a surprise announcement Friday, Apple said it would take longer to improve its controversial child safety tools before introducing them.

More comments are wanted

The company says it intends to get more feedback and improve the system, which had three key components: scanning iCloud photos for CSAM material, scanning messages on the device to protect children, and search suggestions designed to protect children.

Ever since Apple announced the tools, it has faced a lot of criticism from individuals and human rights groups around the world. The big argument that the company seems to have an issue to address seems to have been the potential for repressive governments to force Apple to monitor more than CSAM.

Who is watching the viewers?

Edward Snowden, accused of leaking secret information from the US and now a privacy lawyer, warned on Twitter: «Make no mistake: if you can look for child pornography today, you can look for anything tomorrow.»

Critics said these tools could be exploited or extended to support censorship of ideas or threaten free thinking. Apple’s response, that it will not expand the system, was seen as a bit naive.

«We have faced processes to build and implement changes requested by the government that degrade user privacy before and we have firmly rejected these requests. We will continue to reject them in the future. Let’s be clear, this technology is limited to detecting CSAM stored in iCloud and we will not comply with any government’s request to expand it, «the company said.

«All it would take to extend the narrow back door that Apple is building is an extension of machine learning parameters to look for additional types of content,» the Electronic Frontier Foundation responded.

Apple listens to its users (in a good way)

In a widely released statement to the media (Friday before an American holiday, when bad news sometimes appears) about the suspension, Apple said:

«Based on feedback from clients, advocacy groups, researchers and others, we decided to spend more time over the next few months gathering information and making improvements before launching these important child safety features. ”

It’s a move the company had to take. In mid-August, more than 90 NGOs contacted the company in an open letter asking for reconsideration. This letter was signed by Liberty, Big Brother Watch. ACLU, Center for Democracy & Technology, Center for Free Expression, EFF, ISOC, Privacy International and many more.

The devil in detail

Organizations noted several weaknesses in the company’s proposals. One that transcends much: the fact that the system itself can be abused by abusive adults.

«LGBTQ + young people in family accounts with unsympathetic parents are particularly at risk,» they wrote. «As a result of this change, iMessages will no longer provide privacy and confidentiality to those users.»

There are also concerns that Apple’s proposed system could be expanded. Sharon Bradford Franklin, co-director of the CDT Security and Surveillance Project, warned that governments «will ask Apple to seek out and block images of human rights abuses, political protests and other content that must be protected as free speech, which is the backbone of a free and democratic society ”.

Apple’s lawyers said that what Apple tried to achieve was to maintain the general confidentiality of users’ data while creating a system that could only capture illegal content. They also highlighted the various security devices that the company has incorporated into its system.

These arguments did not work, and Apple executives certainly took the same types of comments on social networks that we saw, representing a deep distrust of the proposals.

What’s next?

Apple’s statement did not say. But given that the company has spent weeks meeting with the media and stakeholders in all its markets on the issue, it seems logical that the second iteration of its child protection tools could answer concerns.