Home » Technology » Apple » Apple To Expand Child Protection System To Third-Party Apps

Apple To Expand Child Protection System To Third-Party Apps

Apple privacy wwdc

Apple’s plan to scan the content of iOS users across the board for nudity and child pornography has triggered a wave of criticism. Now the company is making it clear that the system is to be used even more widely: The scanner could also be used for third-party apps.

Apple Stepping Up Despite Criticism

There is probably no adaptation to iOS that has been criticized as sharply in recent years as the current project. Apple wants to introduce new functions with iOS 15, iPadOS 15 and in parts also with macOS 12, the aim of which is to locally screen pictures for nudity and child pornography and to check them before uploading them to iCloud – the group speaks of protecting children and the Privacy and so far weighed concerns as “misunderstandings”.

According to its own statement, the company even regards it as “desirable” to roll out the “child protection” function even more broadly and to be able to make it available for third-party apps at a later point in time. Apple has not yet provided details here, but expanding the function to include apps that have an upload function does not seem to be a problem from a technical point of view. Exact schedules for such projects are not yet known, but after the introduction of its own apps, Apple usually makes many of its functions available to third-party developers via APIs in a reasonable amount of time.

Others Could Follow

Last but not least, according to the MacRumors report, it is also conceivable that the CSAM system (“Child Sexual Abuse Material”) will also be extended to apps that upload photos to other platforms rather than iCloud. This means that Apple could also persuade large providers such as Snapchat, Instagram, or Whatsapp to make the appropriate adjustments. It will be very interesting to see how the current heated discussion on the subject will continue to shape the Group’s decisions.