Apple says it will refuse government demands to use new child safety system for surveillance
- Apple says it will refuse any government demands to use its new system for other criminal or national security investigations.
- This comes after Apple announced a new Child Sexual Abuse Material (CSAM) detection system coming to US apple devices in iOS 15.
- This feature involves automatically scanning photos before being uploaded to iCloud.
- Their response comes following concerns of the new feature’s potential for surveillance and scanning of non-CSAM images.
- Full story here.