Table of contents
- What is CSAM Apple?
- Is Apple still scanning for CSAM?
- Apple CSAM features
- Apple CSAM iCloud detection
- Apple anti-CSAM communication
- Apple CSAM controversy
- Final thoughts
What is CSAM Apple?
Apple CSAM refers to Apple’s CSAM photo-sharing scanning tool that detects illegal content on Apple users’ iCloud accounts, specifically sexual child abuse. It doesn’t check other device data.
Coined “CSAM detection,” they announced the tool in August 2021; it would not only identify evidence of CSAM but also report the users storing it. The goal was to protect children from being exploited and limit the spread of CSAM.
Did you know?
More than 99.5% of the reports received by the National Center for Missing & Exploited Children’s CyberTipline in 2022 were related to CSAM. That’s nearly all reports.
Is Apple still scanning for CSAM?
No, Apple is no longer scanning for CSAM in iCloud. Apple abandoned its plans to release a CSAM scanning tool on its devices and in iCloud in December 2022. This came after widespread concerns about the tool’s threat to user privacy, even though Apple claims the tool protects it.
Apple’s decision to cancel its plans for its CSAM detection tool wasn’t an abrupt one. The response came in September 2021, roughly a month after the tool announcement. According to Wired, Apple paused the Apple CSAM scanning project following pressure from researchers and digital rights groups. The concern was that the tool would be used maliciously to compromise the privacy and security of users. Needless to say, anyone would find the idea of Apple going through their photos scary, creepy, and invasive.
However, instead of launching the CSAM detection tool, in a statement made to Wired, Apple revealed that it would update its communication safety features. These features inhibit the spreading of CSAM at the source by preventing children from accessing and distributing it.
Apple CSAM features
Apple CSAM features fall into two categories: CSAM detection in Apple iCloud and anti-CSAM communication. The point is to tackle child sexual abuse content.
Apple CSAM iCloud detection
Apple’s CSAM iCloud detection does what it says, meaning it identifies content, mainly photos, in which kids are sexually exploited in a user’s Apple iCloud account. The tool would’ve focused on pictures stored in Apple iCloud and wouldn’t work if iCloud was disabled.
Although it had good intentions, Apple CSAM detection was canned due to privacy concerns. However, Apple continues to tackle CSAM and images through its anti-CSAM communication features.
Apple anti-CSAM communication
Rolled out in December 2021, Apple’s anti-CSAM communication features target CSAM in Spotlight search, Safari search, and Siri. It also helps prevent children from receiving and sending CSAM via Messages on their Apple devices by sending them warnings.
Parents and caregivers must opt in to use Apple’s anti-CSAM communication features. Not only do they prevent the sharing and viewing of on-device content that sexually exploits children, but they also provide resources to report it. Because Apple’s anti-CSAM communication features are available on an opt-in basis, Apple users who don’t opt in don’t have to worry about their privacy being infringed upon.
Apple CSAM controversy
In 2021, Apple came under fire when it revealed its plans to scan users’ iCloud accounts for pictures containing CSAM. Digital rights and privacy advocates were at the forefront of those who decried the privacy risks of the initiative.
The main concern was that governments and bad actors would abuse the tool to invade Apple users’ privacy. However, it’s understandable why privacy experts and groups have concerns about an invasive tool like CSAM detection.
It’s the same reason some users wonder, ‘Is Apple Pay safe’? Apple users’ data can be compromised in various ways, including Apple ID scams and fake security warnings. By the way, you can stop Apple security warnings to avoid inviting viruses into your iPhone or iPad.
Although ‘if you are not doing anything wrong, you don’t have anything to worry about’ may be a reassuring statement, nobody can guarantee that Apple’s scanning technology won’t be used for other content, too.
Tip
Make sure to use Clario’s AntiSpy Setup and Scan tools to help you adjust your privacy settings and detect malware so that bad actors have no chance of scanning and stealing your unpublished or unshared information stored on your phone.
Final thoughts
Initiatives that aim to protect children shouldn’t spark controversy, but sometimes it’s not that simple. When those initiatives involve prying into people’s personal lives and data through their MacBook Pro and iMac devices, it can be tricky to implement them.
That’s the situation Apple found itself in, and that’s why Apple stopped CSAM detection to protect users’ privacy. Fortunately, the company continues to tackle CSAM through its anti-CSAM communication features, which are less invasive. However, Apple users still should play their part by avoiding iOS 17 sideloading to prevent other cyber attacks.