Today, data privacy activists are furious at Apple’s “appalling” plans to automatically scan iPhones for child abuse images and cloud storage. They accuse the tech giant of giving users a backdoor to their personal data.
- New safety tools are revealed to protect children and limit the spread of material
- These measures will initially be implemented in the US only, according to the tech giant.
- Apple intends to make technology available in the UK soon and around the world.
- Security experts called the plan ‘absolutely unacceptable’ and ‘regressive’.
These new safety tools can also be used to examine photos sent via text messages. This will protect children against ‘sexting’ and automatically blur images Apple’s algorithm detects could child sexual abuse material [CSAM].
According to the iPhone maker, the new detection tools are designed to protect privacy and prevent the tech giant from seeing or scanning a user’s photo album. Instead, the system will search for matches securely on the device based on a list of ‘hashes, a type digital fingerprint – known CSAM images, provided by child safety organizations.
Children safety campaigners have encouraged tech giants to take more steps to stop illegal images from being shared. However, privacy concerns are growing about the policy.
The policy could open the door to spying on iPhone users. It could also target innocent parents who share photos of their children. False positives are very likely.
Others worry that totalitarian governments, with poor human rights records, could use it to convict homosexuals if homosexuality was a crime.
Although the initial measures are being implemented in the US only, Apple plans to make the technology available soon in the UK and in other countries around the world.
Ross Anderson, a professor of security engineering at Cambridge University has called the plan ‘absolutely unacceptable’. Alec Muffett is a privacy activist and security researcher who has previously worked for Deliveroo and Facebook. He described the proposal as a “huge and regressive” step in individual privacy.
Anderson stated that it was an absolutely terrible idea because it would lead to widespread bulk surveillance of all our laptops and phones.
Campaigners are concerned that the plan could be easily modified to detect other material.
The tech giant revealed a trio of new safety tools in an effort to protect children and reduce the spread of child sexual abuse material (CSAM).
The new Messages system will warn children when they receive explicit photos. It will blur the image, reassure them that they can view it and present them with useful resources.
The new plans will also warn parents who have linked family accounts.
It will also inform children that their parents will receive a notification if they choose to view the image.
Apple stated that similar protections will apply if a child sends a sexually explicit picture to Apple.
The company will also be able to use new technology to identify known CSAM images in iCloud Photos and report them to law enforcement authorities.
Siri and Search will now offer new guidance, which will direct users to useful resources when they search for CSAM.
According to the iPhone maker, the new detection tools are designed to protect privacy and not allow tech giants to scan or see a user’s photos.
Instead, the system will search for matches on the device based on a list of ‘hashes, a type digital fingerprint – known CSAM images, provided by child safety organizations.
This will only occur if a user attempts to upload an image to their iCloud Photo Library.
Apple stated that it would only be able to manually verify matches for harmful content if the threshold is exceeded. Then, it could send a report directly to safety organizations.
These new tools will be available later in the year as part of the iOS and iPadOS 15 software updates. They will be initially introduced only in the US, with plans to expand over time.
The company stated that the new CSAM detection tool would only be available to users of iCloud Photos. It would not permit the firm or any other person to scan images from a user’s camera roll.
This is the latest update from Apple that aims to improve safety and privacy for iPhone users. It follows a series of security updates earlier this year, which were designed to reduce third-party data collection.