Apple launches new tool about child sexual abuse material!

Apple Inc. stakes its reputation on privacy. The company has always limited how mobile apps gather data and fought law enforcement agencies looking for user records. Apple has been trying to prove against accusations that its upcoming iPadOS and iOS new update release will hamper user privacy for the past week.

The debate started after Apple made announcements on Thursday where it announced a suite of new tools specifically to protect children and decrease the cases of child sexual abuse material (CSAM) online. According to the company, their goal is to create technology that enriches people’s lives, empowers people, and helps them to stay safe. They want to help children protect themselves from predators and limit the spread of Child Sexual Abuse Material (CSAM). These predators used various communication tools to recruit and exploit them. This new tool received lots of criticism because the details were left out by Apple in the original documents.

In three-part, this child’s safety can be understood –

Search, and Siri on Apple devices running iOS 15 later in 2021 will surface organization and resources for victims of abuse and those searching for CSAM content online. It is also being updated to intervene when users perform searches for queries related to CSAM. These interventions will make the user understand that interest in this topic is harmful. It will also provide resources from partners to get help with this issue. Later this year, these updates are coming as an update to macOS Monterey, watchOS 8, iPadOS 15, iOS 15.

In messages, communication safety aims to find out explicit and sensitive content sent in the messaging app. Users who are a part of an iCloud family group and up to age 17 who try to send this content will be given a warning about the nature of this media. For children under the age of 12, parents will send a notification that a sensitive piece of media has been communicated. When a video or photo is received keeps from being seen, a full-screen notice is shown when a user attempts to view it. A notification is sent to parents if the user is under 12 years of age.

The new feature scans iCloud Photos images to search child sexual abuse material and report it to Apple’s moderators. They can pass it on to the National Center for Missing and Exploited Children. According to the company, they designed this feature while finding illegal content to protect user privacy. This unseen database of hashes provided by the National Center for Missing and Exploited Children will be on-device.

An on-device matching process is performed prior to storing an image in iCloud Photos, against the known CSAM hashes for that image powered by the private set intersection ( a cryptographic technology) which finds out if the match is there without revealing the result.

 More details about Apple’s new system –

A typical CSAM scan runs remotely and looks at files that are stored on a server. The company’s system checks for matches locally on iPad and iPhone. On a device having iCloud photos enabled, to break these pictures into hashes the device uses a tool called NeuralHash. It is a string of numbers that recognizes a unique character of an image but to reveal the photo itself, it cannot be reconstructed. Then it compares these hashes against a stored list of hashes from the National Center for Missing and Exploited Children.

If Apple’s system finds a match, the phone generates a safety voucher uploaded to iCloud Photos. After a certain number of vouchers are developed, they all get decrypted and flagged to Apple’s human moderators. These human moderators can review the photos and find out if they contain CSAM. According to the company, it looks only into the images sync with iCloud and not the ones stored only on the phone. Disabling iCloud Photos will not allow the scanning system to run. NeuralHash will not run if users are not using iCloud photos and will not generate any voucher.

In another way, this matching process is powered by a private set intersection ( a cryptographic technology), which finds the match without disclosing the result. The device then creates a cryptographic safety voucher and additional encrypted data about the image and all these encodes the match result. Along with the picture, this voucher is uploaded to iCloud Photos.

Some privacy and security experts have praised these updates. But other advocacy groups and experts have criticized the changes. According to the messaging and iCloud are creating surveillance systems that work directly from tablets or phones. That could provide a guide for breaking secure end-to-end encryption. Its usage is limited right now but could lead to more troubling invasions of privacy in the future.

End-to-end encryption makes data readable for sender and receiver only, and the company running the app and others cannot see it. The companies hold keys to the data to grant access to law enforcement or scan files. Apple’s iMessage uses end-to-end encryption, but the iCloud photo doesn’t. This leaves a door open for specific kinds of surveillance, including a system client side-scanning which Apple is accused of adding.

Apple has denied the fact that it is not client-side scanning. The company has said that Messages are still end-to-end encrypted, and no details about specific message content are being released to anyone, including parents. Apple never gains any access to any communication. It has also denied the fact that it is scanning photos for CSAM. By this application only scan pictures that users upload on iCloud. For the users who have disabled iCloud images, the feature will not work, the company clarifies. In further clarification, the company said it could scan iCloud Photos images synced via its app or 3rd party services. The company agreed that iCloud Photos doesn’t have end-to-end encryption so that such scans could be efficiently run on its servers and argues its system is more secure.

The benefits of the new feature –

Child protection efforts can be enhanced by this feature and help in address the spread of CSAM. iMessage has a sexual predator problem. Child predator grooming is an active threat to the platform and often includes sending children sexually explicit images. This new feature is trying to disrupt this. According to Apple, “What we announced today is the product of this incredible collaboration, one that delivers tools to protect children, but also maintain Apple’s deep commitment to user privacy,” it reads. “We know some people have misunderstandings, and more than a few are worried about the implications, but we will continue to explain and detail the features so people understand what we’ve built.”

This new technology allows the company to provide actionable and valuable information to law enforcement and National Center for Missing and Exploited Children regarding the increased number of known CSAM. Apple does this while giving many important benefits over existing techniques since the company can learn only about the user’s photo if they have a collection of known CSAM in their iCloud Photos account.

Leave a Reply

Your email address will not be published. Required fields are marked *