Apple announces new protections for child safety: iMessage features, iCloud Photo scanning, more

Apple today announced a trio of new initiatives to protect children using the iPhone, iPad, and Mac. Updated knowledge information for Siri Search, and Siri.

Apple stresses that, while the new program is ambitious, it is important that parents remember that children are protected. Apple promises that it will continue to improve its efforts over time.

Messages

Today’s announcement is a new safety feature in the Messages app. Apple explains to children that if they send or attempt to transmit explicit images via iCloud Family, they will be sent a warning message.

Apple explains that if a child receives a picture that is sexually explicit, it will blurr. The Messages app will also warn that the image may contain sensitive information. To see a message explaining why sensitive images were chosen, the child can tap “View Photo”

The popup informs the child they can view the image and their iCloud Family parent will be notified. The popup also contains a link to additional help.

Apple warns children who send explicit photos to their parents. If the child chooses to send a message, parents can also get a response.

Apple further explains that Messages uses machine-learning on-device for analysis of attachments and determines if a photograph contains sexually explicit content.

Apple claims the feature will become available “later this year to accounts set up as families in iCloud”, with iOS 15, iPadOS 15, and macOS Monterey. The feature will immediately be available in the US.

CSAM detection

Apple will announce new steps to stop child sexual abuse material (CSAM), in the second, perhaps, most important place.

Apple will now be able identify known CSAM photos stored in iCloud Photos. This feature was leaked in part today. This entity serves as a comprehensive reporting agent for CSAM, and collaborates closely with law enforcement.

Apple has made it clear that it respects user privacy and has developed its own method for detecting CSAM.

Apple explains:

Before an image is saved to iCloud Photos, it is subject to an on-device matching. Before an image is saved in iCloud Photos, it is checked for a match between the device and the unreadable set CSAM hahes.

The device generates a cryptographic safety code that encodes the match result if there’s an on-device match.

“For instance, if a secret was divided into 1,000 shares, and the threshold is 10, then the secret can be reconstructed using any 10 shares. Apple says that nine shares cannot be obtained, so the secret is kept secret.

Apple will not disclose the threshold it will use. This is the threshold that Apple will use to verify that the match was confirmed. Apple will then send a report the National Center for Missing and Exploited Children.

Accounts are not incorrectly flagged, so the threshold technology is critical. Users can appeal to have their accounts restored. Apple claims that its error rate is less than 1 in trillion accounts annually.

Apple only analyzes photos in iCloud Photos. This is different from constantly scanning all photos stored in the cloud.

Apple’s implementation of the iPhone is complicated. More information is available at the following links.

Apple claims that the feature will only be available in the United States. However, the company plans to make it available to other countries.

Siri, Search

Apple finally upgrades Siri and Search.

Apple also increased the guidance in Siri and Search to offer additional resources to parents and children. To help them file a child exploitation report or CSAM,

Siri, Search and Google have been updated to answer queries about CSAM. These interventions will notify users that the topic is harmful and problematic and provide resources from partners.

Search and Siri updates will be made available with iOS 15, iPadOS 15 and macOS Monterey.

Testimonials

John Clark is President & Chief Executive Officer of the National Center for Missing & Exposed Children.Apple’s new protection is revolutionary for children. These safety measures can save the lives of children who have been lured online by horrifying images shared in child sexual abuse material. The National Center for Missing & Exposed Children is aware that this crime can’t be stopped unless we continue to protect children. This goal can only be achieved with the help of technology partners such as Apple. It’s not impossible to have privacy and child protection at the same time. We are proud to be associated with Apple and look forward to working together to make the world safer.

Julie Cordua, CEO, Thorn:Thorn is a believer in online privacy. This applies to children who have been sexually abused. Apple’s pledge to deploy technology solutions that balance privacy with digital safety for children is a step closer to justice for victims whose most distressing moments were disseminated online. It also makes it possible to have every online platform that has an upload button committed to proactive detection for CSAM in all environments. A step closer to a world where every child can be a kid.

Stephen Balkam is the founder and CEO Family Online Safety Institute”We support Apple’s ongoing improvement in child online safety. Given the challenges parents face in protecting their children online, it is vital that tech companies continually improve their safety tools.

Eric Holder (ex-Attorney General):To meet the challenges posed by the recent increase in online child sexual abuse material, technologists must be innovative. Apple’s recent efforts in detecting CSAM is a major milestone. This is a significant milestone that shows child safety doesn’t have to come at the cost of privacy. Apple is demonstrating its commitment to privacy and making the world a better place.

George Terwilliger, former Deputy Attorney General:Apple’s announcements were a welcome step in helping parents and law enforcement officials in their efforts to prevent harm to children by CSAM purveyors. Apple’s increased efforts in reporting CSAM will help law enforcement better identify and stop dangerous people in society.

Benny Pinkas, a professor at Bar Ilan University’s Department of Computer Science, isThe Apple PSI system offers privacy and utility in a perfect balance. It will greatly aid in identifying CSAM content while keeping users private and minimizing false negatives.”

Mihir Belare is a professor at UC San Diego.”Taking steps towards reducing CSAM is a noble action. It does require some care when it is implemented. This is no easy task. All iCloud users need to scan their photos. Our photos are private. They record moments, events, and people in our lives. Apple should respect these photos’ privacy. Public access to the recipient database for CSAM photos should be restricted. Apple has created a way to report CSAM offenders while respecting privacy restrictions.

David Forsyth holds the Chair in Computer Science at University of Illinois at Urbana Champagne College of Engineering”Apple’s privacy approach is superior to any other I know […] My judgment is that this system will likely significantly improve the chances of people trafficking in [CSAM] being located; this should benefit children. Harmless users won’t have any privacy as visual derivatives cannot be disclosed unless there is sufficient evidence to match CSAM photos. Only images that correspond to CSAM photos will be disclosed. Due to the accuracy and threshold of the matching system, it is unlikely that any photos not matching known CSAM photographs will be revealed.

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Stay in the Loop

Get the daily email from CryptoNews that makes reading the news actually enjoyable. Join our mailing list to stay in the loop to stay informed, for free.

Latest stories

- Advertisement - spot_img

You might also like...