<p> An upcoming Apple iOS update will allow parents to protect their children and help them learn to navigate online communication in Messages.</p>.<p>The second developer beta of iOS 15 (iOS 15.2) includes support for its new communication safety feature in Messages.</p>.<p>With this update, Apple Messages will be able to use on-device machine learning to analyse image attachments and determine if a photo being shared is sexually explicit, reports TechCrunch.</p>.<p>This technology does not require Apple to access or read the child's private communications, as all the processing happens on the device.</p>.<p>"It's a Family Sharing feature for parents to opt in," the report noted.</p>.<p>If a sensitive photo is discovered in a message thread, the image will be blocked and a label will appear below the photo that states, "this may be sensitive" with a link to click to view the photo.</p>.<p><strong>Also Read | <a href="https://www.deccanherald.com/business/could-apple-s-child-safety-feature-backfire-new-research-shows-warnings-can-increase-risky-sharing-1035525.html" target="_blank">Could Apple’s child safety feature backfire? New research shows warnings can increase risky sharing</a></strong></p>.<p>If a child chooses to view the photo, another screen will appear with more information about sensitive photos and videos.</p>.<p>The new Messages tool comes in the backdrop of child safety features that Apple has delayed following negative feedback.</p>.<p>The planned features included scanning users' iCloud Photos libraries for Child Sexual Abuse Material (CSAM), Communication Safety to warn children and their parents when receiving or sending sexually explicit photos and expanded CSAM guidance in Siri and Search.</p>.<p>Apple confirmed that feedback from customers, non-profit and advocacy groups, researchers and others about the plans have prompted the delay to give the company time to make improvements.</p>.<p>Following their announcement, the features were criticised by a wide range of individuals and organisations, including security researchers, the privacy whistleblower Edward Snowden, Facebook's former security chief, politicians, etc.</p>.<p>Apple has since attempted to dispel misunderstandings and reassure users by releasing detailed information, sharing FAQs, various new documents, interviews with company executives, and more.</p>.<p>The suite of Child Safety Features was originally set to debut in the US with an update to iOS 15, iPadOS 15, watchOS 8, and macOS Monterey.</p>.<p>It is now unclear when Apple plans to roll out the "critically important" features.</p>
<p> An upcoming Apple iOS update will allow parents to protect their children and help them learn to navigate online communication in Messages.</p>.<p>The second developer beta of iOS 15 (iOS 15.2) includes support for its new communication safety feature in Messages.</p>.<p>With this update, Apple Messages will be able to use on-device machine learning to analyse image attachments and determine if a photo being shared is sexually explicit, reports TechCrunch.</p>.<p>This technology does not require Apple to access or read the child's private communications, as all the processing happens on the device.</p>.<p>"It's a Family Sharing feature for parents to opt in," the report noted.</p>.<p>If a sensitive photo is discovered in a message thread, the image will be blocked and a label will appear below the photo that states, "this may be sensitive" with a link to click to view the photo.</p>.<p><strong>Also Read | <a href="https://www.deccanherald.com/business/could-apple-s-child-safety-feature-backfire-new-research-shows-warnings-can-increase-risky-sharing-1035525.html" target="_blank">Could Apple’s child safety feature backfire? New research shows warnings can increase risky sharing</a></strong></p>.<p>If a child chooses to view the photo, another screen will appear with more information about sensitive photos and videos.</p>.<p>The new Messages tool comes in the backdrop of child safety features that Apple has delayed following negative feedback.</p>.<p>The planned features included scanning users' iCloud Photos libraries for Child Sexual Abuse Material (CSAM), Communication Safety to warn children and their parents when receiving or sending sexually explicit photos and expanded CSAM guidance in Siri and Search.</p>.<p>Apple confirmed that feedback from customers, non-profit and advocacy groups, researchers and others about the plans have prompted the delay to give the company time to make improvements.</p>.<p>Following their announcement, the features were criticised by a wide range of individuals and organisations, including security researchers, the privacy whistleblower Edward Snowden, Facebook's former security chief, politicians, etc.</p>.<p>Apple has since attempted to dispel misunderstandings and reassure users by releasing detailed information, sharing FAQs, various new documents, interviews with company executives, and more.</p>.<p>The suite of Child Safety Features was originally set to debut in the US with an update to iOS 15, iPadOS 15, watchOS 8, and macOS Monterey.</p>.<p>It is now unclear when Apple plans to roll out the "critically important" features.</p>