Apple introduced a feature to pornographic detection on ios, ipados and macos to combat child sexual abuse. The problem of sexual abuse is very serious and since the PKP was implement more than 500 days ago the number of cases has reportedly increase. In addition, the problem of sending ugly pornographic images also increase with women being sent images of private parts. This is an issue that has been lingering for a long time without any serious efforts being made to combat it.
Apple today announce they will introduce pornographic image detection features on the Message and iCloud apps to combat the spread of Child Sexual Abuse (CSAM) Content on all of their release devices. The machine learning system is built on the iOS 15, iPadOS 15 and macOS Monterey operating systems that will be provide later.
This machine learning system will analyze the image to ensure; that it is not pornographic material when given to child users via Message. A warning will be give to users with blurred images. Parents, in turn, will receive notifications of potentially pornographic content receive by their children; that are include in the family account use.
Also Read: New iPad Battery
Next the images on the Apple device will be scan by the same machine learning system to detect images of child sexual abuse; that have been report to the National Center for Missing and Exploited Children (NCMEC) of the United States. When an image matching the NCMEC database is find on a user’s device; and their iCloud the authorities will be contact. According to Apple this machine learning system; and image matching of sexual abuse is enough with errors will only happen once for every trillion users.
Finally if an Apple device user goes through a search to report CSAM via Siri and Search; the device will provide accurate information. Users who perform a CSAM search will aware of illegally sought content; and provide suggestions on how help can be receive to treat this unnatural sexual desire.
At this time all of the above features will only be provide to users of iOS 15; iPadOS 15 and macOS Monterey devices in the United States first. Apple insists user privacy still guard because searches are done entirely in devices with no data access by Apple. Only reports of child sexual abuse content will be report to authorities; if a match is find with the NCMEC database.
You have read “Pornographic Detection on Ios”