Connect with us

Hi, what are you looking for?

Apple is introducing a new feature to help reduce sexual abuse of children

News

Apple is introducing a new feature to help reduce sexual abuse of children

Apple has always been very concerned about the safety of users and has been trying to provide the best tools to ensure the data security of iOS users. This time, Apple also introduced another new feature called Child Safety, which will be available in a new update of iOS 15. In his new role, Apple introduced technology that can identify any form of child sexual abuse images on Apple devices.

Every picture uploaded to the cloud will be correctly recognized first, and only once. The deleted ones will be moved to iCloud. However, if the system detects a problem with the image, the download will fail and the user will be notified. Your iOS device is unlikely to recognize the wrong image because the system will match a specific image. A picture was found, the content of which has been marked and registered in the database.

In addition, the image will only be reported when it exceeds a certain threshold set by the company that they consider to be tentative. However, the company assures its users that once they receive a notification about CSAM (sexual abuse) in iCloud from the user’s mobile phone, it will conduct a new check to confirm the severity and report it to the NCMEC (National Center). Missing and exploited minors) take further action and immediately deactivate user accounts.

Considering how many child sexual abuse cases have occurred recently, this is a big step for Apple, but it is not the only move. Apple is also developing similar security features in iMessage, which may appeal to Apple’s youngest audience. The company is developing a feature that can delete any images that it deems inappropriate for young people, especially if they obtain images from unknown numbers.

Through password manager google is providing more password security to its Users

The company then lets users know the content, provides them with useful resources and assures them that if they no longer want to open the image, everything will be fine. The parents of the young person will be notified immediately. All of this will help prevent widespread abuse and harassment and make children and adults feel that Apple has taken a big step forward in its capabilities. We are proud that this mobile giant has taken action.

Apple is introducing a new feature to help reduce sexual abuse of children

OVERVIEW:

However, if the system detects a problem with the image, the download will fail and the user will be notified. In addition, the image will only be reported when it exceeds a certain threshold set by the company that they consider to be tentative. However, the company assures its users that once they receive a notification about CSAM in iCloud from the user’s mobile phone, it will conduct a new check to confirm the severity and report it to the NCMEC.

Advertisement. Scroll to continue reading.
mail2box786@gmail.com'
Written By

Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

You May Also Like

Showbiz

Pakistani actress Neelam Munirka said that I always try to keep my job while I do not interfere in anyone’s affairs. Pakistani celebrity Neelam...

Celebrity

Catherine McBroom of The Ace Family is speaking out in defense of her husband Austin McBroom after one media outlet alleged that he has...

Entertainment

WWE After the Bell with Corey Graves and Vic Joseph will play host to special guests and surprises in the lead-up to WrestleMania 37 with daily episodes beginning...

Celebrity

Molly Ephraim became popular as Mike Baxter’s daughter on the sitcom Last Man Standing. However, most people don’t know her career highlights in her...

Advertisement