Connect with us

Hi, what are you looking for?

News

Apple is introducing a new feature to help reduce sexual abuse of children

Apple is introducing a new feature to help reduce sexual abuse of children

Apple has always been very concerned about the safety of users and has been trying to provide the best tools to ensure the data security of iOS users. This time, Apple also introduced another new feature called Child Safety, which will be available in a new update of iOS 15. In his new role, Apple introduced technology that can identify any form of child sexual abuse images on Apple devices.

Every picture uploaded to the cloud will be correctly recognized first, and only once. The deleted ones will be moved to iCloud. However, if the system detects a problem with the image, the download will fail and the user will be notified. Your iOS device is unlikely to recognize the wrong image because the system will match a specific image. A picture was found, the content of which has been marked and registered in the database.

In addition, the image will only be reported when it exceeds a certain threshold set by the company that they consider to be tentative. However, the company assures its users that once they receive a notification about CSAM (sexual abuse) in iCloud from the user’s mobile phone, it will conduct a new check to confirm the severity and report it to the NCMEC (National Center). Missing and exploited minors) take further action and immediately deactivate user accounts.

Considering how many child sexual abuse cases have occurred recently, this is a big step for Apple, but it is not the only move. Apple is also developing similar security features in iMessage, which may appeal to Apple’s youngest audience. The company is developing a feature that can delete any images that it deems inappropriate for young people, especially if they obtain images from unknown numbers.

Instagram and Twitter is being condemn technical problems for deleting Palestinian posts

The company then lets users know the content, provides them with useful resources and assures them that if they no longer want to open the image, everything will be fine. The parents of the young person will be notified immediately. All of this will help prevent widespread abuse and harassment and make children and adults feel that Apple has taken a big step forward in its capabilities. We are proud that this mobile giant has taken action.

Apple is introducing a new feature to help reduce sexual abuse of children

OVERVIEW:

However, if the system detects a problem with the image, the download will fail and the user will be notified. In addition, the image will only be reported when it exceeds a certain threshold set by the company that they consider to be tentative. However, the company assures its users that once they receive a notification about CSAM in iCloud from the user’s mobile phone, it will conduct a new check to confirm the severity and report it to the NCMEC.

Advertisement. Scroll to continue reading.
Content Protection by DMCA.com
mail2box786@gmail.com'
Written By

Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

You May Also Like

Artificial intelligence

The capability of a computer program to comprehend is Artificial intelligence. Anything can be considered artificial intelligence as long as it is a program...

News

The current system of vote casting in Pakistan is manually controlled. Votes are cast manually. All the data and information like registered voters, candidates,...

News

Jazz, Pakistan’s number one 4G operator, and largest broadband and Internet service provider maintained a strategic focus on 4G penetration, investing Rs 13.7 billion...

Marketing

There has been intense competition among smartphone makers in Pakistan in recent years. One of the main reasons is the proliferation of new smartphone...