Facebook is one of the most popular social media applications in the world. The tech giant has taken a big step by launching and deleting various departments and functions from the platform to ensure the privacy and safety of its app users. The newly established Transparency Center on Wednesday aims to provide users with all the resources they need to ensure honesty and transparency. The tech giant said that the Transparency Center will manage Facebook’s policies on how and when to update, recruit human viewers and computer technology to display inappropriate content on apps and manage its deletion.
The Transparency Center staff will also monitor the platform No misinformation will be disseminated, and if they contact them, they will be deleted to include data and indicators related to these tasks. Guy Rosen, the vice president of integrity for the technology department, has also published several other reports. He shared the report of the Community Standards Report released by Facebook in the first quarter of 2021 to assure the public that the company is making the right decision.
Steps to ensure the safety of the user platform every day. He shared 12 Facebook guidelines and 10 Instagram messages to help educate the public on how the tech giant implements these guidelines. Rosen shared a measure that the percentage of people viewing malicious content on adult nudity platforms is only 0.03% to 0.04% on Facebook and Instagram, while hate speech is only 0.From 05% to 0.06% on Facebook, violent and graphic content from 0.03% to 0.04% on Facebook, and 0.01% to 0.02% on Instagram. And its percentage rating is downgraded from 23.6% in 2017 to today.
He also shared the content indicators of other problem areas that Facebook obtained on its flagship platform in the first quarter, reporting 8.8 million9. Removed 8 million, 25.2 million harassment, organized and annoying content from Facebook. Similar deleted content on Instagram includes 5.5 million bullyings and harassment content, 324,500 organized hate content, and 6.3 million hate speech content. It has also been processed and deleted. False information about the pandemic from the beginning of the pandemic to the present.
Marc Fiore shared his report on intellectual property transparency starting in the second half of 2020, in which he discussed the steps and data fragments involved in tampering and deleting Facebook-related hacks. According to Fiore, 99.7% of all counterfeit removals are actively removed, which means that they are removed before reporting. This number is 335,765,018 articles with suspiciously fake content, while for Instagram, it is 82.8% or 2,696,272 people. In terms of copyright removal, 77.9% of users on Facebook actively participated, which is 9,822,070 units of content, while the total on Instagram was 59% and 2,170,529 respectively.
Mark said the company has also taken several other steps to maintain intellectual property protection in various ways, including powerful active control systems, including machine learning, suspicious signals, and direct intelligence from copyright owners. In the following government intelligence report on the application, Facebook vice president and assistant general counsel Chris Sonderby (Chris Sonderby) updated that the government’s requirements for user data rose from 173,592 to 191,013, an overall increase in the past six months. An increase of 10%.
The United States is the main force collecting user data, followed by India, Germany, France, Brazil, and the United States. Mike Clarke, director of product management, said that in the end, Business Insider asked a question in April that 530 million pieces of information and data about Facebook users were publicly available on unsafe websites. In this regard, Clark said that Facebook has taken some steps to prevent “data cleansing” of the platform, including recruiting about 100 people who will focus on detecting, investigating, and preventing data collection. One person can perform certain functions.
The tech giant has taken a big step by launching and deleting various departments and functions from the platform to ensure the privacy and safety of its app users. The tech giant said that the Transparency Center will manage Facebook’s policies on how and when to update, recruit human viewers and computer technology to display inappropriate content on apps and manage its deletion. He shared the report of the Community Standards Report released by Facebook in the first quarter of 2021 to assure the public that the company is making the right decision.