Facebook Had Lesser Hate Speech, Violence in End 2020, Proactive Steps Taken Against Bullying
Facebook Had Lesser Hate Speech, Violence in End 2020, Proactive Steps Taken Against Bullying
The latest Facebook community standards enforcement report claims that changes made to the Facebook News Feed have made a critical and clear change to reduce hate speech, violence and nudity on the platform.

Facebook has published its latest Community Standards Enforcement report for the final quarter of 2020. The report details key statistics for problems that have plagued the Facebook platform for a long time now, and include information on 22 total policy fronts involving both Facebook and Instagram. Some of the key areas that have been detailed in Facebook’s latest report include hate speech, violence and graphic content, and nudity on Facebook. The report also details the percentage of “proactive” actions taken by the Facebook group to improve the experience on its platforms — something that the company has often been accused of doing too little of.

The latest transparency report by the company has a largely positive undertone to content on Facebook and Instagram, and the group claims a number of changes that are helping it offer a better platform. In terms of numbers, Facebook now claims that hate speech prevalence on the platform has dropped from 11 users per 10,000 in July-September 2020, to 7 users per 10,000 in October-December 2020. For the same period, viewership of violence related content dropped from 7 viewers per 10,000 views to 5, and for nudity and adult content on the same metric and in the same period, viewership dropped from 6 users to about 3.

Detailing these highlight reductions in some of Facebook’s key problem areas, Guy Rosen, vice president of Integrity at Facebook, said in a statement, “Our improvements in prevalence are mainly due to changes we made to reduce problematic content in News Feed. Each post is ranked by processes that take into account a combination of integrity signals, such as how likely a piece of content is to violate our policies, as well as signals we receive from people, such as from surveys or actions they take on our platform like hiding or reporting posts. Improving how we use these signals helps tailor News Feed to each individual’s preferences, and also reduces the number of times we display posts that later may be determined to violate our policies.”

The full Facebook report details its statistics in numerous areas. For Facebook, the policy areas detailed in the Community Standards Enforcement report include child nudity and sexual exploitation, dangerous organisations activity, fake accounts, bullying and harassment, regulated goods and suicide and self-injury, alongside the aforementioned sections. Similar policy areas apply for Instagram as well. Alongside statistics on how each point has become more prevalent or lesser, Facebook has also highlighted its proactivity in certain areas, the most important of which include bullying and harassment, and suicide and self-harm.

In terms of proactive actions, Facebook states through the last three months of 2020, when it comes to bullying and harassment, 6.3 million pieces of content were proactively flagged and acted on by Facebook, accounting for 49 percent of such issues on the platform. Out of Facebook’s actions, 4.43 lakh pieces of content were appealed against by Facebook’s users, following which 41,000 of these instances were revised by Facebook basis the user appeal. The hate speech problem appears to be more prevalent on this note — Facebook states that during this period, 26.9 million pieces of content were flagged, removed or acted on by Facebook, of which Facebook claims 97.1 percent was done by Facebook before users reported it. Basis this, close to a million actions were appealed against by users, and close to 50,000 of these appeals were accepted and the decision reversed, by the Facebook group.

These actions proportionately apply to Instagram as well. In cases of sexual content and nudity, the report states that 11.5 million content pieces were acted on by Facebook, of which 96.5 percent were done by Facebook itself. Following its actions, 1.41 lakh pieces of content were restored by Instagram retrospectively, without appeals. Bullying and harassment, which has been a major area of concern for Instagram, saw 5 million pieces of content flagged in the last three months of 2020. Facebook’s actions were less prolific here — 20 percent of all these content were first flagged by users, than Instagram finding them.

On the whole, the Community Standards Enforcement report also details the exact definition of these problematic areas by Facebook, and what standards apply to them. The report is aimed at helping users understand exactly what Facebook defines as problematic, which in turn can help users understand the exact reason for why these bans were affected. Rosen states that in 2021, Facebook aims to “share additional metrics on Instagram and add new policy categories on Facebook,” which can seemingly add more value to the transparency efforts by the company. Facebook also highlights “changes” to its algorithms, as well as a wider involvement of its content moderation team, in its increased proactive action rates on both Facebook and Instagram. There is, however, sizeable gaps that the company would possibly look to fill up, in the months to come.

Read all the Latest News, Breaking News and Coronavirus News here

What's your reaction?

Comments

https://popochek.com/assets/images/user-avatar-s.jpg

0 comment

Write the first comment for this!