Facebook Tightens Its Hate Speech Detection and Updates Policy for Upcoming Myanmar Elections

Facebook

Facebook and its CEO Mark Zuckerberg are always under the media spotlight for various controversial reasons. Recently he was under scrutiny for censoring conservative media outlets’ pages on the platform.

The House Committee of Senate has also grilled him due to ongoing “antitrust” issues surrounding Big Tech. Facebook has previously been criticized for hate speech and fake news that’s rampant on the platform. 

The critics believe that this fake news phenomenon is the reason behind President Trump’s rise into power. But this time, Facebook appears to have tightened up its grip around the issues plaguing the platform.

Ahead of Myanmar elections, the company has revealed on Tuesday that they are preparing for Myanmar general elections that are due in November. They are deploying various algorithms that will detect the hate speech and uninformative content on the platform. 

The company is taking up this action to prevent the spread of the content on the platform that will lead to violence in any manner. It has published a blog saying they will flag the content that appears uninformative and incites violence. The policy updates will allow them to remove any content that can damage the “integrity” of the electoral process.  

Facebook mentions in the blog:

For example, we would remove posts falsely claiming a candidate is a Bengali, not a Myanmar citizen, and thus ineligible. We also recognize that there are certain types of content, such as hate speech, that could lead to imminent, offline harm but that could also suppress the vote. 

We have a clear and detailed policy against hate speech, and we remove violating content as soon as we become aware of it. To do this, we’ve invested significantly in proactive detection technology to help us catch violating content more quickly. We also use AI to proactively identify hate speech in 45 languages, including Burmese.

Facebook has joined hands with two partners in Myanmar to improve the quality of the platform. They would verify the public pages of political parties and figures on Facebook. Fact-checking companies like BOOM, AFP Fact Check, and Fact Crescendo have joined hands with Facebook to check the content’s authenticity.

Recently, the action was taken against almost 300,000 pieces of misinformation in regards to the Myanmar election. That’s nearly six times of content removed in Q1.

To stop the spread of rumors, they will allow a message to be sent to only five people at a time. It’s limited to Myanmar but soon will be available globally.

The elections are due in November, and it will be the second democratic election since Military rule ended in the country. 

What do you think of Facebook’s policy updates? Let us know in the comments and please support us by following our website.

LEAVE A REPLY

Please enter your comment!
Please enter your name here