Social media have been playing a significant role in this time as it allows users to express themselves freely and be heard. However, there are also some accounts that are exploiting the use of social media to distribute negative content such as hate speech, violence, child nudity, terrorism and more. For Facebook, they have been working hard on this issue to take down irrelevant content for years.
Recently, Facebook has released its fourth edition of its Community Standards Enforcement Report for 2019. It shared the data on how many posts and accounts have been taken down from Facebook and Instagram (for the first time) during this year’s April to September and the numbers got us shocked. There are approximately 75 million posts and 3.2 billion fake accounts removed for violating the rules.
Facebook has also said that it is easier for them to track down such content on Facebook than on Instagram. The company has improved a lot in tracking the content that violates its Community Standards nowadays. It was said that they are able to remove such content even before users make reports on Facebook.
For this issue, Facebook has added a new page for users to understand how its Community Standards apply to different types of content and see where they draw the line. The page mentioned that it won’t delete things that are controversial for the users to openly debate and exchange knowledge among themselves.
Facebook has also mentioned the stories that vanish in 24 hours have increased the difficulty for the company to track down and remove misleading content. In addition to that, another plan they are working on is to encrypt Instagram and Facebook which would also make law enforcement harder to go after child exploitation.
Let us know what you guys think about this report on our Facebook page! Stay tuned for more tech news on TechNave.com.