
Facebook published the fourth edition of its Community Standards Enforcement Report this week. It shows the amount of content violating policies, numbers of fake accounts detected and blocked, a thing that is increasing from the last year, and millions of posts showing child abuse and suicide. This information is according with the latest content moderation report.
Based on the report, just in the year 2018, 3.3B fake accounts were detected and taken down and as of Sept 2019 5.4 fake accounts has been detected and taken down too. By the period of 2018, 2.1B of accounts were taken down and as of September 2019, 5.4B accounts were detected and taken down. From Jan-Sept of 2018 and Jan-Sept 2019 there was an an increase of 60.5% of fake accounts taken down. Have in mind that these accounts are not all the fake accounts that are created, these are accounts that are detected by automatic systems or manually detected.

In the other hand, content related to child nudity and exploitation violations on Facebook and Instagram increased from the last 2 quarters. Facebook states that content is deleted before the views of those content reach 10,000 views. It means that all the content that violated its policies are taken down before they reach 10,000 on Facebook and Instagram.
About this, Law enforcement is concerned that Facebook is planning to provide a more robust privacy to its users by encrypting the company’s message services and law enforcement are saying that it will difficult their efforts to fight child abuse.
FBI Director Christopher Wray said last month that the changes would turn the platform into a “dream came true for predators and child pornographers”. Click this link to see the video.