The biggest social media platforms Facebook, Instagram and Twitter have something in common — the existence of content which have shown child nudity and sexual exploitation.
Reports from Sprout Social in 2020 show that Facebook, Instagram and Twitter have 2.6 billion, 1.8 billion, and 326 million users, respectively.
Between January and June 2019, Twitter took down 244,188 accounts for content related to child sexual exploitation, according to the platform’s 2019 Transparency Report.
Other posts taken down by Twitter did not just involve child nudity or sexual exploitation. The company also removed posts with adult nudity and sexual activity; bullying and harassment; terrorism, organized hate and hate speech; drugs and firearms; suicide and self-injury; and violent and graphic content.
In addition, Twitter also removed 115,861 accounts for terrorism-related content, a 30% drop from the previous reporting period.
Twitter cited a “year-on-year decrease in the number of accounts promoting terrorist content […].” The company also reported that, of the 244,188 accounts it took down for child exploitation, 91% were found by a combination of technology used by Twitter — such as Microsoft’s PhotoDNA — to combat child exploitation images.
Facebook Takes Action to Remove Explicit Posts
Facebook, on the other hand, removed 11.6 million child abuse posts between July and September 2019, according to BBC.
For the first time, Facebook released data related to suicide and self-harm after 14-year-old Molly Russell committed suicide in 2017, BBC further reported. Russell’s father then found loads of graphic material about self-hard and suicide on her Instagram account.
BBC indicated that 22.5 million posts related to suicide and self-harm were removed between July and September 2019.
A recent Facebook report showed an increase in the number of posts removed due to child nudity and sexual exploitation in 2020.
The report found that between January and March 2020, 8.6 million posts were taken down. However, between April and June 2020, Facebook removed another 9.5 million.
Facebook flagged most of that content before users reported it.
Just recently, Sky News reported that Facebook is responsible for a whopping 94% of the 69 million child sex abuse images online, according to reports from the U.K.’s Home Office.
Instagram Following Facebook’s Actions
Meanwhile, Instagram — which is now owned by Facebook after it was bought for $1 billion in 2012 — has a similar problem.
In the first quarter of this year (January to March), Instagram took down 1 million posts related to child nudity and sexual exploitation, and another 480,000 in the second quarter (April to June).
During the same time period, Instagram took down 1.3 million posts related to suicide and self-injury between January to March 2020. The company removed another 275,000 posts between April and June 2020.
However, despite the reports that find that companies like Facebook make billions of dollars per year in revenue, it is clear that social media shows no signs of shutting down.
So, stay vigilant — there is still a dark twist behind these platforms that advertise themselves to “connect” people.
Written by: J. Laura
Leave a Reply