Posted inNews

Facebook Tests New Tools to Curb Child Exploitation On the Platform

The tools try to stop searches and reshares of inappropriate child content.

Facebook Tests New Tools to Curb Child Exploitation On the Platform

Facebook has introduced 2 new tools to curb child exploitation and harassment on its platform. 
The first tool is a pop-up that appears when users search a keyword related to child exploitation: imprisonment laws for viewing illegal content, resources for offender-diversion programs, and an account removal warning are displayed. The second tool is a safety alert that notifies users “who have shared viral, meme child exploitative content about the harm it can cause” and warns of the legal repercussions for sharing this type of material.

These two tools were introduced after Facebook conducted a study to understand “how and why people share child exploitative content on Facebook and Instagram”. They found that over 90% of content was either the same or similar to other reported content. They also found that 75% of those resharing exploitative content did not have malicious intent – but wanted to cast awareness on the subject or joke about it in poor taste.

Facebook also updated their child safety policies about inappropriate commentary on photos of minors:

“We will remove Facebook profiles, pages, groups and Instagram accounts that are dedicated to sharing otherwise innocent images of children with captions, hashtags or comments containing inappropriate signs of affection or commentary about the children depicted in the image”.

Additionally, Facebook users can directly help Facebook prevent child exploitation: if users see inappropriate content targeted at minors – they are able to specify ‘involves a child’ under the ‘Nudity and Sexual Activity’ section.

With over 500,000 online predators active daily, it is important for companies such as Facebook to prevent content related to child exploitation from circulating around its platform. Moreover, with 50% of victims of online sexual exploitation being between 12 and 15, it is important for social media platforms to protect their under-18 users.