The social network is singled out again after a study shows Facebook moderates very few images depicting child abuse.
On Facebook, moderation is regularly highlighted by right-backs. Whether it’s regulating political advertising or advocating firearms, inciting hate speech or, more recently, an exposed VIP list Wall Street Journalallowing certain celebrities to evade automatic moderation, the list goes on, as do the scandals that surround them.
This time it’s a new theme that sets the powder on fire. According to a survey by New York Times citing an internal training document Images of child abuse would actually be very unmoderated on Mark Zuckerberg’s platform.
Why age matters
In the event of a complaint of child abuse, Facebook great difficulty in accurately determining the age of the children. If in doubt, explain New York Times, “Young people are treated like adults and the images are not reported to the authorities.” A loophole that leaves a lot of child abuse content every year.
The document brought to the attention of the newspaper therefore urges the moderators to consider as adults all content where it is not possible to determine the person’s age with certainty. A big problem, since the video then goes through the mesh of the moderator, but also the authorities. Under the procedure, all images of abuse are normally required to be submitted to the National Center for Missing and Exploited Children (NCMEC) while those in question “easy” Adults generally agree to be removed from the platform.
Facebook is playing the privacy card to justify itself
However, Facebook is not such a bad student when it comes to moderating problematic content involving minors. Each year, several million videos of alleged child sexual abuse are reported to and delivered to the competent authorities. However, the investigation of New York Times believes that this lack of moderation in age recognition also has implications for a significant number of unreported sexual assaults.
Using an over 50 year old identification method for identification “the progressive phases of puberty”, Facebook would allow thousands – maybe millions – of child abuse videos to slip through the cracks. For its part, the company defends its moderation system tooth and nail: According to Antigone Davis, the security manager at Facebook, it is simply permissible to consider users as adults when in doubt protect the privacy of Internet users. Especially since Meta could be held legally liable in the event of a false report.