[ad_1]
Facebook could be underreporting child sexual abuse content due to on a change in its content moderation policy.
According to a leaked corporate training document, Meta, the parent company of Facebook, Instagram, Messenger and WhatsApp, has instructed content moderators for its platforms to ‘err on the side of an adult’ when uncertain about the age of a person in a photo or video, as reported by The New York Times.
While it’s the responsibility of tech companies to monitor content on their platforms for child sexual abuse material (CSAM), activists claim the policies aren’t doing enough to protect children.
American companies are legally required to report any child abuse content to the National Center for Missing and Exploited Children (NCMEC).
Tech companies employ content moderators to decide if content flagged for potentially containing child sexual abuse should be reported.
A Facebook training document directs content moderators to treat someone as an adult when they can’t determine someone’s age in a photo or video that’s suspected to be CSAM, said the report.
The policy was reportedly made for Facebook content moderators working at Accenture and is discussed in a California Law Review article from August.
Facebook is responsible for 94% of the 69 million child sex abuse images reported by US tech firms, making it a leader in detecting child sexual abuse content.
However, concerns about mistakenly accusing people of posting such content has resulted in a policy change that could allow photos and videos of child abuse to go unreported.
This means that when a content moderator cannot identify whether the subject in a suspected CSAM photo is a minor or an adult, content moderators are instructed to assume the subject is an adult, thereby allowing more images to go unreported to NCMEC.
Antigone Davis, head of safety for Meta, confirmed the policy in an interview and said it stemmed from privacy concerns for those who post sexual imagery of adults.
While it is impossible to quantify the number of images that might be misclassified, child safety experts said the company was undoubtedly missing some minors.
Studies have found that children are physically developing earlier than they have in the past. Also, certain races and ethnicities enter puberty at younger ages, with some Black and Hispanic children, for example, doing so earlier than Caucasians.
Each day, moderators review millions of photos and videos from around the world to determine whether they violate Meta’s rules of conduct or are illegal.
Last year, the company made nearly 27 million reports of suspected child abuse.
Metro.co.uk has reached out to Facebook for comment.
MORE : Apple and Facebook unknowingly shared user data with hackers, says report
MORE : Facebook removes ‘deepfake’ video of Zelensky telling Ukrainians to surrender
Get your need-to-know
latest news, feel-good stories, analysis and more
window.fbApi = (function () {
var fbApiInit = false; var awaitingReady = [];
var notifyQ = function () {
var i = 0,
l = awaitingReady.length;
for (i = 0; i < l; i++) {
awaitingReady[i]();
}
};
var ready = function (cb) {
if (fbApiInit) {
cb();
} else {
awaitingReady.push(cb);
}
};
var checkLoaded = function () {
return fbApiInit;
};
window.fbAsyncInit = function () {
FB.init({
appId: '176908729004638',
xfbml: true,
version: 'v2.10'
});
fbApiInit = true;
notifyQ();
};
return {
'ready' : ready,
'loaded' : checkLoaded
};
})();
(function () {
function injectFBSDK() {
if ( window.fbApi && window.fbApi.loaded() ) return;
var d = document,
s="script",
id = 'facebook-jssdk';
var js, fjs = d.getElementsByTagName(s)[0];
if (d.getElementById(id)) {
return;
}
js = d.createElement(s);
js.id = id;
js.async = true;
js.src = "https://connect.facebook.net/en_US/sdk.js";
fjs.parentNode.insertBefore(js, fjs);
}
if ('object' === typeof metro) {
window.addEventListener('metro:scroll', injectFBSDK, {once: true});
} else {
window.addEventListener('DOMContentLoaded', injectFBSDK, {once: true});
}
})();
[ad_2]
Source link