Loading...

Facebook Will Intensify Efforts to Tackle Revenge Porn

The social network has launched new tools intended to address sharing of so-called revenge porn. They will allow users to report any intimate pictures posted non-consensually. Such pictures will be flagged to “specially trained Facebook representatives”, who will review and remove them if they violate Facebook’s community standards.
Facebook Will Intensify Efforts to Tackle Revenge Porn

Besides, the social media platform will also use “photo-matching technologies” for photos it is already aware are being shared without the subject’s consent. It resembles the PhotoDNA image hashing system used to identify child abuse imagery and terrorist content and prevent further sharing. Accounts that share such images can be disabled.

British representatives of the company remind that two years ago it was made an offence in the country to share private sexual images or video without consent. The recent reports reveal that more than 200 people were prosecuted for such offences in the first year after the introduction of the new law.

Overall, revenge porn is treated differently across jurisdictions. In most cases, the key question is whether the image is a “selfie” (and a copyright claim may be brought), or it is taken by the person who posted it. Moreover, for example, in California, revenge porn is defined as posting explicit images taken “under circumstances where the parties agree … that the image shall remain private”.

Back in 2016, Facebook was sued by a Belfast 14-year-old over the publication of the naked image of her on the platform, which was allegedly extracted from her through blackmail. Facebook argued that the photo was repeatedly removed from the page on which it was posted, but each time it was reposted, and the company failed to do anything to permanently block the image. The newly introduced photo-matching technologies would have addressed this problem and helped prevent the reposting of an already-blocked photo. On the other hand, they can’t pre-emptively block the first posting, since they rely on matching images to a pre-existing database. This means that users still need to flag a revenge porn image first.

Besides, Facebook removes images for containing consensually posted nudity, because it is against its terms of service. However, the company has repeatedly been accused of being overly broad in how it applies this rule, after removing images of breastfeeding mothers and mastectomy survivors.


Posted by: 
SaM
News 7126743588572146335

Enregistrer un commentaire

Give Your's Feedbacks

emo-but-icon

Accueil item