The new technology joins a pilot program of Facebook which requires trained employees to review offensive images.
Facebook Inc he said on Friday he would use artificial intelligence (AI) to fight on their social networks the dissemination of intimate photos shared without permission, sometimes called "Revenge porn."
The new technology adds to a pilot program that requires trained employees to review offensive images.
"Through the use of the machine learning and artificial intelligence, we can now proactively detect nude images or videos that are shared without permission," said the social media giant in a note posted on his blog Facebook Newsroom
"It means we can find this content before someone reports it," the company added.
A member of the community operations team of Facebook I would review the content found by the new technology. If it was discovered that it was an offensive image, it would eliminate or deactivate the account responsible for disseminating it, the company added.
The "revenge porn" It refers to the exchange of sexually explicit images on the Internet, without the consent of the people who appear on them, to extort or humiliate them. It disproportionately affects women, who are sometimes attacked by their ex-partners.
Facebook It will also launch a support center called “Not without my consent” on the security center page for people whose intimate images have been shared without their consent.
The company works with at least five external providers in at least eight countries in content review, according to a Reuters count. In December, it had about 15,000 people between contractors and employees, who worked on the content review.