This Chatbot Aims to Steer People Away From Child Abuse Material

Since March this year, each time someone has searched for a word or phrase that could be related to child sexual abuse material (also known as CSAM) on Pornhub’s UK website, a chatbot has appeared and interrupted their attempted search, asking them whether they want to get help with the behavior they’re showing. During the first 30 days of the system’s trial, users triggered the chatbot 173,904 times.

Read full article on Wired – Threat Level

 


Date:

Categorie(s):