A trial program conducted by Pornhub in collaboration with UK-based child protection organizations aimed to deter users from searching for child abuse material (CSAM) on its website. Whenever CSAM-related terms were searched, a warning message and a chatbot appeared, directing users to support services. The trial reported a significant reduction in CSAM searches and an increase in users seeking help. Despite some limitations in data and complexity, the chatbot showed promise in deterring illegal behavior online. While the trial has ended, the chatbot and warnings remain active on Pornhub’s UK site, with hopes for similar measures across other platforms to create a safer internet environment.

  • CaptainEffort@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    7 months ago

    Like how video games supposedly normalize violence? Are you going to go shoot a bunch of people because GTA exists?

    Ffs guys what year is this? Thought we were past this silly mindset.

    • archomrade [he/him]@midwest.social
      link
      fedilink
      English
      arrow-up
      0
      ·
      7 months ago

      Deciding that you’re going to pull someone out of their car and clap them with a rocket launcher has a significantly higher situational barrier than finding yourself in a close relationship with a child who trusts you enough that you can abuse it in a moment of impulse.

      • CaptainEffort@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        0
        ·
        7 months ago

        You think abusing a child is easier than, say, punching someone in the face as you would do in video games?

        Dude if you genuinely think that I’d recommend reaching out to someone…

        In all seriousness tho, way to take the most extreme video game example possible to dismiss my point. Video game violence can have an extremely low “situational barrier”, but that doesn’t mean that video games will make you do those things.