• Clbull@lemmy.world
      link
      fedilink
      English
      arrow-up
      16
      ·
      edit-2
      10 months ago

      So they paid Kenyan workers $2 an hour to sift through some of the darkest shit on the internet.

      Ugh.

        • SacrificedBeans@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          ·
          10 months ago

          I’m sure there’s some loophole there, maybe between countries’ laws. And if there isn’t, Hey! We’ll make one!

        • Clbull@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          10 months ago

          Isn’t CSAM classed as images and videos which depict child sexual abuse? Last time I checked written descriptions alone did not count, unless they were being forced to look at AI generated image prompts of such acts?

          • Strawberry@lemmy.blahaj.zone
            link
            fedilink
            English
            arrow-up
            3
            arrow-down
            1
            ·
            10 months ago

            That month, Sama began pilot work for a separate project for OpenAI: collecting sexual and violent images—some of them illegal under U.S. law—to deliver to OpenAI. The work of labeling images appears to be unrelated to ChatGPT.

            This is the quote in question. They’re talking about images

        • smooth_tea@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          10 months ago

          I really find this a bit alarmist and exaggerated. Consider the motive and the alternative. You really think companies like that have any other options than to deal with those things?

        • Meowoem@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          0
          arrow-down
          1
          ·
          10 months ago

          They could be working with the governments of relevant countries to develop filters and detection systems.

      • JonEFive@midwest.social
        link
        fedilink
        English
        arrow-up
        1
        ·
        10 months ago

        No, you’re right, you should be. We don’t want to normalize this shit, it should continue to shock and offend.

        These are the dark sides of modern technology. The kids working cobalt mines. The workers being paid pennies to categorize data so bad that it is traumatic to even read it. I can’t imagine how the people who have to look at pictures can do it.

        I feel like I could handle some dark text here or there, but if I had to do it for 40-50 hours a week? Hundreds of passages every day. That would warp me pretty quickly.

    • GenesisJones@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      10 months ago

      This reminds me of an NPR podcast from 5 or 6 years ago about the people who get paid by Facebook to moderate the worst of the worst. They had a former employee giving an interview about the manual review of images that were CP andrape related shit iirc. Terrible stuff