While I think this is a bit overblown in sensationalism, any company that allows user generative AI, especially as open as using LoRas and any amount of checkpoints, needs to have very good protection against synthetic CSAM like this. To the best of my knowledge, only the AI Horde has taken this sufficiently seriously until now.

  • HubertManne@kbin.social
    link
    fedilink
    arrow-up
    9
    arrow-down
    2
    ·
    7 months ago

    So far im the only commentator who is fine with this. the problem with csam to me is children being molested. if its art, or stories, or basically made up and not reality. then im pretty much fine with anything. I mean I may not want to consume it myself but I don’t see a problem with it.

    • x4740N@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      2
      ·
      7 months ago

      I agree with you

      With fictional content their is no child involved and their certainly isn’t anything living involved

      I’d just wish governments in countries that have made this class of fictional images illegal would go after real child molesters instead of makers / consumers of fictional images where no living being is involved

      People who consume and make fictional content won’t harm anyone