• nucleative@lemmy.world
    link
    fedilink
    English
    arrow-up
    4
    ·
    6 months ago

    Well thought-out and articulated opinion, thanks for sharing.

    If even the most skilled hyper-realistic painters were out there painting depictions of CSAM, we’d probably still label it as free speech because we “know” it to be fiction.

    When a computer rolls the dice against a model and imagines a novel composition of children’s images combined with what it knows about adult material, it does seem more difficult to label it as entirely fictional. That may be partly because the source material may have actually been real, even if the final composition is imagined. I don’t intend to suggest models trained on CSAM either, I’m thinking of models trained to know what both mature and immature body shapes look like, as well as adult content, and letting the algorithm figure out the rest.

    Nevertheless, as you brought up, nobody is harmed in this scenario, even though many people in our culture and society find this behavior and content to be repulsive.

    To a high degree, I think we can still label an individual who consumes this type of AI content to be a pedophile, and although being a pedophile is not in and of itself an illegal adjective to posses, it comes with societal consequences. Additionally, pedophilia is a DSM-5 psychiatric disorder, which could be a pathway to some sort of consequences for those who partake.