- cross-posted to:
- [email protected]
Really easy to see where this is going.
“open source image synthesis technologies such as Stable Diffusion allow the creation of AI-generated pornography with ease, and a large community has formed around tools and add-ons that enhance this ability. Since these AI models are openly available and often run locally, there are sometimes no guardrails preventing someone from creating sexualized images of children, and that has rung alarm bells among the nation’s top prosecutors. (It’s worth noting that Midjourney, DALL-E, and Adobe Firefly all have built-in filters that bar the creation of pornographic content.)”
Paid software that can be reined it so it doesn’t compete with Netflix and disney is fine, the open source stuff is satan spawn.
The easy solution would be to go after the ones that distribute the pictures, this is only about keeping the gravy train going.
Legally this is going to be a mess. In theory I agree that CSAM that is photorealistic should be illegal (mainly because it won’t be long before a AI generations are completely indistinguishable from photos, and we can’t just ignore real child abuse), but how do you define photorealistism or CSAM when the subjects literally don’t exist? I figure if this kind of thing ever hits the courtroom there will be wildly different verdicts and sentences.
I agree with this, but don’t have much hope of anything passing. They didn’t outlaw underage hentai, so I feel like this is an uphill battle they’ll give up on.