AI crawlers are scraping every site. Every site. Random public-but-unlisted hobby sites are getting scraped and spiking users’ data. There was a Lemmy post about someone who had that experience just yesterday.
Think of how much Child Porn is stored on public sites that are shared in private groups. Also consider that FaceBook is the largest distributor of Child Sex Abuse material. These models are absolutely trained on Child Porn.
People posting photos online of their kid in the bath, at the beach, etc. with reckless abandon maybe.
For as far as I remember (and that’s quite far these days), we’ve kept telling people to not post pictures of their kids online as much as possible. Way before the facebooks and way before the LLM craze, so people can’t mess with them. Guess 20 years of heads up wasn’t enough.
Can confirm. Been a mod on one minor social media site. Once banned a group that claimed to be “nudist”. More than half of photos were featuring under aged children. This shit happens more often than we think.
I haven’t seen these pictures, so I can’t say how good/bad it works, but if that was only that, the results would be more or less wrong. Kids are quite different from adults.
On the other hand, plenty of pictures of naked/semi naked kids in a non sexual context can probably be found online already, so it’s not inconceivable that their model had plenty of references to use anyway.
How they undress underage without this data on their server to train the models?
AI crawlers are scraping every site. Every site. Random public-but-unlisted hobby sites are getting scraped and spiking users’ data. There was a Lemmy post about someone who had that experience just yesterday.
Think of how much Child Porn is stored on public sites that are shared in private groups. Also consider that FaceBook is the largest distributor of Child Sex Abuse material. These models are absolutely trained on Child Porn.
“Also consider that FaceBook is the largest distributor of Child Sex Abuse material”
Why is this not all over the news too? A rhetorical question. Sadly, I think we all know the answer by now.
Instagram literally has “mommy daughter” accounts that are 100% CP fetish material, and every single comment is by an older man
Meta doesn’t give a FUCK as long as you drive engagement
Disgusting
For the same reason the media pushed the AI bubble. Silicon Valley holds their chosen boys…
People posting photos online of their kid in the bath, at the beach, etc. with reckless abandon maybe.
For as far as I remember (and that’s quite far these days), we’ve kept telling people to not post pictures of their kids online as much as possible. Way before the facebooks and way before the LLM craze, so people can’t mess with them. Guess 20 years of heads up wasn’t enough.
Can confirm. Been a mod on one minor social media site. Once banned a group that claimed to be “nudist”. More than half of photos were featuring under aged children. This shit happens more often than we think.
They don’t need to.
Take pictures of normal dressed children, combine with pictures of naked adults. Now you have CP.
I haven’t seen these pictures, so I can’t say how good/bad it works, but if that was only that, the results would be more or less wrong. Kids are quite different from adults.
On the other hand, plenty of pictures of naked/semi naked kids in a non sexual context can probably be found online already, so it’s not inconceivable that their model had plenty of references to use anyway.