Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
AI-generated naked child images shock Spanish town of Almendralejo (bbc.co.uk)
33 points by basisword on Sept 24, 2023 | hide | past | favorite | 40 comments


Generally there is a distinction between deepfaking and AI generated or am I mistaken here ?

Is there a preferred nomenclature ?


Deepfake utilises AI, no? Either way I feel like in the context of a national news article AI is being used in a colloquial sense.


From the AI industry perspective, I'd call it "deepfake" if it's generated as the original deepfake method did, by taking a real image or video and replacing only the face, and "AI generated" if the whole image was generated.

But perhaps this is a technical distinction that will go away soon, or perhaps has already disappeared in colloquial usage.


I always used it literally:

“Deepfake”: a fake generated by a DNN, or tool chain comprised primarily of such.


While from technical perspective there are multiple quite different methods to create images, I don't think that there is (or should be, or can be) a clear boundary for social and legal purposes.

It's clear that are a bunch of generated fake images with a face that looks like the impersonated victims and a naked body. Does it matter if that face was copy-pasted from an existing photo of that person or manipulated by a skilled artist to look like that person or AI-generated to look that person? And with respect to the impersonated victims, does it matter if the naked body came from a real photo of someone else, or was drawn by a skilled artist, or was generated by AI?


I agree intent remains the same in either case. In this case the intent is desiring to create a nude likeness of an individual using some degree of original material so that the final creation appears indistinguishable from a genuine product.

In terms of law the technical solution used to accomplish the likeness should not be relevant.


Genuinely curious – if instead of AI these images were drawn by other children with photoshop would that be of equal concern?

I feel ashamed to admit this, but I didn't like one of my teachers in highschool and as a joke one evening I photoshopped her into some photos (a few of which were sexual in nature). But worse, I shared these images with a few of my friends as a joke. Obviously we didn't have AI back then, but I was okay at photoshop so the images look semi-realistic I suppose.

Looking back I didn't even think I was doing anything wrong because I had seen this done quite a bit. I didn't like my teacher and I was expressing my dislike for her with the mean images I created. If you look online you often find people doing similar things to politicians and celebs they dislike. I'd argue this should be illegal, but so long as the person is famous enough it seems this is a form of socially acceptable bullying which I've expressed some anger about on here before, https://news.ycombinator.com/item?id=36737515 Perhaps the most egregious example of this I can think of is in Kanye West's "Famous" music video in which he created a life-like naked mannequin of Taylor Swift without her consent seemingly just to be cruel. But I guess that's fine because she's and adult and famous.

Expanding on my point here, I think most of us can agree that it's acceptable to draw a picture of a naked child (or even take a photo of a naked child) depending on context. For example I think most people would agree there are appropriate artistic contexts for a drawing featuring a naked child, and similarly parents often take photos of their child in the bath or naked around the house.

Whenever I think about this issue I always feel the problem is purely the intention and whether or not it was consensual. If the intention is to cause hurt and it's not consensual then I think it is deeply wrong. It's got much less to do with whether the image contains a naked child because otherwise context wouldn't matter. I also don't see how anyone could argue that there is anything inherently wrong or shameful about the body of a child.

In which case I think treating this like child pornography would be inapporiate since the issue with child pornography is much more to do with the fact that a child is being coerced (or forced) into doing things which they cannot consent to for the content. This isn't happening here. In terms of harm this is far more similar to drawing a picture of someone naked against their will and sharing that drawing. However I do accept the realism here makes this worse and therefore should be punished more harshly, but either way this is still purely an issue of bullying.

Who ever did this deserves serious punishment, but treating them like a pedophile would be wrong – especially if it was a classmate of similar age who created these images. And I'll add if we want to stop kids doing this then we should probably not normalise the sharing of nonconsensual photos featuring adults either. It's utterly absurd that they could have created a naked photo of Taylor Swift with this AI and that would have been totally fine, but because it's someone from their class its a crime. Why?

I'd be interesting in hearing if and why others disagree with anything I've said here. I really do think we obsess about these issues when there's a child involved, meanwhile we're fine with adult women being victims of exactly the same behaviour. If this behaviour is wrong then let's stop it completely. You simply should not be allowed to share a photo of person without their consent. The only exception to this should be if that person was in public when the photo was taken.


[dupe]



Thanks for the links. Didn't see that at the time. Edit: wish I hadn't checked those threads out - some absolutely awful takes. There's a serious lack of empathy in our tech bubble at times.


It was posted on BBC, that's why it got more attention. I posted this article yesterday too, having searched for it and not found it posted.


The discussion is over there on the elpais articles. Days ago. It's just a note to go there.


This is why AI is going to be regulated. It’s fairly obvious by now.


Yeah, good luck with that. The genie is out of the bottle, at best you can restrict its public use (and also good luck with proving whether an image is AI or not), because even if they outlaw it somehow, you still only need a GPU, a 4gb file and a <100mb software to make whatever image.

Are we going to ban GPUs or monitor their use? Does that sound like a good idea?


Even if we banned GPUs, they could still be made (albeit slower) with a CPU.


What kinds of regulation do you anticipate helping here?

Let's assume that generating these images is as simple as downloading a model from the internet and running it on the same hardware you use to play Fortnite.

How do you prevent stuff like this without going full "we need to backdoor all encryption because terrorism" moral panic mode?


Photoshop didn't get regulated when people started photoshopping people's heads on strippers. Rather we used existing laws to punish individuals who did it, like those against defamation. And for the more obvious and benign cases we quickly stopped caring all together.


Whoever is generating and distributing these images should be charged with distributing child pornography. Doesn't matter if the images are AI generated.

That's my initial response, anyway. I'll concede that it's not well thought out. For one, it isn't comprehensive enough to solve the problem of the perpetrator being halfway around the world.


I think the problem is that the most likely perpetrators are the teenage boys the girls are in school with. It seems like exactly the sort of thing teenage boys would do without fully understanding the damage never mind the legality. In that sense it feels very tricky to deal with - saying that, it needs a solution. Being a kid these days seems pretty awful. I thought it was bad when your mistakes could be caught on camera and put on Facebook. How much worse can it get?


I think the endgame is having a different understanding of privacy, modesty, etc. There's no way this is going to go away or be regulated away somehow. Heavy handed punishment of young kids who generate images just creates more problems (though I imagine we'll go through that phase). Eventually (in a generation or two) it will equilibrate and nobody will take the pictures seriously or be interested in making them. There's novelty now, it will go away.

I can't see any other realistic direction this will go.


I can’t see this happening. People have been saying this for a long time. On top of that, a lot of young girls are going to go through a lot of pain in the meantime - hoping for societal change seems negligent.


I'd take issue with the statement about no one taking them seriously or being interested in making them. It seems like a pedophiles dream.


It's not tricky at all, if adults would just dislodge their heads from their behinds and take the minimum possible amount of responsibility.

Smart phones must be prohibited from under 16-year olds.


Yes. After all the alcohol limit of 21 is a massive success and leads to both people under 21 never drinking alcohol, and people over 21 being very responsible drinkers, enjoying one or two drinks in the evening instead of getting blackout drunk.

In light of these successes we should ban smart phones from under 16 year olds. Computers too, after all those can also be used to access AI tools. Anyone who says that adults taking the bare minimum of responsibility has anything to do with parenting instead of taking agency and responsibility from teenagers is just a small-government naysayer


I realize that users of this site might be inclined to think that non-perfect solutions are not acceptable. However, in the real world, all solutions are non-perfect. Like for instance alcohol and tobacco limits: they are a success, even if they don't totally prevent children from consuming those drugs.

A smart device ban would be similar to the ban of those substances. Not terrible, not great, but much better than status quo.


I think you've found the solution! We could probably solve more problems by banning things!


Well no, but we can totally solve the problem of under 16-year olds ruining their and other people's lives with these devices.


It’s not the devices, it’s the internet. I also wouldn’t be surprised if the deep fakes were actually created on a PC.


If we could reliably ban children from the internet, I would probably speak for that solution. I think it's more feasible to ban the devices.


That opens the whole "why is child pornography illegal" question we already have with the legal status of drawn depictions of child pornography.

The most obvious answer is that it's about the harm to the subject of the pictures. Most child pornography is made through exploitation of minors, so we just forbid the whole category. Fictional child pornography, like a drawing (or an AI generated image) doesn't suffer from that, so doesn't have to be outlawed. That's largely the position of the US justice system for example.

Some countries go further, arguing about the impact of child pornography on society, especially pedophiles. Pedophilia seems to be getting worse by consuming child pornography, not better, so that gives reason to outlaw it altogether, no matter how clearly fictional it is. That also gives room for lots of subtlety, like when a Swedish court ruled that a manga expert could keep a drawing that would in other cases be illegal child pornography. Similarly the fact that the case in this article is child pornography made by minors for minors could factor in.

In Spain specifically, the line is drawn at a certain level of realism. Real porn of real children is illegal, so are things that look exactly like it, manga levels of unrealism are legal, but somewhere between there's the line. Where these AI images fall on that line would be interesting, but impossible to judge without seeing them and having good knowledge of the Spanish legal system.


In 2015, Spain amended its law to specifically protect its citizens from this type of activity (known as "pseudopornography"). See https://www.boe.es/diario_boe/txt.php?id=BOE-A-2015-3439.


Child pornography is one of the worst forms of human depravity, because in order to satisfy one person's messed up desires, a child is subjected to unspeakable sexual violence. We're conditioned to protect our young for lots of obvious reasons, so this triggers an understandable an entirely justified visceral response.

This particular situation is ... different. It's clearly still causing pain to children. It's using their likeness without their consent and in a sexually violent way.

But ... I can't get behind the idea of equating it to child pornography.

It should absolutely be considered a crime, and come with its own set of punishments for those found guilty.

Again, making it absolutely clear that I personally find this act to be vile, unacceptable and highly antisocial, I also think that it should be published much less severely than producing/distributing .. err ... "actual" child pornography...?

We treat manslaughter and murder as different things, perhaps that's a suitable analogy here?

This also seems similar to the whole issue of deepfaked porn involving celebrities. When folks said "AI is gonna usher in societal problems we have no idea how to deal with", I never imagined it would get this bad, this quickly.


It depends on whether the AI model was trained on CSAM or not, right?

If it was, then crime. If it wasn’t then no child was harmed and in a free thinking liberal society we don’t punish thought crimes.

And if AI models prevent people from committing actual harm to children, then isn’t this actually a win?

Humans and machines must be free to imagine. And as a society we must tolerate all art, even if it depicts something most people find gross. Consider, we have books, movies, and video games depicting killing, even though it’s illegal.


> one person's messed up desires

Philosophically speaking here, but this was said about atypical sexual preferences in the past.

Of course, distributing such images should be illegal. But perhaps generating them solves a problem?


Yeah, I've wondered that too.

I'm a fan of not yucking other people's yum, especially in the privacy of their own homes and not infringing on the liberty, safety or wellbeing of others.

That said, if we've got folks who just straight up like child pornography (which we do, and always will for as long as we remain a race of homo sapiens, sadly), would the ability for them to consume this kind of generated content actually help? Or would it encourage these people to then go further and prey on real human children?

I simply have no idea. I grew up with violent video games. I've had violent thoughts. But blowing someone's brains out in a game has never motivated me to do it in real life. I think that whole era of moral panic was silly. But human psychology is complicated, this could be very different.


Well... it's certainly interesting to ponder. As a personal anecdote I had zero parental supervision growing up and I spent an absurd amount of my formative years on the dark corners of the early Internet. During that time I got hooked on pornography the effects of which I deal with to this day. If I replay the scenario but insert the possibility of stumbling across what amounts to a pedophilia creation machine I... really don't want to think about it...


Creating pornography featuring the likeness of anyone, child or adult, should automatically be classified as a crime similar to revenge pornography laws.

Creating child pornography that does not feature the likeness of someone living or dead should be prosecuted under obscenity laws, but not as child abuse, since by definition no children were abused.


> I'll concede that it's not well thought out.

It doesn’t take into account whether there’s a victim or not. In a free thinking liberal society we don’t punish thought crimes because the concept is absurd. It’s what allows us to have diversity of art, literature, thought, etc. Fantasy != reality.

Today we allow all manner of “unspeakable” acts to be portrayed and imagined: war, murder, sexual abuse, speeding, gambling, fraud, you name it we can write, draw, think, and talk about it. There’s nothing fundamentally special about portraying a minor in a lewd way in that sense.

So I think any call to heavily punish people for a new crime should be framed in the context of: who’s the victim and what harm are we preventing? If there is no victim then it’s much harder to build a case that there’s harm.


>who’s the victim

The article discusses 20 victims.


Thisnudechilddoesnotexist.com?




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: