Editorial  When you purchase through links on our site, we may earn an affiliate commission. Here’s how it works.

10 years after The Fappening literally everyone can recreate it. And it's as dark as it gets

An artificial woman in red dress

It's the end of summer 2014. Everything is nice and sweet, Lumia smartphones are still a thing (kinda), and Microsoft is just a few weeks away from officially announcing a big Windows project codenamed Threshold.

Then, an internet bomb drops. Hundreds of private, often nude, photos of celebrities – Jennifer Lawrence, Rihanna, Kim Kardashian, and Kate Upton, to name just a few – are flooding the pages of 4chan and Reddit. Over the next couple of weeks, by the end of the summer, more batches of spicy material appear.

To this date, the event is most commonly known in internet culture as "The Fappening."

All the private content was stolen from victims' iCloud accounts. Apple denied the allegations of security breach by claiming that the accounts were accessed via a spear phishing campaign – a targeted operation trying to lure login credentials from specific celebrities. In the meantime, Google was busy deleting all the naughtiness from search.

Over the next four years, several men associated with The Fappening pleaded guilty to unauthorized access to a protected computer, with sentences ranging from nine up to 34 months.

Celebrities learned their lesson about online privacy real hard (and, hopefully, the bystanders learned from their mistakes, too).

Google search results for deepfake pornography
Recently, Google was criticized for showing AI-generated pornographic content in its search. Bing and DuckDuckGo were blamed for doing the same.

Fast-forward to 2024, and Bing, alongside Google and DuckDuckGo, is criticized for showing websites with fake celebrity nudes in search.

No, it's not the original Fappening, nor is it another major leak. It's because, in 2024, you don't have to wait until some celebrity gets caught up by a privacy breach. In 2024, you can simply create your very own Fappening.

As weird as it sounds, it's the reality of a new digital era dominated by generative artificial intelligence and the creativity – and often the maliciousness – of its users. The era in which you have to check the correct number of fingers on a person's hand before you get impressed by a picture on social media.

If somebody wants to see their favorite celebrity naked today, they can simply ask AI to get a satisfactory (no pun intended) result. All that privately in the comfort of their home, without anyone noticing or worrying about legal consequences.

Or even easier, they can visit one of the dedicated websites sharing exactly this kind of videos with celebrities' faces put onto their "body doubles."

The incredible abilities – which will only improve as time flies by – of generative AI tools, their broad availability, and their ease of use have unlocked a Pandora's box of its kind.

No, I don't want to scare you with some world-threatening artificial general intelligence (AGI) predictions. People can already do all the potential harm that general AGI can potentially cause today. Generative AI is just one more powerful tool in their hands to do just that.

Man, I'm bored to talk about the downsides of AI all the time. And I can imagine you are bored to read about it every other day. But it is what it is. Artificial intelligence has a huge number of positive applications that might improve our everyday lives. Still, it brings as many negatives, and so far, it looks like we are not ready to tackle many of them.

After all, that's the ever-existing problem of cutting-edge technologies where most of the issues are being addressed after the fact. In this particular case, the ongoing race for technological superiority is not helping.

Open Letter asks to pause AI training for six months

But let's get back to AI and its naughty users. Swapping the faces of an adult film star and a celebrity is fairly easy. Some of you might think it hurts no one (the celebrities might oppose it, though). But the very same technology can be used for really cruel stuff.

Have you ever heard about revenge porn? I'm sure many, if not most of you, did. Quick example as a summary: people break up, dude is really not over it and takes the ex-girlfriend's or ex-wife's private photos online to humiliate her.

Let's assume they are just a few photographs that were meant only for the eyes of her loved one. Now imagine she finds online a full-length video of herself doing it with a bunch of strangers – an event that has never occurred, yet the person in the center of the action looks convincingly like her.

You don't want to experience something like that. But many did or will…

Image depicting artificial intelligence

This whole topic goes beyond porn and toxic relationships, though. The malicious uses of AI might even have society-wide implications, for example, when used in a political race (presidential election 2024, I'm talking about you!).

It's easy to attack your opponent and accuse them of something discrediting. And it's much easier to do so when you can back up your claims with a convincing, anonymously shared video of your opponent doing or saying exactly that. But we could continue all night, right…

To summarize – yes, the times have changed, and technology has evolved significantly since 2014. And I would love to offer you a silver-bullet solution, but I'm not going even to pretend that I have one. We are just observers; we can't really do much about it.

However, I am not implying that artificial intelligence is bad. Not at all. Actually, the widely available and applicable artificial intelligence might be one of the greatest technological advances of this century. But every great technology comes with downsides and opportunities for its misuse.

So, what we can do is to have an awareness of the situation as it is right now and a sense of what might come in the future. Because we are still only in the innocent phase of AI, where people are just learning about the potential of it and what everyday people can do with it – good or bad. And no matter how many safeguards the tech companies put in place and how much the governments regulate the use of AI, there will be a lot of bad anyway. A lot…

Report a problem with article
amazon fire hd 10 tablet
Next Article

Save 42% on Amazon Fire HD 10 64GB tablet, now available for an all-time low price

TWIRL logo in front of an Electron rocket
Previous Article

Rocket Lab to launch space debris tracker satellites - TWIRL #148

Join the conversation!

Login or Sign Up to read and post a comment.

20 Comments - Add comment