On January 27, 2025, a new wave of AI-generated fan edits took social media by storm, sparking discussion among K-pop fans and raising ethical concerns about digital privacy and artist protection.
Unlike traditional deepfakes, which involve superimposing a celebrity’s face onto another person’s body, these edits manipulate still images to create eerily realistic scenarios—showing idols hugging, kissing, or intimately interacting with fans.
The AI technology behind these edits is becoming increasingly sophisticated, making it difficult to distinguish between real and fake images. Similar deepfake edits of K-pop idols like BTS' Jungkook, TXT's Yeonjun, BLACKPINK's Jennie, and more went viral online since 2024.
Some agencies, like HYBE and JYP Entertainment, have previously taken legal action against deepfake content, malicious edits, and AI-generated material misrepresenting their artists. However, current policies may not be comprehensive enough to address this new wave of AI-powered fan edits.
Many K-pop enthusiasts have criticized this emerging trend, describing it as "horrifying" and "dystopian." One fan wrote how a particular deepfake edit of K-pop idol Yeonjun from TXT seemed "horrifying" as he was seen kissing a fan. The fan wrote on Reddit:
"I feel bad for Yeonjun. Finding a fake realistic version of yourself doing something, no matter how tame, seems horrifying and dystopian. In the case of this video specifically, some fantasies need to stay in your head. It’s better for all of us, especially the idol themselves."
In response to the growing controversy, many K-pop fans started raising awareness about the responsible use of AI. Some fan communities on Reddit took it upon themselves to report inappropriate edits and discourage their circulation.
"It's incredibly dangerous to the person on personal lvl n to the environment - one could argue AI industry will turbocharged climate change, to a point of no return," a Reddit user wrote.
"I just pity the people who need to do these sorts of things just to make them happy or entertained. Admiration should not equate to delusion. Can't people love their faves without feeling the need to possess them?" another netizen remarked.
"Fanfiction gonna get real crazy now lmao. On a more serious note, AI only going to keep getting better. I expect governments worldwide to put in some huge restrictions," another netizen stressed.
Others criticized online users who use deepfake or AI technology to create such "edits," which are invasive to K-pop artists' privacy.
"I honestly enjoy fanfiction but these kind of video depictions make me sick. Bc it’s not going to be so obvious and fake. This one looks fake it would fool many and in another couple months, ai is just going to look better and better. It’s gross how it will be misused. And the govts will do nothing," a netizen wrote.
"AI is harmful in many ways. claiming to 'love' someone and violating them using AI is crazy work," another netizen wrote on X.
"Istg i saw that earlier and sat there in shock," another netizen added.
South Korea's rampant deepfake and AI explicit edits target K-pop idols and actresses
Since August 2024, South Korea has been grappling with a surge in digital s*x crimes, particularly the proliferation of deepfake p*rnography. In response, President Yoon Suk-yeol called for intensified efforts to "eradicate" these offenses, emphasizing the situation's urgency.
Deepfakes are digitally manipulated media that use artificial intelligence to create hyper-realistic but fabricated content. In the context of p*rnography, this technology superimposes the faces of individuals onto explicit images or videos without their consent.
This invasive practice has become alarmingly prevalent in South Korea, with reports indicating that many victims and perpetrators are minors.
According to the BBC, the issue gained significant attention after unverified lists of schools with potential victims circulated online in August 2024. This led to widespread fear, prompting many young women to remove personal photos from social media platforms.
AP News reported in September 2024 that in response, President Yoon ordered officials to "root out these digital s*xual crimes," initiating a seven-month special police crackdown scheduled to continue until March 2025.
In October 2024, PBS News reported that the South Korean entertainment industry, particularly K-pop idols, was notably affected by deepfake p*rnography. A report by the U.S. cybersecurity firm Security Hero highlighted that South Korean idols and actresses constitute more than half of the individuals featured in deepfake p*rn worldwide.
In September 2024, the Korea Herald reported that the major K-pop agencies, such as YG Entertainment, took legal action against the creators and distributors of such content. Reportedly, over 200 K-pop idols and actresses were featured in the deepfake edits illegally.
On September 2, YG Entertainment announced its commitment to actively monitor and delete illegal videos, emphasizing that they are "taking all possible legal actions" to protect their artists.
“We are actively monitoring this widespread and malicious illegal activity, and making efforts to delete such illegal videos. We are also taking all possible legal actions. We will continue to respond vigorously and strictly to all illegal activities that seriously harm the reputation of our artists,” YG Entertainment said.
The South Korean government is also working with lawmakers to strengthen legislation against deepfake-related crimes. Recent legal revisions have made it illegal to watch or possess deepfake p*rnography, with violators facing up to three years in prison or fines up to 30 million won (approximately $22,870).
Additionally, the maximum sentence for producing and distributing such content has been increased from five to seven years.