Home Articles Deepfake deluge enrages South Korea - but anonymity challenges prosecution

Deepfake deluge enrages South Korea - but anonymity challenges prosecution

By: Sophie Perryer

September 4 2024

The Korea Communications Standards Commission holds an urgent meeting on digital sex crimes after news of a deepfake image scandal broke on August 28, 2024. The Korea Communications Standards Commission holds an urgent meeting on digital sex crimes after news of a deepfake image scandal broke on August 28, 2024. (Source: Reuters)

Within hours of going live on August 26, a crowdsourced map of South Korea had already been populated with red pins, painting the country in a sea of scarlet. Each pin ostensibly represented the location of a school whose pupils or staff were victims of deepfake pornographic image generation. 

The map was a representation of a scandal that has gripped South Korea in recent days, as anonymous users on X, formerly Twitter, shared screenshots of Telegram groups where these deepfake images and misogynistic messages were allegedly being shared. Some accounts, many with bios describing themselves as "Korean feminists," sought to amplify the allegations to a broader audience by superimposing English translations beside the original Korean characters in the screenshots.

Those amplification efforts have engendered a widespread response; numerous local and global media outlets have covered the story, South Korea's national police launched an investigation into the images being shared, and the country's President Yoon Suk-Yeol called an emergency cabinet meeting to discuss the scourge of digital sex crimes. 

However, the anonymity of Telegram users and those sharing screenshots of the groups makes verification complex — and could undermine efforts to bring perpetrators to justice. 

How were these groups discovered?

On August 24, an anonymous X user shared a post that stated, "Breaking: Another Nth Room Crime in Korea," accompanied by two siren emojis. Nth room refers to a 2019 digital sex crimes scandal in South Korea where a user nicknamed "god god" was found to be engaging in cybersex trafficking through Telegram groups.

The X post, which has been viewed more than 18 million times as of September 4, went on to detail allegations of Korean men creating Telegram chatrooms and sharing images of their female family members. These were then reportedly composited with AI-generated images of nude bodies, creating deepfakes of hundreds of women and girls. According to reporting by South Korean media, one of the channels creating deepfake images on demand has more than 220,000 subscribers.

What about the map?

An anonymous user shared a link to the deepfake map on X on August 26, with a caption in Korean explaining it had been created to collate a list of all the schools affected by the deepfake scandal. This followed other online claims that the deepfake pornographic images made through Telegram were generated using publicly available photographs of South Korean students at elementary, middle, and high schools. This would make some of the girls who had their images repurposed as young as eight.

A screenshot of the map with red pins showing locations of schools whose pupils and staff had allegedly been targeted by deepfake pornographic imagery. (Screenshot/Logically Facts)

Given the crowdsourced nature of the map and the anonymity of users contributing to it, its authenticity cannot be verified. However, the Ministry of Education said it has so far received reports of deepfake sex crimes at 196 schools across the country, 179 of which have been referred to police for further investigation.

As this article was published, the map appeared to have been removed, and the web address now returns an error page. 

The Telegram accounts

Logically Facts was able to access two Telegram accounts which appear to function as deepfake image generators, based on screenshots of the accounts that were reshared on X. 

One of the accounts has a profile image of a nude woman's torso with what appears to be an Instagram logo superimposed on top. The account's bio reads, "Send the photo to me, let me do that for you," followed by a winking emoji. 

The way the account presents itself leads us to believe it would generate deepfake pornographic imagery in the manner described by users on social media. We elected not to test it due to safety and privacy concerns over how a test image could be reused.

A screenshot of a Telegram account that appears to generate deepfake nude images. (Screenshot/Logically Facts)

The scale and extent to which these deepfake nude images are circulating in Telegram groups is challenging to assess. It's not clear how many pupils or teachers were allegedly affected at each of the 179 schools that education authorities are investigating, and it's near-impossible to assess how many women and girls may have had their images taken without their consent or lifted from public social media profiles to generate such images. 

South Korean news agency Yonhap reported it had located one Telegram group distributing these deepfake nude images with 3,500 members. However, Logically Facts has not been able to verify this independently.

It's also unclear to what extent any of the deepfake images had been shared on other platforms outside of Telegram. Logically Facts previously identified at least one X account that appeared to be sharing the sort of deepfake pornographic images which were circulating on Telegram, depicting the faces of women likely of South Korean origin with AI-generated nude bodies. However, X has deactivated this account.

A screenshot of an X account that had been sharing deepfake pornographic images but has since been suspended. (Screenshot/Logically Facts)

The challenges of verification

The double-layered anonymity, in terms of the Telegram users sharing deepfake images and the anonymous X accounts sharing screenshots, has complicated efforts by local media and authorities to investigate these allegations. 

Many of the screenshots being shared on X do not show the name of the Telegram groups the messages allegedly appear in, and even if they did, if those groups are closed, a user would have to convince a moderator with the power to admit them to see their contents. It's also unclear whether the anonymous X accounts sharing the claims represent the users personally affected by deepfake images or whether they are resharing the stories second or third-hand.

Raphael Rashid is a freelance journalist in Seoul who has been covering this story. "Not to diminish what's happening, but there's so many secondary, third and fourth-factor sourcing going on," he told Logically Facts. "I do think a lot of these Telegram rooms have been grabbing pictures and victimizing people — [but] we don't know how many, we don't know what's real and what's not, or who is the victim."

Telegram has even less user accountability than X and has been criticized multiple times for allowing users to create anonymous accounts. These can be set up on a device without a SIM, using a blockchain-based anonymous number, effectively prohibiting any investigation if those accounts subsequently engage in criminal behavior. 

Logically Facts contacted Telegram to ask whether it was aware of these groups' presence and what action it had taken against them. 

A spokesperson told us, "Telegram actively combats harmful content on its platform, including illegal pornography. Telegram's moderators proactively monitor public parts of the app, use AI tools and accept user reports in order to remove millions of pieces of content each day that breach Telegram's terms of service."

Social context

South Korea has long contended with online sexual abuse, the 2019 "Nth room" case being just one example.

In 2021, Human Rights Watch released a report proclaiming the country the global leader in the use of spycams, or "molka," which are often planted in public bathrooms, hotels, or changing rooms to capture women in compromising moments. The images are often subsequently published online without the victim's consent.

South Korean police say deepfake sex crimes have surged in recent years, with 297 cases reported in the first seven months of 2024, compared to 180 in the whole of 2023. Teenagers were responsible for more than 75 percent of deepfake video and image crimes in 2023, according to police data cited by the Korea Times.

Prosecuting these crimes is complex, again due to the anonymous nature of the accounts sharing the images, but also because of legislative limitations. 

Making and distributing pornographic deepfake images carries a potential jail term of up to five years and a fine of up to 50 million won ($37,500), but no law currently exists to punish those downloading or paying for the deepfake images. Children between the ages of 10 and 14 also can't be held criminally responsible under South Korean law — a hindering factor in this case, given the proportion of teenagers involved in creating and distributing explicit deepfake imagery. 

South Korean authorities are now pushing for the maximum jail term for making and sharing the images to be increased to seven years. The government is also appealing to Telegram to establish a "hotline" to increase collaboration between moderators and law enforcement. It's hoped that a multilateral approach will bring an end to the networks distributing these deep fake images, which have so far evaded accountability. 

Speaking at a televised cabinet meeting as the scale of this scandal was revealed, South Korea's President Yoon revealed his government's stance on the issue. "It's an exploitation of technology while relying on the protection of anonymity," he said. "It's a clear criminal act."

Would you like to submit a claim to fact-check or contact our editorial team?

0 Global Fact-Checks Completed

We rely on information to make meaningful decisions that affect our lives, but the nature of the internet means that misinformation reaches more people faster than ever before