Home Analysis Privacy, free speech and misinformation: The Telegram dilemma

Privacy, free speech and misinformation: The Telegram dilemma

By: Ankita Kulkarni

September 6 2024

Telegram app has attracted scrutiny for its lack of content moderation, allowing the spread of conspiracy theories, mis- and disinformation. Telegram app has attracted scrutiny for its lack of content moderation, allowing the spread of conspiracy theories, mis- and disinformation, and reportedly the organization of criminal activities. (Source: Logically Facts)

On August 24, 2024, Telegram’s founder and CEO, Pavel Durov, was arrested in Paris over allegations that his messaging app was being used for illicit activities such as drug trafficking and distribution of child sexual abuse material.

Even in the past, the app has attracted scrutiny for its lack of content moderation, allowing the spread of conspiracy theories, mis- and disinformation, and reportedly the organization of criminal activities. So, how did the platform become a “go-to application” for easy dissemination of misinformation and extremist content?

In the subsequent sections of the story we take a closer look at what makes Telegram a minefield of misinformation and if something could potentially be done to counter it. 

Hotbed of misinformation

Following Durov’s arrest, a fake video stating that the UAE had frozen the purchase of 80 fighter jets from France was widely shared on social media. We traced the origins of this claim back to Telegram, revealing them to be a part of a larger campaign producing fake videos using the logos of mainstream media outlets to lend them legitimacy.



Screenshot of viral posts claiming that UAE had frozen the purchase of 80 fighter jets from France due to Durov's arrest. (Source: X/Telegram/Modified by Logically Facts)  

The messaging app has also been used to spread anti-Muslim hate speech. For instance: A recent post on an India Telegram channel was used to target the Muslim minority community. A video, which showed some boys attempting to unbolt a railway track, was shared with a claim that it was from India, and the video was shared with the term “Rail Jihad”. The video elicited hateful comments from the users, but we found that the video was unrelated to India and showed an incident that was reported in Pakistan. 


Screenshots of posts circulating false information with religious hate. (Source: Telegram) 

Another gruesome video of a woman being sexually assaulted was widely shared on the platform at the peak of the unrest in Bangladesh with a claim that it showed a Hindu woman being assaulted at Dhaka University by Muslim men. However, we found that the video was old and was from another incident that was reported in 2021 in Bengaluru, the capital city of the South Indian state of Karnataka.   

Such claims, which have communal overtones, often perpetuate hate and religious animosity between groups. In the aforementioned case, the clip originated from a Telegram channel called “Islamic Army - latest version” and eventually found its way to other platforms like X, Facebook, and Instagram. 

Screenshot of viral posts claiming that a Hindu woman was sexually assaulted at Dhaka University by Muslim men. (Source: X/Screenshot/Modified by Logically Facts)

Channels with thousands of followers like ‘Katie Hopkins,’ ‘Concerned Citizen,’ and ‘Wide Awake Media,’ known for spreading misinformation on COVID-19 vaccines and climate change denial, also play a significant role in disseminating falsehoods.

“During the COVID-19 pandemic, Telegram almost certainly increased in popularity amongst anti-lockdown and anti-vax online communities, broadly due to this lack of moderation and the removal of key figures from conventional social media platforms. Over time these communities have very much morphed into conspiracy groups, far-right, or a combination therein, a perfect storm for false information,” said JJ Robson, investigations regional lead (U.K.), Logically.

Another investigation by Logically in 2021 noted that an account named “GhostEzra,” was running the largest QAnon Telegram channel with 300,000 subscribers and was “spreading increasingly extreme antisemitic propaganda.”

While these are some of the recent examples, in the past, the app has also been used to circulate extremist posts and propaganda. Reportedly, the app was used by the perpetrators of the November 13, 2015, attack in Paris, which left over 100 people dead. It was also used by the Islamic terror group ISIS to recruit people for the 2016 Berlin Christmas market attack and the 2017 Istanbul nightclub attack. In 2016, CNN reported that two French jihadists who killed a priest first met and coordinated their attack on Telegram.

BBC News reported that Russian intelligence also linked the app to the 2017 St. Petersburg metro explosion and 2019 London Bridge attack, where ISIS claimed responsibility for the assault. Most recently, the Kremlin stated that the gunmen who attacked a Moscow concert hall in March 2024 were also recruited from Telegram.

In India, Telegram has also been implicated in various cases of financial fraud — including the use of AI to create deepfakes — and exam paper leaks. The Securities and Exchange Board of India (SEBI) investigated a stock-price rigging racket on the platform, and major medical exams like NEET-UG were canceled due to a paper leak that was circulated on the app.

The perfect cover?

The application provides the option of a “secret chat,” allowing only the conversation participants to read the messages. These chats, which are end-to-end encrypted, also have a choice of “self-destruct timer,” allowing the sender to destroy all messages and media from the sender and the receiver’s device. 

“All secret chats in Telegram are device-specific and are not part of the Telegram cloud. This means you can only access messages in a secret chat from their device of origin. They are safe for as long as your device is safe in your pocket,” Telegram states on its website. 

Individuals can also anonymously join Telegram groups and channels by hiding their phone numbers and personal details. The application allows users to hide their phone numbers, and a note below the option says, “Users who add your number to their contacts will see it on Telegram if they are in your contacts.”

Stressing the need to ensure the highest security standards for users’ protection, public interest technologist Anivar Aravind opined, “I don't see an issue with strong security features like end-to-end encryption.  However, platforms that do not offer default end-to-end encryption, like Telegram, expose users to potential privacy risks. It's essential that platforms ensure the highest standards of security without shifting the burden to users through invasive background checks. Governments should focus on creating policies that protect individual rights rather than mandating mass surveillance. Social media platforms must comply with national laws without compromising users' privacy, but encryption should be the norm, not the exception."

The app also allows people to form groups “for up to 200,000 people or channels for broadcasting to unlimited audiences,” making it easier to send information with just a tap. This is unlike other platforms, such as WhatsApp, which allows a maximum of 1,024 members. 

These features could potentially benefit users living under authoritarian governments to express and organize, but they have also been exploited by those who regularly disseminate misinformation, extremist content, and hate speech. 

Amber Sinha, an information fellow at Tech Policy Press, said, “Telegram isn’t just used for exchanging messages. It's a tool for disseminating information and mobilizing people. However, there has been little commitment to content moderation, and Telegram has done very little to address or remove misinformation on its platform. There is no self-regulation that we see from Telegram.”

Sinha further noted, “There have been proposals to break encryption for traceability, but that is not a good idea as it could lead to totalitarian control and compromise free speech. If law enforcement wants access for an investigation, there is a lot of other metadata that they have access to which they can employ.”

In a recent statement following Durov’s arrest, the app’s founder emphasized establishing a balance between privacy and security and said that a platform wants to ensure that it is not “abused in countries with [a] weak rule of law.” The statement read, “Our experience is shaped by our mission to protect our users in authoritarian regimes. But we’ve always been open to dialogue. Sometimes we can’t agree with a country’s regulator on the right balance between privacy and security. In those cases, we are ready to leave that country.” 

Through the statement, Durov acknowledged that the platform has flaws but also said they moderate content, publish transparency reports, and even communicate with NGOs regarding urgent issues. Referring to the app’s growth — a user base of 950 million — the statement says that it has presented challenges “that made it easier for criminals to abuse our platform” but assures action.  

The content moderation challenge

Telegram's rules regarding content moderation of mis- and disinformation shared on its platform are unclear. Users have the option of reporting content through the app, but the reporting mechanism for different categories is slightly different for devices, a December 2022 report by the EU Disinformation Lab found. 

On takedowns, the policy states, “Please note that this does not apply to local restrictions on freedom of speech. For example, if criticizing the government is illegal in some country, Telegram won't be a part of such politically motivated censorship. This goes against our founders' principles. While we do block terrorist (e.g. ISIS-related) bots and channels, we will not block anybody who peacefully expresses alternative opinions.”

Commenting on the need for content moderation, Sinha quipped, “I don't think Telegram has done much to counter misinformation. We've seen, in some cases, Telegram working with certain governments. But again, that has been limited to autocratic governments like Russia. I don't think it puts a lot of friction on it. So far, it has not been very receptive to discussions about the need for greater content moderation controls.”

Following the multiple terrorist attacks in Paris, Berlin, U.K., and Istanbul, Telegram took down multiple ISIS-related public channels and agreed to set up a group of moderators. The platform has also set up a channel called ISIS Watch, which regularly provides updates on the number of groups that it takes down on a daily basis.

Screenshots of the Telegram channel ISIS Watch. (Source: Telegram) 

Sinha underscored the need for an international consensus on content regulation that goes beyond government-provided data models.  “At the core, we require platforms to demonstrate a clear commitment to their policies and provide transparency about how they enforce them. Right now, all of this happens behind the black box, and we don't have access to how it's happening, so we can't audit their decisions.”  

“Platforms need to establish clear standards and be more transparent about their enforcement. I think some cases are kind of complex; the content might not qualify as hate speech but can still be discriminatory or offensive toward certain religions. When such content starts going viral, platforms need to assess and limit its reach to prevent harm,” he added. 

Sinha suggested adopting features implemented by other messaging apps like WhatsApp, which label a message that has been forwarded multiple times. Emphasizing on the need for better reporting mechanisms to flag misinformation directly on the platform, Sinha suggested, “Telegram could encourage users to exercise caution when sharing politically sensitive or potentially harmful content by providing automated feedback based on on-device analysis, ensuring that users verify the information they are about to share without compromising their privacy. So those are, I think, some automated AI solutions that can be facilitated on any platform.”

(Editor’s Note: We have reached out to Telegram, and the story will be updated if and when we receive a response.)

(Disclaimer: Founded in 2017, Logically is Logically Facts’ parent company. It combines artificial intelligence with expert analysts to tackle harmful and manipulative content at speed and scale.)

(Edited by Sanyukta Dharmadhikari)

Would you like to submit a claim to fact-check or contact our editorial team?

0 Global Fact-Checks Completed

We rely on information to make meaningful decisions that affect our lives, but the nature of the internet means that misinformation reaches more people faster than ever before