Pavel Durov’s legal battle with French prosecutors took the Telegram founder by surprise. But some of the app’s avid users have long warned him of the need for less-freewheeling management.
Among the warnings: More than two dozen organizations wrote a letter to Durov in late 2021, asking him to implement transparent content rules and policies, develop ways for people to communicate with his company and add an appeals process for frustrated parties. Durov didn’t respond, the organizations said.
The organizations’ users, focused primarily on human rights, journalism advocacy and internet freedom, had embraced Telegram as a tool to organize protests and share information, but they were frustrated. In the letter, they cited Telegram’s “opaque and arbitrary decisions” around human-rights compliance and lack of responsiveness to reports of abuses on its app, as well as a lack of many basic security functions.Now Durov is defending himself against charges that he was complicit in distributing child pornography, illegal drugs and hacking software, following his arrest late last month. French authorities also charged him with refusing to cooperate with European authorities to address illegal content.
“As Telegram grew, you can justify less and less the absence of meaningful processes and policies,” said Natalia Krapiva, senior tech legal counsel at Access Now, a group that advocates for human rights on the internet and that spearheaded the 2021 letter.
Durov this month rejected what he called efforts to portray Telegram as “some sort of anarchic paradise,” and said it removes millions of harmful posts and channels daily.
A Telegram spokesman said strengthening its moderation capabilities is the company’s main priority for 2024. “There is plenty of work to do,” he said.
As Telegram’s popularity increased, its user count grew, rising above 950 million. The site also went from a place for dissidents, journalists and activists to a home for terrorist groups and other extremists, including Islamic State, al Qaeda and neo-Nazis. The criminal groups were drawn to many of the same features that had excited Telegram’s mainstream users, including massive groups and controls on who can see a user’s phone number.
“Telegram serves as ground zero for a lot of the worst things you see on the internet,” said Isabelle Frances-Wright, director of technology and society at the think tank Institute for Strategic Dialogue.
In the 2021 letter, the organizations, which included Access Now and the Committee to Protect Journalists, referred to cases where users coordinated on Telegram to plan the assassinations of activists in Iraq, including with bounties of $1,000 in exchange for information on the location of the activists and their families.
More broadly, researchers often trace violent and criminal content on other platforms such as X to Telegram groups where the content originated. In these Telegram groups, Frances-Wright said, she has seen kill lists of notable Jewish people in the U.S. and Islamic State support groups that distribute how-to manuals for making explosives.
While all platforms struggle with nefarious actors and hateful content, she said the scale on Telegram is on a different level: “It’s the ease with which you can find this content, and the sheer volume of it on Telegram.”
The Telegram spokesman said the app’s moderators remove content that encourages violence, such as target lists and instructions for making weapons, and added that the app doesn’t tolerate groups that exist to promote violence.
Telegram has made efforts through the years to combat some of the most dangerous content on its platform. The company has said it developed artificial-intelligence tools and hired teams of moderators in addition to forbidding certain types of content.
Researchers, though, continue to find Telegram hosting dangerous content related to terrorism, child sexual-abuse material, drugs and hacking.
Durov ignored warnings and requests from governments about problematic content on the platform, unless they threatened his platform’s future.
In 2017, when Indonesia’s Ministry of Communication and IT blocked Telegram, Durov said his company would remove all terrorist-related content reported by the Indonesian government. In 2019, a unit of European Union police agency Europol said Telegram was collaborating with it to address terrorist abuse of the platform.
In 2022, when Brazil blocked Telegram because the company hadn’t responded to the country’s requests to remove certain content, Durov initiated conversations with the country. During the Covid-19 pandemic, Durov said his platform was working with various health departments to improve information on its platform.
Following French authorities’ move to charge Durov, Telegram updated its website to add that users can report illegal content with a button that forwards messages to moderators.
The changes appear to contradict other policies on Telegram’s website. “All Telegram chats and group chats are private amongst their participants,” Telegram’s website says. “We do not process any requests related to them.”
Another criticism is that Telegram doesn’t use “end-to-end” encryption on all of its chats by default, something the groups had asked for in their 2021 letter. Such technology prevents people other than the sender and receiver from having access to the content of the message.
Users often haven’t turned on the so-called end-to-end encryption features under the assumption that all messaging on the app was end-to-end encrypted, according to John Scott-Railton, senior researcher at Citizen Lab, a research group at the University of Toronto that investigates cyberattacks on journalists and dissidents.
“This misunderstanding is extremely widespread,” Scott-Railton said.
Write to Georgia Wells at georgia.wells@wsj.com
Catch all the Business News , Corporate news , Breaking News Events and Latest News Updates on Live Mint. Download The Mint News App to get Daily Market Updates.