Brazil vs. X and France vs. Telegram... vs. users

Brazil vs. X and France vs. Telegram... vs. users
Photo by Reiner Knudsen / Unsplash

Pavel Durov is not the world’s most sympathetic detainee. He is a billionaire tech founder whose messaging app, Telegram, has been a safe harbor for school shooters and neo-Nazis. Under his direction, Telegram has operated like a free-for-all, allowing all manner of content to flow through its servers– all in the name of user privacy. “Our right for privacy is more important than our fear of bad things happening, like terrorism,” Durov has said.  

French authorities took a different view. Last month they arrested Durov, charging him with criminal offenses and barring him from leaving the country. The charges are part of an investigation into Telegram’s alleged role in drug trafficking, child exploitation, and other illegal activities—and that the company has not complied with requests from law enforcement. In a statement from Telegram wrote, “It is absurd to claim that a platform or its owner are responsible for abuse of that platform.” 

Another absurdity with Telegram is the misunderstanding that it is an encrypted messaging service. Unlike competitors like WhatsApp, iMessage, and Signal, which use advanced encryption standards, Telegram’s encryption is neither comprehensive nor trustworthy. End-to-end encryption can only apply to 1-1 chats, and even then, users must manually enable it. Furthermore most of Telegram’s activity is in group chats, where encryption is not even an option. And lastly this meager encryption feature that Telegram offers has never been subjected to a third-party audit to ensure it truly is confidential, privacy-respecting and secure.

These massive shortfalls in security and user privacy disadvantage human rights defenders, activists, journalists, and other marginalized groups. Where civil society groups have been unconvincing to Durov and Musk, they’ve triggered real consequences in the form of regulatory crackdown. France’s drastic actions against Telegram come after prolonged and carefully crafted regulation of content moderation practices of the EU’s Digital Services Act. The DSA is a landmark approach to how platforms are governed. It is designed to increase accountability and transparency for online platforms, ensuring safer digital spaces by holding companies responsible for illegal content and protecting users' rights.

Accountability also factors large in the recent news that for privacy and political reasons Brazil blocked X, all VPNs in case Brazilians tried to get to X anyway, and– for good measure– financially sanctioning Elon Musk’s other company Starlink. Musk is also not a sympathetic figure and Brazilian regulators justified the country-wide block of the platform because they say at Musk’s direction, X has failed to comply with local content moderation laws and flagrantly avoided accountability by closing its office there. That VPNs were also blocked highlights the collateral damage done when the regulation of speech acts are counterproductive to privacy and, well, also speech.

In Durov’s case, the albeit flimsy encryption Telegram offers may have played a role in the extreme action taken. French authorities have charged him with two offenses related to unlicensed use of cryptology tools, citing an obscure French law. It suggests that France may be using a technicality to exert pressure on Telegram, leveraging an old law in a new context. While most countries have relaxed import/export controls on encryption due to its importance for security and human rights, arresting Durov because Telegram did not meet its declaration obligation to the French Cybersecurity agency ANSSI may be stretching a technicality. We have seen growing hostility to end-to-end encryption worldwide, and under Hungary’s EU Presidency the #chatcontrol proposal would backdoor WhatsApp, Signal, iMessage and other encrypted messaging apps.

The global context of Durov’s arrest points to a growing discord between national security, privacy and corporate responsibility. International laws and regulations– as well as those in hostile nations– have tech CEOs figuratively and literally circumnavigating the globe.

But Europe is not a hostile nation. Durov's arrest over Telegram's content moderation practices spectacularly undermines the Digital Services Act, highlighting that there is an ever increasing threat of legal trouble in addition to the efforts platforms must make to meet regulatory standards.

This does not mean that X and Telegram should be immune from accountability. Telegram and X have ignored the concerns of human rights organizations about its inadequate content moderation and security. For Telegram in particular, this is against the backdrop of compliance with government requests from authoritarian regimes such as Iran, Saudi Arabia, and Russia. If Telegram took its duty of care as a social media platform seriously, it would take stronger action against harms. In fact, when faced with a two-day ban in Brazil, Telegram was forced at last to change its content moderation practices around disinformation.

The arrest comes at a time when Europe grapples with its own contradictions—championing privacy on one hand, while increasing demands for access to encrypted communications on the other. This raises troubling implications for any platform that prioritizes privacy and implements end-to-end encryption. Imposing legal liability on platform intermediaries for all activities occurring on them incentivizes intrusive surveillance. Some platforms choose to limit their access to user data through end-to-end encryption and metadata minimization, strategies that shield both the platform and its users from legal risks. 

Despite this it seems unlikely that other CEOs will be arrested for implementing end-to-end encryption. Both X’s and Telegram’s problems are about the next-to-nil and inconsistent application of content moderation practices of widespread criminal and terrorist activity that places the burden on user reporting. They also both have a “brand” of refusing to cooperate with authorities.

But the actions of the French government should come under scrutiny as well: Human rights groups have said that archaic cryptography laws can be used as a cudgel for political needs, putting privacy, free expression and the right to associate and protest at risk.

The United Nations Office of the High Commissioner on Human Rights issued statements in a press briefing yesterday in what seems to be an effort to productively move this case forward not for Durov and Telegram, but for the human rights community that is split between wanting CEOs to comply with local norms and those that are worried about surveillance and censorship. Ravina Shamdasani, OHCHR spokesperson, said in a media briefing that her office might work to set out "the parameters within which these situations should be looked at."

Ultimately the ongoing legal issues faced by X and Telegram illustrate the need to move away from “one size fits all” megaplatforms. Telegram, in particular, must both implement robust encryption and expand its content moderation practices beyond inadequate user reporting, which could reduce their exposure to legal challenges while protecting user privacy. These issues also underscore the importance of balancing abuse mitigation with the preservation of free expression and privacy, acknowledging that different contexts have their own definitions of acceptable conduct.

While the cases have yet to find clarity, what is apparent is that users are caught between unaccountable corporations and overreaching governments. What we need is a public reckoning of how well the largest companies are doing to balance the demands of governments with the rights of their users. And whether governments are being careful enough not to sacrifice human rights and democratic ideals in their pursuit of having the upper hand.

Subscribe to Internet Exchange

Don’t miss out on the latest issues. Sign up now to get access to the library of members-only issues.
jamie@example.com
Subscribe