The suit amounts to a pressure tactic to get Apple to act against Telegram as it already has against Parler, a social media site that swelled with calls for violence and insurrection ahead of the Capitol siege, according to researchers. Apple and Google both have booted Parler from their app stores because of its lax moderation policies, and Amazon Web Services withdrew support as well, disabling Parler last week on the same grounds. Telegram offers both closed, private chat rooms and public groups that anyone with the app can join.
Filing the suit was the Coalition for a Safer Web, a nonpartisan group that advocates for technologies and policies to remove extremist content from social media, and the coalition’s president, Marc Ginsberg, a former U.S. ambassador to Morocco. They complained about Telegram’s role in hosting white supremacist, neo-Nazi and other hateful content, and argued in the lawsuit that such content puts Telegram in violation of Apple’s terms of service for its app store.
A similar suit is planned against Google, said the coalition’s lawyer, Keith Altman.
“Telegram stands out by itself as the superspreader [of hateful speech], even compared to Parler,” Ginsberg said in an interview.
Ginsberg, who is Jewish, asserts in the suit that Telegram’s anti-Semitic content puts him in peril and that his ownership of an iPhone gives him standing to sue Apple in federal court to require that the company enforce its terms of service barring hate speech and incitement to violence on apps carried by the App Store.
The suit, filed in U.S. District Court for Northern California, alleges negligent infliction of emotional distress and violation of the California business code, and seeks unspecified compensatory damages and an injunction requiring Apple to remove Telegram from its app store.
Apple spokesman Fred Sainz did not immediately respond to a request for comment. Mike Ravdonikas, a spokesman for Telegram, did not immediately answer a request for comment.
The siege of the Capitol was widely discussed and instigated on social media and messaging apps, including Parler and Telegram. Supporters of President Trump also celebrated the attack as it happened and called for more in the days leading up to Wednesday’s inauguration. Suing Apple gives the coalition a way to seek action against Telegram, which, as a service based overseas, may be difficult to reach from U.S. courts.
Telegram, which says it operates from Dubai, was developed by Russian Internet entrepreneur Pavel Durov. The app is popular with people who want to keep their communications shielded from autocratic regimes and others seeking online privacy. Durov himself has clashed with the Russian government over censorship and encryption.
But Telegram also has a reputation for being the go-to app of terrorism and hate groups. For years, it was used by Islamic State militants to communicate and spread propaganda, until European police worked with Telegram to take down accounts associated with the group in 2019.
The company has resisted calls to do the same for right-wing accounts that post racist and anti-Semitic messages. Telegram took down some prominent public groups that were calling for violence, but many other feeds remain active on the service. Some police officials have also said the migration from Parler to Telegram has made it harder for them to monitor extremists and prepare for potential attacks.
The prospects for the suit’s success are uncertain. Section 230 of the Communications Decency Act gives online platforms broad immunity from responsibility for most of the content that they host.
Daphne Keller, who researches platform regulation at Stanford Law School, called the lawsuit a long shot, but an interesting one. She said it bears similarity to suits that seek to force platforms to reverse their decisions to remove apps and social media accounts. “This flips the script,” she said, by seeking to force Apple to remove an app.
But the lawsuit may face insurmountable barriers because Apple’s terms of service are broad, giving the company leeway on how it treats apps, she said. Aside from Section 230, Apple’s decision to keep Telegram on its platform is protected by the company’s right to free speech.
Apple doesn’t require apps such as Telegram rid their service of inappropriate content. Rather, they must have a “method for filtering” it out and a way for users to report it. They also must provide contact information and have the ability to block “abusive” users from the service.
But Apple is vague about the methods of content moderation that are required. In the past, Apple has allowed apps to exist even when its customers complained about the content of the app. In 2019, when The Washington Post found reports of unwanted sexual content, racism and bullying on chat apps — some of them used by children — Apple allowed the apps to remain on the store because, it said, they used some content moderation and other safeguards.
When Apple removed Parler from the App Store, it said in a letter that was provided to The Post that Parler’s content moderation policies were not good enough. “While there is no perfect system to prevent all dangerous or hateful user content, apps are required to have robust content moderation plans in place to proactively and effectively address these issues,” Apple told Parler.
Google also removed Parler from its app store, saying it had warned Parler about its lax moderation before making the decision to suspend it.
Ginsberg sent a letter to Apple chief executive Tim Cook in July calling on him to address white-nationalist, anti-Semitic and violent speech on Telegram and to “hold TELEGRAM’s financial feet to the fire.”
He wrote, “Because of the increasing prevalence of Russian and Eastern-European anti-Semitic extremist neo-Nazi groups utilizing TELEGRAM, CSW commenced an in-depth research investigation earlier this year into its role. Our research revealed serious instances whereby TELEGRAM’s end-to-end encryption service was enabling.”
Ginsberg said he got no reply to the letter.