Federal officials warned law enforcement agencies this spring that domestic extremists had used TikTok in the lead-up to the Jan. 6 riots on the Capitol, including by promoting bringing guns to Washington that day, according to an internal government document — highlighting authorities’ growing concern over violent content on the video app.
In the April 19 briefing reviewed by POLITICO, the Department of Homeland Security’s Office of Intelligence and Analysis said American extremists used the Chinese-owned social media platform to recruit people to their causes, as well as share “tactical guidance” for terrorist and criminal activities.
The analysis — shared with law enforcement agencies nationwide — comes as federal authorities and lawmakers examine the role that social media companies like TikTok played in the Capitol riots, which left five people dead.
The DHS alert shows concern that TikTok — already under scrutiny for possibly sending people’s data to China, accusations the company denies — has become a hotbed of extremist activity and that law enforcement enforcement will have to pay closer attention to a platform more associated with viral dance videos than far-right radicalism.
The transparency watchdog Property of the People obtained the document through an open records request and shared it with POLITICO. The group has been conducting a wide-ranging Freedom of Information Act-based investigation looking into the Jan. 6 attack on the Capitol.
In a response to POLITICO, TikTok said it is working to counter extremism. “There is absolutely no place for violent extremism or hate speech on TikTok, and we work aggressively to remove any such content and ban individuals that violate our Community Guidelines,” spokesperson Jamie Favazza said in an email.
Until now, security agencies have paid more attention to the likes of Facebook and YouTube for their roles as potential breeding grounds for hate speech and real-world violence. But as young people turn away from their parents’ social media networks to newer upstarts, these platforms like TikTok are emerging as sources of concern and potential radicalization.
DHS said that it issued the alert specifically about TikTok, in part, because of “some Homeland Security stakeholders’ limited awareness of its functionality.”
While the document shows that DHS is trying to keep up with extremists’ preferred practices and platforms, it may still be behind the curve.
Seamus Hughes, the deputy director of the Program on Extremism at George Washington University, said American national security agencies have struggled to keep up with changes in social media.
“The extremism research field itself is pretty slow on TikTok,” said Hughes, whose organization receives some DHS funding. “There’s something to be said about the demographics of researchers — they tend to skew older. Very few can hear the first five seconds of a TikTok video and know what song that’s referencing.”
Hughes said TikTok is very efficient in bringing extreme content to its users, and that it is “awash” with videos promoting the QAnon conspiracy theory. (TikTok announced a ban on QAnon content last year.)
“The TikTok algorithm is so good that before you know it, you’re on a domestic violent extremism spiral,” he said.
The DHS report made a similar point.
“TikTok’s application layout and algorithms can unintentionally aid individuals’ efforts to promote violent extremist content,” the report says.
“A user’s account may have zero followers but could have substantial viewership on some videos, which could aid violent extremist TikTok users in evading TikTok’s content moderation efforts,” it continues.
The DHS document flags several instances of extremist posts promoting violence throughout 2020 and in the lead-up to the insurrection at the Capitol. Ahead of the Jan. 6 riots, one TikTok user posted a video encouraging those attending the protests to bring firearms. Others users shared videos in early- to mid-2020 with instructions for sabotaging railroad tracks, accessing the White House through tunnels, and interfering “with the U.S. National Guard during riots,” the alert said, citing DHS and law enforcement reports.
Groups involved in the Jan. 6 violence used a range of digital platforms to share debunked claims about electoral fraud and organize on that day. The House select committee investigating the attack has asked numerous tech companies including Facebook, Twitter, Parler and TikTok, to hand over reams of internal documents so policymakers can understand their role in the violence.
But the DHS alert, issued months after those protests, shows its concern about extremism on TikTok is growing.
The DHS alert said a U.S. intelligence center also found evidence that foreign extremists use TikTok, including a pro-ISIS group that posted an English-language video in August 2020 with instructions for “manufacturing explosive compounds.” And it said that in October 2019, “ISIS militants abroad posted videos from 24 TikTok accounts depicting ISIS militants with corpses, guns, and other individuals declaring their support for religiously motivated violence and ISIS,” citing reports from information-sharing hubs used by law enforcement.
The DHS document added that TikTok took down those accounts after a newspaper flagged them.
The department also cited the conviction of a Pakistani imam in Paris who had promoted terrorism on the platform by calling for violence on non-Muslims and lionizing the terrorists who had attacked journalists at Charlie Hedbo, a French magazine, according to local media reports.
Both domestic and foreign groups “are exploiting standard features on the platform to evade the platform’s detection and removal efforts,” concluded the document. OODA Loop, a website run by a global strategic advisory firm, first reported on the document and made portions available.
The DHS document sketches together an overview of the extremist threat for the country’s law enforcement agencies. It says domestic and foreign groups have been active on the platform since at least 2019 and “have used TikTok to recruit adherents, promote violence and disseminate tactical guidance for use in various terrorist or criminal activities.”
The five-page analysis, though, is limited in scope, mostly providing a high-level view on how the app operates and what extremists are able to use the platform for to post violent or hateful material.
In recent years, domestic security agencies have increased their focus on combatting the rise of white supremacist and far-right groups based in the United States. But the U.S. counterterror apparatus still focuses far more attention on countering foreign groups. The DHS document highlights how federal agencies are waking up to the potential threat of domestic and international groups using the network to radicalize people
White supremacists, neo-Nazis and Islamic extremists have flooded TikTok in recent years, often using some of its signature features — like the ability to splice multiple videos into the same post — to create viral content promoting antisemitic and anti-LGBTQ+ messages. In June, the Institute for Strategic Dialogue, a think tank that tracks online extremism, found more than 1,000 such videos within a one-month period, including one showing a replica of the Auschwitz concentration camp built with the Minecraft video game and others lionizing fascist leaders from the 1930s.
TikTok subsequently removed all of those videos.
In the first three months of 2021, TikTok said it removed more than 90 percent of posts that broke its content policies within 24 hours of such material being posted. The company took down more than 61 million videos for policy and guideline violations during that period, it said earlier this year, adding that this amounted to “less than 1% of all videos uploaded on TikTok.”
Still, the Chinese-owned app’s struggles to tamp down content boosting white supremacists and domestic terrorists contrast with its robust approach to other controversial content.
Compared with other social media platforms, TikTok has a record of aggressively using content filters and automated algorithms to wipe material deemed problematic before it can gain a large online following. And its content moderation practices have created a string of controversies.
A top TikTok executive told British lawmakers last year that the app had previously censored content related to what she called “the Uyghur situation” in an effort to keep conflict off the platform. Beijing’s repression of Uyghurs and other Muslim minorities living in China has drawn condemnation from human rights groups. The executive later walked back her testimony, saying she misspoke.
The Intercept reported last year that TikTok executives had told moderators to “suppress posts created by users deemed too ugly, poor, or disabled for the platform.” A TikTok spokesperson told The Intercept those rules were “an early blunt attempt at preventing bullying” and are no longer in place. TikTok has also apologized for suppressing LGBTQ+ content, as Reuters reported.
And MIT Technology Review recently detailed an episode in which a TikTok product allowed content touting Nazism and antisemitism but automatically removed posts saying “Black lives matter” and “supporting black success.”