In late February, just after Russia invaded Ukraine, my Twitter feed began filling up with videos showing Black exchange students being refused passage on trains fleeing the escalating conflict, while white students had no problem boarding. But not everyone was buying it. In a now-deleted tweet, Teen Vogue Editor-in-Chief Versha Sharma shared a Washington Post article reporting on the spread of disinformation, and added, “as videos go viral, a reminder about verifying sources before sharing—and a reminder that Russia disinfo ops have specifically targeted Black people in the past with fake accounts and media.”
“The videos and the ppl are real,” Q. Anthony Omene, creator of RZNWA Media, who had been amplifying the voices of those speaking out about anti-Black racism in Ukraine, shot back in response. Omene and others raising concerns were not part of a disinformation campaign, he asserted, but rather real people who had been in contact with African students who were struggling to escape. “People were accusing me of being a Russian bot,” he tells me, wondering whether his internationalist politics and identity as a communist played a role in why both white and Black users distrusted him. “I still get called to this day a Russian disinformation agent.”
When Kimberly St. Julian-Varnon, a historian at the University of Pennsylvania who focuses on Black identity in the Soviet Union, delved into the conversation, she too came under fierce attacks. Media outlets were asking her for the provenance of one of the videos, so St. Julian-Varnon posed the question on Twitter: “do we have a source for this video?” The question triggered something she wasn’t quite expecting: “People came after me,” she recalls. “People were like, ‘You’re being racist,’ ‘You’re white,’ ‘You’re supporting white supremacy.'”
The bigotry that African students faced at the border was in fact real. But Sharma and St. Julian-Varnon’s suspicions weren’t unfounded: Bots, particularly those from Russia, have been known to pose as Black people on Twitter. Online, the existential question of who someone truly is becomes fertile ground for chaos agents to further ignite painful interracial tensions. This discord serves to perpetuate racism, which has always been founded on lies. As feminist researcher and blogger Sydette Harry says, “Racism is disinformation. Racism has been a form of this since almost every single iteration of mass communications, especially public [ones].”
This manipulation of Blackness for political gain was on full display during the 2016 election. According to a 2018 New York Times report, Russia’s Internet Research Agency (IRA), which fueled the country’s influence campaign and was owned by a close Putin ally, “created a dozen websites disguised as African American in origin, with names like blackmattersus.com, blacktivist.info, blacktolive.org and blacksoul.us.” A report by the Senate Intelligence Committee on Russia’s interference found that, on Facebook, more than 66 percent of the ira’s advertising content contained the word “race”; five of its top 10 Instagram accounts were focused on African Americans and issues pervading Black communities; its Twitter page was full of stories about race and racism, such as the NFL kneeling protests; and 96 percent of its YouTube content was devoted to “racial issues and police brutality.”
Yet the 2016 election wasn’t the first time Russians had targeted African Americans. This particular subset of the culture war has been brewing for years, tracing back to the knotted web of political agendas on both sides of the hemisphere.
Back in the 1920s and ’30s, when African Americans were fleeing by the thousands to the North to escape racial terrorism in the South, the USSR saw an opportunity. Although anti-Semitism was very much present, racial hierarchies were not as entrenched in the Soviet Union as they were in the United States, according to Meredith Roman, an associate professor at SUNY Brockport, who specializes in comparative Soviet and African American history. In fact, under Lenin, an official motto was “Workers and oppressed peoples of all countries, unite!”
In Moscow, organizations such as Comintern (Communist International) reached out to African Americans and Black people throughout the diaspora to spread communism and galvanize resistance to their capitalist oppressors. These organizations often created posters to spread their messages. One showed two images: an illustration of a Black man in chains beneath the Statue of Liberty with the caption “under capitalism,” next to an image of a happy, diverse crowd with the caption “under socialism!”—the idea being that Black people would be equal in the USSR. This propaganda worked. Increasing numbers of African and African American students went to the USSR to study. As economic opportunities narrowed for American blue-collar workers in the 1930s, Soviet companies recruited them to help power their new factories. Disenchanted intellectuals also made the journey.
“Paul Robeson, W.E.B. Du Bois—they made frequent trips to the Soviet Union and saw that there was good in having this country, a superpower, essentially being able to expose the anti-Black racial violence of the so-called ‘leader of the free world,’” Roman explains. But as the Soviet Union began to collapse, neo-Nazi groups proliferated throughout Eastern Europe. Starting in the early ’90s, attacks not only on people of African descent, but also on Asians and non-Slavic people, such as people of the Caucasus, became more frequent. (People of the Caucasus were called chernye, which means “black” in Russian, whereas Black people were called negry. To put it simply, people of the Caucasus were treated similarly to those who are Black in the West.)
Nevertheless, the USSR’s strategic fanning of racial animus was a Cold War feature, notably reaching the world of sports. In his 1999 book, The Sword and the Shield, historian and former KGB officer Vasili Mitrokhin detailed how, during the lead-up to the 1984 Olympics in Los Angeles, Soviet intelligence officers masqueraded as the KKK by forging letters and sending them to African and Asian countries to dissuade the Olympic committees of those nations from sending their best athletes. One of the letters stated, according to NPR: “The Olympics–for the whites only…African monkeys! A grand reception awaits you in Los Angeles! We are preparing for the Olympic games by shooting black moving targets.”
Decades later, when Donald J. Trump was running for president in the 2016 election, Russia saw an opportunity again. This time, it wasn’t through anti-racist propaganda but quite the opposite: digital Blackface to spread alt-right ideologies around the world.
The foundation for foreign agents to meddle with online identities had already been set in the United States. In 2013, New York Times bestselling author and feminist Mikki Kendall started a popular Twitter hashtag called #SolidarityIsForWhiteWomen as a critique of the support given to self-proclaimed male feminist Hugo Schwyzer, who had recently admitted that he criticized Black feminists as a way to boost his career. One of his marks was Sydette Harry. In her words, “[Schwyzer] admitted that he was outright targeting us because it was easy to get white women to ignore racism and to ignore whatever we were saying because of our big, bad attitudes.”
The hashtag was hugely popular. But soon, 4chan users began to masquerade as Black women on Twitter as a way to further fracture digital feminism. “They would fake being Black women and start fights with white women,” Harry says. After attention around the hashtag faded, the fake accounts stuck around, only to show up again during the Gamergate controversy the following year. The trolls used the accounts as “cover for when they were targeting Black women,” says Shireen Mitchell, technology analyst and founder of Digital Sisters/Sistas Inc., and when they were “showing up in gamer spaces pretending to be Black.”
How do bots and trolls so successfully impersonate Black people? As André Brock, an associate professor at Georgia Tech’s School of Literature, Media, and Communication, says, “Black online discussions are internal yet not private; anyone can see it. White folks don’t have to physically venture to Black enclaves to consume Black culture.”
Of course, Blackness has been co-opted in American culture for over a century. The appropriation of African American vernacular in particular goes back to the 1920s. “Think about the jazz age, the beatniks in the 1950s and hipsters and so on,” says Taylor Jones, a quantitative social scientist and linguist who focuses on African American English (AAE). “This is what linguists call ‘covert prestige.’ The idea is that they [white people] have an air of cool with things that are problematically associated with African Americans.” Online, Blackness becomes an exportable product that is accessible to anyone, hence why Black people are vulnerable to co-opting: Their identities are seen as accessories, elements to be adopted or discarded whenever one wants. “Digital capacity to encode Blackness is what makes it exploitative,” Brock says. If a chaos agent believes that they can mimic Black speech, then they in turn can become Black online.
These efforts aren’t always successful: Many Black Twitter users can spot bots and trolls from a mile away. “They don’t think we’re intelligent. They don’t pay attention to grammatical or syntactical rules or rules of interaction with audiences,” Brock explains. But bots are able to masquerade as Black via AAE because although Black people can spot the inaccuracies, the wider public cannot. The study of AAE is a fairly recent academic field, and its relative infancy, compounded with racism, is why many people and algorithms—bots, trolls, Twitter users, luddites, and the larger American public—don’t know, or respect, its complex system of rules, structures, morphologies, and discursive strategies. If the public cannot understand the dialect being badly mimicked, they cannot mobilize to thwart these targeted attacks on Black people. Online, people can claim to be a member of any marginalized group, Jones explains, “and it works because if you don’t interact with anybody from that group on a regular basis, then how would you be able to evaluate it?”
Digital fraud is such a pervasive phenomenon that even those who are Black are using it to insult other Black people who do not perform their Blackness in a way that is arbitrarily acceptable. “Twitter has become increasingly political and there’s a really vocal contingent of folk that have become gatekeepers,” Brock asserts. In his words, “The easiest charge to level against somebody online is to say not only are you not down with the struggle, you’re not even a real person.” Both Omene and St. Julian-Varnon—the former a communist (antithetical to American politics) and the latter an empiricist, who asked for verification of a video depicting racism (and therefore not “down with the struggle,” St. Julian-Varnon reflects)—were seen as being fake or complicit in upholding white supremacy, simply because their perspectives ruffled feathers online.
Russian disinformation agents, the far right, and any online troll will continue to capitalize on our rifts until we acknowledge racism and its sociopolitical implications. “As long as we ignore our history and its real legacies in producing racial inequalities in terms of policies and practices,” Roman, the SUNY historian, says, “the Kremlin [is] going to try and exploit these vulnerabilities not with the purpose of offering refuge but to challenge the United States and make it look bad.” Of course, Twitter does not allow for a nuanced discussion of Blackness on any side of the world. “There’s levels to this,” St. Julian-Varnon laments, “and of course when you’re on Twitter, you can’t explain things in 280 characters.”
So how do we rectify this issue? Mitchell, the technology analyst, says the first step is to acknowledge that the majority of this online impersonation and targeting happens to Black women. “We are not admitting that,” she says. “We culturally, societally, don’t have empathy when Black women are being harmed. Algorithms eventually know who to protect and who not to. The algorithms know not to protect Black women.” No site can protect the marginalized if its creators do not value their voices or safety.
Harry gave me a more practical tip: “If you are ever finding that you have an account that is very inauthentic and you are worried about what to do next, don’t automatically block the account. If they have under 100 followers, go to who they follow and block every locked account.” The accounts that are locked, the ones she can’t see into, are usually the ones that coordinate malicious activity, she points out, so blocking them will help dampen their power.
Aside from that, experts like Harry and Mitchell are not optimistic about the tide changing. When Harry has raised concerns about online impersonation and harassment while speaking on panels about digital technology and social media, she has felt ignored or undervalued for her insight. “I’m not sure if it’s actually fixable,” she says, “the damage we’ve done to the discourse around how to find truth online.”