Public Radio International
South Sudan became the world’s newest country in 2011. But since breaking from Sudan, it’s been riven by its own internal conflicts between clan groups, minor warlords and government factions.
Earlier this month, the United Nations World Food Program discovered three of its workers were killed there. The violence has gotten so bad that a senior British official has made the rare move by a foreign government of calling it outright tribal “genocide.”
Meanwhile, internet monitors are watching very closely. Online hate speech and fake news posts seem to be inciting some of the real world violence, according to researchers and activists.
“There’s a huge potential for genocide using the mechanism of social media to drive the conflict,” says Stephen Kovats, a founder of #DefyHateNow, a nonprofit working to counter online hate speech in South Sudan.
In South Sudan, “inflammatory rhetoric, stereotyping and name calling have been accompanied by targeted killings and rape of members of particular ethnic groups, and by violent attacks against individuals or communities on the basis of their perceived political affiliation,” says a November report by Adama Dieng, the United Nations special adviser on the prevention of genocide. “The media, including social media, are being used to spread hatred and encourage ethnic polarization.”
The ability of rumor or hate speech on social media to drive violent outcomes isn’t limited to South Sudan. In December, a 28-year-old North Carolina man, Edgar Maddison Welch, marched into a Washington, DC-area pizza restaurant with an assault rifle and opened fire (no one was hurt). He later told investigators he believed Democratic political figures were operating a child sex ring out of the restaurant’s basement. He was acting on false reports promoted by talk radio host, Donald Trump booster and “alt-right” provocateur Alex Jones.
Earlier this month, street brawls have spawned online memes, like the viral image of a far-right rioter punching an antifascist woman in the face, which have spread and espouse yet more violence.
Figuring out how to solve the problem of hate speech and fake news in South Sudan is really important — and not just for the South Sudanese.
The PeaceTech Lab, a DC-based nonprofit focused on using technology to identify and disrupt the drivers of conflicts, has been monitoring the spread of hate speech on social media in South Sudan since 2015 (the lab’s also a partner of #DefyHateNow). Researchers have been able to not only identify hate speech on the country’s social media but also tie its amplification directly to real world consequences.
“The South Sudanese are very politically active people, and with the advance of social media they’ve become ever more active outside of the country,” says Theo Dolan, director of the PeaceTech Lab Africa in Nairobi.
According to the PeaceTech Lab’s research, much of the hate speech is created by members of the South Sudanese diaspora in the United States, Canada, United Kingdom, Kenya and Uganda targeting people back in their home country. (That said, most diaspora and refugee South Sudanese are likely not promulgating hate speech online.)
The most telling examples occurred in early October 2016. A series of buses were attacked as they traveled on highways in the Equatoria region in South Sudan — with at least 31 people killed in the ambushes.
“Although some of the attacks were instant and indiscriminate, survivors of the deadly incidents claim the attackers were looking for members of the Dinka tribe,” a PeaceTech Lab report says.
But by tracking the frequency and types of online hate speech terms being used in memes and social media posts, the PeaceTech Lab’s researchers were able to identify a pattern: Around the same time as the bus attacks, there was a spike in posts containing the name of a mobile network, MTN, along with words and phrases associated with South Sudanese hate speech.
“MTN is one of the four mobile networks in South Sudan with the slogan ‘everywhere you go.’ In this context, Dinkas are said to be everywhere like the MTN mobile service,” according to PeaceTech. The report said that the attackers aimed to exact revenge for the killing of civilians by government forces in Equatorian villages.
Using media to make a generalized call for the extermination of an ethnic group is not a new phenomenon in Africa. Radio Télévision Libre des Mille Collines was used to incite Hutu forces to violence against Tutsis during the Rwandan genocide in 1994. Similar to how social media posts in South Sudan refer to Dinkas as “MTN,” the broadcasts in Rwanda referred to members of the Tutsi tribe as “cockroaches.”
In another instance, the website SouthSudanNation.com published a fake news article claiming that a general, Paul Malong, intended to “massacre Equatorians” using a “sinister and devious plan” in July 2016. “This article, on a seemingly legitimate news site [although it is not one] came out at a time when social media was erupting with info about Malong’s ‘threat’ to Equatorians,” says Dolan of PeaceTech. The false story about the massacre was then amplified through posts on WhatsApp, YouTube and Facebook “to mobilize others to take up arms to counter the ‘attack.’”
Many viral posts have also featured images or footage from other conflicts outside the country erroneously labeled as South Sudanese, and including forged logos for the Associated Press.
The combination of fake news, amplified by social media networks and seasoned with virulent hate speech, can be deadly.
South Sudan “has continued to spiral into violence and economic ruin and now famine,” Dolan says. But “a lot of people, our own [US] diplomats included, still feel we can address this hate speech issue. … The holy grail for us as peace builders and technologists is we want to prevent conflict from taking place. What we’re trying to do is connect the dots between online hate speech and violence on the ground.”
To help connect these dots, PeaceTech Lab has been simultaneously building a lexicon of South Sudanese hate terms and actively monitoring social media activity in the country to understand how online speech drives events on the ground.
International observers first began to see connections between online hate speech and real world violence in South Sudan in early 2015, according to Kovats. While literacy rates are very low in the country and only an estimated 1 percent of its people have internet access, a full 35 percent of South Sudanese youth have access to social media. And the vast majority of people in the South Sudanese diaspora are online and have become primary sources of online hate speech fueling the conflict there.
“The guys by the side of the highway who are killing people all have commanders and chains of command around them,” Kovats says. “Those chains of command, without exception, end with people who have mobile phones and are also online.” Those commanders are subsequently connected to clans, political factions or criminal gangs “who are attempting to gain control of a region or a particular resource,” he says. Invariably, “there is always a money connection to the people in the US or Egypt or Canada.”
In other words, violence in South Sudan is becoming big business for some people, and social media is an easy lever to create violent outcomes that are favorable to their financial and political interests.
“Regardless of how illiterate the guy with the gun is, it’s very, very easy in the ethnic tinderbox of South Sudan to hype up one community using social media from the outside,” Kovats explains. “You say things like, ‘You’d better watch out tonight or else your neighbors are going to come and slaughter you,’ which is the best way to get them to go slaughter their neighbors.”
The webs of political and clan influence in South Sudan are incredibly complex; it’s estimated that despite the nation only having a population of around 9 million, it has as many generals as the United States. “They’re all little warlords,” Kovats says. “It’s a very tight-knit group of communities where there are so many refugees and diaspora spread all over the place [around 5 million] that everybody’s got an uncle somewhere.”
One consequence is that rumors and fake news spread on social media often gain credibility by being spread within kinship groups.
Earlier this year, BuzzFeed News asked South Sudanese web users about their apparent online calls for violence. They included a diplomat in Washington, who refused to recant his remarks and downplayed reports of genocide.
However, hate speech and fake news can be combatted. Kovats says #DefyHateNow is focusing on teaching social media literacy and promoting online positive counter-messaging and events.
“The first thing anybody does when you get a computer is you figure out what Facebook is,” Kovats says. “Part of the mitigation strategy is creating awareness about how information moves in social media and how it affects people in communities.”
Another component of the group’s effort to combat internet hate is helping communities organize real world activities, particularly cultural events, that they can broadcast and share online, as a way to fill the information void and drown out false or hateful content.
“The more visibility you can get for South Sudanese who are against the conflict — rather than keeping that space open for the agents of conflict — the better,” says Kovats.
You can follow Benjamin Reeves at @bpreeves.