Can Technology Prevent Genocide? A Case for Virtual Fear-Inoculation

By Dara Barlin

Rwanda, Nazi Germany, Bosnia, and Darfur conjure up images of genocide and ethnic cleansing that are so horrible they're difficult to even comprehend. Yet, in many places around the world, the potential for mass murders based on religion, ethnicity, or political persuasion continues to grow right under the surface. How do these large-scale killings begin? And is there any way to stop them? The UN sought to take on these tough questions during a recent panel discussion analyzing one of the most powerful ingredients of genocide: hate speech.  

At the panel, World Policy Institute senior fellow and founder of the Dangerous Speech Project, Susan Benesch, provided some critical insights.  She looked at the link between hate speech and genocide through a historical lens and identified a number of factors that could help determine the likelihood of   hate speech actually leading to violence.  

One of these factors focuses on the level of influence the speaker has, and the level of audience susceptibility to dangerous speech. If a well-respected speaker starts spouting hate messages to populations that are not prone to questioning authority, or are in dire circumstances (i.e. living in a state of extreme hunger, fear, economic depression, etc.), or both—the likelihood that the speaker can incite large groups to harm other human beings increases. 

Another factor is the use of dehumanizing language, for example, when influential leaders call the ‘other’ group rats, pigs, or cockroaches. This is an intentional effort to make people of another community seem less human, and more like creatures which are generally detested, making it easier to use them as a target that can be wiped out.  For those who think that this strategy is obsolete, leaders from Egypt, to North Korea to the United States are just a few of the many regions where leaders have been using these same type of rhetoric with regards to Jews, South Koreans, and Muslims as recently as this year.

One of the most critical ingredients for genocide, though, is what Susan calls “accusation in the mirror.” This is where the speaker tells the audience that the opposing group is seeking to annihilate them, when in fact it is the speaker that is seeking to incite violence. By convincing audiences that they are in immediate danger, the speaker induces extreme fear that triggers the “fight or flight” mode. This is when the rational brain shuts down, mob mentality rules, and the recipe for a large-scale slaughter is in place. 

A number of political leaders have sought to step in and shut down dangerous speakers by imposing political sanctions or, in some cases, shutting down the ability of dangerous speakers to reach their audiences.  However, neither strategy has typically been successful. And the right to freedom of expression limits many efforts to confine any speaker’s content. It’s a slippery slope, and one doesn’t want to be on the wrong side in either direction.

Susan suggests that instead of trying to shut off a dangerous speaker’s microphone, we should be focusing on what she calls “inoculation”—addressing the issue before (often right before) a speaker begins speaking. This means telling people what the speaker will be saying ahead of time, and putting a spotlight on the fear tactics he’ll be using to play on their emotions. If people know in advance that this speaker is going to say the other side is seeking to annihilate them, and that these accusations are false and are just an effort to scare them into raising fists and arms, then the audience is much less likely to buy the story from the speaker.

The inoculation messages that go out will not likely to change everyone’s mind but even if just a small number of people feel enough doubt about the veracity of the claims in the hate speech and decide to seek out more rational information themselves, that is potentially enough to undermine the public will/mob mentality needed to begin genocidal campaigns. 

The strategy for fear-inoculation is similar to disease inoculation in which doctors identify the disease which is causing harm to large numbers of people. They then inject people with a less potent germ from the disease (vaccine) that causes the body to create anti-bodies, which increase the immune system’s ability to successfully attack the disease. In the case of fear-inoculation, the disease is represented by the fear tactics that spread through individuals and groups. The vaccine is administered by injecting people with small bits of knowledge about the specific language and strategies contained within the disease. And the individual’s immune system to the disease, the anti-bodies, is rational thought. The stronger the anti-bodies, the easier to stop the disease before it starts.        

This strategy makes one wonder if we could possibly prevent genocide by engaging in fear-inoculation strategies? Is the answer that simple to stopping the mass killing of thousands of people around the world? 

Both logistically and politically speaking, the idea of having people hand out flyers at rallies to disseminate information seems untenable. The use of technology, however, could be the groundbreaking difference. Certainly Twitter, Facebook, and other social networks or even texting in regions with low access to smartphones could provide a rapid network of information sharing right before dangerous speakers approach the microphone. 

Groups that have just heard a speaker in one setting can predict what they are going to say in other settings and give others in neighboring areas the heads up.  Since the inoculation messages would be going through friendship networks (as social networks do), people receiving these messages are more likely to listen, trust, and not fall for the speakers fear tactics.

From an organizing standpoint, this is not a hard strategy to pull off. Similar inoculation strategies have been used successfully in other fields such as union organizing, harm reduction and domestic violence.  

In the case of genocide-prevention, a small group of local individuals who seek peaceful outcomes (i.e. clergy, social justice activists, etc.) can attend a few speeches to identify the patterns of speech. The group can then develop a few concise inoculation messages that can be spread via Twitter, text, or Facebook in future speeches in other villages and towns. 

These inoculation messages are not complex. In fact, they can be quite simple.

Example Inoculation message:

The speaker is going to tell you that the {target group leader} has declared they are going to burn our homes down while we are sleeping next week. This is a lie to scare you into going to war, which benefits the leaders of this movement but will only bring harm to our village and our families.  {Target group’s leader} is not seeking to attack.  He has said he, like us, wants to find peaceful solutions.  See for yourself at  Don’t let fear overpower your heart. Use your common sense.

At its core, the strategy seeks to call on people’s rational brain, pushing them to look beyond fear and hate to seek out the truth for themselves. The inoculation messages themselves may need to be more nuanced to reflect the local situation. And more work is needed to develop neutral sources of information that local communities can turn to. But we now have the technology and the strategies to engage in efforts like this for low or no cost and the gain could mean tens or hundreds of thousands of lives saved.

We know that social media has been used to promulgate fear and hate. But we have not even begun to capitalize on new media as a tool for promulgating rationality and a culture of peace. The potential is there.




Dara Barlin is the founder of A Big Project, an initiative aimed at using art and music to address current global issues.

[Photo courtesy of Shutterstock]

Related posts