Jake is here to warn us:
No need to look to large governments and nuclear power plants for apocalyptic mishaps. The Apocalypse could come via your smartphone.
By Jake Sonderman
“Those who can make you believe absurdities, can make you commit atrocities.”–Voltaire
At first, saying social media will cause the apocalypse makes me sound like a middle-aged technophobe, but let me walk you through how social media amplifies hate and falsehood and how that can lead to real world violence and chaos. Let’s consider how propaganda has always been a vital part of dictators’ politics.
The year is 1919 and Hitler has just become the 55th member of the German Workers’ Party. In two years, Hitler would take over the party, now called the Nazi Party. In two more years, he attempts to overthrow the government of Bavaria and is thrown in prison. From prison, he writes Mein Kampf, a half biography, half manifesto, that foretold of his intentions of genocide. In 1932, Hitler lost the presidential election but was named Chancellor (similar to a congressional majority leader) in 1933 by President Hindenberg of Germany. From here, Hitler gained the overwhelming support of the German public and took control of the government. But how did this failed middle-class artist gain so much popularity and power?
Sometimes, I think we forget, since we all despise Hitler, that he was loved by the public when he took power and continued to garner support from the public well into the war. Many historians agree that it was Hitler’s speaking skills that propelled him to power. His ideas were far from original, and he had no status or family of status. In the 1920s and 30s though, radio became common in many households. This multiplied the effect of Hitler’s speeches and message, and certainly expedited, if not enabled, Hitler to gain public support (lifescience.com).
Obviously, we could not have stopped the radio from being created and used by Hitler, but today we have something much bigger than radio: we have social media. Social media amplifies content that is likely to get more views and attention. According to a recent MIT study about tweets, fake news and falsehoods spread significantly faster and to more people than accurate information. This is largely because of the “novelty and the emotional reactions” that go along with conspiracy theories and false information (sciencemag.org). Often times, hate speech and hate groups are tied with conspiracy theories.
One type of hate that has gained steam in the past decade is Islamophobia. Sacha Baron Cohen (Borat, Ali G) recently gave a speech at the Anti-Defamation League summit. In his speech he recounts a story from one of his shows. Sacha Baron Cohen’s thing is to play a ridiculous character who talks to real people and tries to get them to show who they really are. In his show Who Is America, he went into an Arizona town posing as a woke developer. He proposed to a town hall that they let someone build a big Mosque. He declared that it would bring millions of dollars of revenue to the city. This outraged the citizens and led a man to proudly proclaim: “I am racist, against Muslims” (adl.com).
There have been multiple instances of Islamophobia on Facebook. The most famous, atrocious incident was in Myanmar. Facebook has 18 million members in Myanmar. It has become “synonymous with the internet” for many citizens. The Myanmar military took advantage of this and started multiple propaganda accounts under the guise of celebrities and news sources. These gained millions of views and followers. The false accounts frequently posted complete fabrications about the minority Rohingya-Muslim population of the country. Tensions boiled when the military messaged (through Facebook) many known buddhists and anti-muslims and said that an attack was imminent from the Rohingya group. This led to what the UN called a “cultural genocide” of the Rohingya-Muslims, leaving 10,000 dead and 700,000 dislocated and disenfranchised (The New York Times). If Facebook had an effective monitoring system and had deleted those accounts, the whole genocide would have likely been avoided or been less extreme.
Facebook monitoring, though, has changed over time. An in-depth New Yorker article tackled Facebook’s questionable moderating system. At first, Facebook employed a small monitoring group that was left to its own judgement to say what was acceptable and what was not. The group was driven by the sentiment, “If something makes you feel bad in your gut, take it down” (The New Yorker). As Facebook grew and calls for monitoring increased, Facebook moved international monitoring to a large Dublin facility. Not many have come forward about what really goes on behind the scenes in Dublin, but those who have tell a dark story. In training, moderators are told that if they are unsure as to whether a post violates Facebook’s community guidelines, they should let a post stay up. The supervisors give some examples. A couple examples include a meme of a white mother holding her little girl underwater with the caption, “When your daughter’s first crush is a little Negro boy.” According to the supervisor, that image “implies a lot, but does not actually attack the . . . boy” or his ethnicity (The New Yorker). So, the post stands. Another example the supervisors show is a username “Killall [slur for LGBTQ+ peoples].” According to the supervisor, “L.G.B.T. is an idea” unless used with pronouns. This story of Facebook ends with many of the moderators leaving the company and suing Facebook for 52 million dollars for emotional damage (The New Yorker). The New Yorker article implies Facebook no longer worries about morals, only about negative PR.
Hate is what starts genocides and major conflicts. Social media is amplifying hate and conspiracy. None of us expect these companies to be perfect, but we have a chance now to stop future conflicts, wars, and genocides. Sure, World War II was not the Apocalypse. But if we had another World War, would it end in the Apocalypse?
Apocalypse Editor: Erik Bearman
Sources:
https://www.livescience.com/54441-how-hitler-rose-to-power.html
https://www.bbc.com/news/technology-44883743
https://science.sciencemag.org/content/359/6380/1146
https://www.newyorker.com/magazine/2020/10/19/why-facebook-cant-fix-itself
Leave a Reply