By Marc Morial
“It is possible that users assumed that language use that could potentially cause a ban or suspension on the platform in the past was no longer a concern. Additionally, anticipation of an unmoderated platform was potentially a source of excitement for certain Twitter users. By sharing epithets, it suggests that certain users were celebrating a reduction in perceived speech constraints on the platform. Re- gardless, the data conclusively shows that there is correlation between Musk’s arrival and a broader perceived acceptability to posted hostile content on Twitter.”
— Montclair State University Center for Strategic Communication study, “From the Tweets to Hate Speech: Increases in Twitter Racist Content After Elon Musk’s Acquisition”
In the 12 hours after Elon Musk purchased Twitter, the use of the n-word spiked by 500%.
ADL’s Center on Extremism identified a coordinated campaign to spread antisemetic content, launched by users of the largely unmoderated 4chan message board, that resulted in more than 1,200 antisemetic tweets and retweets in the 24 hours after Musk’s takeover.
Musk himself posted a conspiracy theory about the at- tempted murder of Paul Pelosi; the tweet was shared and liked tens of thousands of times before he deleted it.
This painful and shocking increase in hate prompted me, along with NAACP President and CEO Derrick Johnson, and National Action Network President and Founder Rev. Al Sharpton, to request a meeting with Musk to address our concerns and to understand his plan to protect our communities from abuse by those who seek not simply to express controversial views, but to harm us and undermine democracy.
In our letter to Musk, we wrote, “In flippantly declaring that, ‘the bird is freed,’ you might have unwittingly freed people to unleash the worst of human nature with communities of color and religious minorities bearing the greatest burden.”
We share Musk’s professed belief in the importance of free speech, but the fact remains that online hate speech, misinformation, and disinformation, posted by users intent on sowing social and political chaos, have grave consequences for democracy, civil rights, and public safety.
The white supremacist who murdered 10 people in a Buffalo supermarket in May was inspired by conspiracy theories posted on 4chan. Content on white supremacist websites fueled the massacre of nine Black worshippers at Charleston’s Mother Emanuel Church in 2015. The accused attacker of Paul Pelosi, the subject of Musk’s own false tweet, had posted “a mix of bloody images and hateful screeds aimed at a variety of targeted groups including Jewish, Black and trans people, as well as Democrats,” according to The Washington Post.
“Whether you realize it or not, as the new leader of Twitter, you have new responsibilities, and one of those responsibilities is to ensure your platform is not used to harm people and the nation as a whole,” we wrote to Musk. “Another responsibility is to ensure your own words and behavior do not cause harm, especially to the communities of color and other underserved communities who have been long time users and who have made the company what it is today. You have not shown a willingness to meet these responsibilities thus far, but we have hope and are willing to work with you do so in the future.”
Twitter needs strong content moderation standards to foster a safe and healthy online environment. Yet Musk plans to fire fully half of the platform’s work force, according to Bloomberg.
“We strongly urge you to maintain content moderation teams that are tasked with creating and implementing policies that provide a baseline for prohibiting content designed to threaten and harass people of color and religious minorities and suppress votes,” we wrote. “Indeed, we encourage you to strengthen these policies as there has been a rise in dangerous rhetoric and violent acts that threaten our communities’ ability to vote and otherwise fully participate in our society without fear for our safety.”
Lapses in content moderation are especially concerning with the midterm elections just days away. National security officials fear misinformation campaigns could ignite violence at the polls on Election Day. Disinformation campaigns waged on Twitter and Facebook in 2016 and 2020 sought to depress Black voter turnout and sow social and political discord.
We wrote, “We implore you to show immediate leadership by directly addressing the spike in hate speech that occurred over the weekend and to dis- courage vigorously and clearly those who would be influenced by your voice from using such speech in the future and from engaging in violence against anyone.”
Marc Morial is president/CEO of the National Urban League.