After years of treating President Donald Trump’s inflammatory rhetoric with a light touch, Facebook and Instagram are silencing his social media accounts for the rest of his presidency. The move, which many called long overdue following Wednesday’s deadly insurrection at the U.S. Capitol, is also a reminder of the enormous power that social-media platforms can wield when they choose.
Facebook and Instagram said Thursday they will bar Trump from posting at least until the inauguration of President-elect Joe Biden. Twitter said Thursday that it’s still evaluating whether to lift or extend what started as a 12-hour lockdown of Trump’s account.
It remains unclear how the platforms will handle Trump once he leaves office and is no longer shielded from enforcement of most rules by his status as a world leader. And some critics saw the moves as cynical efforts by the companies to position themselves for a post-Trump future.
“Mark Zuckerberg is enacting this temporary suspension not to protect our democracy but rather to protect Facebook’s power and profits,” Rashad Robinson of Color of Change, a group that has pushed tech companies to do more to rein in hate speech, said in an emailed statement. “Now that a Democratic majority in the Senate is set, it’s suddenly convenient for Facebook to suspend Trump.”
In announcing the unprecedented move, Facebook founder Mark Zuckerberg said the risk of allowing Trump to use the platform is too great following the president’s incitement of a mob on Wednesday. Zuckerberg said Trump’s account will be locked “for at least the next two weeks” and possibly indefinitely.
“The shocking events of the last 24 hours clearly demonstrate that President Donald Trump intends to use his remaining time in office to undermine the peaceful and lawful transition of power to his elected successor, Joe Biden,” Zuckerberg wrote.
Trump has repeatedly harnessed the power of social media to spread falsehoods about election integrity and the results of the presidential race. Platforms like Facebook have occasionally labeled or even removed some of his posts, but the overall response has failed to satisfy a growing number of critics who say the platforms have enabled the spread of dangerous misinformation.
In light of Wednesday’s riot, however, Zuckerberg said a more aggressive approach is needed. “The current context is now fundamentally different, involving use of our platform to incite violent insurrection against a democratically elected government,” he wrote.
Instagram, which is owned by Facebook, will also block Trump’s ability to post on its platform. YouTube, owned by Google, also announced more general changes that will penalize accounts spreading misinformation about voter fraud in the 2020 election, with repeat offenders facing permanent removal from the platform.
Twitter locked President Donald Trump’s accounts for 12 hours after he repeatedly posted false accusations about the integrity of the election. That suspension was set to expire sometime Thursday; the president had not yet resumed tweeting as of late Thursday morning.
A company spokesman said the company could take further action as well. “We’re continuing to evaluate the situation in real time, including examining activity on the ground and statements made off Twitter,” the spokesman said.
But the platforms continued to face criticism from users who blamed them, in part, for creating an online environment that led to Wednesday’s violence.
“Today is the result of allowing people with hate in their hearts to use platforms that should be used to bring people together,” singer and actress Selena Gomez wrote on Twitter to her 64 million followers. “You have all failed the American people today, and I hope you’re going to fix things moving forward.”
Thomas Rid, a Johns Hopkins cyberconflict scholar, tweeted “kudos and respect” to Zuckerberg and Facebook shortly after the announcement that Trump’s account would be locked for two weeks.
“Clearly the right move,” Rid said. “Consistent incitement to political violence is not acceptable. Twitter should do so as well.”
A message left with the White House on Thursday morning was not immediately returned.
Twitter and Facebook both temporarily locked Trump’s accounts Wednesday, the most aggressive action either company has yet taken against the president, who more than a decade ago embraced the immediacy and scale of Twitter to rally loyalists, castigate enemies and spread false rumors.
While some cheered the platforms’ actions, experts noted that the companies’ actions follow years of hemming and hawing on the dangerous misinformation and violent rhetoric Trump and his supporters have spread, contributing to Wednesday’s violence.
The flashpoint on Wednesday was a video Trump posted on Twitter more than two hours after protesters entered the Capitol, interrupting lawmakers meeting in an extraordinary joint session to confirm the Electoral College results and President-elect Joe Biden’s victory.
Republican lawmakers and previous administration officials had begged Trump to give a statement to his supporters to quell the violence. He posted his video as authorities struggled to take control of a chaotic situation at the Capitol that led to the evacuation of lawmakers and the death of four people.
While Trump told supporters that “you have to go home now,” he also repeated false claims about voter fraud affecting the election. He then added: “We can’t play into the hands of these people. We have to have peace. So go home. We love you. You’re very special.”
Twitter, Facebook and YouTube all said they removed the video Wednesday, citing its misinformation or dangerous rhetoric.
In a statement Thursday morning, Trump said there would be an “orderly transition on January 20th” and acknowledged defeat in the election for the first time. His aides posted the statement on Twitter because the president’s account remained suspended.
Monica Stephens, a professor at the University of Buffalo who studies social media, said it made sense for Facebook and Twitter to try lighter forms of curbing misinformation in the months leading up to the election. “They’re getting flak from both sides of the political aisle,” she said.
Trump’s ardent supporters have flocked to Parler, GAB and other “free speech” social media sites that cater to conservative voices. Some were used Wednesday by the people who stormed the Capitol.
“Large amounts of the population and press were not anticipating the violence and the vitriol yesterday because it was happening behind closed doors of Parler and Gab instead of Facebook and Twitter,” Stephens said. “If they entirely shut out this type of speech it is still going to happen, it is just going to happen where it isn’t as read.”
Social media companies, many of which are already under heightened government scrutiny over their business practices, “want to make sure that people in power have a good view of them because they would like not to be regulated,” said Shannon McGregor, an assistant professor of journalism and media at the University of North Carolina.
But now that the ban has happened, companies like Facebook and Twitter may find it harder to resist calls for banning other political figures who incite violence. “Because they resisted and resisted but now they’ve done it, it is hard to walk that back,” she said.
Timeline
So, how did we get here? Here’s a look back and Facebook’s steps and missteps over the past four years.
Nov. 10, 2016: Days after Trump’s election, Facebook CEO Mark Zuckerberg calls the idea that “fake news” on Facebook had influenced the election “a pretty crazy idea.” He later walks back the comment.
December 2016: Facebook says it will hire third-party fact-checkers to combat misinformation.
April 27, 2017: Facebook publicly acknowledges that governments or other malicious non-state actors are using its social network to influence national elections, in line with U.S. government findings of Russian interference.
October 2017: Facebook says ads linked to a Russian internet agency were seen by an estimated 10 million people before and after the 2016 election.
November 2017: Ahead of congressional hearings on election interference, Facebook ups that estimate, saying Russian ads fomenting political division potentially reached as many as 126 million users.
Jan. 4, 2018: Zuckerberg declares his 2018 resolution is to “fix” Facebook.
March 2018: Evidence grows that Facebook campaigns were used to steer the U.K. toward Brexit.
April 2018: Zuckerberg testifies before Congress and apologizes for the company’s missteps, as well as fake news, hate speech, a lack of data privacy and foreign interference in the 2016 elections on his platform.
May 2018: Democrats on the House intelligence committee release more than 3,500 Facebook ads created or promoted by a Russian internet agency before and after the 2016 election.
Stay informed on what is happening in Clark County, WA and beyond for only
$99/year
Subscribe
July 2018: British lawmakers call for greater oversight of Facebook and other platforms.
July 2018: After Facebook warns of skyrocketing expenses due in part to beefing up security and hiring more moderators, its stock price suffers the worst drop in its history. Its shares don’t recover until January 2020.
Sept. 5, 2018: Facebook and Twitter executives pledge before Congress to defend against foreign intrusion.
October 2018: Facebook invites the press to tour a newly created “war room” for combating election-related misinformation in what is largely seen as a public relations move.
October-November 2018: Ahead of the 2018 U.S. midterm election, Facebook removes hundreds of accounts, pages and groups for suspected links to foreign election interference.
Feb. 18, 2019: In a scathing report, British lawmakers call for a mandatory code of ethics and independent overseers for social media platforms, specifically calling out Facebook for technical design that seems to “conceal knowledge of and responsibility for specific decisions.”
May 2019: Facebook declines to remove a video manipulated to show House Speaker Nancy Pelosi slurring her words. The altered clip is shared millions of times.
October 2019: Facebook unveils new security systems designed to prevent foreign interference in elections.
November 2019: Facebook opens a new misinformation “war room” ahead of U.K. elections.
May-June 2020: Facebook declines to remove Trump posts that suggest protesters in Minneapolis could be shot. Zuckerberg defends his decision in a Facebook post. Facebook also declines to take action on two Trump posts spreading misinformation about voting by mail. Some Facebook employees resign in protest.
June 2020: Facebook says it will add labels to all posts about voting that direct users to authoritative information from state and local election officials. This includes posts by the president.
July 8, 2020: A quasi-independent civil-rights audit criticizes Facebook’s “vexing and heartbreaking decisions” with respect to civil rights and election misinformation, including Trump’s tweets on voting by mail.
August 2020: After years of a hands-off approach, Facebook restricts the conspiracy movement QAnon, but doesn’t ban it outright.
Sept. 3, 2020: Facebook curbs political ads, although only for seven days before the U.S. election.
Oct. 6, 2020: Facebook bans all groups that support QAnon.
Oct. 7, 2020: Facebook further limits political ads, readies more labels for candidate posts that prematurely declare victory or contest official results, and bans the use of “militarized language” in connection with calls for poll watching.
Nov. 3, 2020: Despite fears over security in the runup to Election Day and social media companies bracing for the worst, the election turns out to be the most secure in U.S. history, federal and state officials from both parties say — repudiating Trump’s unsubstantiated claims of fraud.
Nov. 5, 2020: Facebook bans a large group called “Stop the Steal” that supporters of President Donald Trump were using to organize protests against the presidential vote count. Some members had called for violence, while many falsely claimed that Democrats were “stealing” the election from Republicans. It also bans the hashtag by the same name, though not before that and similar terms are mentioned nearly 120,000 times on websites and social media platforms, per Zignal Labs.
November 2020: Facebook and Twitter largely fight back against Trump’s baseless claims of election fraud by slapping labels on his posts, such as “Official sources called this election differently.” Critics say the measures don’t go far enough and false claims that the election suffered widespread fraud continue to thrive online, pushed by Trump and his supporters.
Nov. 17, 2020: The CEOs of Facebook and Twitter give assurances of vigorous action against election disinformation, they tell Republicans at a Senate hearing. Zuckerberg and Twitter’s Jack Dorsey defend the safeguards against the use of their platforms to spread falsehoods and incite violence in the contest between Trump and Biden.
Jan. 6, 2021: In an unprecedented step, Facebook and Twitter suspend President Donald Trump from posting to their platforms following the storming of the U.S. Capitol by his supporters. Twitter jumps first and Facebook and Instagram follow in the evening, announcing that Trump wouldn’t be able to post for 24 hours following two violations of its policies. Experts noted that the companies’ actions follow years of hemming and hawing on Trump and his supporters spreading dangerous misinformation and encouraging violence
Jan. 7, 2021: Zuckerberg announces that the risk of allowing Trump to use the platform is too great following the president’s incitement of a mob that stormed the U.S. Capitol a day earlier. Zuckerberg says Trump’s account will be locked “for at least the next two weeks” and possibly indefinitely.