Shipping News and Reviews

What the two-year Trump ban on Fb does and what it doesn't

Donald Trump's Facebook ban will last at least two years, the company announced on Friday. Facebook said that the actions of the former president on Jan. 6, which contributed to a violent mob storming Capitol Hill and staging a riot that resulted in five deaths, "was a serious violation of our rules," and that it was that policy change as part of a new way of dealing with public figures in civil unrest.

Facebook added that the two-year sanction represented a period of time "long enough" to significantly deter Trump and other world leaders who might post similar posts, as well as enough to allow a "safe period after the acts of" incitement . "However, Facebook has not yet made a final decision on the future of Trump's account. The company said it would reassess after two years whether there is still a risk to public safety and possible civil unrest.

“We know that any punishment we impose – or not use – is controversial. There are many people who believe that it was inappropriate for a private company like Facebook to suspend an outgoing president from its platform, and many others believe that Mr. Trump should have been banned for life immediately, ”said Nick Clegg, vice president the company's global affairs, said in a blog post, later adding, "The Board of Directors is not a substitute for regulation and we continue to call for thoughtful regulation in this area."


The announcement comes after Facebook's board of directors, a group of policy experts and journalists the company appointed to handle tough content moderation issues, decided to keep the platform on the former president's account. In May, the board ruled that Facebook should not have suspended Trump indefinitely and would have to make a final decision within six months. The board also said that Facebook needs to clarify its rules about world leaders and the risk of violence, among other things.

"The board of directors is reviewing Facebook's response to the board's decision in the case of former US President Donald Trump and will provide further comments once this review is completed," said the board's press team in response to Facebook's announcement on Friday. Later in the day, the board said in a statement that it had been "encouraged" by Facebook's decision and would monitor the company's implementation.

Facebook now says it will fully implement 15 of the board's 19 recommendations. It also responded to the board's request to provide more details on its newsworthiness exception, a policy Facebook has used – albeit rarely – to give politicians a free pass to post content that violates its rules. Now Facebook is saying it will mark posts with these exceptions and treat politicians 'posts more like regular users' posts.

These decisions by Facebook are having a huge impact not only on Trump's account, but also on national politics in the United States for the foreseeable future. At the same time, they signal that the company has retained its decision-making power over what politicians are ultimately allowed to post on the platform. Facebook is disclosing more details about the rules it could use to punish politicians who violate its community guidelines, potentially increasing transparency. Still, Facebook has the final say on enforcement, including what is newsworthy and stays on the platform as opposed to what violates its community guidelines and is removed.

Facebook still decides who gets a free pass according to its rules

In Friday's announcement, Facebook announced it was changing one of its most controversial guidelines: allowing content that violates its rules but is important enough for public discourse to stay online, often because it was posted by a politician. Some call this the "newsworthiness exception" or the "world leader exception". Now Facebook is changing the rules to make the exception appear more transparent and less unfair. But the company still retains its power to decide what happens the next time a politician posts something offensive or dangerous.

Trump was the inspiration for this exemption, which Facebook first introduced in 2015 after the former president (then a candidate) posted a video of himself saying Muslims should be banned from the US. The newsworthiness exception was officially announced in 2016 and has long been controversial because it generates two types of users and posts: those who have to abide by Facebook's rules and those who don't and post offensive and even dangerous content can.

In 2019, the company added more details. Nick Clegg, Facebook's vice president of global affairs and communications, said Facebook would assume that anything a politician posts on its platform would be of interest to the public and should stay awake – “even if it is otherwise against our normal Content rules would be violated ”- and as long as the public interest outweighs the risk of damage.

The policy is also believed to serve as a handy shield for Facebook to avoid disputes with powerful people (like the President of the United States).

Despite all of the controversy and confusion it has caused, Facebook says the newsworthiness exception is rarely used. In 2020, Facebook's independent civil rights review reported that Facebook had used the exemption only 15 times in the previous year and only once in the US. Facebook changed its previous statement to the board of directors on Friday, saying it only technically applied the standard regarding Trump once, via a video Trump posted from one of his 2019 rallies. Although the board of directors was rarely the beneficiary of the policy, Trump's account suspension said in May that Facebook should respond to the ongoing confusion.

Now, Facebook says that politicians' content is analyzed for violations of its community guidelines – and balanced against the public interest – like any other user. While that means the formalized exception for global leaders is gone, much of what actually stays on and off Facebook stays where it began: in the hands of Facebook.

Facebook won't investigate how the platform contributed as of January 6th

After the deadly January 6 riot, many have pointed to the role of social media platforms, including Facebook, in escalating violence. Critics of Facebook said the uprising showed how Facebook should think not only about its approach to Trump's account, but the algorithms, ranking systems, and design feature decisions that could have helped the rioters get organized.

Even the Facebook Oversight Board, an independent body set up by Facebook to serve as a sort of court for the company's toughest content moderation decisions, recommended such a move to Facebook. Earlier this week, allies of the Biden government asked the company to follow these guidelines and conduct a public review of how the platform may have contributed to the insurrection.

Facebook has good reason to believe that its platform contributed to the January 6th events. At the very least, they are required to conduct a full, independent and thorough investigation and publish the results. It's the least they should do.

– Katy Glenn Bass (@KGlennBass) June 4, 2021

But Facebook doesn't and seems to divert that responsibility. The company instead points to a separate research effort focused on Facebook, Instagram, and the 2020 U.S. election, which Facebook says could also include an investigation into what happened at the Capitol.

"Responsibility for January 6, 2021 rests with the insurgents and those who have encouraged them," the company said in its decision on Friday, adding that independent researchers and politicians are best placed to play the role of social media in To explore uprising.

"We also believe that an objective review of these events, including contributing societal and political factors, should be conducted by elected officials," the company wrote, adding that it would continue to work with law enforcement. Republicans in particular virtually ruled out the possibility of a bipartisan commission on January 6th.

Facebook may never make a final decision on Trump

Facebook is delaying, perhaps forever, a final decision on Trump himself. At the moment, Facebook plans to suspend Trump for at least two years, which means he would regain his account in early 2023. The ban precludes Trump from using the platform to comment on the 2022 midterm elections, in which his posts may have empowered (or injured) the hundreds of Republican candidates for the House of Representatives.

Still, the two-year ban isn't a final decision on whether Trump can return to Facebook. That said, it is still unclear whether the former president will have access to the platform should he run for president again. It also leaves the question of what it really takes for a politician to be permanently booted from the platform.

Many are frustrated that Facebook hasn't permanently banned Trump. It is possible that he could get back to the platform in time to run for president in 2024, and Facebook obviously knows that. "If that takes 2 years, what can you do to get a lifetime ban," wrote one employee in an internal post, according to BuzzFeed. Civil rights groups that responded to the decision described Facebook's decision as insufficient and called Trump's possible return to the social network a threat to democracy. Some think the decision proves again that lawmakers need to step in and regulate social media.

Trump, for his part, seems extremely dissatisfied with Facebook's decision. "Facebook's decision is an insult to the record-breaking 75 million people and many others who voted for us in the rigged 2020 presidential election," Trump said in a statement released on Friday. “You shouldn't get away with this censorship and silence and in the end we will win. Our country can no longer bear this abuse! "

It's not clear what Trump's return to Facebook would look like. Facebook has said the guideline is intended to partially discourage politicians from breaking their rules again, but Facebook's current suspension hasn't stopped the former president from spreading electoral conspiracy theories on other platforms. Facebook hinted that Trump could potentially return if things are more stable, but it often seems that Trump himself is a major cause of instability.

It's important that Trump won't post on Facebook until 2023 at the earliest and that the company has some shiny new rules. But overall, Facebook once again retains its power to decide what happens next.

Update, June 4, 6:10 p.m. ET: This piece has been updated with further analysis.

Comments are closed.