Politics

With Trump reinstatement, Meta finds new ways to punish world leaders

Pinterest LinkedIn Tumblr

With its announcement that it would be reinstating former president Donald Trump’s Facebook and Instagram accounts, Meta warned that it could take a stronger stance against world leaders who post content that exacerbates civil unrest or breaks its rules.

The social media giant is expanding the range of interventions that it can deploy to fight dangerous rhetoric from Trump and other public figures — with new measures to decrease the visibility of provocative posts.

“We are moving into a chapter that [we] are going to be looking at this in a more nuanced way [than] whether … to leave it up or take it down,” said Katie Harbath, an outside technology consult and former public policy official at Meta. “I’d like to see these more nuanced options.”

For months, Meta has been at the center of a wide-ranging debate about how world leaders who post problematic content should be handled. Democrats and progressive groups urged Meta to extend its suspension of Trump, arguing that his habit of spreading false accusations about election fraud was dangerous to the American people. Republicans and some free speech activists argue that a politician such as Trump deserves a platform on the country’s most popular social media networks.

Meta Global Affairs President Nick Clegg attempted to strike a middle ground on Wednesday when he argued that the company doesn’t want to interfere in the democratic process but that there should be limits to what politicians are allowed to post.

“The public should be able to hear what their politicians are saying — the good, the bad and the ugly — so that they can make informed choices at the ballot box. But that does not mean there are no limits to what people can say on our platform,” he wrote in a blog post. “When there is a clear risk of real world harm — a deliberately high bar for Meta to intervene in public discourse — we act.”

Under Meta’s new rules, Trump and other leaders whose accounts were reinstated after a suspension will face harsher penalties sooner for repeat offenses or posts that cause real-world harm. For starters, Trump and other public figures — a term whose definition, Meta said, includes government officials, politicians and users with more than 1 million followers — can be suspended for a first violation from one month to two years, depending on the violation’s severity.

Controversial conservative pundits such as Ben Shapiro, with 8.9 million followers, and Jordan Pederson, with 1.9 million, could qualify as public figures under the policy.

Meta also unveiled new remedies to address posts from previously suspended public figures that don’t violate the company’s rules but could lead to harmful events, such as the Jan. 6, 2021, attack on the U.S. Capitol. The company said that for those borderline posts and for rule-breaking content that it deems newsworthy, it may leave the content on the public figure’s page but not allow its distribution in followers’ news feeds. Meta may also remove the reshare button from those posts, stop the posts from being recommended or run them as ads.

The idea that tech companies should expand the way they fight the spread of baseless claims and violent and hateful rhetoric on their platforms has been gaining traction in academic circles for years. Some academics argue that social media platforms don’t have to decide only whether to take something down, they can give public figures the ability to post problematic content while preventing it from being shared to a large audience.

“Free speech does not mean free reach,” Renée DiResta, a researcher from the Stanford Internet Observatory, wrote in the technology publication Wired in 2018. “There is no right to algorithmic amplification.”

Some of that work is already happening at tech companies. Meta, for instance, will slow the spread of misinformation and hoaxes that normally wouldn’t be taken off its platforms because it’s not about voting, the U.S. Census or the coronavirus. Even embattled Twitter owner Elon Musk seized on the popularity of that idea when he justified in November the company’s decision to reverse its bans on Babylon Bee and Canadian psychologist Jordan Peterson. He said users wouldn’t be able to find “Negative/hate” tweets unless they are specifically seeking it out.

“New Twitter policy is freedom of speech, but not freedom of reach,” Musk tweeted at the time.

The Oversight Board, an independent group of experts, academics and politicians that oversees Meta’s content moderation decisions, praised the company for making “significant progress” on crafting penalties that are proportionate to the violation and introducing new ways to limit the distribution of content that doesn’t break the rules but could lead to offline harm.

But experts and activists say there are plenty of unanswered questions about how Meta will implement the new approach to moderating leaders such as Trump. The Oversight Board, which is funded by Meta, called on the company to be transparent about how it implements the new guardrails.

One question is whether the social media giant has the infrastructure to move quickly enough to implement the rules in situations in which the content could lead to imminent harm, said Laura Murphy, a former American Civil Liberties Union executive who oversaw a two-year civil rights audit of the company in 2018.

“Will they act quickly?” Murphy asked. “And how are they going to deal with close cases because a lot of politicians will figure out how to test the boundaries of the new system?”

Stanford law professor Nathaniel Persily, who studies free speech and technology issues, said it’s also not clear whether demoting the reach of Trump’s dicey Facebook posts would blunt their impact because they are likely to gain attention elsewhere.

“So much of the power of his social media feeds is his ability to then set an agenda for other people, especially the legacy media,” Persily said. “It’s not clear that demotion has the same impact on someone like him as it would for the average person.”

The new remedies also aren’t likely to solve all of Meta’s political problems, especially because members of differing political parties and groups are likely to spar over which remedy the company should use in heated moments.

“Everyone is going to have differing viewpoints about what they think the penalties should be,” Harbath said. “That’s just another Thursday for Meta and these companies.”

This post appeared first on The Washington Post