#

Months after Russian invasion, Meta is tweaking its content policies

Nearly a year after Russian forces invaded Ukraine, Facebook parent company Meta is tweaking its content moderation strategy over the bloody conflict.

The most recent change removed the Azov Regiment, a Ukrainian far-right military group, from the social media giant’s list of dangerous individuals and organizations. That change will allow members of the Azov Regiment to create accounts on Facebook and Instagram and post content without fear of it being removed unless it breaks the company’s content rules. The move will also enable other users to explicitly praise and support the group’s work.

The shift in policy follows months of scrutiny over how the social media giant is drawing the line between supporting free expression about the war and mitigating rhetoric that could lead to dangerous or violent consequences offline.

Meta’s Oversight Board, an independent collection of academics, activists and experts who oversee Meta’s content moderation decision, in recent months has argued the company has gone too far in squashing content that criticizes authoritarian governments or leaders.

Historically, the Azov Regiment has been controversial. It is among Ukraine’s most adept military units and has battled Russian forces in key sites, including the besieged city of Mariupol and near the capital, Kyiv.

But the group’s connections to far-right nationalist ideology raised concerns that it was attracting extremists. When Putin cast his assault on Ukraine as a quest to “de-Nazify” the country, seeking to delegitimize the Ukrainian government and Ukrainian nationalism as fascist, he was partly referring to the Azov forces.

In this case, Meta argues that the Azov Regiment is now separate from the far-right nationalist Azov Movement. It notes that the Ukrainian government has formal command and control over the unit.

Meta said in a statement that other “elements of the Azov Movement, including the National Corp., and its founder Andriy Biletsky” are still on its list of dangerous individuals and organizations.

“Hate speech, hate symbols, calls for violence and any other content which violates our Community Standards are still banned, and we will remove this content if we find it,” the company said.

Mykhailo Fedorov, Ukraine’s minister of digital transformation, praised Meta’s decision and singled out Meta’s president for global affairs, Nick Clegg, the former British deputy prime minister.

“Means a lot for every Ukrainian. New approach enters the force gradually,” Fedorov tweeted. “Big contribution @nickclegg & his team in sharing truthful content about war.”

Important news from @Meta — changes in platform’s policies. Azov regiment no longer meets designation as dangerous organization. Means a lot for every Ukrainian. New approach enters the force gradually. Big contribution @nickclegg & his team in sharing truthful content about war.

— Mykhailo Fedorov (@FedorovMykhailo) January 19, 2023

Last summer, Fedorov had complained in a letter to Clegg that Meta’s use of automated content moderation systems unfairly blocked Ukrainian media organizations from sharing accurate information about the war at a time when Russian propaganda was proliferating online. During the early stages of the war, Federov also had pressured Apple, Facebook and other companies to build a “digital blockade” against Russia.

Meta’s decision on Azov is not the only recent change to the company’s rules. Earlier this month, the Oversight Board announced it had overturned a decision by Meta to remove a Facebook post protesting the Iranian government’s treatment of women, including Iran’s strict compulsory hijab laws.

The decision involved a post that displayed a cartoon of Iranian Ayatollah Ali Khamenei in which his beard forms a fist grasping a woman with chains around her ankles and wearing a hijab. The Farsi caption called for “marg bar” or “death to” the “anti-women Islamic government” and its “filthy leader Khamenei.”

Facebook removed the post, citing it’s call to violence, though later restored it under its exception for newsworthy content after the Oversight Board agreed to hear the appeal.

In its ruling, the Oversight board said in some contexts, “marg bar” is understood to mean “down with.” The Oversight Board argued that Meta didn’t need to apply a newsworthy exception because the post hadn’t broken the company’s rules in the first place. The Oversight Board said the rhetoric in the post was being deployed as a “political slogan, not a credible threat.”

“The Board has made recommendations to better protect political speech in critical situations, such as that in Iran, where historic, widespread, protests are being violently suppressed,” the board wrote in its ruling. “This includes permitting the general use of ‘marg bar Khamenei’ during protests in Iran.”

The Oversight Board was deliberating when scores of Iranians were protesting the death of Mahsa Amini in the custody of Iran’s notorious “morality police.”

In November, the Oversight Board also overturned Meta’s decision to remove a Facebook post that likened Russian soldiers who invaded Ukraine to Nazis. The Oversight Board said the Facebook post — which included the image of what appeared to be a dead body and quoted a poem calling for the killing of fascists — did not violate the company’s content rules or its responsibility to protect human rights.

After the Oversight Board selected the case, Meta rescinded its previous decision to remove the post for violating its rules against hate speech, which bar users from posting “dehumanizing” content about groups of people. Later, the company applied a warning screen to the photograph that alerted users the content may be violent or graphic. The board’s ruling overturned Meta’s decision to put a warning screen on the post and the company said at the time it would review other posts with identical content to determine whether to take action.

Earlier this year, Meta decided to allow some calls for violence against Russian invaders, creating an unusual exception to its long-standing hate speech rules that prohibit such language. Clegg wrote in an internal post that the company would be referring the guidance it issued to moderators to the Oversight Board, according to a copy of the post viewed by The Washington Post.

Later, Meta withdrew its request for the Oversight Board to review its approach to content about the war, citing “ongoing safety and security concerns.” That prompted criticism from the Oversight Board.

“While the Board understands these concerns, we believe the request raises important issues and are disappointed by the company’s decision to withdraw it,” the board said in a statement at the time.

This post appeared first on The Washington Post