Meta, the parent company of Facebook, has agreed to remove three posts that were found to incite violence, contain Islamophobic content, and support the violent riots which took place across the UK following a knife attack in Southport last year. The Oversight Board, which reviews content moderation decisions made by Meta, ordered the removal of these posts, stating that each had “created the risk of likely and imminent harm” and “should have been taken down.”

The riots erupted in July 2023 after a knife attack in Southport resulted in the deaths of three young girls and injuries to ten others. The unrest was further inflamed by misinformation circulating on social media regarding the identity of the attacker, including unfounded claims that he was an asylum seeker who had recently arrived in the UK via a small boat. This misinformation exacerbated anti-Muslim and anti-immigrant sentiments, contributing to the violence seen on the streets.

Meta initially allowed the contested posts to remain online after automatic moderation tools assessed them. None were subject to human review. Following appeals from users who reported the posts, the Oversight Board took up the case. In its ruling, the board criticised Meta for a delayed response in activating its Crisis Policy Protocol (CPP) and identifying the UK as a high-risk location. The board stated that these measures were activated only on 6 August 2023, several days after the attack and after the harmful posts had already been published.

“The content was posted during a period of contagious anger and growing violence, fuelled by misinformation and disinformation on social media,” the Oversight Board commented. “Anti-Muslim and anti-immigrant sentiment spilled onto the streets.” The board further expressed concern about Meta’s slow deployment of crisis measures, emphasising that a more prompt response was necessary to prevent the amplification of dangerous content.

Earlier, Meta had acknowledged that leaving one of the posts visible on Facebook was an error and removed it but contended that keeping the other two posts online was appropriate. Following the board’s ruling, a Meta spokesperson said, “We regularly seek input from experts outside of Meta, including the Oversight Board, and will act to comply with the board’s decision.”

The spokesperson also detailed the company’s response during the summer unrest, noting the establishment of a dedicated taskforce that worked in real time to identify and remove thousands of posts violating community rules. “In response to these events last summer, we immediately set up a dedicated taskforce that worked in real time to identify and remove thousands of pieces of content that broke our rules — including threats of violence and links to external sites being used to co-ordinate rioting,” they said. Meta reported removing 24,000 posts from Facebook and Instagram related to violence and incitement, along with 12,000 posts that breached hate speech guidelines during this period.

Meta has committed to responding to the full recommendations of the Oversight Board within 60 days, in accordance with the board’s bylaws. The enforcement actions and board oversight highlight the ongoing challenges social media platforms face in moderating content during periods of social unrest and misinformation.

Source: Noah Wire Services