Meta Platforms said on Thursday it has resolved an error that caused violent and graphic videos to flood Instagram’s Reels feeds worldwide.
The company did not disclose the cause of the glitch but confirmed it affected users despite the “sensitive content control” setting meant to filter such material.
“We have fixed an error that caused some users to see content in their Instagram Reels feed that should not have been recommended. We apologize for the mistake,” a Meta spokesperson said.
It remains unclear how many users were impacted.
The incident comes as Meta faces growing scrutiny over its content moderation practices. Just last month, the company scrapped its U.S. fact-checking program on Facebook, Instagram, and Threads — platforms with over 3 billion users globally.
While Meta prohibits violent and graphic content, it allows exceptions for videos that raise awareness about issues like human rights abuses and conflict. In recent years, the company has increasingly relied on automated moderation tools — a shift expected to accelerate following the fact-checking rollback in the U.S.
Criticism of Meta’s content policies has mounted over time. The company was previously accused of failing to curb the spread of violent content during the Myanmar genocide, promoting eating disorder-related posts to teens on Instagram, and allowing misinformation to spread during the COVID-19 pandemic.
This latest glitch raises fresh concerns about Meta’s ability to balance personalized content recommendations with user safety.
Also read: Musk’s Ultimatum to Federal Workers Sparks White House Tensions