Groups were a major source of misinformation on the platform.
3 min read
This story originally appeared on Engadget
Facebook stated that groups devoted to health-related issues are no longer eligible to appear in recommendations. The update is part of the company’s recent efforts to combat misinformation.
“To prioritize connecting people with accurate health information, we’re starting to stop displaying health groups in recommendations,” Facebook wrote in a statement. The social network discovered that users can still search for these groups and invite others to join, but these groups will no longer appear in suggestions.
Facebook groups, especially those dealing with health-related topics, have long been a problem for the company. Groups devoted to anti-vaccine conspiracy theories, for example, have also been linked to misinformation from QAnon and COVID-19, often through proprietary algorithmic suggestions. Mark Zuckerberg recently said the company will not remove anti-vaccine posts as it does with COVID-19 misinformation.
Regarding QAnon, Facebook says it is taking an additional step to prevent the spread of groups related to conspiracy theory by “reducing their content in the news feed”. Previously, the company eliminated hundreds of groups associated with the movement, but did not completely eliminate their presence.
Image: Facebook via Engadget
On the other hand, Facebook is now archiving groups that no longer have an active administrator. “In the next few weeks we will begin to archive groups that have not had an administrator for a while,” writes Facebook. In the future, the company will recommend group members who do not already have administrator roles before archiving.
Facebook notes that it is penalizing groups who repeatedly share false claims that are discredited by its fact-checkers, and that in the past year it has removed more than a million groups for repeated violations or for breaking its rules.
However, critics have long said that Facebook is not doing enough with law enforcement groups on its platform that have been linked to disinformation, harassment, and threats of violence. The company came under fire last month after failing to remove a Wisconsin militia group that organized an armed response to protests in Kenosha until the day after a deadly shooting. And several Facebook groups have been credited with hampering the emergency response to the devastating Oregon forest fires after spreading unsubstantiated conspiracy theories about the start of the fires. Facebook eventually started removing these claims after emergency services asked people to stop sharing rumors.