Facebook is targeting users who consistently share misinformation
The company plans to push down any posts released by these users to the bottom of the news feed.
Facebook users who repeatedly share misinformation will soon face limitations, announced the social media giant in a press release on Wed, 26 May.
Accounts that continue to share fake news will be demoted by the company, and their posts are moved lower down in News Feed.
In addition, other users will be warned by fact-checkers if they are about to like a page that has shared misinformation before.
These changes are being implemented as the platform aims to reduce the spread of misleading or false information about COVID-19, vaccines and other topics.
The company will also be notifying people when they share false content. The notification consists of the fact-checker’s article debunking the claim, and gives users the chance to delete the post.
Facebook has yet to share details on how many times a user has to share misinformation for their accounts to be demoted, or how demoted users can restore their privileges.