Facebook is targeting users who consistently share misinformation

The company plans to push down any posts released by these users to the bottom of the news feed.

Stacey Tay

When she’s not writing articles, you’ll find her at home using her sewing machine.

Published: 27 May 2021, 1:41 PM

Facebook users who repeatedly share misinformation will soon face limitations, announced the social media giant in a press release on Wed, 26 May.

Accounts that continue to share fake news will be demoted by the company, and their posts are moved lower down in News Feed. 

In addition, other users will be warned by fact-checkers if they are about to like a page that has shared misinformation before.  

These changes are being implemented as the platform aims to reduce the spread of misleading or false information about COVID-19, vaccines and other topics.


These pop-up warnings allow Facebook users to make a more informed decision on whether they want to follow a page. PHOTO CREDIT: FACEBOOK


The company will also be notifying people when they share false content. The notification consists of the fact-checker’s article debunking the claim, and gives users the chance to delete the post. 

Facebook has yet to share details on how many times a user has to share misinformation for their accounts to be demoted, or how demoted users can restore their privileges.

You may like these