Facebook is working on taking a new approach to help users stay informed within the app by sending particular alerts to the users who have interacted with posts that have been identified as containing wrong information with an initial focus on COVID-19 updates.

Recommendation:

Facebook Announces New Messenger Branding & Releases New Messaging Features

Report By Fast Company

[Facebook] will now send notifications to anyone who has liked, commented, or shared a piece of misinformation that’s been taken down for violating the platform’s terms of service. It will then connect users with trustworthy sources in effort to correct the record.

Facebook misinformation alert
Image Source: Social Media Today

As seen in the above image the new alerts will have particular words to let users know the reason for the notification.

We removed a post you liked that had false, potentially harmful information about COVID-19.

The notification will show the information on the removal and the reasoning of why the content was removed.

Report By Platformer

For the interview study, eight of 15 participants said that platforms have a responsibility to label misinformation and we’re glad to see it. The remaining seven took a hostile attitude towards labeling, viewing the practice as “judgemental, paternalistic and against the platform ethos.

One of the study participants stated that:

I thought the manipulated media label meant to tell me the media is manipulating me.

The new labels will give more particular context to each user, which may get more users to rethink their sharing habits. Another study that got released this year claims that only some of the new posts are labeled as fact-checked and disputed, users then tend to believe that any other stories which are not marked are definitely correct – even if they’re completely false.

According To The Report

In Study, we find that while warnings do lead to a modest reduction in perceived accuracy of false headlines relative to a control condition (particularly for politically concordant headlines), we also observed the hypothesized implied truth effect: the presence of warnings caused untagged headlines to be seen as more accurate than in the control. In Study, we find the same effects in the context of decisions about which headlines to consider sharing on social media.

Conclusion

Facebook now is working to give more particular notification on the same which will offer the same false sense of security in that users will then assume Facebook will notify them of all fake reports, and be less inclined to research for themselves. It is hard to balance and with vaccines coming out around the world, Facebook is putting in everything it can to restrict the spread of misinformation about the virus.

Leave a Reply

EXPLORE THE LATEST

EXPLORE THE LATEST

That answer every ifs and buts of digital marketing

Stay ahead with deeper understanding of the trends and up-to-date marketing tactics; all tried & tested in the ever-growing digital space.

Looking to
Achieve your Goals?

Let us help you
get there!

We're a digital agency focused on
creative and results-driven solutions.

Start With
Submit Form Below