Facebook Provides New System to Help Better Inform Readers of News Content | Social Media Today


Amid ongoing discussion about how Facebook was used to spread misinformation ahead of the 2016 U.S. Presidential Election, The Social Network continues to ramp up its efforts to improve news literacy and stop the distribution of fake (or ‘false’, as Facebook calls it) news content.

In their latest measure, Facebook’s adding in a new information button which users can click on to find out more about a particular news story in their News Feed.

As you can see, when you tap on the new information icon at the bottom right of a post’s feature image, you’ll be provided with a pop-up listing of contextual information on the publisher, along with articles related to the same topic for additional context, and a map of where the post has been most highly shared.

The contextual information is pulled from the publisher’s Wikipedia entry, which will help underline the legitimacy of the source – if they don’t have a Wikipedia entry, nothing will show up (as you can see in the second example in the video). This may help users better ascertain the validity of the source – though as noted by TechCrunch, Wikipedia entries can also be manipulated (Facebook says this is rare and that they’re relying on Wikipedia to uphold their quality standards).

The links to alternate coverage have been active since April, though only for news posts generating significant discussion across the platform. Under the new system, Facebook will provide alternate links for all news posts, via an automated system.

And then there’s the share map – a heatmap-style presentation of where the article has been most shared, which also appears to include a listing of your connections who’ve shared it (at the bottom).

I’m not exactly sure what the map is designed to show. I guess, if an article about U.S. politics was trending, and the heatmap showed the majority of shares were occurring in Russia, that might raise suspicions.

It’s an interesting addition, and one which will no doubt provide the opportunity for readers to better inform themselves and avoid sharing fake news. The question is whether they’ll bother. Sure, having this data immediately available should help dispel rumors, but it does seem like much of the sharing of misinformation and related content is done by users who already have a distrust of the media, and of Facebook. They may see this as yet another measure to control the spread of information – and of course, many people will remain unaware of the new icon, or won’t bother to click it even if they do know it’s there.

But Facebook needs to do all it can. This week, the company came under increased scrutiny after providing Congress with examples of more than 3,000 ads purchased by groups linked to Russia which were designed to ‘disrupt the 2016 U.S. election and further divide an already polarized nation’.

Of course, these weren’t news posts, they were targeted ads, but the issue highlights the case that Facebook can be used to spread misinformation – and with 45% of Americans now getting at least some of their news content from The Social Network, the company needs to act.

In terms of broader impacts, it’ll be interesting to see whether the new system helps or hinders smaller publishers. A lot of smaller organizations won’t have Wikipedia pages, which could be seen as a negative, while the ‘Related Articles’ listings could also be a help or hindrance for distribution – Facebook says the links displayed will be “from a wide variety of publishers that regularly publish news content on Facebook that get high engagement with our community.”

That may help expand reach, or it could further restrict it. Or it may have little effect, if users don’t bother clicking on the info icon.

It’s impossible to say what effect the update could have, but it’s good to see Facebook taking practical measures to dispel false reports – whether users take to them or not. 



Source link

?
WP Twitter Auto Publish Powered By : XYZScripts.com