Facebook’s live-streamed Zuckerberg speech got a big thumbs up



Facebook does not show all the comments in real time on popular live broadcasts because the volume is too high, said Tucker Bounds, a Facebook spokesperson. To choose what does appear, Facebook relies on a number of “ranking signals” to filter out those that are low-quality. The signals include how much people interact with comments and if something is “engagement-bait,” according to a Facebook post from earlier this year. The system applies to all comments on public pages and posts with large followings. Nothing was unique to Zuckerberg’s speech, Bounds added.

Some people on Twitter noticed the comments streaming alongside Zuckerberg’s talk were almost universally positive. Viewers left more than 45,000 thoughts on the live stream at the time it ended, and the ones shown during the speech were primarily thanking Zuckerberg and Facebook. However, more negative comments were visible after the talk ended and the option to sort comments by most recent appeared.

Sad or angry emoji faces were also rarely shown on Zuckerberg’s public stream, where animated reactions floated by in real time. Thumbs, hearts and laughs appeared. The reason, says Bounds, is that the system shows emoji based on proportion instead of one-for-one — a necessity for a highly watched stream on which more than 80,000 emoji reactions were shared. Angry faces were clicked around 900 times during the broadcast, while the thumbs up was used 77,000 times, according to statistics on the video feed.

During the 35-minute talk at Georgetown University, Zuckerberg passionately defended the platform’s approach to free speech, saying “I believe we must continue to stand for free expression,” and that the cost of not allowing free speech, in all its messiness, is “too great.”

It’s not Facebook’s first run-in with confusing moderation. The company has been under fire most recently for its policy of allowing political ads that are misleading or contain lies. It’s also been criticized for helping spread disinformation ahead of elections and allowing hate speech. Facebook has tested out different fixes for its problems, such as using artificial intelligence to identify hate speech, adding fact checks to controversial posts and minimizing how many people see something that’s been identified as untrue.

The commenting system is an example of how Facebook uses technology to decrease how many eyeballs see problematic posts while sticking to its free-speech policy. Content is not necessarily removed from the site, unless it’s identified as something that violated Facebook policies, such as a threat of violence or hate speech. In this case, the comments that the automated system identified as being lower quality were simply less visible.

As for the comments that did surface during the stream, they were mostly grateful for the site and its CEO. “Thank you for Facebook, it is a powerful applications bringing friends and families together,” said Eunice Maku Ayiku-Nartey from Ghana.

“Thanks for creating such an amazing platform that makes it easier and convenient for us to [reach] out to our loved ones and friends without stress,” said Mary Gabby.





Source link

WP Twitter Auto Publish Powered By : XYZScripts.com
Exit mobile version