Facebook relies on editors’ judgment for trending news feed, documents show
But the documents show that the company relies heavily on the intervention of a small editorial team to determine what makes its “trending module” headlines – the list of news topics that shows up on the side of the browser window on Facebook’s desktop version. The company backed away from a pure-algorithm approach in 2014 after criticism that it had not included enough coverage of unrest in Ferguson, Missouri, in users’ feeds.
In interviews with the Guardian, three former editors said they had indeed inserted stories that were not visible to users into the trending feed in order to make the experience more topical. All denied personal bias, but all said the human element was vital.
They did admit the presence of human judgment in part because the company’s algorithm did not always create the best possible mix of news.
+Note: Would love to do a thorough commentary on this but think the above highlights the issues, in outline form. That this is an issue, and not “Who uses the trending tabs” (which to our minds is the real question here, not does a story suddenly appear out of nowhere or is being suppressed), and why is it always days behind. The insertion of branded pages into the newsfeed had been noted by us anecdotally during usage. So it will show up in your actual feed because it had been tagged, and you like that page. As such, many pages we “liked” previously were quickly unfollowed.
On the whole this seems a tempest in a political teapot, yet it does highlight something we’ve talked about as seen in the last quote. If algorithms don’t create the best mix curatorially then obviously humans are still needed.