The Current Outrage at Facebook Misses a Huge Point

Source: Conservative Review | May 11, 2016 | Robert Eno

The current conservative outrage at Facebook misses a larger point. Facebook users were led to believe that trending stories were based on a computer algorithm, not human interference. The betrayal of that trust is a much broader problem than finding out that humans let their bias affect their work. If the controversy allows more transparency from Facebook, that will ultimately be a good thing.

….

According to the company, Facebook has over one billion daily unique users. That is one billion trending news sections, “personalised[sic] for each user, based on their likes and those of their friends.” It is impossible for any set of humans to personalize anything for over one billion users a day. That’s where the algorithm comes in.

At its core, an algorithm is a set of instructions or rules for solving a problem. Algorithms are the basis for just about everything a computer does. Facebook uses algorithms to determine what is trending. That is how it is able to personalize trending news topics for over a billion users a day. Most algorithms run without human involvement. Therein lies the problem. Facebook made it seem like its users were getting a non-curated product. Nowhere in its original announcement of the feature did it mention human involvement in curation.

….

The bigger story was released by Gizmodo a week earlier. It was a look at the inner workings of the Facebook “trending” operation as a whole. Gizmodo wrote that Facebook contractors told them, “we choose what’s trending.” Users were led to believe something different. It is this lack of openness that causes a lack of trust in the Facebook product among its user base.

….

Stocky goes on to explain that the reviewers — which up until now were not actively publicized — have the discretion to disregard stories that are fake, duplicate topics, hoaxes, etc… That sounds fair, but the part about insufficiently sourced may be the loophole that a human, and their bias, can seep in. While he argues that bias can’t seep in he admits that they take a biased look to see if news is properly “sourced.”

After his Facebook post, it was revealed that Stocky is a maxed out donor to Hillary Clinton’s presidential campaign.

The key takeaway here is that when you announce a feature to a product, and you pretend that feature is something it is not, it will come back to bite you. That is exactly what has happened here. Facebook users were under the assumption that trending news was based on what they and their friends shared, not what a dozen folks in a basement in Manhattan thought was important.

Viewing 1 post (of 1 total)
Viewing 1 post (of 1 total)

You must be logged in to reply to this topic.