- Talent Management
You’ve already heard the news–Facebook is on the hot seat for admitting that its trending news is not simply a reflection of an algorithm but also subject to tampering by its human newsroom editors. Worse yet, the allegations suggest that Facebook’s human newsroom editors have been intentionally suppressing right-wing news stories about Paul Rand, Donald Trump and the pro-life lobby to ensure less trendy left-leaning stories rise to the surface.
The fuss started last week with a feature article in Gizmodo. The investigation reported that “Facebook workers routinely suppressed news stories of interest to conservative readers from the social network’s influential ‘trending’ news section…” Only a week later, Mark Zuckerberg (who was apparently on an extended work leave to hang out with his new baby) will meet with 15 “prominent conservatives” to explain the company’s approach to news curation and defend its method. As reported earlier this week in the Wall Street Journal, “Attendees will include TV and radio host Glenn Beck, Donald Trump adviser Barry Bennett, Mitt Romney’s former digital director Zac Moffatt, CNN commentator S.E. Cupp, American Enterprise Institute President Arthur Brooks and Fox News’ Dana Perino, a Facebook spokeswoman said.”
But are algorithms necessarily unbiased? Also, why can’t Facebook take a stand like any other news outlet?
The real issue here is evidently a big data question. Is big data really less biased than human talent?
From Facebook’s trending news to recruitment and hiring (notably, a growing number of high-tech companies, including Google, are using algorithms to recruit employees and select them), there’s an assumption that algorithms can do what humans can’t do–make decisions without bias. In fact, there are several problems with this assumption. First, while Facebook’s algorithm may be able to determine what stories people are reading most at any given moment, one has to bear in mind that only some people are on Facebook and actively using the platform. Put bluntly, it is a mistake to assume that the platform’s active users necessarily reflect the broader public.
On the flip side, there is also an assumption that trained journalists necessarily lack objectivity. It may be true that no one is perfect–journalists’ subjective biases do taint what they write and how they approach stories on some level, Yet, at least at most major news outlets, like the New York Times, there are also considerable checks and balances in place to assure that everything from hard news to columns have some degree of objectivity (e.g., even for columns, where opinion is allowed, all claims must be verified and this means that fact checkers must be able to back up any claim with at least two solid sources).
So who is more or less biased? An algorithm or a human? Or is it simply that biases surface in different ways depending on whether you’re relying on an algorithm or human? If so, Facebook’s reliance on both algorithms and human talent to curate their trending news may in fact be a welcome compromise.
While Facebook has spent most of the past week scrambling to explain that its algorithm may from time to time get a bit of help from its human newsroom curators too, another controversy has come to the surface and once again, it focuses on the behavior of its newsroom staff.
On May 17, an article by a former Facebook contractor who worked in the company’s trending news department appeared in The Guardian. Published anonymously, presumably because the writer fears retaliation from the company, the article suggests that real problem is not bias in the platform’s trending news feed but rather bias in the workplace itself. As the anonymous writer stated:
Most, if not all, of what you’ve read about Facebook’s Trending team in Gizmodo over the past few weeks has been mischaracterized or taken out of context. There is no political bias that I know of and we were never told to suppress conservative news. There is an extraordinary amount of talent on the team, but poor management, coupled with intimidation, favoritism and sexism, has resulted in a deeply uncomfortable work environment. Employees I worked with were angry, depressed and left voiceless – especially women.
Indeed, the anonymous reporter went on to explain that women’s voices are routinely shut down in the Facebook newsroom and that over the past two years, many women have left, despite the fact that Facebook compensates journalists at above average rates. While Facebook has denied allegations that its newsroom is a hostile environment for women employees, this emerging story reminds us once again that the tech world’s problems are pervasive and complex.
Addressing bias in the tech world is something that likely will not be solved by high-tech solutions, such as big data, alone.