You are on page 1of 2

FACEBOOK is biased. Thats true. But not in the way conservative critics say it is.

The social networks powerful newsfeed is programmed to be viral, clicky, upbeat or

quarrelsome. Thats how its algorithm works, and how it determines what more than
a billion people see every day.
Algorithms in human affairs are generally complex computer programs that crunch
data and perform computations to optimize outcomes chosen by programmers.
Such an algorithm isnt some pure sifting mechanism, spitting out objective answers
in response to scientific calculations. Nor is it a mere reflection of the desires of their
We use these algorithms to explore questions that have no right answer to begin with, so
we dont even have a straightforward way to calibrate or correct them.
To defend itself against the charges of bias stemming from the trending topics
revelation, Facebook said that the process was neutral, that the stories were first
surfaced by an algorithm. Mark Zuckerberg, the chief executive, then invited the
radio host Glenn Beck and other conservatives to meet with him on Wednesday.
But surfaced by an algorithm is not a defense of neutrality, because algorithms
arent neutral.
Algorithms are often presented as an extension of natural sciences like physics or
biology. While these algorithms also use data, math and computation, they are a
fountain of bias and slants of a new kind.
If a bridge sways and falls, we can diagnose that as a failure, fault the engineering, and
try to do better next time. If Google shows you these 11 results instead of those 11, or if
a hiring algorithm puts this persons rsum at the top of a file and not that one, who is
to definitively say what is correct, and what is wrong? Without laws of nature to
anchor them, algorithms used in such subjective decision making can never be
truly neutral, objective or scientific.
Programmers do not, and often cannot, predict what their complex programs will do.
Googles Internet services are billions of lines of code. Once these algorithms with an
enormous number of moving parts are set loose, they then interact with the world, and
learn and react. The consequences arent easily predictable.
With algorithms, we dont have an engineering breakthrough thats making life more
precise, but billions of semi-savant mini-Frankensteins, often with narrow but deep
expertise that we no longer understand, spitting out answers here and there to questions
we cant judge just by numbers, all under the cloak of objectivity and science.

If these algorithms are not scientifically computing answers to questions with objective
right answers, what are they doing? Mostly, they optimize output to parameters the
company chooses, crucially, under conditions also shaped by the company. On
Facebook the goal is to maximize the amount of engagement you have with the site and
keep the site ad-friendly. You can easily click on like, for example, but there is not yet
a this was a challenging but important story button.
Software giants would like us to believe their algorithms are objective and neutral,
so they can avoid responsibility for their enormous power as gatekeepers while
maintaining as large an audience as possible. Of course, traditional media
organizations face similar pressures to grow audiences and host ads. At least, though,
consumers know that the news media is not produced in some neutral way or above
criticism, and a whole network from media watchdogs to public editors tries to
hold those institutions accountable.
The first step forward is for Facebook, and anyone who uses algorithms in subjective
decision making, to drop the pretense that they are neutral.