You are on page 1of 2

Beyond Facebooks Impact on Political Polarization in the U.S.

Any time scientists at a company purport to have done a study involving said
company in any way, the public has good reason to be suspicious of the reported
conclusions. Were the folks running the company really intent on providing credible
information, they would use independent scholars (i.e., not being compensated by
the company). Such a management would want to obviate even the appearance of
a conflict of interesttheir desire to provide the public with an answer being so
strong. So the management at Facebook may not have been very invested in
providing the public an answer to the question: how much influence do users
actually have over the content in their feeds? In May 2015, three Facebook data
scientists published a peer-reviewed study in Science Magazine on how often
Facebook users had been exposed to political views different from their own. 1 The
scientists concluded that if users mostly see news and updates from friends who
support their own political ideology, its primarily because of their own choicesnot
the companys algorithm.2 Academic scholars criticized the studys methodology
and cautioned that the risk of polarized echo chambers on Facebook was
nonetheless significant.3 I was in academia long enough to know that
methodological criticism by more than one scholar is enough to put an empirical
studys findings in doubt. Nowadays, I am more oriented to the broader implications
of the echo-chamber criticism.
The entire essay is at Beyond Facebooks Impact.

1 Alexander B. Howard, Facebook Study Says Users Control What They See, But Critics
Disagree, The Huffington Post, May 12, 2015.
2 Ibid. I put the quotes around scientists to make the point that the conflict of interest
renders the label itself controversial in being applied to the studys investigators.
3 See, for example, Christian Sandvig, The Facebook Its Not Our Fault Study,
Multicast, Harvard Law School Blogs, May 7, 2015.

You might also like