The algorithms that determine what we see on social media platforms wield a lot of power, especially when it comes to the news that people see. Facebook made big news recently when the company tweaked its algorithm. And did you know some real news stories are not written by humans, but by smart algorithm?
In this episode of Policy 360, Kelly Brownell discusses the promise and peril of algorithms with Phil Napoli. Napoli was recently awarded a fellowship by the Carnegie Corporation of New York to explore this topic more closely.
Phil Napoli’s upcoming book is titled Media Technocracy: The Rise of Algorithmic News and the Future of the Marketplace of Ideas.
First, what do you mean when you talk about social media algorithms?
There are some folks today who still aren’t aware of the extent to which our News Feeds are filtered. A lot of these platforms … made that decision that [the news feed we see is] not going to be chronological, that it would curated, that it would be filtered on the basis of criteria related to our previous behavior patterns: what types of stories were we clicking on; what types of stories were we liking; what types of stories were we sharing; what types of stories were the people in our social network reacting to in all these different ways?
All that becomes a treasure trove of data that can be analyzed to determine what we do and do not see in our News Feeds.
The fact that algorithms affect what see on social media – is it a problem?
Does it tend to increase the likelihood that we are actually exposed to falsity? There is some evidence of that, because, unfortunately, there is a fairly consistent relationship between partisanship as it relates to news and falsity. … As long as that linkage exists, there is … concern that this process does increase our likelihood of being exposed to false information and decreases our likelihood of being exposed to the accurate information that might refute the false information.
What does this news from Facebook mean?
One, it means that they would like to be a less prominent source of news for people than they have become.
What was interesting was a few days after the announcement about diminishing the role of brands in news outlets in the news feed, they also announced another new program that was going to put users in a position of evaluating the trustworthiness of individual news sources. Those evaluations would become a determining factor in the algorithm that places news in your News Feed.
[Facebook is] really are approaching the issue of how they operate as a distributor of news from a few different [sources] and seem to really be trying to reconfigure the dynamics of how they present news to us.
What do you think this all means for policy?
That is a really good question. I tend to use the term in this context governance, because I don’t think this is the kind of subject [in which] we want government alone (government agencies) dealing with.
It would be very interesting … if we had a federal agency that was tasked with policing the activities of social media platforms and dissemination of news.
What I really advocate is a real multi-stakeholder approach where we have not only users being involved in [vetting news in] some ways perhaps, but also … creating a much more robust self-regulatory apparatus where they in fact [platforms like Facebook] perhaps vet their algorithms according to particular public interest values that have [thus far] not really been part of how these algorithms work, and that there are more stakeholders, perhaps even news media involved in that and academic researchers involved in that.
[I imagine] the whole process becomes more participatory.
Computer algorithms can actually write stories?
Yes. If you read a college sports story for not a major sport or if you read an earnings report story, these days the odds are that that story was written by an algorithm….
It can produce a summary of a baseball game on the basis of the box scores. It can produce earnings reports stories on the basis on the statistics in those reports.
Those tend to be the primary areas where this is happening, but the companies that do this kind of work also have expressed some ambitions to see the nature of this sort of algorithmically generated news reporting expand.
Can we tell the difference if a story is written by a computer program?
You know what’s interesting is that the studies that have been done on that have found, generally, no. There might be some subtle ways where we rate a story slightly different, but in terms of our being able to identify which one was or was not written – absolutely not.