Democracy in computational conditions
In early September 2016, Aftenposten, the Norwegian conservative daily newspaper, shared on its Facebook account one of the most iconic images from the Vietnam War – Nick Ut’s photograph of a naked, screaming child running away from napalm bombs. Facebook censored the post. First, Facebook sent Aftenposten an email saying that, because of the nudity, it violated Facebook’s policies and requested it be either pixelated or taken down. Second, without any correspondence with Aftenposten, Facebook took the picture down. Third, when the journalist responsible for posting the image tried to dispute Facebook’s action, he was banned from posting for twenty-four hours. Aftenposten’s editor-in-chief, Espen Egil Hansen, was angry and unsettled by Facebook’s intervention. He wrote an open letter to Facebook founder Mark Zuckerberg. The issue exploded into the media all over the world and is still sizzling weeks later.
What is the big deal? After all, ultimately, Facebook reversed itself and allowed publication of the photo. But, Hansen did not let the issue drop because he thinks Facebook’s censorship is a systematic problem and not just an isolated incident. Hansen describes the problem like this: “An increasing part of the population states that Facebook is their main deliverer of information about what is going on in the world. Zuckerberg is de facto the most powerful editor-in-chief on the globe. [He] mainly exerts his editorial responsibility by means of advanced algorithms that control what information we get to see and what we don’t.”
I endorse Hansen’s call for a public debate with Zuckerberg, but he is mistaken to think that Zuckerberg is a powerful news editor. Zuckerberg is more like media mogul Rupert Murdoch: he does not edit news, rather he owns the means of its distribution. Like Murdoch, if something goes wrong with his media company, he might be held accountable for its editorial practices. It is Facebook’s algorithms that are the powerful editor, not Zuckerberg. Consequently, as a public we need to see the software and have some means to edit or change Facebook’s software algorithms.
So, certainly this should be a public debate. Zuckerberg claims that Facebook is a technology company and not a media company, but events like this one make that a debatable claim. Hansen’s point is that even if Facebook does not produce news content, it is an incredibly powerful editorial force because it distributes so much news content. Discussion and debate will make us more knowledgeable about how and when Facebook distributes or censors news.
But we need to go beyond discussion and debate and engage in issues of design and implementation. For the good of democracy, the public needs to weigh in on the design and implementation of Facebook’s algorithms and also those of many other companies and governments.
Hansen’s objections raise questions about how news censorship and distribution is conducted through algorithms. Throughout the world, analogous concerns have been raised about surveillance, privacy, voting, warfare, and the financial markets –all now inextricably entangled with software algorithms. In the United States, algorithms have become primary decision makers in the legal system taking human decision making out of the process of terminating individuals’ welfare benefits; targeting people for exclusion from air travel; identifying parents believed to owe child support and instructing state agencies to file collection proceedings against those parents; purging voters from the rolls; and, deeming small businesses ineligible for federal contracts.
You likely encounter the power of algorithms every day when your credit card is accepted or declined; when you book a flight or buy something online; when you use a search engine, like Google. All these activities and more are mediated by software.Despite the seriousness of these algorithmic alterations to our governments and our everyday lives as citizens, the interrogation of technology’s role in democracy has largely been left to experts. It is now time for a larger public to participate.
Clearly I am asking for a lot, but what I have in mind dovetails with both the emerging priorities in education to teach children to program (e.g., starting fall 2016, French elementary school children will be taught to program); and, with what free and open source software advocates have been demanding for years: to see the code, especially the code that implements algorithms essential to the conduct of our everyday, social, and political lives.
Schools need to teach children how to read and write code. But, literacy is not enough. We need adult citizens who are more than literate; who are fluent in software so that they can critique and redesign algorithms. How can we know if Facebook’s algorithms are making reasonable decisions if we do not know what alternative algorithms are plausible? We can argue that what Facebook currently does is bad for democracy, but that does not resolve Hansen’s question. How should Facebook be operating otherwise?
For instance, it is educational to know that the cutting edge of machine vision until a few years ago was designing an algorithm to recognize cats in YouTube videos. In other words, just getting a computer to pick out a given object in an image is still cutting edge research. Imagine just how hard it will be to write an algorithm that might be acceptable to Hansen: an algorithm to identify what is and what is not a culturally and politically important documentary image. But just because it is hard does not mean we should not try!
Transparent government has been a democratic ideal at least since the Enlightenment. We need to have access to the codes and workings that govern us. Today, those include software algorithms too. We must see the source code of software algorithms. But, we need to do more than see: we need more than a window. We also need a door: a way into Facebook’s algorithms so that we can critique and change them. If we work on this together, we will have a chance to make algorithms match our democratic ideals.