Rethinking democracy in the era of big data

By: Julie Paquette


Christopher Wylie, the Canadian whistleblower of the Facebook–Cambridge Analytica scandal, made public the political data manipulation that happened during the 2016 United States presidential election.

As far as we know, the data of more than 80-million account holders were collected to target electors and influence votes in favour of the Donald Trump campaign.

This powerful data technology, as Wylie reported, would have also been used by the “Leave” camp during the Brexit campaign. People already lack trust in democratic institutions; this scandal is likely to reinforce this skepticism and can be presented as a threat to democracy.

To recognize what this scandal reveals on a political scale, we must understand what these data represent and what ethical and political concerns they reveal.

We now have the capacity to collect and store a massive amount of data ‘forever.’

Technically, this means everything you do online can be collected, stored and analyzed to reveal patterns and to ‘personalize’ your web experience. As Viktor Mayer-Schönberger (professor of Internet Governance and Regulation at Oxford) highlights, this technology goes hand in hand with “the quest to quantify and understand the world.” Big Data (the name for this massive amount of data) is often understood as a duplication of the reality, as if A (the reality) equals A’ (the data collected).

Even if we know this is only partially true (everything is not collected yet, and some data are corrupted), the belief in this equation has a perverse effect. In fact, A’ is tending to become A, which means our reality is becoming the collected data.

By extension, there is no room for what exists outside the Big Data world. If you do not participate online, you are, in a way, no longer part of this reality.

Moreover, the amount of collected data is so large that if you are part of a minority, even if you participate, you are fading away in this massive flow of information.

As highlighted by Antoinette Rouvroy in her work on algorithmic governmentality, what we used to call the ‘virtual’ no longer reflects the unfulfilled potential or the excess of possibility.

The virtual is the data and is reduced to pure functionality.

In addition, the collected data are not only quantifiable – they are valuable.

As has been said many times, data is the oil of the 21st century, and our participation in these web platforms is making this digital economy highly profitable.

Some may say that this data can be used for the greater good, such as health research. For example, it has been shown to improve transplant matches.

But as Cathy O’Neil points out in her book Weapons of Math Destruction, this can also increase inequality and threaten democracy.

In a way, we could say that Big Data is a pharmakon, a concept first developed by Plato, as it can denote both a poison and a cure.

Ethical and political issues abound with Big Data: What is the value of our consent? Is accepting the terms and conditions (which, honestly, none of us look at) enough? Does the right to privacy still matter and, if so, who should protect that right (the state, corporations, individuals)? What is the value of privacy? Is it ethical to make a profit from it and, if so, who should benefit from that profit? Is Big Data a threat to diversity?

Doesn’t the so-called personalization of the web experience also create an echo chamber where we are subjected to similar content, which could create a nasty feedback loop and render the very idea of pluralism obsolete? Also, shouldn’t those in power be accountable?

The algorithms that process these data are mostly opaque. We must remain on our guard.

As Astra Taylor details in her book People’s Platform, “The more customized and user friendly our computers and mobile devices are, the more connected we are to an extensive and opaque circuit of machines that coordinates and keep tabs on our activities; everything is accessible and individualized, but only through companies that control the network from the bottom up.”

Most of us would agree that democracy is something we should protect.

A democratic process should include some basic principles, such as procedural fairness, accountability, division of power and autonomy, and be by the people, for the people.

Some might respond to the lack of confidence in our institutions by promoting an online democratic platform (liquid democracy), but often those who are in favour of this idea also defend the notion that a perfect democracy would be instant, direct and without mediation.

They also assume that 100 per cent participation is the ideal.

We can ask: Aren’t we caught up in the A = A’ situation mentioned above? If you saw the movie The Circle, you could already have an idea of how dystopian this dream can be.

We must remember that a democratic process takes time and going against this very basic fact could risk undermining the importance of the discussion between people who do not think alike (at least at first glance).

Bringing Big Data into democracy, whether it is through Cambridge Analytica or through a liquid democracy platform, can be risky. Reflecting on the ethical and political concerns that any kind of digital democracy could raise might be the first step in protecting us from the worst of what democracy has engendered in the past.

Never again.


Julie Paquette is from the School of Public Ethics, Saint Paul University.