Oxford's oldest student newspaper

Independent since 1920

I’m deleting Facebook, for your benefit as much as mine

All of us contribute to making algorithms dangerously accurate

This is not about mental health, although that alone is (or should be) a good enough reason for deleting your Facebook account. It’s also not about “living life in the moment”, or “returning to a simpler time”. This will not be a holier-than-thou sermon to the effect of “I’m deleting Facebook, and if you know what’s good for you, you will too”. This is a response to the revelations surrounding Cambridge Analytica, who used data from Facebook to compromise the integrity of the democratic process worldwide.

In a curiously prescient art film from 1973 titled “Television Delivers People”, text scrolls over a blank background, proclaiming that “The viewer is not responsible for programming…you are the end product.” In the case of social media, to say we are only products is an oversimplification. Our data is the product. And we, as consumers, produce it. We are individually and collectively responsible for our own ‘programming’, though we can’t choose what form it takes, or how it is delivered.

We’ve known for over a year that we can’t trust Facebook to be responsible – not with the headlines they show us, not with the research on us that they conduct, and certainly not with our data. Whilst treating their user base as if they were a combination of lab rats and golden geese, they have repeatedly proven themselves to be arrogant, duplicitous, and incompetent. In times where security breaches of tech companies are commonplace and black markets for the trade of stolen data are multi-million dollar affairs, these opinions are not controversial.

But this was no security breach. Facebook didn’t fail to defend our data, they simply sold it. In effect, they sold us. The fields of psychometrics and quantitative psychology, in combination with vast quantities of user data, allowed governments and companies such as Cambridge Analytica to manipulate us in unprecedented ways and to, as Mark Turnbull so charmingly put it, “drop the bucket further down the well than anybody else, to understand…those really, deep-seated underlying fears, concerns”. Turnbull is not talking about individuals here. He is talking about aggregates and averages. This is the law of large numbers at work: an individual person is very difficult to predict, but large groups of people are far easier. Every reaction I choose, every video I watch, and every meme I tag my friends in contribute to an algorithm’s accuracy in targeting not only me, but also people like me. I produced the data. Facebook sold the data. Cambridge Analytica used the data. Now a demagogue sits in the White House, and Britain is leaving the EU. Who is responsible?

Of course Cambridge Analytica is. They knowingly misused data meant for academic research purposes, and gleefully provided Steve Bannon a weapon with which to wage his culture war.

Of course Facebook is. They’ve played fast and loose with our data in a way that, at best, indicates gross negligence.

But I am too. The data I produced for Facebook refined every algorithm that it was fed into. This is true not only of CA, but of all the other companies and governments (I’m looking at you, Russia) that perform identical metrics. My data is being misused, and I am, however indirectly, responsible for the effects that it causes. Similarly, people who don’t get immunised against disease are responsible for the (now alarmingly increasing) death toll due to preventable diseases. To protect the health of the herd, every animal has to pitch in and get vaccinated. I can’t be certain that Facebook (or blackhat hackers) won’t sell my data to another company or government looking to manipulate an election. The only solution I see is to delete my data, in the hope that it knocks a fraction of a percentage point of accuracy off those algorithms trying to scare someone into voting for a racist, or a misogynist, or a fascist, or some combination of the above. If enough people did the same, they would stop working entirely.

I have read arguments that claim there is no need to quit Facebook entirely. They point out how embedded the platform is in our lives, how difficult it is to delete your data, as well as its positive potential. Initially, I found them compelling. But then indignation kicked in. How could I let myself be held hostage by photos (or “memories”, as Facebook insidiously terms them) which I could easily download? Wouldn’t a new, professional account, devoid of anything but the bare minimum of personal data be sufficient for work? Surely the process of deletion couldn’t be more difficult than navigating Weblearn for the first time? (It isn’t.) Yes, Facebook’s potential to bring people together is unparalleled. But the source of its strength is also its greatest weakness – it can only function effectively with a huge number of users. We can easily envision a social network built upon principles of transparency, privacy, and integrity (perhaps based on blockchain technology). Viewed as a consumer choice, quitting Facebook provides tech companies with a tangible incentive in terms of user numbers to create that network. It falls to us to create that incentive, to show that this is a status quo we will not, under any circumstances, tolerate. #Deletefacebook.

Check out our other content

Most Popular Articles