Skip to main contentSkip to navigationSkip to navigation
Aleksandr Kogan
Aleksandr Kogan: now we begin to understand the implications of his work. Photograph: Universal News and Sport
Aleksandr Kogan: now we begin to understand the implications of his work. Photograph: Universal News and Sport

What price ethics for software designers in the poisonous era of Cambridge Analytica?

This article is more than 5 years old
John Naughton

The programmers behind data analytics have unleashed forces they could never have imagined

On 12 September 1933, Leo Szilard, an unemployed Jewish physicist who had fled Nazi Germany, was walking down a street in Bloomsbury, London. He was brooding on a report in the Times that morning of a lecture given by Ernest Rutherford the previous day, at the annual meeting of the British Association for the Advancement of Science. In that lecture, the great physicist had expressed scepticism about the practical feasibility of atomic energy. Szilard stopped at a traffic light at the junction of Russell Square and Southampton Row. “As the light changed and I crossed the street,” he later recalled, “it suddenly occurred to me that if we could find an element which was split by neutrons and which would emit two neutrons when it absorbs one neutron, such an element, if it were assembled in sufficiently large mass, could sustain a nuclear chain reaction”.

This epiphany, which is recounted in Richard Rhodes’s monumental book The Making of the Atomic Bomb, was the key insight that led eventually to Hiroshima and Nagasaki. And after those atrocities, Szilard “felt a full measure of guilt for the development of such terrible weapons of war; the shape of things to come that he had first glimpsed as he crossed Southampton Row in 1933 had found ominous residence in the world partly at his invitation”.

Thus do ideas change the world. As the furore about Cambridge Analytica raged last week, I thought about Szilard and then about three young Cambridge scientists who brought another powerful idea into the world. Their names are Michal Kosinski, David Stillwell and Thore Graepel and in 2013 they published an astonishing paper, which showed that Facebook “likes” could be used to predict accurately a range of highly sensitive personal attributes, including sexual orientation, ethnicity, religious and political views, personality traits, intelligence, happiness, use of addictive substances, parental separation, age and gender.

The work reported in their paper was a paradigm of curiosity-driven research. The trio were interested in social media as a phenomenon and had a hunch about how unintentionally revealing its users’ online activities could be. They found a way of confirming their hunch. Since they are smart, they doubtless understood how valuable it could be to, say, advertisers.

What they might not have appreciated, though, was the power this conferred on Facebook. But one of their colleagues in the lab obviously did get that message. His name was Aleksandr Kogan and we are now beginning to understand the implications of what he did.

In a modest way, Kosinski, Stillwell and Graepel are the contemporary equivalents of Szilard and the theoretical physicists of the 1930s who were trying to understand subatomic behaviour. But whereas the physicists’ ideas revealed a way to blow up the planet, the Cambridge researchers had inadvertently discovered a way to blow up democracy.

Which makes one wonder about the programmers – or software engineers, to give them their posh title – who write the manipulative algorithms that determine what Facebook users see in their news feeds, or the “autocomplete” suggestions that Google searchers see as they begin to type, not to mention the extremist videos that are “recommended” after you’ve watched something on YouTube. At least the engineers who built the first atomic bombs were racing against the terrible possibility that Hitler would get there before them. But for what are the software wizards at Facebook or Google working 70-hour weeks? Do they genuinely believe they are making the world a better place? And does the hypocrisy of the business model of their employers bother them at all?

These thoughts were sparked by reading a remarkable essay by Yonatan Zunger in the Boston Globe, arguing that the Cambridge Analytica scandal suggests that computer science now faces an ethical reckoning analogous to those that other academic fields have had to confront.

“Chemistry had its first reckoning with dynamite; horror at its consequences led its inventor, Alfred Nobel, to give his fortune to the prize that bears his name. Only a few years later, [in May 1915] its second reckoning began when chemist Clara Immerwahr committed suicide the night before her husband and fellow chemist, Fritz Haber, went to stage the first poison gas attack on the Eastern Front.

Physics had its reckoning when nuclear bombs destroyed Hiroshima and Nagasaki and so many physicists became political activists – some for arms control, some for weapons development. Human biology had eugenics. Medicine had Tuskegee and thalidomide, civil engineering a series of building, bridge and dam collapses.”

Up to now, my guess is that most computer science graduates have had only a minimal exposure to ethical issues such as these. Indeed, it’s possible they regard ethics as a kind of hobbyhorse for people who don’t have enough to do. Hackers, in contrast, have more important stuff on their plates, such as refining that algorithm for increasing user “engagement” by exploiting human foibles or finding a neat way of combining GPS co-ordinates with Facebook “likes” to suggest a nearby gluten-free bakery. And all the while enabling their employers to laugh all the way to the bank.

What I’m reading


Trump charged over Twitter blocks
Is the Twittersphere more like a virtual town hall or an informal convention? Donald Trump blocks people who annoy him on Twitter. Now, the New Yorker tells us, he finds himself in court facing blockees (if that’s the right word) who argue that, by blocking them, he has violated their First Amendment rights.

Mr Glass calling to fix your drains…
Does Having a Day Job Mean Making Better Art? is a lovely essay in the New York Times by Katy Waldman, about artists who have day jobs. I didn’t know that the composer Philip Glass, for example, has worked as a plumber.

Most viewed

Most viewed