It’s not social media, it’s us!

It’s not technology that is bringing about populism and political outcomes many of us don’t agree with. It’s people.

Unsuprisingly, establishing a causal relationship between the means of communication and today’s political landscape is quite common these days. Thanks to intense news media coverage, the general public is getting more media-savvy. People are discussing algorithms, the power of Facebook, the impact of Twitter, and they draw their conclusions. It’s all the Russians’ fault, they say, or: it’s all because of the filter bubbles.

But sorry, this is wrong. While social media and its ad-driven business model does plenty to amplify radical voices, and while Russia’s role in all this deserves investigation, let’s clarify. It’s not technology that is bringing about populism and political outcomes many of us don’t agree with. It’s people.

It was members of a British political elite deluded by dreams of self-importance and a long-gone empire, hunting for a quick win among disappointed voters, who pushed for Brexit. They were noisily supported by traditional British media, who had much more impact than Twitter could dream of. It was the already polarized American society with its undercurrents of racism and sexism that led to the election of Donald Trump. And it has been the fear of losing out in a fast-changing world that led people to hope for quick fixes and easy solutions, proposed by politicians who capitalize on these fears — yes, people again.

This is not to downplay the effects of social media. The powerful platform companies better hurry in understanding the mechanisms that amplify hate speech, bullying, extremism and the like and get a grip on them — fortunately bad publicity and looming regulations got them started. But one of the biggest dangers of the digital age is the temptation to shed human responsibility. By transferring more and more processes to algorithmic decision-making, it will get easier for decision-makers to retreat from the line of fire. A qualified candidate doesn’t get invited to a job interview, a mortgage is denied? Must have been the algorithm. A blatant lie gets shared all over the place? Well, it just showed up on the timeline. A military drone targets civilians? Ouch, the programming must have gone awry. 

But let’s remember, it is humans who code software in the first place. It’s humans who neglect to pay attention to the rules it follows. It is humans who decide to share or ignore lies. It is humans who come up with lies in the first place. And it is humans who decide to go to war.

In the age of artificial intelligence, it is vitally important that human intelligence lend AI the values we would like to see in our societies. And to review these values in the light of the public debate. Additionally, there needs to be strong mechanisms for appeals, when someone feels mistreated by software. All this is even more important when algorithms write algorithms. Here exit mechanisms need to be put in place.

No one is infallible. Judges, leaders, doctors, editors, none of us are always right. But as long as we feel responsible, there is much we can do to base our knowledge on the best and latest information, and on our shared social values. And we should definitely use software to help us make these decisions, rather than let the software do the decisions making for us.

This column was published by NewsMavens on 23rd November 2018