Social media giants are mired in controversy. But kneejerk reactions from politicians should be avoided, writes Alex Krasodomski-Jones
The social media honeymoon is over. A decade ago, these disruptors were a breath of fresh air. Dressed in denim and thick-rimmed glasses, they freed our societies from ancient monopolies of power and media and speech. They were vanguards for democratic revolution in the Middle East. Connecting people together was a good in and of itself, perhaps even the ultimate good.
The scandal splashed across front pages last month – that Facebook data had been used to target political messaging – was just the latest in the litany of complaints that has come to characterise public perceptions of big tech. Accusations of harbouring illegal content, political manipulation, extremist use of platforms, hate speech and online abuse now pile up in policy inboxes and newsrooms on a daily basis.
The recent report by the committee on the standards in public life was predictably scathing. Its chair, crossbench peer Paul Bew, castigated the platforms – the ‘most significant factor’ in driving the harassment and abuse of members of parliament and candidates during the 2015 general election. Our own research at Demos has, over the years, tracked abuse targeting women, minority ethnic and religious groups, the disabled, the LGBT+ community, victims of accidents and terror attacks, politicians, Lily Allen and so on.
One famous Silicon Valley motto is ‘move fast and break things’, and governments around the world are starting to think whether actually fixing some of the stuff that has been broken might be good.
How do you fix a problem like Facebook? It is not easy, and it is right to remember that. Kneejerk reactions make for bad technology policy. We cannot forget that moderating a platform as huge as Facebook is an enormous technological and resourcing challenge. We cannot hand over the responsibility for policing our public commons entirely to a Californian advertising company. We cannot forget that algorithms are not perfect: defining ‘abuse’ is hard enough for us humans, let alone the machines.
The solution floated most often is to treat these websites as publishers, not platforms. To force them to take responsibility for every piece of content their users upload, abusive or otherwise. I remain sceptical of this solution. Leaving aside the technological difficulty and scale of the task – even if they got it 99 per cent right, that is still millions of mistakes to drag up at select committee meetings – removing all limitations to liability threatens the very core of social media’s business model. Regulating these companies out of existence feels extreme.
Instead, the government should demand far greater transparency of the digital commons. Backed up by a clear and realistic framework of what is and is not acceptable, we must be able to measure the health of our online spaces, and demand that those standards are maintained. As it stands, we rely on the generosity of big technology to reveal their inner workings. That has to change.
Alex Krasodomski-Jones is a researcher at the Centre for the Analysis of Social Media at Demos. He tweets @akrasodomski
Progressive centre-ground Labour politics does not come for free.
Our work depends on you.