Maria Miller is a former Culture Secretary, and is MP for Basingstoke.
I want 2021 to be the year that we finally grasp the nettle of online abuse – to create a safer, more respectful online environment, that will lead to a kinder politics too.
The need has never been greater. Abuse, bullying, and harassment on social media platforms is ruining lives, undermining our democracy, and splintering society.
As an MP, I have had to become accustomed to a regular bombardment of online verbal abuse, rape, and even death threats. In this I am far from alone. Female colleagues across the House are routinely targeted online with abusive, sexist, threatening comments. As Amnesty has shown, black female MPs are most likely to be subjected to unacceptable and even unlawful abuse.
And while women and people from an ethnic minority background are more likely than most to receive abuse online, they are not alone. Hate-filled trolls and disruptive spammers consider anyone with a social media presence to be fair game: one in four people have experienced some kind of abuse online and online bullying and harassment has been linked to increased rates of depression, anxiety, and suicide.
While the personal impact of online abuse is intolerable, we must not underestimate the societal effect it is having. Research by the think-tank Compassion in Politics found that 27 per cent of people are put off posting on social media because of retributive abuse. We cannot have an open, honest, and pluralist political debate online in an atmosphere in which people are scared to speak up.
Which is why I am working cross-party with MPs and Peers to ensure that the upcoming Online Harms Bill is as effective as possible in tackling the scourge of online abuse.
First, the Bill must deal with the problem of anonymous social media accounts. Anonymous accounts generate the majority of the abuse and misinformation spread online and while people should have an option to act incognito on social media, the harm these accounts cause must be addressed.
I support a twin-track system: giving social media users the opportunity to create a “verified” account by supplying a piece of personal identification and the ability to filter out “unverified” accounts. This would give choice to verified users while continuing to offer protection to those, for example whistle blowers, who want to access social media anonymously.
The public back this idea. Polling by Opinium for Compassion in Politics reveals that 81 per cent of social media users would be willing to provide a piece of personal identification (passport, driving license or bank statement most probably) to gain a verified account. Three in four (72 per cent) believe that social media companies need to have a more interventionist role to wipe out the abuse on their platforms.
Of course, this approach would need to be coupled with enforcement ,and I believe that can be achieved by introducing a duty of care on social media companies, along the lines suggested in the Government’s White Paper.
For too long, they have escaped liability for the harm they cause by citing legal loopholes, arguing they are platforms for content not producers or publishers. The legal environment that has facilitated social media companies’ growth is not fit for purpose – it must change to better reflect their previously unimaginable reach and influence. Any company that sells a good to a customer already has to abide by health and safety standards, and there is no reason to exempt social media companies. Any failure by those companies to undertake effective measures to limit the impact of toxic accounts should result in legal sanctions.
Alongside a duty of care, we need more effective laws to give individuals protection, particularly when it comes to posting of images online without consent. Deepfake, revenge pornography and up-skirting are hideous inventions of the online world. I want new laws to make it a crime to post or threaten to post an intimate image without consent, and for victims to be offered the same anonymity as others subjected to a sexual offence, so we stop needing the law to play continuous ‘catch up’ as new forms of online abuse emerge.
Finally, the Government should make good on its promise to invest an independent organisation with the power and resources to regulate social media companies in the UK. All the signs suggest that Ofcom will be asked to undertake that role and I can see no problem with that proposal as long asthe company is given truly wide-ranging and independent powers, and personnel with the knowledge to tackle the social media giants.
In making these recommendations to Government, my intention is not to punish social media companies or to stifle online debate. Far from it. I want a more respectful, representative, and reasonable discourse online. So, let’s work together over the coming 12 months to make this Bill genuinely world-leading in the protection it will create for social media users, in the inclusivity it will foster, and respect it will engender.