Nat Wei: Will Sunak or Truss finally stop a Conservative Government tabling un-Conservative legislation?

27 Jul

Lord Wei is a Conservative member of the House of Lords. He is a co-founder of Teach First, a social entrepreneur, and a former government adviser.

With the hustings now underway there will be lots of questions asked no doubt about how we combat the cost of living, lower tax (eventually), control immigration, and how to make the most of Brexit.

But another key question is this: how can we make sure that a Conservative government actually delivers conservative legislation? Or at the very least, stops tabling laws drafted by civil servants (often in collaboration with our political opponents) which attack directly Conservative principles such as freedom of speech and thought, or the right to privacy.

One clear glaring example of this is how the Schools Bill, authored effectively with the public backing Labour politicians such as Lord Soley, seeks to prevent parents from fulfilling their duty to educate their children as they see fit. It massively empowers local authorities to monitor how parents raise their children, imposing inspections that will bring state intrusion deeply into homes and the lives of families.

This is bad enough on its own terms. But worse is what it will empower politicians to do next.

Should a future government seek to change the national curriculum to further teach woke ideas and oppose conservative ones, our own Schools Bill as currently drafted will remove the ability of parents to take their children out of school and educate them at home.

It does this by enabling officials to inspect what is being taught and then, for the slightest, subjective reason, force parents to put their children back into school using an attendance order. If they refuse to comply, they would face a hefty fine and then up to 52 weeks imprisonment, without recourse to appeal.

(There are already unjustifiable cases of this happening today, even before this particular bill has passed.)

How did we get to a point when such a draconian, statist, and anti-freedom law even gets it onto the desk of our Education Secretary unvetted – let alone makes it to Second Reading of the House of Lords?

Many a distinguished previous Conservative education secretaries, such as Lord Baker, have highlighted how abhorrent is this Bill, with its centralising tendencies and lack of detail.

Sadly, it is not the only piece of such legislation in the pipeline. Others, such as the Online Harms Bill, are also proceeding through Parliament. In each case laudable aims are harnessed to bad laws with unintended consequences, many of which will lead to the unjust curtailment of ancient freedoms.

So if you have a question to ask at one of the hustings this summer, why not ask them what they are going to do about the Schools Bill and others like it, which seek to curtail freedoms in ways that are not only un-Tory, but which could be weaponised against Conservatives and our beliefs one day very soon?

True Conservatives believe in our ancient rights to be free: free to spend money and to put it to use how we see fit, free to take control over our sovereignty, and free to operate within the rule of law which should protect our freedoms of speech, privacy, and belief.

Which of the candidates to be our next prime minister truly believes in freedom? And which one will fight to make sure our laws guarantee this and don’t take them away?

The post Nat Wei: Will Sunak or Truss finally stop a Conservative Government tabling un-Conservative legislation? appeared first on Conservative Home.

Danny Stone: The Online Safety Bill is a necessary curb on inflammatory internet content

22 Jun

Danny Stone MBE is the Chief Executive of the Antisemitism Policy Trust

Last week, Toby Young and I spoke to a group of Conservative MPs about the Online Safety Bill. You’d expect that the two of us would have extremely different views on the nature of the Bill, but I was surprised at the level of agreement we shared in different areas of it, which leads me to think that some of the concerns raised about freedom of speech in relation to the Bill are more about prioritising issues than underlying principles.

First, it’s important to have a shared understand what the Bill actually does. It does not require material be deleted unless it is illegal. When it comes to legal but harmful content, only the largest companies are required to act. The Bill says these larger companies must risk assess for legal but harmful content and develop Terms and Conditions to address it.

However, ‘takedown’ – that is removal of content – which is mentioned only once – is one of a number of measures that a company can deploy in dealing with legal harms. There is no duty to remove content or to censor legal speech, and no penalty associated with failure to take action, so long as companies are consistent in their approach.

If the Bill became an Act, as drafted, platforms would detail the risks associated with their business, choose their harmful content policy and enforcement mechanisms, and be required to deploy these consistently. We, as adults, could then make an informed, risk-based choice on the services we wish to use.

This is a systems-based approach, not focusing on the content but the delivery of it and systems responses to it. As for the harms themselves, they will no longer be determined by Silicon Valley, but instead in Westminster.

The Bill doesn’t require action on offensive speech, despite the fact that we go further in limiting expression in other areas. Ofcom is required in its regulation of our TV and radio content to set “generally accepted” standards to protect the public from “offensive & harmful material”; the Prevent programme involves assessment of legal speech when addressing extremism. Even in parliament, MPs aren’t allowed to call one another liars in the chamber.

Parliamentarians have long, therefore, considered that some legal speech must be addressed. National polling shows the public support this approach. However, the Bill focuses in on the systems.

The systems behind what we see on user-to-user services arise from active choice. They aren’t neutral with a default guarantee of freedom, they amplify or minimise harms, including antisemitism. Though social media is free to use, and a ‘leveller’ in some respects, platforms provide the opposite of the inclusivity, good faith and discourse ethics central to Haberman’s public square concept.

The owners monetize and monopolise our expression data. Platforms gamify discourse, using human psychology to enable traffic hungry design, driving people to increasingly inflammatory content.

Rules are made and applied inconsistently, without transparency. In a real town square we don’t find 20,000 people shipped in by the council to attack the speaker whilst others are doing a deep dive on their background, in a bid to share their address endanger their family and damage their employment prospects.

However, we do find no trip hazards or over-amplified PA systems in this Bill. The public square has been subjected to disruption, and companies – and let us not forget, they aren’t opposing the Bill – are crying out for regulation, having had nearly two decades to get it right themselves. They have failed to deliver unshackled speech online; the terms of engagement are unequal.

That inequality is often not considered in debates about the Bill. When speaking to the MPs, I argued that we define free speech too narrowly. Online harm leads to departure and isolation from online spaces. The great focus is censorship when, in fact, so many are already supressed, or marginalised.

Research shows individuals avoid political discourse for fear of harassment. Moreover, the spaces we inhabit are already far from diverse. The Pew Research Centre, for example, found the most active 25% of Twitter users produce 97% of all tweets.

In these potentially monotone spaces, the distribution of legal harms is sometimes encouraged. Let us not pretend this is simply offensive speech. Praise for terrorists when it doesn’t directly encourage violence is a legal harm. Making your profile name ‘gas the k**es’ is also legal but harmful,  and denying the Holocaust and pseudo racial science is too.

This type of material can, and has, led to offline harm. The mass killer responsible for the murders in Buffalo, New York, talked about the way in which he had been radicalised by material online. Is it any wonder so many voices stay away from these horrible online spaces?

A systems-based approach, which the Bill aspires to, is more likely to be free speech preserving, and centralise democratic rights. Legally harmful content should be viewed as a design problem. It’s not that it cannot be there or must be deleted – though I obviously have strong views about whether it should be – but rather that platforms should be incentivised to ensure their systems minimise its promotion or ask users to stop and think.

The Bill creates a regulated marketplace of harm. It doesn’t force legal content removal – the penalties for posting such content are determined by social media companies, not the state.

Both Toby Young and I had concerns about the Bill. We were worried that the Secretary of State (rather than parliament) is given too many powers to dictate what legal harms are, about the degree of detail being left to secondary legislation, and about platform inconsistencies in takedown (in relation to free speech and harmful materials respectively).

However, my contention was that the aspect of the Online Safety Bill requiring action on ‘legal but harmful’ materials addresses longstanding market failures. My focus is on the harm caused by platforms. His appeared to be ensuring free speech is not damaged through acting on such harms.

The systems focus should enable us to reach a happy medium, in my view, one not bogged down in discussion of content. The debate helped me to feel reassured this is viable. The legal but harmful provisions simply recognise a reality which if ignored, will require future legislation anyway, when the next round of system design flaws lead to offline harm.

The post Danny Stone: The Online Safety Bill is a necessary curb on inflammatory internet content first appeared on Conservative Home.

Profile: Chris Philp, charged with the nightmarish task of seeing the Online Safety Bill through the Commons

15 Apr

You can’t make a silk purse out of a sow’s ear. This, however, is the task to which Chris Philp will from next Tuesday have to apply himself as he strives to see the Online Safety Bill through the Commons.

It is expected to be the most amended Bill in history, for everyone who has had anything to do with the legislation admits that it is in an unsatisfactory state, with terms like “a bloody nightmare” often used.

The Online Safety Bill sets out to regulate the internet. This means anyone who has ever been annoyed by something which happened to them online has views about what it should ban or at least ameliorate, which in turn means virtually every MP and peer.

John Whittingdale, a former Culture Secretary, told ConHome it is quite wrong that only one day, Tuesday, has been allowed for the Second Reading, and observed that it really needs two.

Whittingdale pointed out that on Tuesday there are likely to be statements on Ukraine, Downing Street parties and energy, which means those who want to speak on “this hugely important and hideously complicated Bill will get about 30 seconds each”.

At the heart of the legislation is an unresolved struggle between free speech – the right, under the law, to publish whatever one wishes on the internet – and the proposal to remove “legal but harmful” content.

As the Bill goes through its Committee stage, Philp will be charged with the task of deciding which amendments the Government intends to accept, and which it opposes.

This will require a grasp of the detail, which he is universally agreed to possess, just as he did in his previous ministerial post at the Home Office, which included the vexed question of cross-Channel migration.

It will also, however, require the ability under pressure to shape incompatible elements into a coherent whole which can command parliamentary and public confidence, and here one simply does not know how he will get on.

His officials find he masters his brief with almost miraculous speed, but is deficient in social skills, and is not the kind of person who says at the end of an arduous day,  or to whom one might oneself feel able to say, “Shall we go for a drink?”

If Philp succeeds, he be marked out as a rising star. If he fails, and antagonises parliamentarians as he fails, the role of scapegoat awaits him, even though the whole venture was set in motion four years ago by Theresa May, along with various other pious aspirations which are easier said than done, such as the Net Zero target and the ban on conversion therapy.

When Nadine Dorries, since 15th September 2021 Culture Secretary, and her sidekick Philp, appointed the next day Parliamentary Under-Secretary of State for Tech and the Digital Economy, appeared in November before the Joint Committee on the Draft Online Safety Bill, the following exchange took place:

The Chair, Damian Collins: “Thank you very much. You say that you have been looking at progressing the Bill since you were appointed as Secretary of State. By that, would it be fair to assume that, as far as you and the department are concerned, the Bill as published in draft form earlier this year is not the Government’s final word on the legislation?”

Nadine Dorries: “No, it is not the Government’s final word. It is not my final word. I have been pushing on a number of areas, which I hope to be able to highlight this morning. It is not the final word because of the work that you have been undertaking. I want to reassure you that we are awaiting your recommendations as soon as possible, and we will be looking at them very seriously indeed. At the risk of saying too much, I want to reassure you that they will be very carefully and very seriously looked at. I see this as very much a joint effort on behalf of all of us.”

So the Government is open, or claims it is open, to being pushed around: an additional incentive for both the Commons and the Lords to try to push Philp around.

Insiders say the legislation is already festooned like a Christmas tree: “Nadine keeps hanging more and more things on it.”

Dorries says this is “the most important piece of legislation to pass through Parliament” in her 17 years in the House, and “has to be watertight”:

“In my previous role as Minister for Mental Health and Suicide Prevention for two years, I made a point of meeting with the parents of children who had lost their lives, had taken their own lives. I cannot put into words how devastating it is to sit down with parents of children who have taken their own lives needlessly. It was not that they went online and looked for the means to do so, but because algorithms took them in that direction, whether it was to pro-anorexia sites, suicide chatrooms or self-harm sites.”

This is one of the harms which the giant tech companies will be required to take reasonable steps to prevent. So Philp has got to produce a Bill which will not only stand up to parliamentary scrutiny, but to the world’s top lawyers, employed by Facebook and Google.

One danger is that the big tech companies, which will be liable under the Act to fines of up to ten per cent of their global turnover, will err on the side of caution, and will censor anything which might conceivably cause harm. To some extent, this is already happening.

It is easy enough to agree that children should not be encouraged, by algorithms which guide them to the wrong sites, to commit suicide.

But what about adults who wish to discuss climate change, or the best way to treat a mysterious new virus which has just emerged in China? “Legal but harmful” could result in the censorship of various ideas which are regarded with horror in Silicon Valley, but which in Britain we wish to be free to discuss.

Are Mark Zuckerberg and Nick Clegg to be the arbiters of thought in Sheffield and Swansea?

OFCOM will be given the task of implementing the Act. It will draw up a code of practice, which the tech companies will have a duty either to follow, or to show they have matched. “The point of bite is at the duty level,” Philp told the joint committee.

“We must also remember that we have given OFCOM teeth, some may say fangs,” Dorries added.

Dorries and Philp stand shoulder to shoulder. When John Nicolson (SNP, Ochil and South Perthshire) tried to rough up Dorries, Philp asked: “Are these questions designed to scrutinise the Bill or personally to attack the Secretary of State?”

And Dorries soon afterwards said of Philp: “I know he is very keen to give you the technical answer. I am so glad he is here.”

But to the condundrums posed by the Bill, there will not be technical answers.

Nor will Philp be able, as has been his inclination in his career so far, simply to follow with ostentatious fidelity the party line.

There is, as yet, no party line. On the one side are MPs like David Davis and Steve Baker who are vigilant defenders of free speech.

On the other are figures like Dorries who voice the desire of parents everywhere, and especially in seats captured from Labour in 2019, to have their children protected from perverts and pornographers, and their grandmothers from online fraudsters.

And there are other powerful interests which Philp will be under pressure to accommodate. Many Remainer MPs are obsessed with disinformation, to which they attribute their defeat in the 2016 referendum. The Home Office is keen, for reasons of national security, to end encrypted messaging.

British newspapers want to take revenge on the Californian tech giants which have stolen their advertising revenues.

In an attempt to conciliate the newspaper industry, the Bill includes special protections for journalism, a term which is hard to define in the age of the citizen journalist.

Nor is the Daily Mail easy to conciliate on a long-term basis. Last month Philp wrote a piece for it in which he said:

Social media sites currently operate under no one’s rules but their own.

This has led to an online world where teenagers’ lives can be ruined by cyberbullying, suicide is encouraged, vulnerable people are radicalised by terrorists, kids are exposed to pornography and racist bile is shared without consequence.

What’s worse – a lot of this vile stuff is actively promoted to huge audiences via algorithms simply because it makes social media firms more money.

The case for regulation couldn’t be clearer: We have a moral duty to make big tech take action and clean up the internet once and for all. As a father, nothing could be more important to me…

Trusted news sites such as MailOnline will be exempt from the Bill’s provisions, including its reader comment sections which inspire such lively debate.

Ofcom will hold tech giants to account with tough powers to issue multi-billion-pound fines and block them in the UK.

I cannot be alone (the style is infectious) in finding something repugnant in a Government minister, or even a regulator devised and perhaps leant upon by the minister, deciding which news sites are “trusted”.

Where do questions of good taste and manners end, and the “harms” which the Bill is supposed to avert begin? That is an impossible border to draw, especially as it is fluid.

Boris Johnson became Prime Minister in part because of his genius for saying and writing things which were in poor taste, and for which the prigs wished to condemn him, but which most fair-minded people could see ought in a free society to be allowed.

How is Philp to make sense of that kind of provocation, and that kind of toleration? It is a matter more of instinct than of legal definition. The Bill is in danger of setting out to define the indefinable.

When the Daily Mail is angry with Philp, as assuredly it will be one day, it will turn him over. He will have arrived as a politician when that newspaper denounces him on its front page as an enemy of freedom.

Philp, born in 1976, was educated at St Olave’s Grammar School, in Orpington in Kent. He read physics at University College, Oxford, became a successful businessman, in 2006 took a council seat off the supposedly impregnable Camden Labour Party, but at the 2010 general election fell 42 votes short of defeating Glenda Jackson, the Labour incumbent, in Hampstead and Kilburn.

He had worked immensely hard to win the seat, but took defeat with good grace, and in 2015 was returned for Croydon South, after which he said in his maiden speech:

“People in Croydon South believe that hard work and enterprise are the best ways of combating poverty and promoting prosperity. Businesses such as the Wing Yip Chinese supermarket on Purley Way and BSW Heating in Kenley are the lifeblood not just of our economy but of our society. I share those values. Over the past 15 years, I have set up and run my own businesses in this country and overseas. I set up my first business when I was 24. I started by driving the delivery van myself, and eventually floated that company on the stock market. My grandfather also drove a delivery van and he grew up in Peckham. I think he would be very proud, if he were still with us, to see his grandson speaking on the Floor of the House today.”

All good stuff, but one detects a kind of compelled agreement which will not be available as he sets out to pilot the Online Safety Bill through the Commons.

Dr Sarah Ingham: We must do far more to protect children on, and from, the Internet

1 Apr

Sarah Ingham is author of The Military Covenant: its impact on civil-military relations in Britain.

Mass murder in Myanmar; intensive care beds full of unvaccinated Covid-19 patients; insurrection at the US Capitol; teenagers sent down rabbit holes of content promoting self-harm, eating disorders and suicide …

Given the human cost of the internet, as listed by the Joint Committee on the Draft Online Safety Bill, it must be wondered whether the techies in Silicon Valley realised they were creating a digital Pandora’s Box, unleashing horrors upon us. So much for “Do No Evil”.

Trying to bring some accountability and regulation to the online world and tech giants, a fortnight ago the Government unveiled the latest draft of the bill, some five years after it was originally proposed.

As with most government measures these days, it comes with a side-order of Carlsberg-type hyperbole: instead of “probably the best lager in the world”, the bill will ensure “the UK is the safest place in the world to be online”.

Much of the impetus for the bill is the safety of children. Evidence given to the Joint Committee is indeed troubling, particularly in connection with the ease that children can access extreme pornography. Age Verification, a Conservative manifesto commitment back in 2015, is long overdue, even if like parental locks, SafeSearch and privacy settings, it will probably be easily cracked by determined youngsters.

Almost absent from any debate over the harm to children caused by social media access is the role of parents and schools. While initiatives from the Child Exploitation and Online Protection subsidiary of the National Crime Agency are welcome, specific online safety education for children aged 4 to 7 raises some questions – mostly about the carelessness of their carers.

Mumsnet, perhaps the only organisation guaranteed to strike fear into the hearts of all politicians, offers parents practical advice about online safety. Concerns over impact of the online world on children has been likened to earlier generations’ worries about books, the theatre, TV and computer games.

But how helpful is it to compare children’s terrestrial television, where content and ads have always been highly regulated, with, for example, the algorithm-heavy YouTube Kids? Last summer, Google announced it would restrict the targeting of advertisements to minors. Better late than never.

Over the past quarter of a century, our lives have been transformed by the tech revolution. With being digitally off-grid almost unthinkable, it must be increasingly difficult for parents to resist a pastel-coloured tablet for their toddlers, especially if, along with a shatterproof casing and Peppa Pig, it promises educational apps.

Ofcom’s Children and Parents: Media Use and Attitudes Report 2020/21 says that most children aged 3-4 had been online (82 per cent), with nine in ten of them using video sharing platforms such as TikTok. Almost half of these pre-schoolers had their own tablet, but about one third used a laptop or mobile phone to access the online world.

Despite 99 per cent of parents stating they had some form of supervision in place when their child was online, 41 per cent did not directly supervise by, for example, sitting beside them.

With three in ten parents saying it is hard to control their pre-schooler’s screen time, it is unsurprising that in the battle over tech, those with older children – especially gaming-addicted teens – wave the white flag. Although the majority are aware of the various parental controls, only around a third use them, according to Ofcom: perhaps they are the same third who allow their children aged 5-12 to use social media, despite most tech companies’ minimum age requirement of 13.

The Department of Culture, Media and Sport has reported that “80 per cent of 6- to 12-year-olds have experienced some kind of harmful content online”, while last September the NSPCC suggest that the online sexual abuse of children surged by 78 per cent in four years. This of course coincided with lockdown, which increased our dependency on tech.

Anyone born after 1996 is a “digital native”, growing up with the internet. The advent of the iPhone (2007) and iPad (2010) revolutionised access to the digital universe, that dizzying kaleidoscope of search engines, messaging services and social media, of family Zoom sessions, Uber-booking, ISIS executions, puppy videos and hardcore porn. Exponential access since the naughts has coincided with a rise in suicide, self-harm and mental health issues among teenagers, especially girls who are prey to what has been dubbed ‘Snapchat dysphoria’.

“Self-regulation of online services has failed” concluded the Joint Committee in December, highlighting the tech giants’ business model based on data harvesting and microtargeted advertising. But perhaps all of us adult users of Meta, Google, Twitter and the other digital platforms have colluded in that failure?

“We don’t give it a second’s thought when we buckle our seat belts to protect ourselves when driving. Given all the risks online, it’s only sensible we ensure similar basic protections for the digital age,” said Nadine Dorries, the Culture Secretary, introducing the bill.

Not only are children’s seat belts buckled but they must use child seats in the rear of a vehicle until they are 12 or 135cm tall. “Keep out of reach of children” is found on most household cleaning products: why not on devices offering online access?

The Netflix documentary The Social Dilemma ends with some of Silicon Valley’s leaders making clear how fanatical they are about keeping their children off-line or, at minimum, severely limiting their screen-time. As drug-dealers say, never get high on your own supply.

Last month Katharine Birbalsingh, Social Mobility Commissioner and headteacher, told the Irish Times that parents should get their children off phones. “They should not be having a smart phone until they are 16 … and do not give them unsupervised access to the internet.” In 2018, France banned mobile phones, tablets and smart watches for students under 15 in schools.

The Online Safety Bill is a welcome start but given the huge range of issues it covers, from online fraud to hate speech, via fake news, cyber flashing, ‘Zach’s law’, disinformation and trolling, is it too unwieldy? Doesn’t Ofcom have enough to do without being given oversight?

Many are uneasy about the impact the proposed bill will have on freedom of expression – a subject which surely deserves entirely separate consideration and legislation. As Elon Musk tweeted last week, “Free speech is essential to a functioning democracy.”

In the context of online harm we need the Tesla chief to get tweeting – about parents and carers outsourcing their responsibilities for their children’s safety to the state.

Bryn Harris: Free Speech is an afterthought for the Online Safety Bill

22 Mar

Dr Bryn Harris is the Chief Legal Counsel of the Free Speech Union

The Online Safety Bill has been laid before Parliament. Ministers, including Nadine Dorries last week, have worked hard to persuade voters that the Bill contains important safeguards for free speech online. Are they right?

Even those being generous would resoundingly answer ‘no’. The Bill is informed by a desire to protect freedom of speech, but largely does the opposite.

We should give the government its due. The Bill imposes free speech obligations on online providers where previously there were none. The big social media platforms will no longer have wholly free hands. They will be under free-standing obligations to implement processes that protect political speech (or ‘content of democratic importance’) and journalistic content. If they do not, users can complain and Ofcom can take action. This is a considerable improvement.

This Bill, however, fundamentally concerns the prevention of ‘harm’, not the protection of free speech (hence the name). When the ‘safety’ duties are engaged alongside the free speech duties, the balancing exercise will skew decisively towards harm prevention – concrete action must be taken in relation to harmful or illegal content, but social media companies are only asked to ‘have regard’ for free speech, which is the weakest of the legal duties.

The Bill thus enshrines in statute the illiberal approach all too familiar to the Free Speech Union, with free speech treated as an afterthought. The liberal philosophy of the English common law, with a starting point of the presumption of liberty, unless a specific rule says otherwise, is reversed. Online platforms will start by asking whether a user has harmed someone,. Only much later will they ‘have regard’ to that user’s freedom of speech.

The Bill has also become worse during its journey from a White Paper three years ago. Whereas the previous draft required platforms to ‘minimise’ illegal content, they will now have to ‘prevent’ users from encountering illegal content, where necessary by removing it.

This tougher duty will likely result in over-removal by providers, because risk-savvy provider, fearful of potentially huge fines (10% of a company’s annual global turnover) will be cautious. In cases where a free speech duty and a safety duty are competing, removing content that might be harmful will be the safer option – the free speech duty is weak and easily complied with (even with removed content) whereas complying with the safety duties requires action. The box-ticking requirement to ‘have regard’ thus imposes no effective deterrent against over-removal.

The duty regarding ‘content that is harmful to adults’ has also worsened. Providers will have four options in dealing with such content: removal, restricting access, preventing promotion, or actively promoting it. The liberal option – leave it be and let adults make their own choices – isn’t available. The only option that isn’t censorious – ‘recommend or promote content that you believe to be harmful’ – is so undesirable that no platform will choose it.

Nevertheless, a new clause on ‘user empowerment duties’ is welcome. It allows adults to choose whether or not they wish to be exposed to harmful content on sites like Twitter. But the choice is illusory and the reality is paternalistic – an adult won’t be free to see everything unadulterated, including the ‘harmful’ stuff, because platforms are virtually certain to remove, restrict or downgrade harmful content. Users will be free to choose, so long as they choose not to be ‘harmed’.

However, users will have a right to sue for breach of contract if providers remove or restrict content contrary to their terms of service. This should allow users to resist providers that fail to ‘take into account’ the protections for political speech and journalistic content. It remains to be seen if these duties will genuinely restrain the instinct to over-remove content.

Also welcome are new restraints on the Secretary of State’s power to dictate what kinds of content providers must police. The categories of ‘priority’ illegal content are now stated baldly by the Bill, and are what one would expect. When it comes to content that is harmful to adults, the Secretary of State will have the power to lay a statutory instrument specifying what lawful speech social media companies will be forced to remove. It remains to be seen how censorious Nadine Dorries will be, but even if she is relatively restrained, this Bill is a hostage to fortune. It empowers a future Secretary of State at DCMS to come up with their own Index Librorum Prohibitorum.

All analysis of the Bill is speculation: we’ll only know its impact once it becomes law, and providers and Ofcom begin to implement it. What is unusual is that ministers seem to be aware of the pressures that are likely to turn the Bill into a censor’s charter.

The Culture Secretary accepts that a culture of censorship already exists among the platforms whom she proposes to essentially entrust with deciding what to remove. Ministers seem to be aware that a repeat of the Trump Twitter ban would be disastrous. They must also know that the huge fines and even criminal sanctions that could be imposed under the Bill are virtually certain to drive excessive risk-aversion. So why is the Government introducing a Bill so likely to thwart freedom of speech?

I suspect the answer lies in an unwillingness to address a very difficult but fundamental conceptual problem – a government cannot protect free expression while also trying to prohibit harmful speech. To govern is to choose: ministers and lawmakers must show leadership and tackle the question of whether we should prioritise liberty or paternalism, or we will continue to muddle through a mess of contradictions.

Free people do not live their lives under a rulebook’s control, still less one which vexatious political activists will be able to weaponise. This Conservative Government should be true to its convictions and use this Bill to force the social media companies to do more to protect free speech.