Well-meaning, but so badly done as to be dangerous.
That’s the most generous interpretation of the Online Safety Act 2023, and the 2024 amendments to the Investigatory Powers Act.
The online world is complex, and widely open to misuse. The Internet brings with it many upsides, but it exposes people to a propaganda regime so serious that the Nazis could never have even dreamed it, combined with a surveillance setup so sinister that had they thought of it, the Stasi would have held a week of celebratory parties.
Internet use generates the data needed to keep very close tabs on users, catalogue their data, see what they access, what they say, who they are. Social media allows this very simply – indeed, the sale of that very data is what allows Facebook et al. to be free to use.
Staying safe online requires curbs – not to users, but to companies and governments – so that privacy is maintained to a reasonable degree. We use encryption – where communications are scrambled, preventing snooping – to keep data safe, and to keep conversations, emails and web browsing private.
Almost always, this privacy is a good thing – law-abiding citizens going about their business, entitled to privacy as a basic human right, whilst violating no laws. Obviously, for a very small minority of users, this can result in illegal misuse of varying degrees.
Government is then tempted to rush in, and try to “keep people safe”. This sounds sensible – clearly, if content is harmful then we should do something about it – check communications, and check content. But, what’s harmful? Porn? Fake news? Terrorism? Suicide? Offensive content?
This is a minefield. Go back 20 years, and there was broad consensus – child porn bad, legal porn not ideal but fair enough, fake news – no such thing, terrorism bad, suicide bad, offensive content – just don’t read it.
Today, the right to cause offence is very much on shaky ground, and almost anything is being included. We have anti-hate speech laws, and we seem to have lost the ability to work out what’s a joke, what is venting, and what is serious.
That context is important, crucial. The Online Safety Act imposes a duty of care on content platforms to ensure that children don’t see anything illegal, nor do they see anything which is “legal but harmful”. OK, what’s that?
God knows, say the content providers, but what we can do, is scan absolutely everything on the whole platform and remove anything which (a) might be accessed by children (so pretty much all content, then) and (b) might possibly be harmful to them. And if we get it wrong, we’ll be fined absolutely vast sums, so we better be pretty vicious in our definition of “legal but harmful”.
Which means… we’ll censor pretty much anything that isn’t The Approved Woke Narrative. Couple this with the impending banter bouncers, and all of a sudden, we can’t speak freely in pubs or online because it *might* fall foul of the law. That’s a pretty big deal.
And the real kicker? If children want to access content we’d prefer they didn’t see, all they have to do, is pop over to the App Store (or Google Play store – don’t get grumpy with me, Google!) and download a free VPN, at which point they can bypass all of this.
So, we’ve removed free speech, paid for with millions of lives in World War Two, and in exchange, we saved …. Nobody.
“But wait!” I hear you cry, “We can surely have discussions over private message? *That* isn’t going to be seen by children or cause offence?”
Ah yes, but you might do something else illegal, so that’s covered, too. Between the Online Safety Act, which contains provision for Ofcom to outlaw end-to-end encryption, and the updated Investigatory Powers Act, which contains other curbs on encryption and introduces Notification Notices, forcing vendors to notify government of any encryption-related changes, and also adding a behind-closed-doors mechanism to force vendors to break encryption for UK law enforcement. Crucially, any appeal against these notices does not delay their enforcement.
The outcome? Apple has already removed the ability to use end-to-end encryption in the UK, because technically, it is either present, or it is not. There is no “the user, and the Police” setting for access, because of the way the maths works.
“So what?” says almost everyone. This is a big deal – Apple, like most cloud vendors, backs up your data to the cloud. Very handy – if your phone is stolen, then the new one will magically have all the data, just as before. Your whole life is on your phone, or your computer, or your tablet, so the chances are, it’s now on the cloud, too.
End-to-end encryption is the thing that means it stays *your* data. Without it, that data can be read by anyone at the vendor, or by snooping governmental eyes, granted that power by the IPA Act 2024.
“OK yes, but if you have nothing to hide, it doesn’t matter!” Sure, except that (a) data breaches are a sure thing – take it from this cyber security professional, it’s a matter of when, not if, someone breaks in and steals data from every company, government or institution, and (b) that old chestnut – ‘harmful but not illegal’ can be levied against your data by AI, run in government, meaning that anything you say can and will be dug up by machine and used against you.
Perhaps not now, perhaps not in a few years, but we didn’t think leading academics would be censored by government during the Covid pandemic either, and yet here we sit.
These measures are authoritarian, dressed up as safety. The real travesty is that both the Online Safety Act, and the groundwork of the Investigatory Powers Act, were arranged by the previous Conservative government.
We need to admit that, as conservatives, we got this one wrong. We must stand against these laws and commit to their immediate repeal. It is absolutely right to seek to protect citizens from potential harms of the internet, but there are better ways. We need to go back to the drawing board on this, and this time, work with the technology, rather than trying to ignore the realities of the world.