New Clause 9 - Offence of failing to comply with a relevant duty

Online Safety Bill – in a Public Bill Committee at 12:45 pm on 15th December 2022.

Alert me about debates like this

“(1) The provider of a service to whom a relevant duty applies commits an offence if the provider fails to comply with the duty.

(2) Where the provider is an entity and the offence is proved to have been committed with the consent or connivance of—

(a) a senior manager or director of the entity, or

(b) a person purporting to act in such a capacity,

the senior manager, director or person (as well as the entity) is guilty of the offence and liable to be proceeded against and punished accordingly.

(3) A person who commits an offence under this section is liable on conviction on indictment to imprisonment for a term not exceeding two years or a fine (or both).

(4) In this section—

a ‘director’, in relation to a body corporate whose affairs are managed by its members, means a member of the body corporate;

‘relevant duty’ means a duty provided for by section 11, 14, 18, 19, 21 or 30 of this Act; and

‘senior manager’ has the meaning given in section 89(4) of this Act.”—(Nick Fletcher.)

Brought up, and read the First time.

Photo of Nicholas Fletcher Nicholas Fletcher Conservative, Don Valley

I beg to move, That the clause be read a Second time.

It is a pleasure to serve under your chairmanship, Dame Angela. If you will allow, I want to apologise for comments made on the promotion of suicide and self-harm to adults. I believed that to be illegal, but apparently it is not. I am a free speech champion, but I do not agree with the promotion of this sort of information. I hope that the three shields will do much to stop those topics being shared.

I turn to new clause 9. I have done much while in this position to try to protect children, and that is why I followed the Bill as much as I could all the way through. Harmful content online is having tragic consequences for children. Cases such as that of Molly Russell demonstrate the incredible power of harmful material and dangerous algorithms. We know that the proliferation of online pornography is rewiring children’s brains and leading to horrendous consequences, such as child-on-child sexual abuse. This issue is of immense importance for the safety and protection of children, and for the future of our whole society.

Under the Bill, senior managers will not be personally liable for breaching the safety duties, and instead are liable only where they fail to comply with information requests or willingly seek to mislead the regulator. The Government must hardwire the safety duties to deliver a culture of compliance in regulated firms. The Bill must be strengthened to actively promote cultural change in companies and embed compliance with online safety regulations at board level.

We need a robust corporate and senior management liability scheme that imposes personal liability on directors whose actions consistently and significantly put children at risk. The Bill must learn lessons from other regulated sectors, principally financial services, where regulation imposes specific duties on the directors and senior managers of financial institutions, and those responsible individuals face regulatory enforcement if they act in breach of such duties.

The Joint Committee on the draft Online Safety Bill, which conducted pre-legislative scrutiny, recommended that a senior manager at or reporting to board level

“should be designated the ‘Safety Controller’ and made liable for a new offence: the failure to comply with their obligations as regulated service providers when there is clear evidence of repeated and systemic failings that result in a significant risk of serious harm to users.”

Some 82% of UK adults would support the appointment of a senior manager to be held liable for children’s safety on social media sites, and I believe that the measure is also backed by the NSPCC.

There is no direct relationship in the Bill between senior management liability and the discharge by a platform of its safety duties. The Government have repeatedly argued against the designation of a specific individual as a safety controller for some understandable reasons: an offence could be committed by the company without the knowledge of the named individual, and the arrangement would allow many senior managers and directors to face no consequences. However, new clause 9 would take a different approach by deeming any senior employee or manager at the company to be a director for the purposes of the Bill

The concept of consent or connivance is already used in other Acts of Parliament, such as the Theft Act 1968 and the Health and Safety at Work etc. Act 1974. In other words, if a tech platform is found to be in breach of the Online Safety Bill—once it has become an Act—with regard to its duties to children, and it can be proven that this breach occurred with the knowledge or consent of a senior person, that person could be held criminally liable for the breach.

I have been a director in the construction industry for many years. There is a phrase in the industry that the company can pay the fine, but it cannot do the time. I genuinely believe that holding directors criminally liable will ensure that the Bill, which is good legislation, will really be taken seriously. I hope the Minister will agree to meet me to discuss this further.

Photo of Damian Collins Damian Collins Chair, Draft Online Safety Bill (Joint Committee), Chair, Draft Online Safety Bill (Joint Committee) 1:15 pm, 15th December 2022

I want to briefly speak on this amendment, particularly as my hon. Friend the Member for Don Valley referenced the report by the Joint Committee, which I chaired. As he said, the Joint Committee considered the question of systematic abuse. A similar provision exists in the data protection legislation, whereby any company that is consistently in breach could be considered to have failed in its duties under the legislation and there could be criminal liability. The Joint Committee considered whether that should also apply with the Online Safety Bill.

As the Bill has gone through its processes, the Government have brought forward the commencement of criminal liability for information offences, whereby if a company refuses to respond to requests for information or data from the regulator, that would be a breach of their duties; it would invoke criminal liability for a named individual. However, I think the question of a failure to meet the safety duty set out in the Bill really needs to be framed along the lines of being a systematic and persistent breach, as the Joint Committee recommended. If, for example, a company was prepared to ignore requests from Ofcom, use lawyers to evade liability for as long as possible and consistently pay fines for serious breaches without ever taking responsibility for them, what would we do then? Would there be some liability at that point?

The amendment drafted by my hon. Friend Sir William Cash is based on other existing legislation, and on there being knowledge—with “consent or connivance”. We can see how that would apply in cases such as the diesel emissions concerns raised at Volkswagen, where there was criminal liability, or maybe the LIBOR bank rate rigging and the serious failures there. In those cases, what was discovered was senior management’s knowledge and connivance; they were part of a process that they knew was illegal.

With the amendment as drafted, the question we would have is: could it apply for any failure? Where management could say, “We have created a system to resolve this system that hasn’t worked on this occasion”, would that trigger it? Or is it something broader and more systematic? These failures will be more about the failure to design a regime that takes into account the required stated duties, rather than a particular individual act, such as the rigging of the LIBOR rates or giving false public information on diesel emissions, which could only be made at a corporate level.

When I chaired the Joint Committee, we raised the question, “What about systematic failure, as we have that as an offence in data protection legislation?” I still think that would be an interesting question to consider when the Bill goes to another place. However, I have concerns that the current drafting would not fit quite as well in the online safety regime as it does in other industries. It would really need to reflect consistent, persistent failures on behalf of a company that go beyond the criminal liabilities that already exist in the Bill around information offences.

Photo of Angela Eagle Angela Eagle Labour, Wallasey

Just to be clear, it is new clause 9 that we are reading a Second time, not an amendment.

Photo of Caroline Ansell Caroline Ansell Conservative, Eastbourne

I rise to recognise the spirit and principle behind new clause 9, while, of course, listening carefully to the comments made by my hon. Friend the Member for Folkestone and Hythe. He is right to raise those concerns, but my question is: is there an industry-specific way in which the same responsibility and liability could be delivered?

I recognise too that the Bill is hugely important. It is a good Bill that has child protection at its heart. It also contains far more significant financial penalties than we have previously seen—as I understand it, 10% of qualifying revenue up to £18 million. This will drive some change, but it comes against the backdrop of multi-billion-pound technology companies.

I would be interested to understand whether a double lock around the board-level responsibility might further protect children from some of the harrowing and harmful content we see online. What we need is nothing short of transformation and significant culture change. Even today, The Guardian published an article about TikTok and a study by the Centre for Countering Digital Hate, which found that teenagers who demonstrated an interest in self-harm and eating disorders were having algorithms pushing that content on to them within minutes. That is most troubling.

We need significant, serious and sustained culture change. There is precedent in other sectors, as has been mentioned, and there was a previous recommendation, so clearly there is merit in this. My understanding is that there is strong public support, because the public recognise that this new responsibility cannot be strengthened by anything other than liability. If there is board-level liability, that will drive priorities and resources, which will broker the kind of change we are looking for. I look forward to what the Minister might share today, as this has been a good opportunity to bring these issues into further consideration, and they might then be carried over into subsequent stages of this excellent Bill.

Photo of Rachel Maclean Rachel Maclean The Minister of State, Home Department

I would like to build on the excellent comments from my colleagues and to speak about child sexual abuse material. I thank my hon. Friends the Members for Penistone and Stocksbridge (Miriam Cates) and for Stone for tabling the amendment. I am very interested in how we can use the excellent provisions in the Bill to keep children safe from child sexual abuse material online. I am sure the Committee is aware of the devastating impact of such material.

Sexual abuse imagery—of girls in particular—is increasingly prevalent. We know that 97% of this material in 2021 showed female children. The Internet Watch Foundation took down a record-breaking 252,000 URLs that had images of children being raped, and seven in 10 of those images were of children aged 11 to 13. Unfortunately, the National Crime Agency estimates that between 550,000 and 850,000 people in the UK are searching for such material on the internet. They are actively looking for it, and at the moment they are able to find it.

My concern is with how we use what is in the Bill already to instil a top-down culture in companies, because this is about culture change in the boardroom, so that safety is considered with every decision. I have read the proceedings from previous sittings, and I recognise that the Government and Ministers have said that we have sufficient provisions to protect children, but I think there is a little bit of a grey area with tech companies.

I want to mention Apple and the update it was planning for quite a few years. There was an update that would have automatically scanned for child sex abuse material. Apple withdrew it following a backlash from encryption and privacy experts, who claimed it would undermine the privacy and security of iCloud users and make people less safe on the internet. Having previously said that it would pause it to improve it, Apple now says that it has stopped it altogether and that it is vastly expanding its end-to-end encryption, even though law enforcement agencies around the world, including our own UK law enforcement agencies, have expressed serious concerns because it makes investigations and prosecution more challenging. All of us are not technical experts. I do not believe that we are in a position to judge how legitimate it is for Apple to have this pause. What we do know is that while there is this pause, the risks for children are still there, proliferating online.

We understand completely that countering this material involves a complicated balance and that the tech giants need to walk a fine line between keeping users safe and keeping their data safe. But the question is this: if Apple and others continue to delay or backtrack, will merely failing to comply with an information request, which is what is in the Bill now, be enough to protect children from harm? Could they delay indefinitely and still be compliant with the Bill? That is what I am keen to hear from the Minister. I would be grateful if he could set out why he thinks that individuals who have the power to prevent the harmful content that has torn apart the lives of so many young people and their families should not face criminal consequences if they fail to do so. Can he reassure us as to how he thinks that the Bill can protect so many children—it is far too many children—from this material online?

Photo of Alex Davies-Jones Alex Davies-Jones Shadow Minister (Digital, Culture, Media and Sport)

Labour supports new clause 9, as liability is an issue that we have repeatedly raised throughout the passage of the Bill—most recently, on Report. As colleagues will be aware, the new clause would introduce criminal liabilities for directors who failed to comply with their duties. This would be an appropriate first step in ensuring a direct relationship between senior management of platforms and companies, and their responsibilities to protect children from significant harm. As we have heard, this measure would drive a more effective culture of awareness and accountability in relation to online safety at the top of and within the entire regulated firm. It would go some way towards ensuring that online safety was at the heart of the governance structures internally. The Bill must go further to actively promote cultural change and put online safety at the forefront of business models; it must ensure that these people are aware that it is about keeping people safe and that that must be at the forefront, over any profit. A robust corporate and senior management liability scheme is needed, and it needs to be one that imposes personal liability on directors when they put children at risk.

The Minister knows as well as I do that the benefits of doing so would be strong. We have only to turn to the coroner’s comments in the tragic case of Molly Russell’s death—which I know we are all mindful of as we debate this Bill—to fully understand the damaging impact of viewing harmful content online. I therefore urge the Minister to accept new clause 9, which we wholeheartedly support.

Photo of Paul Scully Paul Scully The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport

The Government recognise that the intent behind the new clause is to create new criminal offences of non-compliance with selected duties. It would establish a framework for personal criminal offences punishable through fines or imprisonment. It would mean that providers committed a criminal offence if they did not comply with certain duties.

We all want this Bill to be effective. We want it to be on the statute book. It is a question of getting that fine balance right, so that we can properly hold companies to account for the safety of their users. The existing approach to enforcement and senior manager liability strikes the right balance between robust enforcement and deterrent, and ensuring that the UK remains an attractive place to do business. We are confident that the Bill as a whole will bring about the change necessary to ensure that users, especially younger users, are kept safe online.

This new clause tries to criminalise not complying with the Bill’s duties. Exactly what activity would be criminalised is not obvious from the new clause, so it could be difficult for individuals to foresee exactly what type of conduct would constitute an offence. That could lead to unintended consequences, with tech executives driving an over-zealous approach to content take-down for fear of imprisonment, and potentially removing large volumes of innocuous content and so affecting the ability for open debate to take place.

Photo of Kirsty Blackman Kirsty Blackman Shadow SNP Spokesperson (Cabinet Office)

Does the Minister not think that the freedom of speech stuff and the requirement to stick to terms of service that he has put in as safeguards for that are strong enough, then?

Photo of Paul Scully Paul Scully The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport

I come back to this point: I think that if people were threatened with personal legal liability, that would stifle innovation and make them over-cautious in their approach. That would remove the balance, disturb the balance, that we have tried to achieve in this iteration of the Bill. Trying to keep internet users, particularly children, safe has to be achieved alongside free speech and not at its expense.

Further, the threat of criminal prosecution for failing to comply with numerous duties also runs a real risk of damaging the attractiveness of the UK as a place to start up and grow a digital business. I want internet users in the future to be able to access all the benefits of the internet safely, but we cannot achieve that if businesses avoid the UK because our enforcement regime is so far out of kilter with international comparators. Instead, the most effective way to ensure that services act to protect people online is through the existing framework and the civil enforcement options that are already provided for in the Bill, overseen by an expert regulator.

Going forward, companies will need to regularly assess the risks that their services pose to users, including ahead of any major design or functionality changes, and put in place proportionate systems and processes to mitigate those risks. It is only when companies thoroughly understand the risks arising from their services that they will be able to take proportionate action to keep users safe. This approach will fundamentally change the way tech services operate. It will mandate that services and tech executives properly consider risks and user safety from the get-go, rather than as an afterthought once a product is already open to users.

If platforms fail to comply with their enforceable requirements, Ofcom will be able to use its range of strong enforcement powers, including fines. By court order, it will be able to take business disruption measures and block sites from operating in the UK. Make no mistake about the substance of those fines: it is 10% of a company’s global turnover. No matter how big the company is, 10% is 10%; it is still a massive proportion of operating costs that will be removed. Our approach will ensure that providers are held to account and that swift action is taken to keep users safe, whether by bringing the platform into compliance or through stronger measures.

Senior tech executives can already be held criminally liable under the Bill for failing to take reasonable steps to ensure their company properly complies with Ofcom’s information requests. That includes failing to ensure that their company responds fully, accurately and on time; failing to ensure that their company does not provide false information; failing to ensure that their company does not provide encrypted information that Ofcom cannot understand; and failing to ensure that their company does not destroy or alter information required by Ofcom.

If we start to widen the scope of senior management liability in the Bill, we start to come up against problems quickly. For a criminal offence, a precise statement of the prohibited behaviour must clearly be set out—in other words, that a particular act or omission constitutes the criminal offence. In this case, a failure to comply with the relevant duties listed in the amendment would depend on a huge number of factors. That is because the Bill applies to a providers of various sizes and types. In most areas, the framework is flexible, rather than prescriptive: it does not prescribe certain steps that providers must take. That means that it may be difficult for individuals to foresee exactly what type of conduct constitutes an offence, and that can easily lead to unintended consequences and to tech executives taking an over-zealous approach to content take-down for fear of imprisonment.

My hon. Friend the Member for Folkestone and Hythe talked about health and safety, LIBOR and diesel emissions, which have been raised here and in the main Chamber. There is a big difference between what we are talking about and those examples. On health and safety, LIBOR and the cover-up of diesel emissions, there is far closer contact with a personal conduct measure; the Bill contains broader measures.

My hon. Friend the Member for Eastbourne talked about having an industry-specific way of delivering the responsibility and liability. This is the industry-specific way. We are making sure that the approach is proportionate and that executives have to co-operate with Ofcom at every stage. It is pre-emptive as well as reactive. It ensures that that, when Ofcom assesses their risk assessments, their approaches to algorithms and so on, it has all the facilities it needs to check that what they are doing is the right approach. If there are complaints and systemic failings within the platform’s regime, they need to comply; they must not cover it up or hinder Ofcom’s investigation.

On the TikTok algorithm, the development of an algorithm is quite remote from personal conduct. It is not easy to make an individual criminally liable for it, not least because algorithms tend to be developed by hundreds if not thousands of people in different continents. To boil that down to one person is incredibly difficult.

We also heard about the example of Apple. There is no way that through this legislation, we are banning, or creating back doors in, end-to-end encryption; there is no safe back door, frankly, so if we did that, we could kiss goodbye to open banking and any number of things that we use daily. I may be wrong, but my understanding of the Apple product that was mentioned is that it would involve scanning pretty well everything that a person had in their iCloud, so it would be a sledgehammer to crack a nut, although clearly a really important nut. If Apple will not bring that forward, we would expect it and other platforms to bring forward something else that is effective specifically against terrorism content and child sexual exploitation and abuse.

For the reasons that I have given, I strongly believe that the Bill’s approach to enforcement will be effective. It will protect users without introducing incentives for managers to remove swathes of content out of fear of prosecution. I want to make sure that the legislation gets on the books and is proportionate, and that we do not start gold-plating it with these sorts of measures now, because we risk disrupting the balance that I think we have achieved in the Bill as amended.

Photo of Nicholas Fletcher Nicholas Fletcher Conservative, Don Valley 1:30 pm, 15th December 2022

I appreciate the Minister’s comments, but from what my hon. Friends the Members for Folkestone and Hythe, for Eastbourne, and for Redditch said this morning about TikTok—these sorts of images get to children within two and a half minutes—it seems that there is a cultural issue, which the hon. Member for Pontypridd mentioned. Including new clause 9 in the Bill would really ram home the message that we are taking this seriously, that the culture needs to change, and that we need to do all that we can. I hope that the Minister will speak to his colleagues in the Ministry of Justice to see what, if anything, can be done.

Photo of Paul Scully Paul Scully The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport

I forgot to respond to my hon. Friend’s question about whether I would meet him. I will happily meet him.

Photo of Nicholas Fletcher Nicholas Fletcher Conservative, Don Valley

I appreciate that. We will come back to this issue on Report, but I beg to ask leave to withdraw the motion.

Clause, by leave, withdrawn.

Question proposed, That the Chair do report the Bill, as amended, to the House.

Photo of Angela Eagle Angela Eagle Labour, Wallasey

It is usual at this juncture for there to be a few thanks and niceties, if people wish to give them.

Photo of Paul Scully Paul Scully The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport

I apologise, Dame Angela; I did not realise that I had that formal role, but you are absolutely right.

Photo of Angela Eagle Angela Eagle Labour, Wallasey

If the Minister does not want niceties, that is up to him.

Photo of Paul Scully Paul Scully The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport

Dame Angela, you know that I love niceties. It is Christmas—the festive season! It is a little bit warmer today because we changed room, but we remember the coldness; it reminds us that it is Christmas.

I thank you, Dame Angela, and thank all the Clerks in the House for bringing this unusual recommittal to us all, and schooling us in the recommittal process. I thank Members from all parts of the House for the constructive way in which the Bill has been debated over the two days of recommittal. I also thank the Doorkeepers and my team, many of whom are on the Benches here or in the Public Gallery. They are watching and WhatsApping—ironically, using end-to-end encryption.

Photo of Angela Eagle Angela Eagle Labour, Wallasey

I was just about to say that encryption would be involved.

Photo of Alex Davies-Jones Alex Davies-Jones Shadow Minister (Digital, Culture, Media and Sport)

I thank you, too, Dame Angela. I echo the Minister’s sentiments, and thank all the Clerks, the Doorkeepers, the team, and all the stakeholders who have massively contributed, with very short turnarounds, to the scrutiny of this legislation. I have so appreciated all that assistance and expertise, which has helped me, as shadow Minister, to compile our comments on the Bill following the Government’s recommittal of it to Committee, which is an unusual step. Huge thanks to my colleagues who joined us today and in previous sittings, and to colleagues from across the House, and particularly from the SNP, a number of whose amendments we have supported. We look forward to scrutinising the Bill further when it comes back to the House in the new year.

Photo of Kirsty Blackman Kirsty Blackman Shadow SNP Spokesperson (Cabinet Office)

I thank you, Dame Angela, as well as Sir Roger for chairing our debates. Recommittal has been a very odd and unusual process; it has been a bit like groundhog day, discussing things we have discussed previously. I very much appreciate the hard work of departmental and Ofcom staff that went into making this happen, as well as the work of the Clerks, the Doorkeepers, and the team who ensured that we have a room that is not freezing—that has been really helpful.

I thank colleagues from across the House, particularly the Labour Front-Bench spokespeople, who have been incredibly helpful in supporting our amendments. This has been a pretty good-tempered Committee and we have all got on fairly well, even though we have disagreed on a significant number of issues. I am sure we will have those arguments again on Report.

Photo of Angela Eagle Angela Eagle Labour, Wallasey

There being no more obvious niceties, I add my thanks to everybody. I wish everybody season’s greetings and a happy Christmas.

Question put and agreed to.

Bill, as amended, accordingly to be reported.

Committee rose.

Written evidence reported to the House

OSB113 HOPE not hate

OSB114 Samaritans

OSB115 Jeffrey Howard, Associate Professor of Political Theory and Director of the Online Speech Project, School of Public Policy, University College London

OSB116 Open Rights Group

OSB117 Meta