My Lords, I am pleased to introduce this debate on the report of the Communications and Digital Committee, Free for All? Freedom of Expression in the Digital Age. I am very grateful to our outstanding committee staff. Our clerk was Alasdair Love and our policy analyst was Theo Demolder. Rita Cohen once again provided them and us with invaluable support and Dr Ella McPherson provided expert advice throughout the inquiry.
I am grateful too to noble Lords on the committee, many of whom are speaking today and all of whom brought great experience and expertise to this report. The committee is thriving under the fine leadership of my noble friend Lady Stowell of Beeston; I am very much looking forward to her contribution today.
This report was published under my chairmanship in July last year, since when there have been many significant developments and changes in digital regulation and more widely. I was privileged to sit on the Joint Scrutiny Committee for the Online Safety Bill which reported at the end of last year.
Having heard the debate on the demonstrations in Iran, we have to reflect that free speech is still something to be cherished and something that brave people are dying for today. Freedom of expression is about not being prevented from speaking one’s own mind. It is the bedrock of free societies. Although it is subject to important legal limits, including against the incitement of violence and defamation, we must remember what Lord Justice Warby referred to in one judgment as
“the well-established proposition that free speech encompasses the right to offend, and indeed to abuse another.”
It was evidence taken during our previous inquiry on the future of journalism that led us to turn to freedom of expression. We had heard about how the market power of Google and Facebook was threatening media freedom. I am very pleased that the committee is continuing to champion the media and pursue our recommendation for an Australian-style mandatory bargaining code to ensure that publishers receive fair compensation for the use of their content.
It was clear from the outset of this inquiry that there are two major problems online. The first is the dissemination by platforms of the worst kind of content: that which is either illegal or harmful to children. The other problem is the opposite: platforms removing legitimate content or treating some political viewpoints more favourably than others. Among many examples we heard about were Twitter banning Donald Trump while still allowing Ayatollah Khamenei to incite violence against Israel and praise jihadi groups; and Facebook choosing to treat a New York Post story as misinformation, with no evidence, at the same time as taking no action against Chinese state media when they spread lies about the genocide in Xinjiang.
At the core of these twin problems of aggressive promotion of harmful content on the one hand and overremoval of posts on the other is the dominance of the big platforms. Their monopoly of power means that they do not have to respond to users’ concerns about safety or free speech. These companies have monopolised the digital public square, shutting out new entrants that might be able to provide better services.
Tough competition regulation would unleash the power of the market to raise standards. It is a central part of the approach that we recommend in our report and we concluded that it was urgent. The delay in bringing forward legislation on the Digital Markets Unit is disappointing. I hope the Minister will agree that swiftly fixing broken markets to increase competition is the right and indeed Conservative thing to do.
There are two other pillars to the holistic approach we recommend which have not received enough attention. One is digital citizenship initiatives. Schools and public information campaigns both have a role to play in improving online behaviour. One person’s abuse of their right to freedom of expression can have a chilling effect on others, leaving them less able to express themselves freely. There is now much evidence that women and girls most often are being silenced by others online. However, regulation is not the only answer here. Alongside really joined-up, consistent citizen initiatives, an improvement in our public discourse would be a good start. Lord Williams of Oystermouth told us that “abrasive and confrontational styles” of discussion
“do not come from nowhere.”
Indeed. Politicians and other public figures should be setting a better example, showing that we can disagree while respecting those we are arguing with and not condemning as extremists those who have different viewpoints from our own.
The other pillar is regulation of the design of the biggest platforms. Freedom of expression is the right to speak out, but there is no corresponding obligation on others to listen. We called for users to be empowered with tools to filter the type of content they are shown. Everyone has their own individual sensitivities and preferences and only they, if they are an adult, can really decide what they want to see. I am glad that the Government have gone some way in implementing this with new clauses in the Online Safety Bill, which I will come to a moment.
It is not the existence of individual pieces of content which in some circumstances and to some people can be harmful that is the problem, but the way in which algorithms serve up that content in unrelenting barrages. The devastating impact of these business models was laid bare in the astonishing evidence at the inquest into the death of Molly Russell, which we would never have seen were it not for the persistence and courage of her father, Ian Russell. The horrendous material that was targeted, promoted and recommended to Molly changed her perception of herself and her options. Seeing the systemic nature of her abuse in the coroners’ court will help us to take action to save lives, and I hope that Ian and his family find some comfort in that.
Design regulation means ensuring that the largest platforms’ content-creation algorithms, choice architecture and reward mechanisms are not set up to encourage users’ worst instincts and to spread the most unpleasant content the most quickly. Such measures would get to the heart of those business models, which centre on keeping users logged in and viewing adverts for as long as possible—even if that means stoking outrage.
We should be taking different approaches to protect children and adults. For adults, we want a space where they do not find manifestly illegal material but can control their own online environment by insisting that platforms put power in their hands, as I have described, which means an approach that allows adults to in effect create their own algorithms through approaches such as interoperability.
When it comes to children, we want to protect them from content that is not appropriate for their age, but surely we want more than that. We should be aspiring to an online environment that is positive and enriching, and which helps them to grow and learn safely: a space where their privacy is respected and where every stage of the design process puts these objectives ahead of the financial interests of the platform.
It is obvious, then, that platforms and other online services need to know the age of their users. The way in which they do this and the degree of certainty they would need will depend on the risk of children using the service and the risk of children encountering harmful material or design features if they do. That is why, while I will passionately champion free speech when we come to the Online Safety Bill, I will also support the call led by the noble Baroness, Lady Kidron, for a set of standards for age-assurance technology and approaches that preserve privacy. Well-designed and proportionately regulated age assurance is the friend, not the enemy, of free speech.
I have outlined the approach favoured by the committee in its report, and now I turn to the Bill’s approach. We have been told repeatedly by officials and Ministers that the Online Safety Bill is simply about platforms, systems and processes, rather than content. This is incorrect. These are systems and processes to remove content. Their compliance with the legislation will be judged according to the presence of content, even if a single piece of content would not be enough for a platform to be deemed non-complaint.
The “legal but harmful” duty has been the subject of so much debate. Its supporters are right that it is not straightforwardly a duty to remove content; it is about platforms choosing a policy on a given type of legal but harmful content and applying it consistently. However, this is not nearly as simple or as innocuous as it sounds. The vagueness of the concept of harm gives Ofcom significant definitional power. For example, a statutory instrument might designate information which has an adverse physical or psychological impact as a priority category which platforms must include in their terms and conditions. A platform that said that it would not allow such information could be penalised by Ofcom for not removing content which the regulator feels meets this standard but which the platform does not, because the platform either does not believe it is untrue or does not believe it is harmful.
When we asked why it would not be simpler to criminalise given harms, part of the response was that many of those legal harms are so vague as to be impossible to define in law. It is not clear why that would not also make them impossible to regulate. As a committee, we have always felt it a crucial point of principle to focus on the evidence in front of us, and when we did, on this issue, a consensus quickly emerged that the “legal but harmful” provisions are unworkable and would present a serious threat to freedom of expression. They should be removed from the Bill.
We also raised concern about the duty to remove illegal content as currently drafted. The problems with this duty have not received nearly as much attention as the “legal but harmful” duty, but might, I fear, be significantly more dangerous. Unlike with “legal but harmful”, this is straightforwardly a duty to remove content. Of course no one wants illegal content online. If it really is illegal, it should be removed, but we are asking platforms to make decisions which would otherwise be left to the courts. Prosecuting authorities have the time and resource to investigate and examine cases in great detail, understanding the intent, context and effect of posts. Platforms do not. Neither platforms’ content moderation algorithms nor their human moderators are remotely qualified to judge the legality of speech in ambiguous cases.
The new communications offences in Part 10 of the Bill, which have their merit, show the problem most clearly. A platform will have to remove posts which it has reasonable grounds to believe are intended and likely to cause serious distress to a likely audience, having considered whether there might be a reasonable public interest defence. Even courts would struggle with this.
If we oblige platforms to remove posts that they have “reasonable grounds to believe” might be illegal, there is a real danger, surely, that they will simply remove swathes of content to be on the safe side, taking it down if there is the slightest chance it may be prohibited. There is no incentive for them to consider freedom of expression, other than some duties to “have regard for” its importance, which are currently much too weak. Legitimate speech will become collateral damage.
I do not pretend that we have all the answers to these concerns of how to ensure proportionality and accuracy in removing potentially illegal content, but I know that this is something the Government have been looking at. Can my noble friend tell us whether the Government acknowledge the concern about overremoval of legal content and whether consideration has been given to solutions which could include a clear and specific duty on Ofcom to have regard for freedom of expression in designing codes and guidance and using enforcement powers, or more fundamentally, a change in the standard from “reasonable grounds to believe” to “manifestly illegal”?
The committee in its report found drafting of the Bill to be vague in parts, perhaps because it is born of a desire to find some way of getting rid of all the bad things on the internet while avoiding unintended consequences. As Susie Alegre, a leading human rights lawyer at Doughty Street Chambers put it, the Bill is so unclear that
“it is impossible to assess what it will mean in practice, its proportionality in relation to interference with human rights, or the potential role of the Online Safety Bill in the prevention of online harms.”
Ofcom will be left to try to make sense of and implement it. Ofcom is rightly a very well-respected regulator, but it is wrong to hand any regulator such sweeping powers over something so fundamental as what citizens are allowed to say online. There is no analogy in the offline world.
Think of how contested BBC impartiality is. Imagine how much more furiously the debate about Ofcom impartiality will be when both sides of a highly contested debate claim that platforms are wrongly taking their posts down and leaving their opponents’ posts up, demanding Ofcom take action to tackle what they see as harm.
The only winners from all this will be the likes of Facebook and Google. Having left their business models fundamentally unscathed, the Online Safety Bill will create obligations which only they can afford to deal with. New entrants to the market will be crushed under the compliance burden.
Before I conclude, on enforcement, it is sometimes said that the internet is a Wild West. It is not. We are right to put in place regulatory regimes across the digital landscape and, for all its flaws, this Bill is an important step. However, the report identified 12 existing criminal offences and a number of civil law protections that are already in place, and which are especially relevant to the online world. These offences already cover many of the behaviours online that we most worry about. The problem is not a lack of laws but a failure to enforce existing legislation. We called on the Government to ensure that existing laws are enforced and to explore mechanisms for platforms to fund this, and to require platforms to preserve deleted posts for a fixed period.
It will soon be time for this House to turn its attention to detailed scrutiny of the Online Safety Bill. I hope that noble Lords will find the committee’s report and today’s debate a useful preparation. I firmly believe that the approach that we suggest would make the internet safer and freer than would the current proposal. I would like to see an Online Safety Bill that focuses on platform design and content which is manifestly illegal, and which goes much further to protect children. It must also contain strong incentives for platforms not to take down legal content, including a prohibition on removing content from legitimate news publishers.
Parliament must provide ongoing scrutiny on the online safety regime, competition, and all areas of digital regulation, to help regulators do their jobs effectively and show that their powers are never again so completely overtaken by changes in the digital world.
I look forward to hearing from my noble friend the Minister, and warmly congratulate him on his appointment. I am sure that he will approach this debate and the online safety Bill with characteristic depth of thought. I beg to move.
My Lords, I rise not because there is even more power in having “early” next to my name on the speakers’ list, but because the noble Lord, Lord Bassam, has had to withdraw—I hope for non-serious reasons. We will miss his contribution.
I sincerely congratulate the noble Lord, Lord Gilbert, and the committee, for an excellent report and, as he has indicated, a timely one, as we move to the Online Safety Bill in the very near future, hopefully. I also look forward to the noble Baroness, Lady O’Neill, following me. My mother, who was born in 1900 and left school at 13, was something of a philosopher herself, and used to tell me, “Sticks and stones will break your bones, but names can never hurt you.” That provided me with a certain resilience for my chosen profession of politics, but it is only partly true. Misinformation, fake news, and plain old-fashioned lies have been the prelude to tyranny, torture and murder throughout history.
Liberal democracies are particularly susceptible to such attacks. I am not talking about the Liberal Democrats but about that wave of parties in all free societies who believe in the freedom of speech that the noble Lord, Lord Gilbert referred to, to free Parliament and the rule of law by an independent judiciary. They are particularly susceptible because they have built into their DNA a certain tendency towards tolerance and freedom of speech, and a reluctance to claim absolute certainties. I miss from these Benches today the late Lord Russell. Conrad would say, in response to a particularly dogmatic colleague, “I wish I could be as sure about one thing as the noble Lord is about everything.”
I have time to make only three short points. First, I commend the four regulators—Ofcom, the ICO, the CMA and the FSA—for the work they do to consult and co-ordinate, and I urge them to extend this to further protect the rights of citizens and consumers. I associate myself with the call from the noble Lord, Lord Gilbert, for the early establishment of the digital markets unit.
Secondly, digital citizenship should be a central part of the government media literacy strategy and be properly funded. I served on the Puttnam committee, which gave pre-legislative scrutiny to the 2003 Communications Act. We recommended that Ofcom give priority to digital literacy as a way of equipping the citizen and democratic structures for the new digital age. I am afraid that this is still work in progress, and I support the report’s recommendation that Ofcom assist in co-ordinating digital citizenship education between civil society organisations and industry.
Thirdly, the Government’s response contains lots of good intentions and box-ticking, but big tech will be judged, rather like the big energy companies on climate change, not by its ability to tick boxes or do its equivalent of greenwashing, but by what it actually does to address these very real problems. That is why I strongly support the report’s recommendation that a Joint Committee of both Houses be established to consider the ongoing regulation of the digital environment.
My old mentor, Jim Callaghan, was fond of saying, “A lie can be halfway round the world before truth has got its boots on”. This is truer than ever today, and liberal democracies must equip themselves and their citizens to protect their institutions and values from a real and present danger. This report and debate are an important contribution to us getting right how we protect our freedom and values in the years ahead.
My Lords, this is a rich, detailed and informative report, yet one underlying issue has perhaps gone to the margins: the focus on freedom of expression. Nowadays, we often use the term “freedom of expression” as though it were a synonym for freedom of speech. I note that communication involves two parties—not merely those who express themselves, the originators, but the recipients. This shift has been a feature of 20th-century discussions. When we shifted human rights documents to focus on freedom of expression rather than free speech, perhaps we did not notice that this marginalises the position of recipients and privileges originators. In short, there is a difference between expression and communication. Freedom of expression is not enough for a democratic culture in which free communication is respected and required.
As we well know, new communications technologies have often fundamentally disrupted communication. We can think all the way back to what Plato tells us of Socrates writing about writing, to realise how old this is. Similar things happened with the advent of printing and then, of course, of broadcasting. The remedies were often extremely slow, which is a salutary lesson for us in contemplating the recommendations of this report. How fast could it be done? How much of a change would it achieve?
This time, as I mentioned, we have new technologies that privilege the originators and expand their freedom of expression—at least in theory. That is no bad thing, but it might leave the recipients in a problematic position, receiving content from they know not where or whom. That is where the problem begins: we do not who the originators of this communication are. Very often, this is a source of difficulty.
Unsurprisingly, some norms and standards that have mattered greatly for communication will be ignored if we are thinking mainly about freedom of expression. Norms that can be ignored might include—this is just a smattering; there are many others—honesty, accuracy, civility, reliability and respect for evidence. I could go on. Noble Lords will note that they are not only ethical but epistemic norms. These are the bedrock of good communication.
So, stressing the rights of originators too much is likely to land us with some difficulty. Digital communication empowers originators, and this can be at the expense of recipients. Let us remember that some of the originators are not you, me and our fellow citizens seeking to express ourselves, but tech companies, data brokers and other actors in the digital space who relish the thought that they have freedom of expression, because it enables them to do things they perhaps ought not to do.
It follows that remedying the situation will be multiply difficult and probably slow, but the one thing it must not be is a set of remedies that protect originators at the expense of recipients. Remedies must concentrate on removing the cloak of anonymity that currently protects so many originators and ensuring that what they do can be seen to be something they did. That means removing anonymity from the tech companies, the data brokers and indeed the many other sources that are polluting communication at present.
I suppose that this empowers some originators, but I doubt whether concentrating on those will get us there. The important thing is to regulate data brokers, tech companies, Governments and cartels: those who pollute the online space.
It is an honour to follow such a respected philosopher as the noble Baroness. Indeed, it was a privilege to join the committee under my noble friend Lord Gilbert’s excellent chairmanship, but that was not until after the inquiry was completed, so I cannot claim any input into this excellent report.
In January this year, I took on the daunting challenge of succeeding my noble friend as chair and maintaining the committee’s reputation for undertaking inquiries of relevance and impact. Clearly, I endorse the conclusions and recommendations of the committee’s report. I believe in freedom of speech—online or in the real world—and welcome the Government’s decision to look again at the most contentious element of the Online Safety Bill—which my noble friend has already referred to—which threatens to undermine that. But, like everyone else, I also care deeply about the protection of children from harm, and my concerns have only been reinforced by the recent inquest into the tragic death of Molly Russell.
Doing nothing when it comes to regulating the internet is not an option I would consider acceptable. The Communications and Digital Committee will reconsider the Online Safety Bill once the Government have announced how they plan to change it before it reaches your Lordships’ House. I am not going to comment further on the freedom of speech aspects of the committee’s report today. Instead, I want to emphasise the importance of the other half of the regulatory equation to which the Government, frustratingly, have not so far attached equal priority, even though, as my noble friend has said, it is just as important if we are to have a safe as well as economically healthy online world: legislating to tackle the dominance and overwhelming power of the big tech firms by allowing much-needed competition to them.
Chapter 4 of the committee’s report sets out most powerfully the case and urgent need for the Digital Markets Unit, which is part of the Competition and Markets Authority, to be put on a statutory footing and given ex-ante powers to intervene more effectively in these markets. My noble friend already referred to one of the key conclusions in chapter 4, which is about these platforms not being allowed to monopolise the digital public square. The report also recommends that the DMU should, where necessary,
“block mergers and acquisitions which would undermine competition.”
Earlier this year, determined to continue the good work started by my noble friend, the committee held accountability sessions with the Government and the CMA to maintain the pressure for action, including calling on the CMA to use its existing powers to their very limit while waiting for these long-promised and much-needed new powers. Since then, and to its credit, the CMA has been doing that, as evidenced by its recent ruling against Meta’s acquisition of Giphy—the GIFs that are used in tweets and different forms of social messaging. Noble Lords and others might shrug their shoulders and wonder, “So what? What’s the benefit of that?” Well, let me explain.
Had this acquisition been allowed to continue, Meta would have been able to increase its market power by denying or limiting other social media platforms’ access to these GIFs, thereby pushing people to Facebook, Instagram and WhatsApp, which already make up 73% of user time spent on social media in the UK—or it would have been able to change the terms of access to Giphy GIFs, requiring Twitter, TikTok and Snapchat to provide Meta with more data from UK users in return for their access. Disentangling Giphy from Meta will now be a slow and costly operation and a lot of the anti-competition damage will already have been done, but if the DMU had had ex-ante powers it would have been able to prevent the acquisition, or at least the integration, of the business until it had carried out its work.
The internet and the big tech firms have revolutionised our world, and they deserve huge credit for their innovation and the risks they have taken to make a success of their businesses and create opportunities for so many others. But we cannot ignore the damage they cause socially and economically because of the control and power they hold. This threat will only grow if there are no limits to their dominance and everyone else is forced to rely on them, whether as individuals, businesses or even nation states.
It cannot be right that a handful of powerful individuals or corporate entities with no democratic mandate can influence and shape our society and affect our social norms. We need to ensure that the Online Safety Bill does not inadvertently exacerbate that threat, and we need to accept that we will need to keep evolving regulation in this area. But the Government also need to recognise that, on their own, online safety legislation is not enough, and they must bring forward with equal if not more urgency the digital competition Bill. When my noble friend comes to wind up, could he explain why the Government have, so far, failed to recognise this? Could he also tell us what plans the Government have to bring forward this necessary legislation as soon as possible?
My Lords, I am grateful for the opportunity to speak in this debate, and to pay my thanks to the outgoing chair and, indeed, my obeisance to the incoming chair, as I seek to behave appropriately as a member of the committee.
My first point is an observation on how long it takes for a committee report to get its day in the Chamber. It is two years since we did this work. I think of our work on the future funding of the BBC, the future of Channel 4, the position of regulators and now our report on the creative industries and wonder just how old I will be by the time we get to the end of that list.
So it is good to have the report here. In a sense, rereading it with the advantage of two years’ space makes me aware of just how good a report it is. It makes as good reading now as it did then. The noble Baroness, Lady O’Neill, subtly made a point that I will take home and think about. Yes, we had the age-old debate about the need to wed ourselves to the idea of freedom of expression as a human right, but we also had impeccable debates about the misuse of people’s data.
They were two debates that were truly impeccable, each adumbrating a principle which we should stand by with every fibre of our being. It seems to me that, since one seems like an unstoppable force and the other an immovable object, it would need the wisdom of Solomon to decide in particular instances how to favour the rights of those who feel their privacy has been invaded over the advocates—of whom I am one—of freedom of speech. But originators and recipients will go home with me, and I shall think seriously about it.
The digital equivalent of the public square is how social media platforms have been described, and indeed they are, yet the irony is that they are controlled by private companies. Out of that paradox come all the difficulties that we are wrestling with as we seek to get legislation that deals with this complicated world.
The protection of children has been adequately mentioned, and so it should be. I heard the Minister at Question Time yesterday talk again and again about the fact that looking after the interests of children is the predominant feature of the Government’s mind as they take legislation forward in this area. So I hope that the 5Rights work done by the noble Baroness, Lady Kidron, will be incorporated in that thinking and play a major part. Age verification is what she is very concerned about. I believe that her foundation has made significant progress towards getting something that we could work with, and I hope she has assurance on that point.
Early in this report, we were pointing the way forward, presciently I think, towards the Online Safety Bill that will soon be before us—or will it be soon? It has been put off so many times. I have no idea when it will finally be taken on the Floor of the House of Commons. Looking towards such a Bill, we emphasise the need for three aspects of consideration that we should take very seriously: the design of legislation, the nature of competition and the need for improved education in what the phenomenon of the internet and its applications means, not just in terms of helping children and adults to press the right buttons and to activate the machinery to do their will, but to understand outcomes and the essential nature of what anonymous contributions to conversations—or are they conversations if the contributors are anonymous?—can lead to. Well, I am very glad that this is before us.
“If liberty means anything at all, it means the right to tell people what they do not want to hear.”
That is fair enough. I have stood at Speakers’ Corner in Hyde Park many a time and have had a fair few things hurled at me. However, I want to add as a corollary, “If liberty means anything at all, it means the right of people to tell me what I don’t want to hear”. I think that that might be a complementary way of looking at a very important principle.
My Lords, I shall seek not to go over. I congratulate the noble Lord, Lord Gilbert, and the committee on the report. It is very timely to debate it today—the day on which the EU’s Digital Services Act comes into law, and as we ourselves eagerly anticipate the Online Safety Bill. I want to make a short contribution on the basis of having spent a decade inside one of the platforms, making decisions about how to manage content.
We are here with the Online Safety Bill and the Digital Services Act because we, the politicians, do not trust private companies to make decisions about their platforms. The noble Lord, Lord Gilbert, outlined some of the reasons why that trust has evaporated. The position now is that we are taking power to ourselves to tell platforms how to manage content, as a condition of operating in the UK market, and we will delegate the day-to-day enforcement of those rules to our chosen regulator, Ofcom.
An important question that arises from this, which the report rightly focuses on, is whether we should instruct Ofcom to consider only illegal speech or to bring in a wide range of other types of harmful speech. Because of the concerns about whether the regulator should enforce against only legal speech, there is now an interest in whether the definitions of “legal” and “harmful” could be more closely aligned. Today, I want to make a necessarily condensed argument for why this would be a mistake, both as a matter of principle and as a practical matter.
Turning first to the principle, we often hear calls to align online and offline standards. In our real-world interactions, we do not rely solely on the law to manage speech behaviour; this is to build on some of the arguments made by the noble Baroness, Lady O’Neill. To take an example, I could cover myself in swastikas and hand out copies of Mein Kampf entirely legally in the United Kingdom. There is no law that prohibits me. Yet were I to try to do that in most public spaces, such as by going to a football ground, I would be stopped on the basis that the speech norms prohibit my doing that, rather than because I had broken the law. We have a gap between what is unacceptable speech and what is illegal speech. This is not a bug but a feature of our speech norms in the United Kingdom.
It would be a mistake to try to make all unacceptable speech illegal or, equally, to deem all legal speech acceptable and try to force platforms to carry it. We are left with a sustained situation where there will be a gap between what we as a population believe is acceptable and what the law outlines, and that is right. We want to keep the legal prohibitions—the criminalisation of speech—as minimal as possible.
Turning to the practical considerations, which the noble Lord, Lord Gilbert, again talked about, it is sometimes assumed that there is a bright line between legal and illegal content. My experience over many years is that there is no such bright line but many shades of grey. Again, to illustrate this with a specific example, many people would post on social media pictures of Abdullah Öcalan, the leader of the PKK, a proscribed terrorist organisation in the UK. Now, when someone posts that picture, are they supporting the peace process he is engaged in in Turkey? Are they supporting him as a terrorist? Are they supporting his socialist ideals or the YPG in Syria, which also looks to Abdullah Öcalan? There is no way to understand the purpose and intent from that photo, so you have to make a judgment. At one end of the spectrum, you could say, “Look, I am so worried about terrorist content that I am going to take down every picture of Abdullah Öcalan”, knowing that you will be taking down many forms of legal expression. At the other end, you could say, “I will leave them all up, and if I do so I know that I will be permitting some expressions of support for terrorism, or some illegality to take place.” There are of course many points in between.
We have an opportunity now to shift where those judgments are made in the new structure outlined in the Online Safety Bill. Platforms will have to respond to guidance and codes of conduct precisely on these issues of how they make judgment and we, as Parliament, will have a role in setting that guidance and those codes and of conduct, as Ofcom will bring them to us. We are moving into a world where decisions will not necessarily get any easier but will no longer be the sole preserve of the platforms. It is a benefit for public accountability that there will be an official government or parliamentary view expressed through Ofcom’s codes of conduct. Equally, we as Parliament—or the British establishment—will be responsible in future for the decisions made around content moderation. I fear that I may have jumped out of the platform frying pan into the regulatory fire by engaging from this side of the argument, but the Online Safety Bill will be a significant improvement.
My Lords, I declare an interest as a freelance TV producer. I had the honour of serving on the Communications Committee when this report was published. I too thank the noble Lord, Lord Gilbert, for his very able chairing of this inquiry.
The noble Lord, Lord Gilbert, suggested that the Government should amend the Online Safety Bill clauses on content that is legal but harmful to adults. I agree with the fears that these clauses will have an extremely deleterious effect on free speech. It is not just that the definition for this material is so vague, but that the Bill gives such dangerous powers to the Secretary of State to specify what is harmful by regulations. I support the recommendation in this report, which were then taken further by the Joint Committee on the Bill, to set up a parliamentary committee that will have the power to interrogate these changes further. I understand that the last Government were minded to drop these clauses. I would be grateful if the Minister would share with your Lordships’ House the new Government’s thinking on this issue.
I want to concentrate my speech on the later recommendations in the report. Recommendations 33 and 34 call for the Digital Markets Unit to be given statutory powers. It has been established for over a year and a half but still has not been given these. This could not be a more urgent issue. The big tech companies are still shockingly dominant. Your Lordships have heard this week of the falls in their share prices, but they still have enormous power in the markets.
In the tech ad market, this power is supreme. The CMA’s report into online platforms and digital marketing space found that Google and Facebook, as it was then called, make up 80% of digital advertising spend. It declared that the market is “no longer … contestable”. Such dominance is an obvious threat to innovative start-ups. Even if they manage to get a share of the advertising revenue, they face the ever-present threat of being bought up before they have grown to scale by the big players, whose dominance is therefore enhanced.
The problem is that the CMA’s monopoly rules concentrate on consumer price benefit. Obviously, when so many of the services offered by the platforms are free, that does not apply. Instead, different metrics must be introduced which take into account how the platforms use data, consumers’ privacy and freedom of expression.
The Government’s response to the committee’s recommendation is to acknowledge that competition is central to unlocking the full potential of the digital economy. They promise to deliver reforms that will bring more vibrant markets, innovation and increase productivity. Who in this House does not agree with that?
I echo the noble Baroness, Lady Stowell, who asked why the Government have been so slow to enact these pledges. The Queen’s Speech dangled before your Lordships the hope of a draft digital markets and competition Bill, which promised to give the DMU statutory powers so that it can tackle tech companies’ abuse of their dominant positions. As the Government delay on this matter, regular businesses and consumers are losing out. The CMA suggests that they are losing £2.4 billion annually from the overpricing of the big platforms on ad sales alone.
Instead, the Government have used valuable legislative time to bring forward a media Bill which, although containing useful elements, promises to privatise Channel 4, which is driven by blind ideology rather than any business case. Can the Minister give the House an indication of when the digital markets Bill will come before it? I hope he will give us an assurance that goes beyond “when parliamentary time allows”.
I should also like to draw your Lordships’ attention to recommendation 42 of the report, which calls for a mandatory bargaining code to be set up to ensure fair negotiations between platforms and news publishers. Since 2010, over 265 regional newspapers in the UK have closed. Those that remain have seen their circulations collapse and this lost revenue is not being replaced by digital subscriptions. The industry faces an existential threat.
The big hope is that it can be resurrected digitally, as 38% of visits to news publishers’ websites came from links on Google or Facebook. However, at the moment the platforms get the content free or at very little cost, even though news content is one of the biggest drivers of traffic. The tech companies have made contracts with some newspaper publishers to pay for their content, but many say that the power imbalance is so great in the platforms’ favour that they are not being paid the true cost for using the content.
A bargaining code has already been introduced in Australia. It is not perfect because it is not sufficiently inclusive of regional players, and some people are worried about a mandatory contract for news content being imposed on the platforms. However, Rod Sims, the ex-head of Australia’s competition commission, told me that this has not happened and he had not been forced to use his powers. The threat of the imposition of a contract has changed the dynamic in the market enough to bring the platforms to agree an equitable price with news publishers for use of their content.
The report needs to see more of its recommendations taken up by the Government. There is still important work to be done if this country is to become a digital world leader. I urge the Minister to do all he can to ensure that there is legislation which allows freedom of expression and for a competitive digital market to allow a plurality of platforms in which those voices can be heard.
My Lords, I thank the committee for this report. Even though I do not agree with many of its recommendations, it was a real treat to read—like a great primer or literature review. There is so much of the Online Safety Bill to worry about in terms of free speech that it is hard to know where to focus, so I will just make a few points.
I was especially grateful to see a refreshingly nuanced approach in the report to misinformation, which I focused on the last time we discussed these issues. As research from Ofcom notes, many believe that the term “misinformation” is being
“weaponised for censorship of valid alternative perspectives.”
The report’s examples from the lockdown and Covid era are pertinent: for example, expert medical opinion—albeit a minority—that challenged either the Government or the World Health Organization were labelled misinformation, deemed so by big tech fact-checkers with no scientific qualifications but
“There is a moral panic about ‘fake news’”, leading to “frightening overreactions” by Governments and big tech.
I was also glad that the report noted the broader context of what I think is in danger of being a potential moral panic about online safety. Concerns from free-speechers are based on the offline problems of cancel culture and the ever-growing attacks on, for example, academic freedom in universities—such that the Government are attempting to legislate to enhance free expression on campus at the same time as undermining free expression online.
I will add another offline context: there is a contemporary therapeutic ethos that posits safety—especially psychological safety—as trumping freedoms of any sort. I hope that the committee will look at this at some stage. We cannot discuss online harms without understanding that the concept of harm is an ever-expanding category.
Before I look at that, I will make one clarification: whenever I raise problems with the Bill, the justifications that come back at me always centre on children’s safety. I note that I would be happy if the Online Safety Bill confined its focus on the young and children. Instead, the Government use adult worries about children’s access to porn, self-harm and suicides—all right worries—to introduce huge legislative changes that will affect adult freedoms, effectively infantilising citizens and treating us as dependent children in need of protection from each other’s speech.
The report tells us:
“Civilised societies have legal safeguards to protect those who may be vulnerable.”
The problem is when vulnerability gets discussed in relation to adults. In a therapeutic culture, vulnerability and victimhood are valorised and often incentivised because, if we present ourselves as fragile and vulnerable, we have a cultural currency and power not only to gain attention and support but to silence others. For example, the report is extremely helpful in deconstructing the whole concept of harm: the committee rightly rails against the illiberal notion of censoring “legal but harmful” material, and hopefully the Government will indeed drop that egregious clause. The whole premise of the Bill is based on the idea that speech online can be, and often is, harmful. The elastic use of the term “harm” makes it ill-defined and subjective, fudging physical harm with psychological harm—and it is no wonder that many now see words as violence.
The committee helpfully asked the Government whether the
“Bill’s definition of psychological impact has any clinical basis”.
The reply came back saying, “No”; it would be up to “platforms … to make judgements” about speech causing anxiety or fear. This is potentially disastrous, as terms such as “offensive”, “hate”, and “misinformation”—with all their subjectivity—can be said by individuals to mean that something should be banned.
The report notes that, a few years ago,
“its presence could ‘harm”’ some attendees.”
Goodness knows what they would make of the harm of having Bishops in this place. Only this week, Cambridge University faculty heads apologised to students for “distressing” them by sending an email promotion for a “potentially harmful” talk. What caused such alarm? A talk by Sex Matters’ Helen Joyce entitled, “Criticising gender-identity ideology: what happens when speech is silenced”—oh the irony. Actually, much speech is silenced, online and offline, by deploying the language of psychology to suggest that speech, books and ideas are dangerous. Trigger warnings are put on lectures and literature to prevent post-traumatic stress disorder. PTSD is now not clinically diagnosed post war or after a disaster, but by the potential harms caused by upsetting speech or words. So even if “harm” in the Bill is medically diagnosed, it will not help because psychological language is now frequently used to silence us.
My Lords, I join in paying tribute to the noble Lord, Lord Gilbert, for steering this excellent report through the committee. It was very much his idea; he was ahead of his time in alighting on this as a big gap in terms of how we debate online safety regulation. Let it not be forgotten that now I have a new leader—my noble friend Lady Stowell—and I duly genuflect to her for the few months that I have remaining to serve on this excellent committee, which does some excellent work. It has been a wonderful way in which to be introduced to your Lordships’ House.
I echo a lot of the points made already by people who are extremely well informed in this arena. First, on the point made by the noble Baroness and the noble Lord, Lord Gilbert, about the Digital Markets Unit, it is obviously very important that we update competition regulation. It is interesting that the analogue Competition and Markets Authority still managed to take a swipe at Meta/Facebook and forced it to divest itself of Giphy, which is a site that produces lots of memes. No doubt the decision produced its own memes—but it has been a very bad week for Mark Zuckerberg, and I gather that he is now down to $38 billion in net worth. He has lost $100 billion and is now merely worth his age, 38, so we hope that it does not go any lower. I am not generally in favour of the competition authorities getting involved in these kinds of issues, but it is a good reflection that the acquisition of small companies, such as Instagram, can sometimes shut off competition at an early stage.
I also want to get off my chest this issue about digital citizenship. By the way, as mentioned in the register of interests, I work with Common Sense Media, a US not-for-profit organisation that promotes digital citizenship, as well as NewsGuard, which combats fake news sites. I find the phrase “digital citizenship” intensely annoying, because it has become completely meaningless. That is not to be rude about Common Sense Media, the organisation that I work for, which provides very useful videos and training for young people on how to handle online bullying, and so on. It also means that we miss the point about how still clunky technology is. For me, the biggest change that we could make in digital citizenship would be to take the 120 pages of terms and conditions that you sign up to when you buy a new phone and turn them into five principles, so you know exactly, in effect, what you are signing up for.
I would like to see the Government—and, indeed, this Minister, in the few hours he remains in his current post, although I am sure he will be moved to an equally good department—
I am only teasing about the endless reshuffle. My jokes do not always work in this place. I would love the Minister to say what the Government are doing to encourage technology companies to be more user-friendly. That may involve digital citizenship training for the Home Secretary, who I gather finds it difficult even to use email. Clearly, there are issues here for people of every level of experience.
At the heart of this debate is, of course, what we mean by “legal but harmful”. I completely agree with the noble Baroness, Lady Fox. I was hoping to disagree with her, because she is very provocative, but she is right to a certain extent about moral panic. The rise of Trump was actually aided by CNN more than anybody else. We sometimes load too much on to the platforms in terms of what they do. Nevertheless, I strongly support internet regulation of some kind. We need to make platforms accountable. The best example is, when you have a Twitter pile-on and you have the most vile abuse—particularly as a politician—there is simply no way in which to get redress. There has to be a regulatory backstop for you to be able to do that. But let us be clear: this is not broadcast regulation and it is not going to take every tweet and adjudicate; it is systems regulation and it is long overdue.
The final point that I would make to the Minister—and I shall not make another joke about him moving—is that I would love to hear what more the Government are doing on age verification and identity. It is such an important issue, and we simply cannot seem to get to a clear answer. It is about dealing with the issue of adult content, which the noble Baroness, Lady Fox, raised—and it seems unbelievable that we still do not have proper age verification procedures in place for this kind of thing.
Finally, as a committed remainer, I celebrate, along with the noble Lord, Lord Allan, the fact that these terrible EU bureaucrats who can barely tie their own shoelaces have managed to pass much more quickly than us sensible regulations on internet and competition regulation, while we see the Online Safety Bill tortuously stuck in the other place.
My Lords, like previous speakers I thank the committee for its excellent report. As someone said, it makes good reading and is a clear exposition of the issues. The obvious question is, where are we with the Online Safety Bill, which was mentioned in the Queen’s Speech but has since disappeared from sight? We were told it would be “as soon as possible” and clearly, the emphasis is on “possible” rather than “soon”. Much of the discussion, rightly, has been on the issues set out in the report, with a focus on protection of children, which I feel strongly about. I think back to my childhood and that of my children, who grew up in a pre-internet age, and I fear for my grandchildren, faced with the issues they now see on the internet. Legislation in that area is crucial.
However, I want to expand the discussion to reflect the expansion of the scope of the Online Safety Bill, because various other priority offences have been added to it, including fraud and financial crime. This is an important aspect of online safety: clearly, protection against financial crime should be a crucial part of the Bill, and I am glad the Government have accepted that. It is not clear to me exactly how it is going to work, because there is a particular problem. They have defined it as “fraud and financial crime”, but a lot of the harm that happens on the internet might not, strictly speaking, be found to be fraud or other another form of financial crime. People can be harmed financially through reading material on the internet which, unless there is prosecution through the courts, might not be counted as criminal. I hope the Minister will say something about how such things will be defined.
The issue of a duty of care has been mentioned, and a duty of care to protect people against financial harm would be an essential pillar of the Bill. In the phrase that is used, there should be safety by design, so that people are not misled into reading material that will cause them suffering.
My Lords, I too sat on the committee under the excellent chairmanship of the noble Lord, Lord Gilbert, and now under the excellent chairmanship of the noble Baroness, Lady Stowell.
The power to amplify, together with the volume and speed of the internet, have put power in the hands of individuals, organisations and tech companies, for better or worse. Now, we are seeking to control the worse, but as we do so I counsel that we remember that the internet has given us the most extraordinary communication tool for ideas, for gathering others to our cause and for getting information around the world quickly, as well as avenues for those in countries that do not have the miracle of free speech, because their media are state-controlled, to contact the outside world. So, getting the balance right between freedom of speech and the need to qualify it is a very important task. Of course, what is illegal offline is illegal online: that is the easy bit and I guess that is where my preference lies, with very few exceptions.
As the noble Lord, Lord Gilbert, said, I want maximum controls in my own home. Put power in my hands—if I do not want to receive anonymous messages, I should be able to tap my screen and they should never bother me again. However, primarily, I want companies to be responsible for policing their content and Ofcom to regulate and act when companies do not comply with their codes. I would hope that that would be enough, as it has worked pretty successfully in broadcast and publication to date, but clearly the world has changed and we are in different territory. If something is likely to cause real, serious harm online, then, as the noble Lord, Lord Gilbert, said, it should be made illegal.
However, as we are to legislate against less obviously harmful content, let us have a very short list of what will qualify. I saw the list on priority content that was published. The list is not unreasonable; the unreasonable part is putting any such power into the hands of the Secretary of State and not Parliament as a whole. We cannot give the state control over our media.
Overarchingly, we must leave room for adults to make their own decisions. We do not have to view what we do not want to see. We need to be careful in any legislative fervour to guard against authoritarian creep, where the prohibition against what is truly harmful oversteps into a world where we are to be protected from absolutely anything we do not like or agree with—or worse, that the Government do not like or agree with. That is really dangerous territory.
Free speech is one of the most precious of all human rights. It is the foundation of a democratic, open society. I am concerned that we are already seeing authoritarian creep in things we have taken for granted for years, such as some curtailments on the right to protest. It has always been recognised that the right of people to criticise Governments, laws and social conditions is fundamental to democracy. Of course, free speech presents great challenges—that is the point—but from Socrates on, the very best way to challenge ideas you disagree with has been to confront them by marshalling better ethics, reasoning and evidence. I worry that we have become risk-averse to a degree where we are disabling ourselves.
When we bring in the Online Safety Bill, we must guard against disabling future generations by overprotecting them from the realities of our existence. Civilisation is only skin deep; we need to be able to think, counter arguments and fight back with strength of mind. Life is dangerous and ideas can be challenging. With too much protection, we will create an inability to build resilience. Jonathan Haidt, the American social psychologist, cites the immune system: if you do not expose a human to various viruses or allergens, their immune system will not develop. We require a degree of exposure to stress to enable us to develop strength.
If we spend time only with people who agree with us, are like us or think like us—this is happening, as society is disaggregating into groups of the like-minded—we will be on a very dangerous road. Continuing to divide ourselves and narrow our circles to people, media or groups who agree with us is reductionist. It leaves us weak, suspicious and scared of the different. I am not fond of the term “snowflake”, because I think being sensitive to peoples’ feelings and sensitivities is a good thing. I disagree with the noble Baroness, Lady Fox; I think trigger warnings are fine—they are just putting the power in your hands.
Anthony Kapel “Van” Jones, an American news and political commentator, author and lawyer said:
“I don’t want you to be safe, ideologically … I want you to be strong.”
If we eradicate words, ideas and subjects that cause discomfort or give offence, we weaken ourselves. I am worried that power will be held too close to the state. We must sort out the chaff from the wheat but, more than that, we must not submit our intellect and freedoms to the mob.
My Lords, with the blurring of edges between physical, digital and virtual as technology advances into the metaverse and Web3, the committee is right to demand an immediate setting of standards. To commend the members of the committee would be an understatement; I express my admiration for what is a thorough examination of this plethora of subjects.
Freedom of speech in any context is a valiant aspiration. The fact is that each aspect of our freedom has consequences and impact. None of us should be entitled to a set of freedoms that disregards the well-being of others, or that is detrimental to others and predicated on harms to others. In the absence of defining a set of boundaries and values for that freedom, we will certainly need to consider guidance, although exploring the parameters of what common values can be regulated and safeguarded without defining them is a problem.
I am chair of the APPG on the Metaverse and Web 3.0. We have recently conducted meetings with the leading innovators and entrepreneurs in this space, those who are transitioning from Web2 to Web3 within the emerging technologies. Their overall view is that it is vast and fast in this decentralised space, and that we may be running late to regulate the industry. It is important to prevent conglomerates, elite one-man bandwagons like Facebook, Google and Twitter, becoming the key holders of our data and the future of our young, without including them as stakeholders and entrepreneurs within the sector, and we are already at an advanced stage of building systems without any recourse to accountability and transparency. The concerns are well laid out in the report on page 19, with the evidence presented to the committee. Our APPG wishes to add to the committee’s work by bringing together practitioners, academics and NGOs who are cognisant of the impact on young people playing Roblox, or vulnerable young people stuck in virtual reality.
The possibility of innovation being a common good for society is immense, as I experienced this week with a WPP event where I stepped into a virtual world of Singapore and South Africa. It was a powerful experience, but nothing can replace experiencing the countries’ air, beauty and interactions with people.
Even in this new, decentralised arena, inclusion is not a reality for those who may most benefit from, and need, support, virtual or otherwise. The report eloquently emphasises the issues of content, but without an acceptable definition of false or harmful content it would be difficult for Ofcom or other regulatory bodies to take any actions. Use of Facebook by the Burmese extremists to spread hate against the Rohingyas may have assisted the brutal murder, torture and rape of hundreds of thousands of people. The Rohingyas have filed a suit against Facebook; no one knows what the outcome will be. Closer to home, the dirty tactics of Cambridge Analytica remind us that the world is an advertiser’s oyster when the selling of our data goes without adequate public knowledge and education.
I recently tried to buy a photobook for my grandchildren on Google, and when it came for the time to pay, suddenly a pop-up said, “All your data, including email contact, will be available”. I was rather disturbed; I was flabbergasted. It seems that anything is acceptable in this space. I find the intrusion from one purchasing one item rather unnerving, to say the least, and members of the public may be unwittingly agreeing to things without informed consent, with data often being sold onwards.
I have much to add on the social impact of this matter. Suffice it to say that, as a child protection officer, my antennae are permanently engaged on exploitation of children and vulnerable adults. I have witnessed too many times the devastating effect of child sexual abuse and exposure to pornography. The digital space is open for paedophiles to go beyond current imagination, allowing them to create a virtual reality of children raping children and of extreme violence against women, not to mention the demonisation of certain religious groups and women—these are alarmingly rampant. A witness to the committee has highlighted many of these.
It is refreshing that young developers and innovators are all too keenly aware of the issues, and I am confident that their work is a good example. They are very keen to work in partnership with institutions and government, as well as NGOs, and many are acutely aware of their responsibility.
I just want to ask a question and then I am done. I appreciate the leniency of the House.
Where the devices are becoming more accessible, how can we ensure freedom of speech for our citizens, considering the potential for hacking and stolen data? If much of our social activities move to the metaverse, how can we safeguard users against cyberbullying and sexual exploitation, particularly of children? How will the pornography industry be regulated and monitored, as experiences in this space become more immersive?
My Lords, I also congratulate the noble Lord, Lord Gilbert, and members of the committee on producing such a thorough and thought-provoking report. I refer to my interests as set out in the register and declare that I spent some 20 years building a digital information company where freedom of expression—in our case, views and analysis on Governments around the world—was our lifeblood.
That said, my focus today is online safety, particularly for the young, among whom evidence shows that mobile access to digital media has led to deeply disturbing patterns of behaviour—not just in the well-documented areas of online hate, abuse and bullying but in the unintended contributions to increasing obesity, falling levels of physical activity and, in certain areas, declining levels of academic performance. This also raises a key question: has social media led to a decline in workplace productivity? It is debatable, but many employers, like me, believe that it has.
I believe we must go further and much faster than the draft Online Safety Bill suggests in providing stronger and more effective levels of protection to children. Yes, some of these measures will cause friction, a pet hate of digital platforms; some will restrict freedom of speech; some will impact revenues and profits; and some will depress usage, which is no bad thing in my view. However, the damage to both the mental and physical health of the young is the absolute priority.
The ONS reports that 75% of our children spend three or more hours online a day at the weekend, with 22% spending more than seven hours a day. On school days, almost half spend more than three hours a day online. Allied to that, just 23% of boys and 20% of girls in this country meet the national recommended level of physical activity. One in five children starts primary school overweight or obese, rising to more than a third by the time they leave. More time online, less physical activity—what an unhealthy start to life.
As we know, anxiety and depression among both boys and girls has risen sharply over the last 20 years, as have self-harm and suicide rates. The young and vulnerable continue to have almost unfettered access to menacing websites promoting self-harm or “taking control of your life”, and this is not just reserved to the dark web. The need to protect our children is beyond question. How you do so is complex and challenging, and it ultimately requires a global set of principles for digital safety, because this is very much a multinational issue.
I will finish by touching on two further points raised in this report. First is the urgent need for age assurance and age verification technologies, as others have flagged up today, which the draft Bill should address much more forcefully. Responding to a Question in this place yesterday, the Minister suggested that we should not rush in because these technologies are developing so rapidly. With respect, I find that a defeatist excuse for inertia. We should have acted in this area five years ago. TikTok is a prime example: it has a minimum age requirement of 13, which is laughably unenforced. Ofcom reports that it is used by 42% of our eight to 12 year-olds, which is almost certainly an underestimate. The British Board of Film Classification found that a deeply disturbing 51% of 11 to 13 year-olds have accessed pornography online.
Secondly, I wholeheartedly agree with noble Lords that digital citizenship, annoying though that term is, should be a central part of the Government’s media literacy strategy, but it requires structure and funding, as indeed does the equally important related need for health education. Teaching appropriate behaviour online—focusing on civility, inclusion and respect—has become a critical life skill, not just at primary and secondary school but at university and in the workplace. Let us embark on a joined-up and properly financed strategy to address this.
My Lords, as I am the last of the Back-Bench speakers, in the interests of catching up on time and because so many other noble Lords have expressed more eloquently than I could my own concerns about the unworkability of the “legal but harmful” duties and the need to protect children with proper age verification, I will make just one point that the report did not make.
As is well known, the Online Safety Bill has been five years in the making, during which time the tech world has moved on considerably. The report makes no mention of virtual private networks, yet with just two clicks on a VPN app, any user who wants to post to other adults freely outside of UK Government-censured social media can easily post to the rest of the world. That is, the rest of the world except China, whose outsourcing censorship methodology the Government are proposing to copy. In other words, those of us who value freedom of speech between adults over hurt feelings are reassured that technology has made whole sections of the Online Safety Bill redundant and irrelevant. However, we must improve age verification, as has already been mentioned, to protect children.
My Lords, I congratulate the Select Committee on yet another excellent report relating to digital issues and the noble Lord, Lord Gilbert, on his masterly introduction. It really has stimulated some profound and thoughtful speeches from all around the House. This is an overdue debate, as the noble Lord, Lord Griffiths, put it.
As someone who sat on the Joint Committee on the draft Online Safety Bill, I very much see the committee’s recommendations in the frame of the discussions we had in our Joint Committee. It is no coincidence that many of the Select Committee’s recommendations are so closely aligned with those of the Joint Committee, because the Joint Committee took a great deal of inspiration from this very report—I shall mention some of that as we go along.
By way of preface, as both a liberal and a Liberal, I still take inspiration from JS Mill and his harm principle, set out in On Liberty in 1859. I believe that it is still valid and that it is a concept which helps us to understand and qualify freedom of speech and expression. I was very interested in the speech of the noble Baroness, Lady O’Neill; like the noble Lord, Lord Griffiths, I think I need to take it away and think about the difference between freedom of speech and freedom of expression. Clearly, it is something of considerable importance conceptually. Of course, we see Article 10 of the ECHR enshrining and giving the legal underpinning for freedom of expression, which is not unqualified, as I hope we all understand.
There are many common recommendations in both reports which relate, in the main, to the Online Safety Bill—we can talk about competition in a moment. One absolutely key point made during the debate was the need for much greater clarity on age assurance and age verification, a point made by the noble Lords, Lord Griffiths, Lord Vaizey, Lord Gilbert and Lord Londesborough. It is the friend, not the enemy, of free speech.
The reports described the need for co-operation between regulators in order to protect users. On safety by design, both reports acknowledged that the online safety regime is not essentially about content moderation; the key is for platforms to consider the impact of platform design and their business models. Both reports emphasised the importance of platform transparency. Law enforcement was very heavily underlined as well, particularly by the noble Lord, Lord Gilbert, in his introduction. Both reports stressed the need for an independent complaints appeals system. Of course, we heard from all around the House today the importance of media literacy, digital literacy and digital resilience, from my noble friend Lord McNally and the noble Lords, Lord Griffiths and Lord Vaizey. Digital citizenship is a useful concept which encapsulates a great deal of what has been discussed today.
The bottom line of both committees was that the Secretary of State’s powers in the Bill are too broad, with too much intrusion by the Executive and Parliament into the work of the independent regulator and, of course, as I shall discuss in a minute, the “legal but harmful” aspects of the Bill. The Secretary of State’s powers to direct Ofcom on the detail of its work should be removed for all reasons except national security.
A crucial aspect addressed by both committees related to providing an alternative to the Secretary of State for future-proofing the legislation. I agreed with the noble Viscount, Lord Colville, and the noble Baroness, Lady Uddin, who talked about the metaverse, but the digital landscape is changing at a rapid pace—even in 2025 it may look entirely different. The recommendation—initially by the Communications and Digital Committee—for a Joint Committee to scrutinise the work of the digital regulators and statutory instruments on digital regulation, and generally to look at the digital landscape, were enthusiastically taken up by the Joint Committee.
The committee had a wider remit in many respects in terms of media plurality. I was interested to hear around the House—not only from the noble Lord, Lord Gilbert, but from the noble Baroness, Lady Stowell, in her intervention, and the noble Viscount, Lord Colville—support for this and a desire to see the DMU in place as soon as possible and for it to be given those ex-ante powers.
Crucially, both committees raised fundamental issues about the regulation of legal but harmful content, which has taken up some of the debate today, and the potential impact on freedom of expression. However, both committees agreed that the criminal law should be the starting point for regulation of potentially harmful online activity. Both agreed that sufficiently harmful content should be criminalised along the lines, for instance, suggested by the Law Commission for communication and hate crimes, especially given that there is now a requirement of intent to harm. I was not very clear from the intervention of the noble Baroness, Lady Fox, as to whether she even accepted that that could be regulated online.
Under the new Bill, category 1 services have to consider harm to adults when applying the regime. Clause 54, which is essentially the successor to Clause 11 of the draft Bill, defines content that is harmful to adults as that
“of a kind which presents a material risk of significant harm to an appreciable number of adults in the United Kingdom.”
Crucially, Clause 54 leaves it to the Secretary of State to set in regulations what is actually considered priority content that is harmful to adults.
The Communications and Digital Committee thought that legal but harmful content should be addressed through regulation of platform design, digital citizenship and education. However, many organisations argue—I take quite a degree of comfort from my noble friend Lord Allan’s points, made as someone with experience within the industry—especially in the light of the Molly Russell inquest and the need to protect vulnerable adults, that we should retain Clause 54 but that the description of harms covered should be set out in the Bill.
Our Joint Committee said, and I still believe that this is the way forward:
“We recommend that it is replaced by a statutory requirement on providers to have in place proportionate systems and processes to identify and mitigate reasonably foreseeable risks of harm arising from regulated activities defined under the Bill”, but that
“These definitions should reference specific areas of law that are recognised in the offline world, or are specifically recognised as legitimate grounds for interference in freedom of expression.”
We set out a list which is a great deal more detailed than that provided on
We also diverged from the committee over the definition of journalistic content and over the recognised news publisher exemption, and so on, which I do not have time to go into but which will be relevant when the Bill comes to the House. But we are absolutely agreed that regulation of social media must respect the rights to privacy and freedom of expression of people who use it legally and responsibly. That does not mean a laissez-faire approach. Bullying and abuse prevent people expressing themselves freely and must be stamped out. But the Government’s proposals are still far too broad and vague about legal content that may be harmful to adults. We must get it right. I hope the Government will change their approach: we do not quite know. I have not trawled through every amendment that they are proposing in the Commons, but I very much hope that they will adopt this approach, which will get many more people behind the legal but harmful aspects.
That said, it is crucial that the Bill comes forward to this House. The noble Lord, Lord Gilbert, pointed to the Molly Russell inquest and the evidence of Ian Russell, which was very moving about the damage being wrought by the operation of algorithms on social media pushing self-harm and suicide content. I echo what the noble Lord said: that the internet experience should be positive and enriching. I very much hope the Minister will come up with a timetable today for the introduction of the Online Safety Bill.
My Lords, I, too, am most grateful to the Communications and Digital Committee for its work in this area, and particularly thank its chair, the noble Lord, Lord Gilbert, for his work in that role. I also wish the noble Baroness, Lady Stowell, well in taking up the role. It seems an appropriate moment to call on the words of my noble friend Lord Griffiths, who talked about the need for the wisdom of Solomon in this report. I hope the noble Baroness finds that she too has the wisdom of Solomon, because this debate has shown us the need for that.
The contributions to this thoughtful debate today have shown the considerable tensions between protection from harms and privacy on the one hand, but also the great need to embrace the ever-developing and changing opportunities that the digital age brings us. This report has given your Lordships’ House a great opportunity today—albeit some time after the event—to consider a very important matter of our time, which is so deeply affecting so many different aspects of our lives. I am minded to recall that, in the course of a previous debate which I know a number of noble Lords present today took part in, the most reverend Primate the Archbishop of Canterbury wisely observed, when we were looking at the contemporary challenges to freedom of speech, that it is not just about having frank speech, it is about having fitting speech. As we discussed today, we are speaking about freedom of expression, and that, too, must be fitting.
The right to freedom of expression is absolutely balanced by the responsibilities held by government, media, technology and citizens. It is not an unrestricted right, and it is subject to legal limits. For example, while the UN General Assembly recognised all the way back in December 1948 that freedom of expression was a fundamental right to be universally protected, subsequent international agreements have recognised that there can and should be limits to this right.
Of course, the right to freedom of expression is already subject to a range of restrictions in law in this country, but, as noble Lords have said, we must align the legislation, the regulation, with the reality, and we must keep pace. As the noble Baroness, Lady O’Neill, said, the question of the balance of consideration between originators and recipients is a very important one. In particular, the cloak of anonymity worn by some originators cannot be used as a way to damage recipients.
This report highlights the very difficult balancing act which is faced by policymakers, and there is a significant body of evidence which demonstrates the types of harm witnessed online—but we must ensure that freedom of expression is not unfairly curtailed. In addition, as the noble Viscount, Lord Colville, said, we must also remember that it is important that we retain and develop a position as a digital world leader.
This report helpfully acknowledges that various regulators have roles in relation to different forms of online activity, but it also identifies concerns about the lack of overarching regulation covering social media and search services in the UK. Of course, unsurprisingly, many noble Lords have referred to the Online Safety Bill and the various delays to its progress through Parliament, and to the Government’s recent attempts to rewrite parts of it. Of course, it has been some time since the committee’s report and the Government’s response, and since the Joint Committee published its various recommendations for changing the draft legislation.
In the intervening period—and I say this with a certain concern that it may change—at the current tally we have seen three Prime Ministers, three Secretaries of State and three Lords Ministers, plus an assortment of junior Ministers in the Commons. The Bill has changed a lot, but the fundamental tension highlighted by this report remains that, for some, regulation on big tech firms cannot ever be strong enough, while for others any regulation is seen as anti-business, anti-free speech wokism.
So we look forward to welcoming the Bill to the Lords—I hope it will be soon. I hope that the Minister can give that assurance today, because I have no doubt that we will consider many of the issues raised by the committee during our deliberations, which I am sure will take a considerable amount of time.
We support the agenda to tackle online harms and much of what is in the Online Safety Bill. It is by no means perfect, but it would represent a significant step forward for the majority of internet users. The repeated delays to bring in important new safeguards have undoubtedly been disappointing. We are keen to get the legislation on to the statute book, but, as noble Lords today have again said, and as we said in the Chamber this week, the continued failure to act on age verification, which goes back many years, is really something that the Government should have put right. As the noble Lord, Lord Londesborough, rightly said, the Government have indeed failed to act when they could have. The tragic death of Molly Russell stands as a reminder to us all of the need to act, and to act swiftly.
Given the recent change in Administration, can the Minister confirm whether the Government intend to introduce any further changes to the Bill beyond those already published? Might some of the changes be welcomed by the committee? When might we see the Bill?
As other noble Lords have chosen to focus on specific recommendations, I will refer to the importance of improving users’ media literacy skills. One of the recommendations of the committee is that platforms should not be arbiters of the truth. The noble Baroness, Lady Featherstone, spoke of putting the power in our hands, but the ability to question and to interrogate is a crucial weapon in this.
The previous Minister and I had a number of exchanges on this important issue, but I remain unconvinced by the Government’s argument that the current duties on Ofcom are sufficient. There is indeed a strategy, but it is hard to see how that and the various education campaigns run by platforms are having the desired effect. We have seen the harms of disinformation and misinformation in recent years, particularly in regard to Covid vaccines. If the current approach to media literacy was working, those conspiracy theories would not have been as prevalent as they were.
Improving media literacy undermines those who spread misinformation—and that is what we need to do, because the best way to combat fake news is to teach people how to identify it. So could the Minister offer some comment on his view of the effectiveness of the various steps that have been taken or identified to be taken? Are they working and what still needs to be done?
We are of course this afternoon not going to solve all the issues the committee has raised, but this has been an extremely helpful holding debate as we wait for the Bill’s arrival. Once again, my thanks are due to members of the committee and to the chair for giving us that opportunity. I hope the Minister addresses many of the serious questions that were raised during the debate. I am sure we all agree that there is much work to do.
My Lords, I begin by thanking my noble friend Lord Gilbert for moving this debate on the committee’s report. I also thank noble Lords who are members of that committee for having the foresight to place digital regulation at the centre of public debate, especially in their report. Let me also thank all noble Lords, whether or not they are on the committee, for their contributions.
Before I turn to the specific recommendations made in the report, as noble Lords asked about one fundamental issue that lies at the heart of this debate—freedom of expression—I think it is worth looking at that. Your Lordships’ committee highlighted the importance of protecting freedom of expression online and, as was said by the noble Baroness, Lady Featherstone, this is an age in which the internet has brought huge opportunities for freedom of expression. It allows people from all over the world to exchange ideas at a speed and scale never seen before. We should not throw that out.
When I was lecturing on international business courses, we used to talk about this concept in academic terms as space-time compression leading to globalisation. This has been of huge benefit to mankind, and one of the challenges for countries where we have reasonably good internet access is how to spread that to the rest of the world. Sometimes that is via mobile devices, if the landlines are not good enough, but we should not forget the important progress we have made. We should also remember how we can harness the good side of that technology.
As a result, as my noble friend Lord Gilbert said, the largest tech platforms exercise great influence over public discourse. They determine what content people encounter online and can arbitrarily remove content, with no accountability and few routes for users to appeal. One of the interesting questions around this debate is that there are always tensions. We are talking about freedom of expression against security or safety, and also how we behave towards other people and who has the right to remove content or to be an arbiter. Sometimes we see a tension between property rights and freedom of expression, and we have to address how much we give those platforms, which can argue, “Well, it’s our space, we have a right to arbitrate on who can have that debate here”. We see that in the physical world as well, where certain schools and campuses ban speakers. There is a tension between freedom of expression and property rights. The number of issues just shows how difficult this is.
This is why the Online Safety Bill is so important. We will bring it back soon—as soon as possible. By that I mean sooner than possible, and “possible” is not “probable”, if that makes sense. I wish I could say more, but I am always warned by my officials to be very careful what I say, because of various processes. Noble Lords who have been in government will understand this.
For the first time, tech companies are going to be accountable to an independent regulator for the protection of children and tackling of illegal content, while also protecting freedom of expression. I am very grateful to the noble Lord, Lord Allan, for his points on the challenges and difficult issues that companies will have to overcome. It is not as simple as it sounds: we all want children to be protected, but it brings up lots of tensions and debate about how you do that and what the trade-offs are. But I am confident, having taken one Bill through this House, that we can rely on the wisdom of noble Lords to find an appropriate balance and address that tension. There is almost universal consensus on protecting children online but, as I said to the House yesterday, for adults we have to straddle that difficult tension between freedom of expression and protecting the vulnerable.
I hope that noble Lords will allow me to summarise some key changes to the Bill since the committee’s report. The noble Lord, Lord Davies, talked about fraud. That is covered under illegal content. I know that the committee made recommendations on content, and most noble Lords agree on the need to ensure that the Online Safety Bill includes strong protections against illegal content and criminal activity, while avoiding the removal of legal speech.
The Government have added provisions in the other place to establish how providers should determine whether content is illegal. We clarified how companies should determine whether content is illegal, protecting against both under-removal and over-removal of content, as the noble Lord, Lord Gilbert, alluded to. The Bill also includes strong protections for freedom of expression. Companies must have regard to freedom of expression when discharging their illegal content duties. I have no doubt that the noble Lord, Lord Allan, and I will have debates about what “due regard” means. Again, that is one of the issues we must address, and the largest platforms must set out what they are going to do to safeguard free speech.
The Government also welcome the committee’s endorsement of the importance of child safety. The strongest protections in the Bill remain those for children, but as the noble Lord, Lord Londesborough, said, how do we achieve that? How do we get there?
We have also addressed the committee’s concern that pornographic services were not captured in the Bill. We have made changes to require all websites which publish or host pornography to put robust checks in place to ensure that users are 18 years old or over. Again, as with many of these things, the question is how we deal with determined teenagers, who are often more tech-savvy than their parents and can run rings around them. We can put the best protections in place, but even the world’s best cybersecurity experts cannot stop hackers. So, we have to reduce this as much as possible, but I have to be honest: are we going to prevent the most determined and tech-savvy teenager from accessing content that we do not want them to access? That is a challenge, but we have to be honest about what we can and cannot do: what we can do through regulation, what companies themselves can do, but also what we can all do as society, as parents, as neighbours.
Let me turn to the committee’s recommendations on adult safety. We agree that platforms’ moderation decisions are inconsistent and opaque. That is why the duties in the Bill require major platforms to be transparent about and accountable for how they treat users’ content. We will continue to ensure that the Bill strikes the appropriate balance between safety and freedom of expression, but that will move in this House. We have also added measures to give adults more control over who can contact them. Adult users will be given options to verify their identity—the noble Baroness, Lady Merron, asked about this—and to decide whether to interact with unverified users. We hope that this will empower adults to manage their personal online experience, while protecting the anonymity of those who may need it, such as victims of abuse. Again, there is a very difficult balance to strike: we must make sure that we can tackle those who are anonymous and malicious, but we also have to protect those who have to remain anonymous for fear of abuse turning into something worse.
A number of noble Lords, including the noble Lord, Lord Griffiths, mentioned a point that the committee rightly highlighted: the importance of platform design in keeping users safe online. We hope that the Bill will ensure that companies design their services to mitigate the risk of harm from illegal content, and to protect children. This has always been the policy intent. We clarified this in the other place by amending the Bill to include an explicit duty on companies to take measures relating to the design of their services. These changes will ensure that companies build in safety by design, managing the risk of illegal content and activity on their services, rather than mostly focusing on content moderation.
My noble friend Lady Stowell, the noble Viscount, Lord Colville, and others talked about digital markets regulation. The committee made a number of recommendations. The Government remain committed to establishing a pro-competition regime to boost competition in digital markets. We want to introduce new, faster, more effective tools to address the unique barriers to competition in digital markets. The Government will set out their plans for the new regime in a draft Bill during this legislative Session. As set out in the Plan for Digital Regulation, the Government are committed to ensuring that our regulators have the capacity and expertise to regulate effectively and proportionately.
The committee also recommended the creation of a new parliamentary Joint Committee to scrutinise the work of digital regulators. I am afraid I have to refer noble Lords back to the position the Government adopted in their response. The Government believe that such a permanent Joint Committee it is unnecessary when we already have rigorous scrutiny provided by established committees, such as your Lordships’ committee and the DCMS Select Committee in the other place. However, the Government intend to work with Parliament to support scrutiny of the Online Safety Bill in a way that captures the skills and expertise in both Houses. We welcome further views during the passage of this Bill.
I turn to a number of the points raised specifically by noble Lords. I will start with my noble friend Lord Vaizey. I would like to ask him: what does he know that others do not know about the reshuffle? I hope this is not fake news to drive traffic to his podcast.
I should take this opportunity to pay tribute to my honourable friend Damian Collins for his expertise. I sat in on a fascinating meeting that the noble Baroness, Lady Kidron, organised last week with children’s groups. It was clear that he was on top of his brief. I have to admit that there will be a gap to fill, but I hope we will be able to fill it.
On that, I thank the noble Baroness, Lady Kidron, in her absence, for organising that round table, and the noble Lord, Lord McNally, others who attended for their comments. It was touching, moving and gave me lots to think about. When I met Ian Russell, the father of Molly Russell, I said to him that we will do all we can to try to ensure this does not happen again. That is something I am sure noble Lords across the House agree on. We might disagree on how we do that, but let us keep that in mind as we go through the Online Safety Bill.
The noble Baroness, Lady Featherstone, was absolutely right: we have to equip our children to be robust enough to stand up to difficult arguments. I teach international politics. In my academic job, which I am on leave of absence from, my boss is a Marxist and I am a libertarian-minded Conservative, so we are at two different ends of the political spectrum. But we both agree that it is important to try not to indoctrinate our children but to expose them to arguments from across the political spectrum, and to let them decide and to argue and debate with each other. That gives them robustness, but it also allows them to think intellectually and develop. I agreed with the noble Baroness when she said that this is really important. We have to be very careful about mollycoddling our children and overprotecting them. We should expose them to arguments but also to tools to argue back against people. I know that some noble Lords will disagree. Once again, the noble Baroness, Lady Fox, made those remarks.
The noble Baroness, Lady O’Neill, made some fascinating points about respect and civility—I can tell why she is a philosopher. We also need to understand the issue of subjectivity. If someone says something and you are harmed, does that give you cause for redress? There is also an awful lot of hypocrisy in discussing freedom of speech. People often say that they are in favour of freedom of expression until they are offended, and then they are suddenly against it. I remember when I was in the European Parliament and there were the Danish cartoons of Muhammad. I am a practising Muslim. I was offended by some of the cartoons and I actually found some of them funny, but I did not think that they should be banned. I was happy to see the debate around them in a free society.
Then I took part in a debate and talked about the whiteness of the European political space, the lack of racial and ethnic diversity, some of the imperial ambitions of the EU and racism across the spectrum, including on the left, and I was asked to apologise because I had offended some people. The same people who extolled the virtues of freedom of expression were suddenly asking me to apologise because they did not like what I said. We have to be clear when we are concerned about something or are harmed or offended. We talk about freedom of expression: let us make sure we are consistent. Let us make sure that not only do we think we should feel free to say things, so long as they are not encouraging violence against others, for example, but at the same time are willing to be open to criticism in our own right. That makes for a stronger, more robust and more intellectually challenging society. From discourse comes liberty. That is an important point that we should not forget.
I can try to beat the clock. The noble Lord, Lord McNally, and the noble Baroness, Lady Uddin, talked about media literacy. It is a crucial skill for everyone in the digital age. Key media literacy skills are taught through a number of compulsory subjects in the national curriculum, but we need to be careful about it. We have to make sure that it is always up to date. There are new challenges. We have to make sure that these curricula are updated. We have the computing national curriculum, which builds digital literacy and citizenship education—some noble Lords do not like the idea of that. We want to make sure that there is critical thinking in debates in relation to the proper functioning of democracy. The Department for Education is reviewing its Teaching Online Safety in Schools guidance and its non-statutory guidance, which provide advice and support on how to teach children to stay safe online. The DCMS and the Department for Education work closely to create a holistic, whole-of-government approach to supporting media literacy.
The noble Viscount, Lord Colville, asked about an Australian-style bargaining code. We are committed to defending media freedom and enhancing the sustainability of the press sector, and we hope that the pro-competition regime conduct requirements will improve transparency and allow large platforms to provide the businesses that rely on them with fair and reasonable terms. This will make an important contribution to the sustainability of the press. In addition, we are minded to pursue the use of a binding final-offer mechanism as a backstop to resolve challenging price-related disputes where needed. We will design the mechanism to boost competition in all digital markets and have been engaging with the Australian Government to understand the impact of their news media bargaining code on platforms and publishers. This regime presents just one aspect of the Government’s wider support for news publishers, and we will continue to consider all possible options in the interests of promoting and sustaining the sector. Once again, we are open to the wisdom and knowledge of noble Lords in this House on how we do that.
A number of noble Lords, including the noble Lords, Lord Strathcarron, Lord Vaizey and Lord Londesborough, asked about age verification. There will be clear requirements for companies to prevent children accessing harmful content, such as online pornography. Companies that are likely to be accessed by children will need to use a range of technology, including age verification, to comply with the new requirement. Age assurance and age verification have now been referenced in the Bill, which provides clear direction to Ofcom and companies about the measures we expect may be used where proportionate. The Bill will not mandate that companies use specific technologies to comply with their new duties. It is important the Bill is future-proofed as much as possible, and what is most effective today may not be effective in the future. Once again noble Lords talked about issues such as VPNs, and there are ways around them, and there are other technologies that will challenge people’s safety. For example, I was told about face-scanning technologies and iris recognition for age verification, but is there something eerie about using that sort of technology? Do people feel concerned about that technology and the way the data is stored? Does it feel like a Big Brother society or is it useful to society? There will be different views among noble Lords in this Chamber, but we have to understand the spectrum of views. We know that age-assurance technologies are developing rapidly and there is growing usage.
The noble Lord, Lord Clement-Jones, talked about JS Mill. He knows that I am classically liberal-minded, so it is worth quoting Mill, who said that
“the only purpose for which power can be rightfully exercised over any member of a civilized community, against his will, is to prevent harm to others”.
But there is disagreement over what is harmful, and JS Mill acknowledged that. When I was reading about this, I remember one paper saying that Mill does not say that the Government must always intervene to prevent one person harming another. Clearly, that is a philosophical discussion and there are a number of interpretations of JS Mill, but it is important that we recognise some of those issues. I also thank the noble Lord, Lord Clement-Jones, for bringing that up so that I could digress into political philosophy.
This has been a fascinating debate. It has highlighted the arguments and tensions between online safety and freedom of expression, which I know we will return to during debates on the Online Safety Bill very soon. Let me once again thank all noble Lords for their wise contributions today and for exposing some of the challenges that we are going to face as we take that Bill through the Lords. I end by thanking the noble Lord, Lord Gilbert, for moving this debate. I look forward to continuing the debate and to working constructively with noble Lords as we chart our course through these new challenges.
I will be very brief. The internet, let us be clear, has given voice to many marginalised people and in so many ways has transformed our lives for the better. What we have seen today is a really serious and constructive debate about what we need to do to deal with the societal issues that have come with the digitalisation of the world that we live in.
I thank all noble Lords who gave such insightful contributions today, in particular my noble friend the Minister for his response, and especially the noble Baroness, Lady Merron, and the noble Lord, Lord Clement-Jones. What they demonstrated was that the House really wants to come together to fix these issues and I hope that my noble friend will seek a cross-party approach to this legislation and engage the whole House in coming up with the solutions that we need to resolve these problems. Would he thank his officials and the succession of Ministers who came to see us? His officials were very generous with their time.
I will also take this opportunity, on behalf of the committee, to thank Ofcom for engaging with us. I am confident its people are the right people for this job; they will do an excellent job and we need to hand them a seriously workable piece of legislation, while not forgetting our role as Parliament in asserting societal priorities as Ofcom moves forward with this task.