Examination of Witnesses

Product Security and Telecommunications Infrastructure Bill – in a Public Bill Committee at 9:29 am on 15 March 2022.

Alert me about debates like this

John Moor, Dave Kleidermacher and Dan Patefield gave evidence.

Photo of Graham Stringer Graham Stringer Labour, Blackley and Broughton 10:19, 15 March 2022

Q We will now hear oral evidence from John Moor, managing director of the IoT Security Foundation; Dave Kleidermacher from Google and the Internet of Secure Things Alliance; and Dan Patefield, head of programme, cyber and national security at techUK. We have until 11.25 am for this session. Can I ask the witnesses to introduce themselves, starting with Dan?

Dan Patefield:

Good morning, everyone. I am Dan Patefield. I lead the cyber-security programme at techUK, which is the national trade association for the digital and technology sectors.

John Moor:

Just before I introduce myself, let me say that it is an honour to be here. This represents a milestone moment for me, seven years in the making. Seven years ago, I set out on this journey to understand what IoT cyber-security was about and its challenges, so I am honoured to represent our membership and the executive steering board. I am John Moor, managing director of the IoT Security Foundation.

Dave Kleidermacher:

Hi, everyone. My name is Dave Kleidermacher; hopefully you can hear me okay. I am the Google vice-president of engineering responsible for the security and privacy of the Android operating system, the Google Play app store, and “Made by Google” products including Pixel phones, Nest smart home products and Fitbit wearables. I am responsible for security and privacy, including the certification strategy for the company—how we assess and demonstrate compliance with security standards and privacy standards.

Photo of Graham Stringer Graham Stringer Labour, Blackley and Broughton

Thank you. I will move straight to the Minister for questions.

Photo of Julia Lopez Julia Lopez Parliamentary Secretary (Cabinet Office), Minister of State

Q Thank you, Mr Stringer, and thank you to all the witnesses who have come here today.

John, you rather touched on the challenge: this is an area that is very dynamic. All of us are learning what the security risks are, and in Government—which often moves very slowly—it is a particular challenge to manage such a dynamic, changing picture. That is why in this legislation, we have set out some broad principles and basic requirements, but a lot of this has to be secondary legislation so we can keep up to speed with all the changes that are going to be happening to connected devices, and some of the risks that will come with that. I think it would be very helpful if you could set out for the benefit of the Committee how this picture has changed over the past few years, where you think things will be moving, the extent to which connected devices will be in our homes in future, and some of the security risks that will present.

John Moor:

When I started out seven years ago, I was invited to take a look by the chairman of the organisation I was working for at the time, the National Microelectronics Institute. He was the CEO of an IoT company. I confess, I had not seen what the challenge was, so when he invited me—“John, go and take a look at IoT cyber-security”—I thought, “Why me? What’s the challenge? Isn’t this thing just a tiny part of a well-established body of knowledge about cyber-security, and why me?” My background is in electronic engineering—semiconductors.

As it turned out, when I went and had a look, it did not take me very long to realise, “My goodness, there is a real problem here.” I remember that at the time, a word I was using often was “egregious”. As effectively a student coming into it, trying to understand the space, I looked at the evolution of computing, broadly speaking. In one era, we had computers—desktops, laptops—and we connected them up, and the security around those was pretty dire at one point, but we started to get on top of that. It is not perfect now, but it is a lot better than it used to be, and we are all very familiar now with doing security updates. The next phase was mobile. Mobile was not quite as bad as the era of PCs. It was better—still a few problems, but much, much better. Then we got to this thing called IoT, and it took a complete reset. It was totally egregious.

I come from the world of embedded systems engineering, and one of the first events we did was a summit we ran at Bletchley Park in 2015, just to do a landscape piece—just to try to understand it from chips to systems, bringing in the regulator. We had a representative of what was then the Communications-Electronic Security Group, but is now the National Cyber Security Centre, to try to understand where the issues are. Part of the problem, I think, is what I learned there as an embedded systems guy. We had a pen tester there, and he said, “If a researcher comes knocking on your door, don’t turn him away.” I thought, “That is a really interesting thing. What is he talking about?” We were talking about vulnerability disclosure. For someone who comes from embedding air gap systems, security was not a thing. It does not take you long to realise that when you start connecting things up, suddenly you expand this thing called an attack surface. Attackers can come from many sources, not in proximity to the thing that you are working on. Suddenly, you have this massive attack surface.

The whole idea about IoT—internet of things—is about connecting things up, so by its very nature, you are vulnerable. These things can come at you from many angles. What does that mean? It means different things to different people. I tried to understand what this thing called security was about. I immersed myself in the security community and straight away I realised there were different groups. If people start talking to me about data, they are usually coming from a data security or information assurance-type background. If they talk to me about availability of systems—keeping systems up—they usually come from an operational technology. What I mean by that is the sort of things we find in industry—process and manufacturing.

Then we have this thing called IoT. One of our board members expressed it very well. He called it the “invasion of IoT”. What I took from that is that this technology is coming at us, ready or not. We established in those early days that we needed to have a response. The need is now. We could not wait for new standards and regulation, which is why we set up the IoT Security Foundation. Our centre of gravity is in best practice. It is saying, “Can we help manufacturers who do not yet see that the very fact that they are starting to connect things up poses a risk?” They did not, but now we are in a much better state. The body is developing.

I am delighted to be here to talk about this regulation. More needs to be done, without a doubt. A seminal moment for me was at the very first summit that I talked about. We had the chief technology officer of ARM, a chap called Mike Muller, give a talk in which he said, “The ugly truth is this: you will get hacked.” That was quite an epiphany for me, because coming from an engineering background, we engineer our systems to be virtually perfect, but what we are witnessing now is that security is a movable feast that evolves. Out in the wild, things change. New vulnerabilities are discovered. Yes, you can do all you can to engineer it up front, but guess what? Once it is in the wild, this thing called resilience is so important. What that means, especially in terms of this regulation, is the software updating part and especially the vulnerability disclosure. They are absolutely essential parts. That is part of what I have learned on the way.

I come to refer to IoT security as a “wicked challenge”. By that I mean that I do not think we will ever perfectly fix it, because it is always moving, but we can address it. We can mitigate the risks to a level that we are comfortable with and can accept. Again, another phrase I learned is, “Don’t let perfect be the enemy of the good.” This is all good. This is progressive. This is what the world needs. Being part of the regulatory process to get here today, it became apparent that getting regulation right is so difficult. It is so easy to get it wrong, but going through the process, this is a regulation that we can wholeheartedly back. We think it is absolutely the right thing. It takes a step; it gets us on that security journey. We often talk about an on-ramp of security. It is about maturity. In terms of regulation, this is a fantastic first step, but more will come. The way it has been set up is exemplary. We can evolve it over time as we have to ratchet up the security for the benefit of consumers and society. I hope that little ramble gives you some idea about my journey and where I think we are at.

Photo of Julia Lopez Julia Lopez Parliamentary Secretary (Cabinet Office), Minister of State

Q It would be helpful if the other witnesses could also set out the context from their perspective. I am particularly interested in Google’s view, given it is a company with vast resources and a lot of expertise. There is a challenge for smaller operators about how to fulfil basic security requirements and how you think the basic set of requirements will help start that conversation with people who may not have even thought about the security of their devices before the legislative requirements come in.

Dave Kleidermacher:

Let me start by saying I am so appreciative of the leadership role that the UK Government have taken to help us get to a better place for IoT security. I have been working closely with the Department for Digital, Culture, Media & Sport and NCSC for the past couple of years leading up to this. I have worked on how to measure security in digital technology for almost 20 years, and I believe that the lack of transparency in what the security ingredients are for digital technology has been one of the headwinds facing the entire digital world, even before the IoT was called the IoT. Of course, the IoT has made it much more urgent that we address this.

I agree that the minimum requirements we are talking about here are a really good starting point, but as we move forward and look at the secondary legislation, the really big challenge is how we scale this. The question about smaller developers is something that I am quite concerned about. At Google, we build our own first-party products but we also develop global-scale platforms. On Android, we have many manufacturers of devices across all different price points. We have millions of app developers across the world with whom we connect and work in all sorts of different environments.

One of the biggest challenges is how to monitor and measure these requirements, and how to make that work for small businesses in particular. That is the area I have personally been putting a lot of time into over the past couple of years. How do we build and establish an actual practical mechanism or scheme for measuring security at scale? There are a lot of details that go into that, but at the end of the day, we need a hub and spoke model. I can give you an example of a failure mode. The UK is, again, taking a leadership role, but many countries are looking at similar kinds of ideas and legislative concepts. The problem is that if every single country decides to create its own testing scheme for how to measure this, imagine how difficult it would be to have, say, a webcam or smart display, and then go to each country and provide documentation, provide the test results, explain how it works and go through a testing mechanism for every country.

As an example, for our Nest Wifi products, Google has had public commitments and transparency about our desire to have third-party independent security labs to test the products and assess compliance to these common-sense requirements. We have been doing that for a while now. We certify all of our products that way, but then a couple of countries at the leading edge of this started to ask us to certify again their schemes, and we did. That was a lot of work, to test to one scheme and certify and then do the same for another country with a different set of rules. The product did not change at all; it did not get any better because we were already certifying it. However, the work and the cost of doing that were significant. If we scale that to the full IoT, to all the countries which are interested in this—they all should be—then you can imagine how quickly it breaks down.

The hub and spoke model is looking at how we can work together to build a public-private partnership where there are non-government organisations, typically well-regarded international standards bodies, which take the great standards that we are developing, such as the ETSI EN 303 645 international specification on security requirements, which the UK has led in developing, and translate that into a practical conformance regime. An NGO can take that specification and the test specification—a sister specification, ETSI TS 103 701—and test a product once to have it certified for use in all of the different nations which adopt the same standard. That is the trick to this—the hard part that has to be solved as we move forward.

Dan Patefield:

I think John and Dave have already mapped out the ever-growing risk landscape, so I will not reiterate that. From an industry perspective, there is clearly strong support for the ambitions of the Bill we have been discussing today, in implementing a minimum baseline that everyone should work to. Certainly, large swathes of industry are going beyond that, as Dave has outlined. I think I would join the other panellists in commending DCMS on the leadership that it has shown in developing the framework, not just with this legislation, but with the code of practice in 2018. I also commend it for playing a key role in developing the globally recognised standard in this space, EN 303 645—I always get that number wrong. The challenge that we have, and I am sure that we will come on to this, is that the code of practice—we supported its development and engaged industry in it—created an outline for best practice. However, it was never prescriptive; it was broadly focused. The practical challenge now is translating that into regulation that is workable for industry and consumers. I am sure we will move on to that, so I will leave it there.

Photo of Julia Lopez Julia Lopez Parliamentary Secretary (Cabinet Office), Minister of State

Q Dan, you touched on the challenge about the need for simplicity, so that this very complex area is at least understood on a basic level—a general hygiene that everybody needs to apply. Ultimately, there is a need to thrash out a lot of this via secondary legislation. I wondered to what extent that basic requirement has helped you have conversations with other members of your organisation who may not have been aware of some of the challenges coming down the line. Also, does the basic three-point requirement that we will be introducing help the conversation with consumers about what they need to do, and some of the things that they need to be demanding of products when it comes to security?

Dan Patefield:

Going back to the code of practice, I am confident that across all 13 of those areas many companies have made good progress, and will continue to develop best practice that goes far beyond those requirements. I think it is a good approach to start with the three requirements that are included in the Bill; it is not the case that industry will be surprised by what comes out in secondary legislation. The practical challenge is translating the non-prescriptive code of practice into something that will be more prescriptive by definition.

There are a number of areas where I think there is more work to be done to smooth the path to compliance, if you like. We have got various elements. We have got the standard—that is not going to be a surprise. We know the security requirements—they are not a surprise. What we have not got is the boring bit—the technical specification that people in compliance teams within manufacturers are worried about. Quite often they have to then communicate that to their HQs—which are often in different parts of the world—and say, “We have got legal certainty that this is how it is going to work and this is how we achieve compliance”. That is the bit that we have not yet got.

Photo of Julia Lopez Julia Lopez Parliamentary Secretary (Cabinet Office), Minister of State

Q Just one final question for Dave Kleidermacher. You talk in your submission about not having static labels, but live labels. Can you take me through how that would look in practice for the consumer?

Dave Kleidermacher:

It is a really important distinction, as we look at the so-called security ingredients in digital products. The analogy to food is a good one—but it also has its limits. What is good about it is that consumers deserve to have information at their disposal to be able to make better decisions about their health; in the case of food, that is their physical health, but in the case of digital technology it is their digital health. The concept that a consumer should easily be able to get a sense of the security status of a product is a very good idea. However, the main challenge is that food contents do not typically change—there can be a printed label that works okay. However, in the digital world, it could happen that you ship a product today and then there is a severe critical vulnerability, perhaps a hardware problem, that cannot effectively be mitigated or even patched. If that happens in the future, even a day after you have shipped it—this is a worst-case scenario—then if you try to put an attestation on the static label that the product is “secure” or meets these requirements, that attestation could be immediately incorrect. In fact, it could be dangerously misleading, and give consumers a false sense of security, so I believe that, while the ingredients label is essential, the user needs to have transparency. The consumer needs to have visibility here.

That label needs to be a live label. A simple example would be a QR code on packaging, although I am not sure how much consumers really go back to their packaging. We should also stress in-product experience wherever that is practical. It will not be practical in the case of every electronic product, but there is typically an app to manage many of our consumer IoT products. The app can provide an experience where the consumer can get the real-time, current status. That status can be as simple as a link that takes you to the certification page. As I mentioned earlier, we can have NGOs that establish the conformance programmes that we need to help to measure the security. It could just take you to the certification page to see the real-time status. If a product is deemed unsafe for use, it will become decertified, and the user will then know it.

Photo of Chris Elmore Chris Elmore Opposition Whip (Commons), Shadow Minister (Digital, Culture, Media and Sport)

Q Thank you, Mr Stringer. This is only for Mr Patefield, unless anybody else wants to come in, of course. You talked, in answer to the Minister, about implementation and getting to the specifics of how that is delivered. In your evidence you refer to manufacturers and retailers being concerned about the timescales of the Bill, specifically the 12 months. I wonder whether you could expand on that, as I think you wanted to in your previous answer, and specifically on how secure devices could become obsolete because of the speed that it would take to implement the changes within the 12 months of the Bill’s introduction.

Dan Patefield:

There are two points on the timescales. There is the point at which the grace period will begin. For industry, we strongly think that that should be when the regulatory framework is confirmed and we know who the regulator is. That is the point at which that countdown should start. There are different views in industry on how long an appropriate grace period would be. Obviously, DCMS has confirmed that it will be no less than 12 months. Once we see that technical specification, a lot of parts of industry will have interpreted the code of practice in such a way that complies, so that will not be a problem for them, but some might have an interpretation that the compliance framework rules out—for example, around passwords. They might have to go back, certainly for security requirement 1, and make a hardware change. For a lot of these products, the supply chains are enormously long. Take a projector coming over from Malaysia. That will be 15 weeks in transit, and eight weeks getting through the broader supply chain in the UK through distributors and re-sellers. That already reduces the 12 months to seven months for manufacture and design. That is the difficulty that some manufacturers might face.

To the obsolescence point, there are two points again. In terms of when this comes in, we have to communicate it to consumers in such a way that it does not cause them to think that any devices that they currently have are obsolete in any way. That is a communication piece. It is about DCMS and the Government broadening that out, and helping consumers to understand what the legislation is for. More broadly, I am sure that we will come to the timescales for security updates but we do not want that to turn into some kind of perceived sell-by date. That is the minimum we will give you security requirements for, but the device is not useless after two or three years. Both those elements might lead to an increase in electronic waste and the kind of things that we want to avoid in a practical framework.

Photo of Graham Stringer Graham Stringer Labour, Blackley and Broughton

Do either of the other two witnesses wish to comment?

Dave Kleidermacher:

I would like to make a quick comment. Especially as we look forward in time, beyond the minimum requirements to the larger set that are codified into the ETSI EN 303 645, and extended requirements even beyond that, in different vertical markets there will be a desire to have additional requirements. For example, on the Android side, a Google-certified Android device already meets baseline requirements, so we are working with NGOs on how to define higher levels. For example, the strength of a biometric is really important on a smartphone, and that is not currently covered by the baseline requirements.

As we go forward, there will be an increasing set of requirements, and there is a way to balance that challenge. You will always hear of some manufacturers, including smaller ones, that have more difficulty meeting a certain requirement in a certain timeframe, and one way to help balance that is by focusing more on transparency about whether the requirement is met, versus requiring that all those requirements be met. I like to say that transparency is the tide that raises all boats. That is the key.

To go back to our analogy with food, it is not that on a label it says that you cannot have more than 50 grams of something; it is that you can compare the number of grams of carbohydrates and other ingredients between products. If you look at EN 303 645 and all its provisions—there are many—you could ask manufacturers simply to attest as to whether those are met. Yes, I still believe that there are minimum requirements that are critical, but in as much as we run into some difficulties on timeframes, you could just ask them to state whether they meet those requirements. That transparency will still be really valuable for consumers. Again, the NGOs that are setting up those conformance schemes can take the attestations of yes or no across the requirements and translate that into a health score, if you will, to help consumers make better decisions.

Photo of Graham Stringer Graham Stringer Labour, Blackley and Broughton

Thank you. John, did you wish to add anything?

John Moor:

Yes, I have a few points to make. First and foremost, most of my comments are about the here and now: what we are looking at, what is in front of us and the three requirements that are coming. Our assumption and that of our members is that, as we add to that, there will be an equally robust and rigorous process to determine what might follow. That is essential.

The labelling question is really interesting, along with certifications and attestations. All we can say about certification is, under these conditions, on this day, in these tests, those conditions were satisfied. I have heard the discussion about food labelling schemes come up time and time again as a “We ought to do something like that”, but in our view that is not really practical.

One of the things that I had to get my head round when I came into this space was some people talking to me, saying, “Safety and security are the same, aren’t they, John?” I had never had to get my head around that in the past, but I thought about it for about an hour, and I concluded, “Actually, they are not the same.” They are not the same because safety is much more determinable. You can define the situation, the operating environment, the characteristics, the materials, etc., and you can figure out, “This is safe under these conditions.” The difference in security is that it is dynamic—there is a changing environment, there is a human adversary at the other end. We might consider something to be safe today, as David said, but that changes over time.

Where do we place our trust? Do we place it in the product? I do not know that we do. Do we want to be looking up thousands of products to see what the certificates are? Where we really place our trust is in the companies that provide those products. It is interesting that, of the three provisions that we are talking about, only one is really related specifically to the product, and that is passwords. The other two are really about the processes that are involved in the providers of the technology—vulnerability disclosure and keeping the software updated.

I do think that certification is useful, but it is not a panacea; it only goes so far. What we are really looking for is something that we would term “continuous assurance”. How do you do continuous assurance? That is the question for the industry to answer going forward, but some of the mechanisms that we have done in the past do not map well into a future world that is changing rapidly.

That is on the labelling front. It should be as simple as possible for consumers and for the producers of the technology. There is a discussion about whether we need another label. Certainly, many of our members favour integrating this into something that is already known. For example, could it become part of a CE labelling scheme, so that we add the security elements too? Those processes are well known.

Some of the discussions among our members about keeping software updated come down to considering what is a reasonable time to keep software updated. If you make it too short, that process is almost meaningless, and means that consumers probably will not buy a product if the update is, let’s say, after only six months. If that update is too long, the company is carrying a financial legacy burden. What is the right point? I think we will find that out. Is it three years, five years, one year? We do not quite know yet. My own view is that it should be a length of time that is beyond the life cycle of the product. In that regard, it is variable and I do not know how that would quite be implemented, but that is what we have in front of us. For the here and now, this is what we are talking about; as for the future, we are assuming the rigorous.

In my view, security is an awful lot like quality. As we go into the digital world, we will see profound changes not only in the way that we use products, but how they are produced. We already know that: among our membership whole engineering teams have been reconstructed. The selling of physical products must be reviewed too, because are we buying a physical product? Often we are not, often we are buying a service. Do we actually own it? No, we don’t.

Those are things that we will be working out as we go forward. We must understand those limitations as we do that, because we do not want to be taking the past into the future when the future looks quite a lot different from the past.

Photo of Chris Elmore Chris Elmore Opposition Whip (Commons), Shadow Minister (Digital, Culture, Media and Sport)

Q One final question for techUK about part 2. Lots of organisations that you represent talk about the digital connectivity divide within cities and large towns between flats and access for upgrading and automatic upgrading. You have said that the Bill could go further to deal with overground infrastructure and automatic upgrades for flats to resolve the problem. Could you expand on that and tell us what your members say about the challenges they face, because this is not just about rural roll-out or semi-rural roll-out, but changing infrastructure, including in boroughs such as Hackney and Camden—places where you would not automatically think there were connectivity issues?

Dan Patefield:

I will lead on that question. techUK would be happy to give more thoughts on that in a written submission, but it is not an area I focus on. Internally, we split the Bill; I lead on the cyber-security element and another colleague leads on telecoms infrastructure. I am happy to get that question answered in a written submission.

Photo of Graham Stringer Graham Stringer Labour, Blackley and Broughton

If there are no other questions from Committee members, I thank our witnesses for their time and contributions. I am sure that when Committee members come to consider the Bill in detail they will find those comments very helpful. Thank you.

Ordered, That further consideration be now adjourned. —(Steve Double.)

Adjourned till this day at Two o’clock.