Data protection impact assessment

Data Protection Bill [Lords] – in a Public Bill Committee at 3:15 pm on 15 March 2018.

Alert me about debates like this

Photo of Louise Haigh Louise Haigh Shadow Minister (Home Office) (Policing) 3:15, 15 March 2018

I beg to move amendment 142, in clause 64, page 37, line 2, leave out “is likely to” and insert “may”.

Photo of David Hanson David Hanson Labour, Delyn

With this it will be convenient to discuss the following:

Amendment 143, in clause 64, page 37, line 2, leave out “high”.

Amendment 144, in clause 64, page 37, line 15, leave out “is likely to” and insert “may”.

Amendment 145, in clause 64, page 37, line 15, leave out “high”.

Amendment 146, in clause 65, page 37, line 19, leave out subsection (1) and insert—

“(1) This section applies where a controller intends to—

(a) create a filing system and process personal data forming part of it, or

(b) use new technical or organisational measures to acquire, store or otherwise process personal data.”

Amendment 147, in clause 65, page 37, line 23, leave out “would” and insert “could”.

Amendment 148, in clause 65, page 37, line 23, leave out “high”.

Amendment 149, in clause 65, page 37, line 44, at end insert—

“(8) If the Commissioner is not satisfied that the controller or processor (where the controller is using a processor) has taken sufficient steps to remedy the failing in respect of which the Commissioner gave advice under subsection (4), the Commissioner may exercise powers of enforcement available to the Commissioner under Part 6 of this Act.”

New clause 3—Data protection impact assessment: intelligence services processing—

“(1) Where a type of processing proposed under section 103(1) may result in a risk to the rights and freedoms of individuals, the controller must, prior to the processing, carry out a data protection impact assessment.

(2) A data protection impact assessment is an assessment of the impact of the envisaged processing operations on the protection of personal data.

(3) A data protection impact assessment must include the following—

(a) a general description of the envisaged processing operations;

(b) an assessment of the risks to the rights and freedoms of data subjects;

(c) the measures envisaged to address those risks;

(d) safeguards, security measures and mechanisms to ensure the protection of personal data and to demonstrate compliance with this Part, taking into account the rights and legitimate interests of the data subjects and other persons concerned.

(4) In deciding whether a type of processing could result in a risk to the rights and freedoms of individuals, the controller must take into account the nature, scope, context and purposes of the processing.”

New clause 4—Prior consultation with the Commissioner: intelligence services processing—

“(1) This section applies where a controller proposes that a particular type of processing of personal data be carried out under section 103(1).

(2) The controller must consult the Commissioner prior to the processing if a data protection impact assessment prepared under section [Data protection impact assessment: intelligence services processing] indicates that the processing of the data could result in a risk to the rights and freedoms of individuals (in the absence of measures to mitigate the risk).

(3) Where the controller is required to consult the Commissioner under subsection (2), the controller must give the Commissioner—

(a) the data protection impact assessment prepared under section [Data protection impact assessment: intelligence services processing], and

(b) any other information requested by the Commissioner to enable the Commissioner to make an assessment of the compliance of the processing with the requirements of this Part.

(4) Where the Commissioner is of the opinion that the intended processing referred to in subsection (1) would infringe any provision of this Part, the Commissioner must provide written advice to the controller and, where the controller is using a processor, to the processor.

(5) The written advice must be provided before the end of the period of 6 weeks beginning with receipt of the request for consultation by the controller or the processor.

(6) The Commissioner may extend the period of 6 weeks by a further period of one month, taking into account the complexity of the intended processing.

(7) If the Commissioner extends the period of 6 weeks, the Commissioner must—

(a) inform the controller and, where applicable, the processor of any such extension before the end of the period of one month beginning with receipt of the request for consultation, and

(b) provide reasons for the delay.

(8) If the Commissioner is not satisfied that the controller or processor (where the controller is using a processor) has taken sufficient steps to remedy the failing in respect of which the Commissioner gave advice under subsection (4), the Commissioner may exercise powers of enforcement available to the Commissioner under Part 6 of this Act.”

Photo of Louise Haigh Louise Haigh Shadow Minister (Home Office) (Policing)

The amendments in my name, and in the names of my right hon. and hon. Friends, are all designed to strengthen the requirement to conduct impact assessments, and to require permission from the Information Commissioner for the purposes of data processing for law enforcement agencies. Impact assessments are a critical feature of the landscape of data protection, particularly where new technology has evolved. It is vital that we have in place enabling legislation and protective legislation to cover new technologies and new methods of data collection and processing.

Since the introduction of the Data Protection Act 1998, the advance of technology has considerably increased the ability of organisations to collect data, as we have discussed. The impact assessment as envisaged allows for an assessment to be conducted where there are systematic and extensive processing activities, including profiling, and where decisions have legal effects, or similarly significant effects, on individuals. In addition, an assessment can be conducted where there is large-scale processing of special categories of data, or personal data in relation to criminal convictions or offences, and where there is a high risk to rights and freedoms—for example, based on the sensitivity of the processing activity.

Given the breadth and reach of new technology, it is right that impact assessments are conducted where the new technology may present a risk, rather than a “high risk”, as envisaged in the Bill. That is what we seek to achieve with the amendments. New technology in law enforcement presents a unique challenge to the data protection and processing environment. The trialling of technology, including facial recognition and risk assessment algorithms, as already discussed, has not been adequately considered by Parliament to date, nor does it sit easily within the current legal framework. I do not doubt that such technologies have a significant role to play in making law enforcement more effective and efficient, but they have to be properly considered by Parliament, and they need to have adequate oversight to manage their appropriate use.

Facial recognition surveillance was mentioned in Committee on Tuesday. The Minister was right to say that it is being trialled by the Metropolitan police, but it has been trialled for three years running. I suggest that it is no longer a trial. It is also being used by South Wales police and other police forces across the country, particularly when policing large events. The Metropolitan police use it in particular for Notting Hill carnival.

In September last year, the Policing Minister made it clear in response to a written question that there is no legislation regulating the use of CCTV cameras with facial recognition. The Protection of Freedoms Act 2012 introduced the regulation of overt public space surveillance cameras. As a result, the surveillance camera code of practice was issued by the Secretary of State in 2013. However, there is no reference to facial recognition in the Act, even though it provides the statutory basis for public space surveillance cameras.

Neither House of Parliament has ever considered or scrutinised automated facial recognition technology. To do so after its deployment—after three years of so-called trialling by the Metropolitan police—is unacceptable, particularly given the technology’s significant and unique impact on rights. The surveillance camera commissioner has noted that “clarity regarding regulatory responsibility” for such facial recognition software is “an emerging issue”. We urgently need clarity on whether the biometric commissioner, the Information Commissioner or the surveillance camera commissioner has responsibility for this use of technology. Our amendments suggest that the Information Commissioner should have scrutiny powers over this, but if the Minister wants to tell me that it should be any of the others, we will be happy to support that.

Clearly, there needs to be some scrutiny of this very important and invasive technology, which provides recommendations to law enforcement agencies to act, to stop and search and, potentially, to detain people. There are still no answers as to what databases law enforcement agencies are matching faces against, what purposes the technology can and cannot be used for, what images are captured and stored, who can access those images and how long they are stored for.

In 2013, the Government said that the Home Office would publish a forensics and biometrics strategy. Five years on, that strategy has still not been published. The deadline has been missed by quite some time. I appreciate that they have said that they will publish it by June 2018, but in the meantime many of these emerging technologies are being used with absolutely no oversight and, as the Minister said, no legal basis. That simply cannot be acceptable.

There are other issues with the use of facial recognition technology. It is used extensively in the United States, and several studies have found that commercial facial recognition algorithms have in-built biases and issues around demographic accuracy. In particular, they are more likely to misidentify women and black people. That might be because of bias coded into the software by programmers, or it might be because of an underrepresentation of people from black and minority ethnic backgrounds and women in the training datasets. Either way, the technology that the police are currently using in this country has not been tested against such biases.

Surely that testing is urgently needed when we consider the issues that the Home Secretary and the Prime Minister have tried to tackle around the disproportionate use of stop-and-search powers against black and minority ethnic populations, and the issues around trust in the police that that has engendered. Why are we not concerned about the same issues with this very invasive technology that could recreate those exact same biases?

The facial recognition software used by the South Wales police has not been tested against those biases either, but this is not just about facial recognition software. Significant technologies and algorithms are being used by law enforcement agencies across the country. We have already discussed the algorithm used to make recommendations on custody. Automatic number plate recognition has been rolled out across many forces—we will discuss a code of practice for that when we come to a later amendment. Fingerprint-scanning mobile devices have recently been rolled out across West Yorkshire police. I mentioned earlier, in relation to another amendment, that South Yorkshire police is now tagging individuals who frequently go missing.

It was brought to my attention this morning that South Yorkshire police and Avon and Somerset police have a technology that allows them to track the movements of mobile phone users within a given area and intercept texts and calls. These are called international mobile subscriber identity—IMSI—catchers. They mimic cell towers, which mobile phones connect to in order to make and receive phone calls and text messages. When they are deployed, every mobile phone within an 8 sq km area will try to connect to the dummy tower. The IMSI catchers will then trace the location and unique IMSI number of each phone, which can then be used to identify and track people.

Those are all worrying invasions into the privacy of individuals who have not been identified by the police as being about to commit criminal activity, nor are wanted by the police or law enforcement agencies. In that last example, they are just people who happen to be within the 8 sq km area in which the police would like to track and intercept people’s phones.

It may be that every one of those technologies is being used proportionately and necessarily, and that we would all be happy about the way that they are being used. However, if there is no basis in law and no commissioner overseeing the use of these technologies, and if Parliament has never discussed them, surely this is the opportunity to ensure that that happens, to give people confidence that the police and other enforcement agencies will be using them proportionately and not excessively.

Furthermore, the police national database currently contains over 21 million images of individuals, over 9 million of whom have never been charged or convicted of any offence. The biometrics commissioner has already said that it is completely unacceptable for the Home Office to retain those images when it has no good reason to do so. Doing so would also be a clear breach of clause 47, which covers the right to erasure, when there is no reasonable need for the police national database to contain those images. That raises issues around facial recognition software, because if we are matching people’s faces against a database where there is no legal right for those faces to be held, that would already be a breach of the Bill as un-amended.

I hope the Minister will accept that there are good reasons for these amendments or, if she can, assure me that these existing and emerging technologies will be covered by the Bill, and that a relevant commissioner will oversee this, both before any technology or new method of data collection and data processing is rolled out by law enforcement, and afterwards, when an individual’s data rights have been potentially abused. We need clear principles around what purposes any of these technologies can or cannot be used for, what data is captured and stored, who can access that data, how long it is stored for and when it is deleted. I am not convinced that the Bill as it stands protects those principles.

Photo of Liam Byrne Liam Byrne Shadow Minister (Digital, Culture, Media and Sport) (Digital Economy) 3:30, 15 March 2018

I rise briefly to support my hon. Friend’s excellent speech. The ambition of Opposition Members on the Committee is to ensure that the Government have in place a strong and stable framework for data protection over the coming years. Each of us, at different times in our constituencies, have had the frustration of working with either local police or their partners and bumping into bits of regulation or various procedures that we think inhibit them from doing their job. We know that at the moment there is a rapid transformation of policing methods. We know that the police have been forced into that position, because of the pressure on their resources. We know that there are police forces around the world beginning to trial what is sometimes called predictive policing or predictive public services, whereby, through analysis of significant data patterns, they can proactively deploy police in a particular way and at a particular time. All these things have a good chance of making our country safer, bringing down the rate of crime and increasing the level of justice in our country.

The risk is that if the police lack a good, clear legal framework that is simple and easy to use, very often sensible police, and in particular nervous and cautious police and crime commissioners, will err on the side of caution and actually prohibit a particular kind of operational innovation, because they think the law is too muddy, complex and prone to a risk of challenge. My hon. Friend has given a number of really good examples. The automatic number plate recognition database is another good example of mass data collection and storage in a way that is not especially legal, and where we have waited an awfully long time for even something as simple as a code of practice that might actually put the process and the practice on a more sustainable footing. Unless the Government take on board my hon. Friend’s proposed amendments, we will be shackling the police, stopping them from embarking on many of the operational innovations that they need to start getting into if they are to do their job in keeping us safe.

Photo of Stuart McDonald Stuart McDonald Shadow SNP Spokesperson (Immigration, Asylum and Border Control)

I will speak briefly in support of amendments 142 to 149, as well as new clauses 3 and 4. As it stands, clause 64 requires law enforcement data controllers to undertake a data protection impact assessment if

“a type of processing is likely to result in a high risk to the rights and freedoms of individuals”.

That assessment would look at the impact of the envisaged processing operations on the protection of personal data and at the degree of risk, measures to address those risks and possible safeguards. If the impact assessment showed a high risk, the controller would have to consult the commissioner under clause 65.

It is important to be clear that the assessment relates to a type of processing. Nobody is asking anyone to undertake an impact assessment every time the processing occurs. With that in mind, the lower threshold for undertaking an assessment suggested in the amendments seems appropriate. We should be guarding not just against probable or high risks, but against any real risk. The worry is that if we do not put these tests in place, new forms of processing are not going to be appropriately scrutinised. We have had the example of facial recognition technology, which is an appropriate one.

New clauses 3 and 4 do a similar job for the intelligence services in part 4, so they also have our support.

Photo of Darren Jones Darren Jones Labour, Bristol North West

I rise to support the amendments in the name of my hon. Friend the Member for Sheffield, Heeley. I had the pleasure of cross-examining Baroness Williams of Trafford, who is the Minister responsible for some of these issues, on the Select Committee on Science and Technology in our inquiry on the biometric strategy and why there has been such a delay in the Government publishing that document. We had grave concerns about the delay in the strategy, but also about the way in which IT systems and servers in different forces act in different ways, which make things potentially very difficult.

The amendments would add safeguards to legitimate purposes—to prevent them from going too far. They should be welcomed by the Government and included in the Bill. There are a number of situations where, in this developing area of technology, which could be very useful to us as a country, as my hon. Friends have said, we need to ensure that the appropriate safeguards are in place. On facial recognition, we know from information received by the Science and Technology Committee that there is too high a number of facial records on the police national database and other law enforcement databases, when there is no legitimate reason for them to be there. We understand that it is difficult to delete them, but that is, with respect, not a good enough answer.

The Select Committee also heard—I think I mentioned this in an earlier sitting—that we have to be careful about the data that the Government hold. The majority of the adult population already has their facial data on Government databases, in the form of passport and driving licence imagery. When we start talking about the exemptions to being able to share data between different Government functions and law enforcement functions, and the exemptions on top of that for the ability to use those things, we just need to be careful that it does not get ahead of us. I know it is difficult to legislate perfectly for the future, but these safeguards would help to make it a safer place.

I will mention briefly the IMSI-catchers, because that covers my constituency of Bristol North West. It was the Bristol Cable, a local media co-operative of which I am a proud member—I pay £1 a month, so I declare an interest—that uncovered some of the issues around IMSI-catchers with bulk collection of information. It is really important that when we are having debates, as we have had with algorithms and artificial intelligence, we recognise that human intervention and the understanding of some of these systems is sometimes difficult. There are very few people who understand how algorithms actually work or how the systems actually work. As they become more advanced and learn and make decisions by themselves, the idea of human intervention or a human understanding of that is increasingly difficult.

In a situation where human resource is extremely stretched, such as in the police service, the tendency will understandably be to rely on the decisions of the systems within the frameworks that are provided, because there is not time to do full human intervention properly. That is why the safeguards are so important—to prevent things getting ahead of us. I hope the Government support the amendments, which I think are perfectly sensible.

Photo of Victoria Atkins Victoria Atkins The Parliamentary Under-Secretary of State for the Home Department, Minister for Women

I have just a small correction. The hon. Member for Sheffield, Heeley said in error that the Home Office were holding on to the photographs. It is not the Home Office. It is individual police forces that hold that.

Photo of Louise Haigh Louise Haigh Shadow Minister (Home Office) (Policing)

No, it is on the police national computer. That falls under the responsibility of the Home Office, not individual forces.

Photo of Victoria Atkins Victoria Atkins The Parliamentary Under-Secretary of State for the Home Department, Minister for Women

That is run by the police. I do not want the misapprehension to be established that there is an office in the Home Office in Marsham Street where these photographs are held on a computer. It is on the police national computer, which is a secure system that people have to have security clearance to get into. It is not completely accurate to say that the Home Office has possession of it.

I want to reassure the hon. Lady, because the picture she painted of the various systems she described was that they are unregulated, but that is not the case. Where they involve the processing of personal data, they will be caught by the Bill and the 1998 Act. Other statutory provisions may also apply—for example, the provisions of PACE relating to biometric information—and the surveillance camera commissioner will have a role in relevant cases. Facial recognition systems, in particular, are covered by the 1998 Act and the Bill, because they relate to personal data. Any new systems that are developed will be subject to a data protection impact assessment.

Law enforcement processing of ANPR data for the purpose of preventing, detecting, investigating and prosecuting crime will be conducted under part 3 of the Bill. When the data is processed by other organisations for non-law enforcement purposes, such as the monitoring of traffic flows, the data will be processed under part 2 of the Bill.

Part 3 of the Bill puts data protection impact assessments on a statutory footing for the first time. The purpose of such impact assessments is to prompt a controller to take action and put in place safeguards to mitigate the risk to individuals in cases in which processing is likely to result in a high risk to the rights and freedoms of their personal data. For example, under clause 64 the police will be required to carry out a data protection impact assessment before the new law enforcement data service—the next-generation police national computer—goes live. Clauses 64 and 65 faithfully transpose the provisions of the law enforcement directive, and the provisions in part 4 faithfully give effect to draft Council of Europe convention 108.

Amendments 142 to 145 would extend the scope of the requirements in clause 64 so that a formal impact assessment would have to be carried out irrespective of the likelihood or significance of the risk. That would place overly burdensome duties on controllers and their resources, with limited benefit to the data subject.

Photo of Louise Haigh Louise Haigh Shadow Minister (Home Office) (Policing) 3:45, 15 March 2018

I would be grateful if the Minister can confirm that all the examples we raised today will fall under the “high risk” category in the Bill.

Photo of Victoria Atkins Victoria Atkins The Parliamentary Under-Secretary of State for the Home Department, Minister for Women

I will deal with the definition of high risk in a moment. Clause 64 separates out the processing most likely significantly to affect an individual’s rights and freedom, which requires an additional level of assessment to reflect the higher risk. The amendments would water down the importance of those assessments. That is not to say that consideration of the impact on rights and freedoms can be overlooked. It will, of course, remain necessary for the controller to carry out that initial assessment to determine whether a full impact assessment is required. Good data protection is not achieved by putting barriers in the way of processing. It is about considering the risk intelligently and applying appropriate assessments accordingly.

On the question of high risk, officers or data controllers will go through that process when considering whether a data protection impact assessment is correct. I will write to the hon. Lady to clarify whether the bodies and lists she mentioned will be defined as high risk. The fact is that they are none the less regulated by various organisations.

Photo of Matt Warman Matt Warman Conservative, Boston and Skegness

The crucial point—I do not think the Opposition disagree with it—is that, although some things contain an element of risk, there are also huge benefits. Surely nobody wishes to do anything that prevents law enforcement from using hugely advantageous new technology, which will allow it to divert its resources to even more valuable areas.

Photo of Victoria Atkins Victoria Atkins The Parliamentary Under-Secretary of State for the Home Department, Minister for Women

Indeed. A pertinent example of that is the development of artificial intelligence to help the police categorise images of child sexual exploitation online. That tool will help given the volume of offences now being carried out across the world. It will also help the officers involved in those cases, because having to sit at a computer screen and categorise some of these images is soul-breaking, frankly. If we can use modern technology and artificial intelligence to help categorise those images, that must surely be a good thing.

Photo of Louise Haigh Louise Haigh Shadow Minister (Home Office) (Policing)

There is absolutely no argument over that. As a former special constable myself, I have no wish to put obstacles in the way of law enforcement. There is a particular need to develop technology to help digital investigations, and I think the Government have been delaying that. Human failures in those investigations have led to the collapse of several trials over the past couple of months.

The Minister says that the surveillance camera commissioner has a role. The commissioner has said that there needs to be further clarity on regulatory responsibility. It is not clear whether it is the surveillance camera commissioner, the biometrics commissioner or the Information Commissioner who has responsibility for facial recognition software. Does she accept that the Government urgently need to provide clarity, as well as guidance to the National Police Chiefs Council and police forces, about the use of this potentially invasive software?

Photo of Victoria Atkins Victoria Atkins The Parliamentary Under-Secretary of State for the Home Department, Minister for Women

Specifically on clause 64, which is about the data protection impact assessment, the judgment as to whether the proposed processing is high risk must be a matter for the controller. On the face of it, many of the systems that the hon. Lady described in her speech will involve high risk, but with respect the decision is not for me to make as a Minister on my feet in Committee. We must allow data controllers the freedom and responsibility to make those assessments. They are the ones that make the decisions and what flows from that in terms of processing.

If the hon. Lady will write to me on the more general, wider point about oversight of the surveillance camera commissioner and so on, I would be happy to take that up outside of Committee.

Photo of Louise Haigh Louise Haigh Shadow Minister (Home Office) (Policing)

The issue about whether it is high risk is of course a matter for the data controller, but we are scrutinising this Bill, and the Minister is asking us to support a test of high risk. I am sure the whole Committee would agree that all the cases that have been suggested today involve an incredibly high risk. They involve deprivation of liberty and invasion of privacy. The idea that we would accept a definition of high risk that does not cover those examples is too much for the Opposition to support. That is why the amendment exists. We need to test exactly what the Government envisage in the definition of high risk.

Photo of Victoria Atkins Victoria Atkins The Parliamentary Under-Secretary of State for the Home Department, Minister for Women

May I just clarify whether the hon. Lady intends to amend her amendment to list the various categories she listed in her speech? I have been very clear that high risk is defined as including processing where there is a particular likelihood of prejudice to the rights and freedoms of data subjects. I would be very cautious about listing examples in the Bill through an amendment, because as we have all acknowledged, criminality and other things develop over time. It would be very bold to put those categories in the Bill.

Photo of Louise Haigh Louise Haigh Shadow Minister (Home Office) (Policing)

No one is suggesting that such examples should go in the Bill. I appreciate this is the Minister’s first Bill Committee, but the job of the Opposition is to test the definitions in the Bill and ensure that it is fit for purpose. My concern is that the definition of high risk is set too high to cover law enforcement agencies and will allow egregious breaches of individuals’ data rights, privacy rights and right to liberty. It is our job as the Opposition—there is nothing wrong with us exercising this role—to ensure that the Bill is fit for purpose. That is what we are seeking to do.

Photo of Victoria Atkins Victoria Atkins The Parliamentary Under-Secretary of State for the Home Department, Minister for Women

I am extremely grateful to the hon. Lady for clarifying her role. My answer is exactly as I said before. High risk includes processing where there is a particular likelihood of prejudice to the rights and freedoms of data subjects. That must be a matter for the data controller to assess. We cannot assess it here in Committee for the very good reason put forward by members of the Committee: we cannot foresee every eventuality. Time will move on, as will technology. That is why the Bill is worded as it is, to try to future-proof it but also, importantly, because the wording complies with our obligations under the law enforcement directive and under the modernised draft Council of Europe convention 108.

Photo of Liam Byrne Liam Byrne Shadow Minister (Digital, Culture, Media and Sport) (Digital Economy)

Does the Minister not have some sympathy with the poor individuals who end up being data controllers for our police forces around the country, given the extraordinary task that they have to do? She is asking those individuals to come up with their own frameworks of internal guidance for what is high, medium and low risk. The bureaucracy-manufacturing potential of the process she is proposing will be difficult for police forces. We are trying to help the police to do their job, and she is not making it much easier.

Photo of Victoria Atkins Victoria Atkins The Parliamentary Under-Secretary of State for the Home Department, Minister for Women

Clause 65(2) states:

“The controller must consult the Commissioner prior to the processing if a data protection impact assessment prepared under section 64 indicates that the processing of the data would result in a high risk”.

There are many complicated cases that the police and others have to deal with. That is why we have guidance rather than putting it in statute—precisely to give those on the frontline the flexibility of understanding, “This situation has arisen, and we need to calibrate the meaning of high risk and take that into account when we look at the prejudices caused to a person or a group of people.” That is precisely what we are trying to encompass. Presumably, that is what the Council of Europe and those involved in drafting the law enforcement directive thought as well.

Of course, there will be guidance from the Information Commissioner to help data controllers on those assessments, to enable us to get a consistent approach across the country. That guidance will be the place to address these concerns, not on the face of the Bill.

Photo of Louise Haigh Louise Haigh Shadow Minister (Home Office) (Policing)

Can the Minister confirm that the Metropolitan police consulted the Information Commissioner before trialling facial recognition software? I appreciate that she might not be able to do so on her feet, so I will of course accept it if she wishes to write to me.

Photo of Victoria Atkins Victoria Atkins The Parliamentary Under-Secretary of State for the Home Department, Minister for Women

I am afraid that I will have to write to the hon. Lady on that.

The intention behind this part of the Bill is not to place unnecessary barriers in the way of legitimate processing. Nor, we all agree, should we place additional burdens on the commissioner without there being a clear benefit. These provisions are in the Bill to address the need for an intelligent application of the data protection safeguards, rather than assuming that a one-size-fits-all approach results in better data protection.

Amendment 149 would insert a new subsection (8) to clause 65, which would permit the commissioner to exercise powers of enforcement if she was not satisfied that the controller or processor had taken sufficient steps to act on her opinion that intended processing would infringe the provisions in part 3. It is worth noting that the purpose of clause 65 is to ensure consultation with the commissioner prior to processing taking place. It is therefore not clear what enforcement the commissioner would be expected to undertake in this instance, as the processing would not have taken place. If, however, the controller sought to process the data contrary to the commissioner’s opinion, it would be open to her to take enforcement action in line with her powers already outlined in part 6.

I do not know, Mr Hanson, whether we have dealt with new clauses 3 and 4.

Photo of David Hanson David Hanson Labour, Delyn

New clauses 3 and 4 are being considered as part of this group, but would not be voted on until after the consideration of the clauses of the Bill have been completed. If you wish to respond to them, Minister, you can do so now.

Photo of Victoria Atkins Victoria Atkins The Parliamentary Under-Secretary of State for the Home Department, Minister for Women

I am grateful; I will deal with them now. New clauses 3 and 4 would place additional obligations on the intelligence services. New clause 3 would require the intelligence services to undertake a data protection impact assessment in cases where there is

“a risk to the rights and freedoms of individuals”,

whereas new clause 4 would require the intelligence services to have prior consultation with the Information Commissioner when proposing processing. Neither new clause reflects the unique form of processing undertaken by the intelligence services, its sensitive nature and the safeguards that already exist.

I should stress that the “data protection by design” requirements of clause 103 are wholly consistent with draft modernised Council of Europe convention 108, which was designed to apply to the processing of personal data in the national security context, and which therefore imposes proportionate requirements and safeguards. Under clause 103, in advance of proposing particular types of processing, the intelligence services will be obliged to consider the impact of such processing on the rights and freedoms of data subjects. That requirement will be integrated into the design and approval stages of the delivery of IT systems that process personal data, which is the most effective and appropriate way to address the broad aim. Furthermore, clause 102 requires the controller to be able to demonstrate, particularly to the Information Commissioner, that the requirements of chapter 4 of part 4 of the Bill are complied with, including the requirement in clause 103 to consider the impact of processing.

The impact assessment requirements of the general data protection regulation and the law enforcement directive were not designed for national security processing, which is out of the scope of EU law. Given the need to respond swiftly and decisively in the event of terrorist acts or actions by hostile states, any unnecessary delay to the intelligence services’ ability to deal with such threats could clearly have serious consequences. The new clauses are therefore inappropriate and could prejudice the lawful and proportionate action that is required to safeguard UK national security and UK citizens. Having explained our reasoning behind clauses 64 and 65, I hope that the hon. Member for Sheffield, Heeley will withdraw her amendment.

Photo of Louise Haigh Louise Haigh Shadow Minister (Home Office) (Policing) 4:00, 15 March 2018

I remain concerned that the Bill leaves gaps that will enable law enforcement agencies and the police to go ahead and use technology that has not been tested and has no legal basis. As my right hon. Friend the Member for Birmingham, Hodge Hill said, that leaves the police open to having to develop their own guidance at force level, with all the inconsistencies that would entail across England and Wales.

The Minister agreed to write to me on a couple of issues. I do not believe that the Metropolitan police consulted the Information Commissioner before trialling the use of photo recognition software, and I do not believe that other police forces consulted the Information Commissioner before rolling out mobile fingerprint scanning. If that is the case and the legislation continues with the existing arrangements, that is not sufficient. I hope that before Report the Minister and I can correspond so as potentially to strengthen the measures. With that in mind, and with that agreement from the Minister, I beg to ask leave to withdraw the amendment.

Amendment, by leave, withdrawn.

Clause 64 ordered to stand part of the Bill.

Clauses 65 and 66 ordered to stand part of the Bill.

Clause 67