My Lords, it always used to be said that reaching the end of your Lordships’ day was the graveyard slot. This is a bit of a vice slot. You are tempted by the growing number of people coming in to do a bit of grandstanding and to tell them what they are missing in this wonderful Bill that we are discussing. You are also conscious that the dinner hour approaches—and I blame the noble Baroness, Lady Hamwee, for that. All her talk of dining in L’Algorithme, where she almost certainly had a soup, a main course and a pudding, means that it is almost impossible to concentrate for the six minutes that we will be allowed—with perhaps a few minutes more if we can be indulged—to finish this very important group. It has only one amendment in it. If noble Lords did not know that, I bet that has cheered them up. I am happy to say that it is also a réchauffage, because we have already discussed most of the main issues, so I will be very brief in moving it.
It is quite clear from our discussion on the previous group that we need an ethics body to look at the issues that we were talking about either explicitly or implicitly in our debates on the previous three or four groups and to look also at moral and other issues relating to the work on data, data protection, automatics and robotics, and everything else that is going forward in this exciting field. The proposal in Amendment 78A comes with a terrific pedigree. It has been brought together by members of the Royal Society, the British Academy, the Royal Statistical Society and the Nuffield Trust. It is therefore untouchable in terms of its aspirations and its attempt to try to get to the heart of what should be in the contextual area around the new Bill.
I shall not go through the various points that we made in relation to people’s fears, but the key issue is trust. As I said on the previous group, if there is no trust in what is set up under the Bill, there will not be a buy-in by the general public. People will be concerned about it. The computer will be blamed for ills that are not down to it, in much the same way that earlier generations always blamed issues external to themselves for the way that their lives were being lived. Shakespeare’s Globe was built outside the city walls because it was felt that the terribly dangerous plays that were being put on there would upset the lieges. It is why penny dreadfuls were banned in the early part of the last century and why we had a fight about video nasties. It is that sort of approach and mentality that we want to get round to.
There is good—substantial good—to be found in the work on automation and robotics that we are now seeing. We want to protect that but in the Bill we are missing a place and a space within which the big issues of the day can be looked at. Some of the issues that we have already talked about could easily fit with the idea of an independent data ethics advisory board to monitor further technical advances in the use and management of personal data and the implications of that. I recommend this proposal to the Committee and beg to move.