Thank you. In this area, I need correcting quite a lot so I welcome that intervention. For all that, the concern for children is picked up in the online harms White Paper. Certain harms identified when they affect children are stipulated in the list of harms there.
I cannot forbear mentioning this morning’s news about Facebook. If it can put £5 billion aside to pay for the infractions which have occurred in its activities, and if for all that £5 billion it can report a 26% rise in its profits, we simply have to ask: are we in waters that are too deep for us to swim in? There are contradictory elements happening that I find very threatening and bewildering.
I want to go back for a moment. I have compared the list of harms on page 31 of the online harms White Paper—I will do it more systematically before the debate next week on that paper—with the harms, hinted at by the noble Lord, Lord Bilimoria, listed in the Plum report: individual, societal and economic. There are so many harms identified in the Plum report that do not figure at all in the list in the DCMS online harms report. We had a Question for Short Debate on this when the paper was published and I was on a wave of euphoria, because after all that Brexit stuff we were talking about real things again. I really was flying but afterwards the noble Baroness, Lady Neville-Rolfe, said to me, “But there’s nothing about online harms for small and medium enterprises”. Then the noble Baroness, Lady O’Neill, came to me and said, “But there’s nothing in there about the harms for our democracy”. In the end, the paper has to be more generic and overwhelmingly across the spectrum than it currently is.
Let us look at the harms in the Plum report. There is,
“Digital advertising fraud … brand risk”,
“Inappropriate advertising … that is … offensive, explicit … or … contains malware”.
Under “Societal harms” there is,
“financial support for publishers of offensive or harmful content”.
There is also “Discrimination”, described as targeted data to inadvertently categorise people on gender, ethnicity and race.
There is a moment of confession coming up—wait for it. Every morning, I generally address the quick crossword from the Guardian newspaper, and if I do it very quickly I allow myself to do just a couple of exercises in solitaire. That really is a confession; I am trying to avoid addiction, and coming off it cold turkey is very difficult. But when I put those things on the screen, along with the crossword a very expensive car is advertised to me. I am a retired Methodist minister and when I came to this House, I came in my rusty Ford Fiesta. On what grounds of behavioural knowledge and profiling of me are they targeting me with a Bentley? When I come to solitaire, however, what do I find? It is ladies’ clothing. What in my life do they know that I would not want to share with Members of this House, for goodness’ sake? Is that clothing for my wife or some other woman, or for myself, if they think that I am interested in these garments? The more alarming thing still is that I have repeatedly allowed myself to press all the buttons that eliminate the advert from the screen, but no algorithm has yet picked up the fact that I am not interested in advertising. If it is behaviourally driven it should, but it has not; I still get the stuff anyway.
When Martin Moore wrote his book, he took us through all the stages that have produced this side of the internet. We must agree with the right reverend Prelate the Bishop of Durham and others who laud the democratic and communautaire aspects of the internet—what it makes possible for us. At the same time, I fear that the negative aspects—the underbelly or darknet—is becoming disproportionately controlling of the general aspects of this technology. One review said that just before Facebook went to the stock market in 2012—after starting by saying that it did not want any advertising when it first launched in 1998—according to Martin Moore it went,
“‘all out to create an intelligent, scalable, global, targeted advertising machine’ that gave advertisers granular access to users. And so it created the most efficient delivery system for targeted political propaganda the world had ever seen”.
I will read one final paragraph from the review of this remarkable book, because it points me to both what the internet can do and what is too often implicit in the very things it does well. If I read it, your Lordships will get the tale. It says:
“Actually, Google is already doing a very good job”,
at helping in the field of education. It continues that:
“By mid-2017, the majority of schoolchildren in America were using Google’s education apps, which of course track the activity of every child, creating a store of data that—who knows?—might come in useful when those children grow up to be attractive targets for advertising. In the near future, Moore points out, we might no longer have a choice: ‘It will be a brave parent who chooses to opt out of a data-driven system, if by opting out it means their child has less chance of gaining entry to the college of their choice, or of entering the career’”,
they aspire to. You are in because it is good for you, but being in makes you vulnerable to exploitation in the fullness of time. This is a matter about which we all must be concerned, because it affects us in incalculable ways. Even a Minister of Government as expert as the one facing me now will have to bow to the inevitable, as we stiffen our resolve to face this question head on and do something about it.