The noble Lord, Lord Gilbert, has set out very admirably the findings of our report, particularly our concerns about the opaque nature of advertising and the lack of diversity in the industry. There is also a palpable concern that current education policy does not meet the needs of the creative industries and that the Department for Education is perhaps not in listening mode; and that the looming spectre of Brexit threatens London’s status as the premier hub, and first-choice location, of such a lucrative industry.
Rather than go over the ground that has been so admirably covered, I would like to pick up on one key issue that I believe would have been in the report if it were being published now and is clearly on the minds of other noble Lords; it became a considerable concern as we went on to our second report, Regulating the Internet. I refer to the monetisation, commercialisation and commodification of children online. In doing so, I refer noble Lords to my interests as listed in the register.
Children make up one-third of all online users globally and are therefore, like all users, subject to the business models of the world’s most popular online platforms, much of whose value comes from the commoditisation of data. That business model is to harvest from users as much personal data as possible, then use that data to encourage them into behaviours and decisions likely to generate profit—that is, to advertise, market or otherwise make the user available to those who wish to have their attention. As a result, children are bombarded with targeted advertising, irrespective of whether it is in their best interests. The platforms they use are designed to keep them online for as long as possible, even to the point of addiction. This is why the majority of companies that provide online services are incentivised not to care if their users are underage. If a user creates data, they create value; if they create value, then they are old enough.
Since this goes far beyond what we traditionally understand as advertising, it is perhaps useful to consider how it plays out in practice. I am uncertain, although I will take a guess, that not many of our small number have played Pokémon GO, a game that takes players out of the house to locate and collect virtual creatures in real-world places. But the chances are that even those who have played it—children included—do not know that the game’s real prize is not the collection of virtual creatures but, rather, the sale of the user’s location data to companies willing to pay.
The commercial arrangement between Pokémon GO maker Niantic and McDonald’s is the most prominent example of this. McDonald’s pays Niantic to place virtual Pokémon in its car parks and restaurants, thereby directing droves of oblivious children towards Big Macs, fries and chicken nuggets just as the game is finished. If this was an outlier, it would still be an affront, but targeting children is a growing norm.
In 2017, a leaked Facebook memo produced shocked outrage when it revealed that the company had given a presentation to advertisers demonstrating its ability to infer emotional states, in real time, from the posts and photos of millions of children, determining when they are feeling “stressed”, “nervous”, “overwhelmed”, “anxious” or “useless”. In other words, it was targeting children with advertising when they were at their most vulnerable.
This sort of profiling and targeting is a new frontier—not advertising as we once understood it, but using a child’s emotional state to help predict and shape their behaviour and then nudging them at the point they are most likely to respond. In more straightforward language, it is making them available to advertisers and marketeers at the precise moment that they are most vulnerable to the push of that commercial interest. This is not a fair fight.
Even if children are feeling their best, they are still vulnerable. Research on children’s cognitive development vis-à-vis advertising shows again and again that they are unable to spot native advertising—that is, advertising that adopts the look, feel and function of the media format in which it appears; it is designed to be indistinguishable from, and therefore to undermine, other content such as facts or news. No wonder Ofcom finds that only a fifth of eight to 11 year-olds and a third of 12 to 15 year-olds can differentiate between promotional and factual content, understand that prominent search results have probably been paid for, or identify and resist the nudge towards in-app purchase. The committee’s report correctly identifies that,
“many businesses exploit users’ data without informed consent”.
We must surely also ask whether it is appropriate to seek the consent of a child to treat them in any of these ways. Profiling, manipulating and targeting children is wrong in principle and harmful in practice.
The age-appropriate design code, launched in draft last week by the Information Commissioner, offers a new approach. It states that a child’s data must be processed only in circumstances where they are actively and knowingly engaged, and for purposes in their own best interests. This children’s code, as it is now nicknamed, will require online services—including the advertising sector—to reconsider how they treat children online by making them observe the norms and protections of childhood, including protecting children from economic pressure and exploitation. The code’s 16 provisions cover a number of interconnected aspects of data protection, such as high-privacy default settings, preventing sites recommending material detrimental to children’s health and well-being and strategies to minimise the gathering of data—since the very best way of avoiding abuse of a child’s data is not to take it in the first place. The code also covers data sharing, security of connected toys and the promotion of commercial activities that fail the bar of being in the best interests of the child. Its 16 provisions effectively take children out of the excesses of the business model.
Since its publication on
There is nothing intrinsically wrong with the technology we are all using. On the contrary, within it lies the promise of a better and more equitable world. However, a greedy corporate culture has been allowed to develop and until now the sector has been given a free pass for the collateral damage of its model, including the monetisation, commercialisation and commodification of childhood. Rather than questioning whether businesses should protect their bottom line, we must reassert that protecting children should be everyone’s bottom line.
So does the Minister agree with me that innovation that does not include protecting the well-being of children is not worthy of the name, and that businesses in the sector, big and small, must put the best interests of children first when designing their products and services? Can he also confirm that the Government will stand firm behind the Information Commissioner, whose children’s code is much admired around the world as the first serious attempt to tackle the asymmetry of power between the tech sector and children, and resist the attempts of the commercial interests working furiously in the background to water it down?
Finally, if advertising now includes the ability to take a child out of their bedroom, out of their home and across town to a McDonald’s car park without their knowledge, understanding or informed consent, does the Minister agree with me that it is now time for society to formally uphold all the privileges, protections and legal frameworks that have defined childhood so far, irrespective of the nature of the service, who is paying for access to that child or where the owner is registered?