New Clause 14 - Inclusion of systems within the Algorithmic Transparency Reporting Standard

Public Authorities (Fraud, Error and Recovery) Bill – in a Public Bill Committee at 10:45 am on 18 March 2025.

Alert me about debates like this

“(1) For the purposes of this section, ‘system’ means—

(a) algorithms, algorithmic tools, and systems; and

(b) artificial intelligence, including machine learning

provided that they are used in fulfilling the purposes of this Act.

(2) Where at any time after the passage of this Act, the use of any system is—

(a) commenced;

(b) amended; or

(c) discontinued

the Minister must, as soon as reasonably practicable, accordingly include information about the system in the Algorithmic Transparency Reporting Standard.” —

This new clause would require the use of algorithms, algorithmic tools, and systems, and artificial intelligence, including machine learning, to be included within the Algorithmic Transparency Reporting Standard.

Brought up, and read the First time.

Photo of John Milne John Milne Liberal Democrat, Horsham

I beg to move, That the clause be read a Second time.

The new clause would require that the use of algorithms, algorithmic tools and systems, and artificial intelligence, including machine learning, should be included within the algorithmic transparency reporting standard. That standard, established by the Government, is supposed to be mandatory for all Government Departments. However, last November, The Guardian reported that not a single Whitehall Department has registered the use of AI systems since it was made mandatory.

Throughout debate on this issue, the Government have consistently downplayed the risk of using AI to trawl for suspect claimants, but if it really is that simple, why have so many organisations come out with concerns and opposition? That includes Age UK, ATD—All Together in Dignity—Fourth World, Amnesty International, Campaign for Disability Justice, Child Poverty Action Group, Defend Digital Me and Difference North East. I could go on: I have half a page, which I will spare the Committee from, listing organisations that have expressed concern. It is quite a roll call.

Governments can and will get things wrong. History tells us that if it tells us anything. In June 2024, a Guardian investigation revealed that a DWP algorithm had wrongly flagged 200,000 people for possible fraud and error; it found that two thirds of housing benefit claims marked as high risk in the previous three years were in fact legitimate, but thousands of UK households every month had their housing benefit claims wrongly investigated. Overall, about £4.4 million was wasted on officials carrying out checks that did not save any money. We know that more mistakes will happen, no matter how hard we try to avoid them. I therefore ask the Minister to support the insertion of new clause 14 as a small measure of defence against future institutional failings.

Photo of Rebecca Smith Rebecca Smith Opposition Assistant Whip (Commons)

As we have heard, Liberal Democrat new clause 14 would require the use of algorithms, algorithmic tools, and systems, and artificial intelligence, including machine learning, to be included in the algorithmic transparency reporting standard. I have obviously just heard the comments of the hon. Member for Horsham, but I would be interested to know precisely what the Liberal Democrats are aiming to achieve with this new clause and how such reporting would better enable the Government to crack down on fraud and error. Is that the intention behind the new clause?

Photo of Andrew Western Andrew Western The Parliamentary Under-Secretary of State for Work and Pensions

I share the support expressed by the hon. Member for Horsham for the algorithmic transparency recording standard as a framework for capturing information about algorithmic tools, including AI systems, and ensuring that public sector bodies openly publish information about the algorithmic tools used in decision-making processes that affect members of the public. However, I do not think the new clause is a necessary addition to the Bill, and I will explain why.

First, all central Government Departments, including the DWP and the Cabinet Office, are already required to comply with the standard as appropriate. We are committed to ensuring that there is appropriate public scrutiny of algorithmic tools that have a significant influence on a decision-making process with public effect, or that directly interact with the public. We have followed and will continue to follow the guidance published by the Department for Science, Innovation and Technology on this to ensure the necessary transparency and scrutiny.

Secondly, I remind the Committee that although the DWP and PSFA are improving their access to relevant data through the Bill, we are not introducing any new use of machine learning or automated decision making in the Bill measures. I can continue to assure the House that, as is the case now, a human will always be involved in decisions that affect benefit entitlement.

Thirdly, although I do not wish to labour the point yet again, I remind the Committee that the Bill introduces new and important safeguards, including reporting mechanisms and independent oversight in the Bill, demonstrating our commitment to transparency and ensuring that the powers will be used proportionately and effectively. The DWP takes data protection very seriously and will always comply with data protection law. Any information obtained will be kept confidential and secure, in line with GDPR.

Photo of John Milne John Milne Liberal Democrat, Horsham

I am content to beg to ask leave to withdraw the motion.

Clause, by leave, withdrawn.