Amendment 111ZA

Employment Rights Bill - Report (3rd Day) – in the House of Lords at 4:47 pm on 21 July 2025.

Alert me about debates like this

Lord Clement-Jones:

Moved by Lord Clement-Jones

111ZA: After Clause 34, insert the following new Clause—“Workplace AI risk and impact assessments(1) Before implementing or developing an AI system which may have significant risks or impacts on employment rights and conditions in the workplace, an employer must conduct a workplace AI risk and impact assessment (a “WAIRIA”)(2) A WAIRIA must be conducted under this section if there is a potential significant risk or impact on—(a) the identification or exercise of rights;(b) recruitment;(c) work access or allocation;(d) remuneration or benefits;(e) contractual status, terms or conditions;(f) mental, physical or psychosocial health.(3) A WAIRIA conducted under subsection (1) must—(a) document the intended purpose and functionality of the AI system;(b) establish a process for undertaking the monitoring of significant risks and impacts; (c) document the definitions, metrics and methods selected for the WAIRIA;(4) Employers must review and update the WAIRIA—(a) at least once every 12 months,(b) whenever substantial changes are made to the AI system to which it relates, or(c) when evidence emerges of unforeseen significant risks or impacts.(5) The Secretary of State must require any Fair Work Agency to issue guidance on the conduct disclosure and enforcement of WAIRIAs within 6 months of this section coming into force.”

Photo of Lord Clement-Jones Lord Clement-Jones Liberal Democrat Lords Spokesperson (Science, Innovation and Technology)

My Lords, Amendment 111ZA seeks to introduce a requirement for workplace AI risk and impact assessments. This amendment is focused on addressing the profound and rapidly evolving impact of artificial intelligence systems on the modern workplace. There are many opportunities for its adoption but also risks and impacts. There is potentially massive job displacement. AI could displace 1 million to 3 million UK jobs overall. There are workplaces skills gaps; more than half the UK workforce lacks essential digital skills and the Majority of the public has no AI education or training.

AI recruitment algorithms have resulted in race and sex discrimination. There are legal vulnerabilities. Companies risk facing costly lawsuits and settlements when unsuccessful job applicants claim unlawful discrimination by AI hiring systems. Meanwhile, AI adoption accelerates rapidly, and the UK’s regulatory framework is lagging behind.

Organisations such as the Trades Union Congress and the Institute for the Future of Work have consistently highlighted the critical need for robust regulation in this area. The TUC, through its artificial intelligence regulation and employment rights Bill, drafted with a multi-stakeholder task force, explicitly proposes workforce AI risk assessments and emphasises the need for worker consultation before AI systems are implemented. It also advocates for fundamental rights, such as a right to a human review for high-risk decisions. IFOW similarly calls for an accountability for algorithms Act that would mandate pre-emptive algorithmic impact assessments to identify and mitigate risks, ensuring greater transparency and accountability in the use of AI at work. Both organisations stress that existing frameworks are insufficient to protect workers from the potential harms of AI.

When I spoke to a similar amendment—Amendment 149—in Committee, the Minister acknowledged this and said:

“The Government are committed to working with trade unions, employers, workers and experts to examine what AI and new technologies mean for work, jobs and skills. We will promote best practice in safeguarding against the invasion of privacy through surveillance technology, spyware and discriminatory algorithmic decision-making … However, I assure the noble Lord, Lord Clement-Jones, that the Institute for the Future of Work will be welcome to make an input into that piece of work and the consultation that is going forward. I reassure the noble Baroness, Lady Bennett, and all noble Lords that this is an area that the Government are actively looking into, and we will consult on proposals in the make work pay plan in due course”.—[Official Report, 5/6/25; col. 878.]

This was all very reassuring, perhaps, but I have retabled this amendment precisely because we need more concrete specifics regarding this promised consultation.

The TUC and IFOW have been working on this for four years. Is it too much to ask the Government to take a clear position on what is proposed now? The Minister referred to the importance of proper consultation. This is a crucial area impacting the fundamental rights and well-being of workers right now, often without their knowledge, and AI systems are increasingly being introduced into the workforce, so the Government need to provide clarity on what kind of consultation is being undertaken, with whom they will engage beyond relevant stakeholders and what the precise timescale is for this consultation and any subsequent legislative action, particularly given the rapid introduction of AI into workplaces.

We cannot afford a wait-and-see approach. If comprehensive AI regulation cannot be addressed within this Bill as regards the workplace, we need an immediate and clear commitment to provision within dedicated AI legislation, perhaps coming down the track, to ensure that AI in the workplace truly benefits everyone. I beg to move.

Photo of Lord Holmes of Richmond Lord Holmes of Richmond Conservative

My Lords, it is always a pleasure to follow my friend, the noble Lord, Lord Clement-Jones, who, in his single Nelsonian Amendment, has covered a lot of the material in my more spread-out set of amendments. I support his Amendment 111ZA and will speak to my Amendments 168 to 176. I declare my interests in the register, particularly my technology interests, not least as a member of the advisory board of Endava plc and as a member of the technology and science advisory committee of the Crown Estate.

I will take one brief step backwards. From the outset, we have heard that the Government do not want to undertake cross-sector AI legislation and regulation. Rather, they want to take a domain-specific approach. That is fine; it is clearly the stated position, although it would not be my choice. But it is simultaneously interesting to ask how, if that choice is adopted, consistency across our economy and society is ensured so that, wherever an individual citizen comes up against AI, they can be assured of a consistent approach to the treatment of the challenges and opportunities of that AI. Similarly, what happens where there is no competent regulator or authority in that domain?

At the moment, largely, neither approach seems to be being adopted. Whenever I and colleagues have raised amendments around AI in what we might call domain-specific areas, such as the Product Regulation and Metrology Bill, the data Bill and now the Employment Rights Bill, we are told, “This is not the legislation for AI”. I ask the Minister for clarity as to whether, if a cross-sector approach to AI is not being taken, a domain-specific approach is, as opportunities are not being taken up when appropriate legislation comes before your Lordships’ House.

I turn to the amendments in my name. Amendment 168 goes to the very heart of the issue around employers’ use of AI. Very good, if not excellent, principles were set out in the then Government’s white paper of 2023. I have transposed many of these into my Amendment 168. Would it not be beneficial to have these principles set in statute for the benefit of workers, in this instance, wherever they come across employers deploying AI in their workplace?

Amendment 169 lifts a Clause largely from my Artificial Intelligence (Regulation) Private Member’s Bill and suggests that an AI responsible officer in all organisations that develop, deploy and use AI would be a positive thing for workers, employees and employers alike. This would not be seen as burdensome, compliant or a mere question of audit but as a positive, vibrant, dynamic role, so that the benefits of AI could be felt by workers right across their employment experience. It would be proportionate and right touch, with reporting requirements easily recognised as mirroring similar requirements set out for other obligations under the Companies Act. If we had AI responsible officers across our economy, across businesses and organisations deploying and using AI right now, this would be positive, dynamic and beneficial for workers, employees, employers, our economy and wider society.

Amendment 170 goes to the issue of IP copyright and labelling. It would put a responsibility on workers who are using AI to report to the relevant government department on the genesis of that IP and copyrighted material, and the data used in that AI deployment, by which means there would be clarity not only on where that IP copyright and data had emanated from but that it had been got through informed consent and that all IP and copyright obligations had been respected and adhered to.

Amendments 171 and 172 similarly look at where workers’ data may be ingested right now by employers’ use of AI. These are such rich, useful and economically beneficial sources of data for employers and businesses. Amendment 171 simply suggests that there should be informed consent from those workers before any of their data can be used, ingested and deployed.

I would like to take a little time on Amendment 174, around the whole area of AI in recruitment and employment. This goes back to one of my points at the beginning of this speech: for recruitment, there currently exists no competent authority or regulator. If the Government continue with their domain-specific approach, recruitment remains a gap, because there is no domain-specific competent authority or regulator that could be held responsible for the deployment and development of AI in that sector. If, for example, somebody finds themselves not making a shortlist, they may not know that AI has been involved in making that decision. Even if they were aware, they would find themselves with no redress and no competent authority to take their claim to.

Would the Minister not agree that this makes the case for at least the consideration of a recruitment and employment-specific regulator to be brought about through this Bill? If not, I would certainly prefer to have a light-touch, cross-sector AI authority which would ensure that, wherever individuals, workers, employees and citizens come across AI, they can have clarity, certainty and consistency in its application. In this instance, it would be in the area of recruitment, but the AI authority—agile, light-touch and, crucially, horizontally focused—would ensure clarity, certainty and consistency across all sectors of employment, our economy and society.

I will touch briefly on Amendment 176 and the algorithmic allocation of work. Again, this is already happening, often without employees even being aware that that is the case. What is the Government’s position on the algorithmic allocation of work? If this amendment is not to be considered and adopted, what is the Government’s approach to how this is currently occurring in our economy to workers right now, often with an extremely discriminatory and detrimental impact on those workers?

AI has such potential to transform employment for the benefit of workers, employers, businesses and our economy. It has the potential, if it is human-led AI, to drive productivity in a way that no other element of our economy is currently likely to do. Similarly, if we do nothing and continue with this wait-and-see approach to legislation and regulation, it is most likely that workers may often find themselves at the sharp end of the algorithmic allocation of work, AI in the workplace taking their data and numerous other issues, unable to avail themselves of the benefits.

This situation could be wholly averted if some of these amendments were considered and incorporated into the Employment Rights Bill. Better still, the Government should reconsider bringing forward a cross-sector AI regulation Bill. What we know fundamentally is that regulation is right: right for workers, right for employees, and right for all aspects of our economy and society. When I say that regulation is right, I mean the right size regulation. What we know from history, not least from recent history, is that right-size regulation is good for innovation, investment, citizens, creativity and our country. Would the Government be good enough to agree?

Photo of Lord Pitkeathley of Camden Town Lord Pitkeathley of Camden Town Labour 5:00, 21 July 2025

My Lords, I am aware that many of the amendments in this group have a rather different focus from the points I wish to make. I acknowledge the amendments by the noble Lords, Lord Clement Jones and Lord Holmes of Richmond. I believe they provide a valuable opportunity to reflect on the particular nature of working in tech and AI. This is, as has already been alluded to, a sector that makes a significant and growing contribution to the UK economy, and it is rightly seen as one of the priority strands of the Government’s modern industrial strategy.

As the rather scary AI 2027 forecast by Daniel Kokotajlo and others makes clear, developments in this space are accelerating incredibly rapidly and are already reshaping how we live and work. Even as I say that, I wonder whether I may have triggered an algorithmic alert somewhere—let us hope that parliamentary privilege covers some of it. AI is happening, regardless of how we feel about it, and the opportunity it provides makes it all the more important that firms are based and regulated here rather than elsewhere.

Jobs in this area tend to be highly skilled and well paid, but that does not mean workers do not need some protections. In many cases, the things that matter most are not issues such as minimum wage and paid leave but how easily people can move between companies, start their own ventures and work across several fast-growing enterprises. Here, it is non-compete agreements which pose a particular challenge. Understandable concerns over safeguarding intellectual property have led some firms to restrict employee movement, yet this comes at a cost to innovation, competition and the free flow of ideas that underpin these industries. I know the last Government carried out a review of these clauses in general terms, but no meaningful reform followed. Does the department have a view on how widespread these clauses now are, particularly in fast-moving and competitive sectors? Has any formal assessment been made of their impact on innovation, start-up activity, and people’s ability to move freely and fairly between roles?

I fully appreciate that this Bill is focused on establishing baseline rights for all workers rather than addressing sector-specific concerns. However, I hope the Minister can say something about how these challenges are being considered as part of the Government’s wider thinking on the future of work and on how we ensure that the UK remains a good place to innovate, as well as a fair place to work.

Photo of Lord Freyberg Lord Freyberg Crossbench

My Lords, I support the timely and vital amendments tabled by the noble Lords, Lord Clement-Jones and Lord Holmes of Richmond, concerning the use of artificial intelligence in the workplace. These amendments, which cover transparency, accountability, consent, fairness and the protection of workers’ rights, speak to one of the central challenges of our time: how we align the rapid deployment of AI with the rights, dignity and agency of working people.

Just 11 days ago, a few of us, including the noble Lord, Lord Clement-Jones, had the privilege of attending the round table on aligning AI for human flourishing, hosted here in the House of Lords by the noble Baroness, Lady Kidron, and convened by Oxford University’s Institute for Ethics in AI and the Accelerator Fellowship Programme. It was led by Professor Yuval Shany and brought together leading international voices, including Professor Alondra Nelson, who designed the US Blueprint for an AI Bill of Rights, later embedded in President Biden’s executive order on AI.

That discussion made one thing clear: we are at a crossroads. As Professor Nelson put it at a recent AI action summit in Paris:

“We can create systems that expand opportunity rather than consolidate power for the few”.

If we are serious about that aspiration, we need Laws that embed it in practice. I hope we will soon see legislation introduced in this House—an AI Bill of Rights rooted in the UK context—that reflects our democratic values, legal traditions and the lived realities of British workers. That will require leadership from the Government and support across parties, and I believe this House is well placed to lead the way.

That is precisely what the amendments tabled by the noble Lord, Lord Holmes, seek to do. Amendment 168 outlines the core principles employers must uphold when using AI on workers: safety, fairness, transparency, governance, inclusion and the right to redress. These are the bedrock of responsible innovation. Amendment 169 proposes the appointment of designated AI officers within organisations, ensuring that someone is directly accountable for the ethical and unbiased use of these powerful technologies.

Amendments 171 and 172 tackle perhaps the most urgent concern: consent. No worker’s data should be ingested by AI systems—or decisions made about their employment by algorithm—without their meaningful, informed opt-in. We are not speaking in abstractions; AI is already determining who is shortlisted, scheduled, surveilled or sidelined. These systems often operate in secret and carry forward the biases of the data they are trained on. If we do not act now, we risk embedding discrimination in digital form.

This is not the first time that this House has stood up for fairness in AI. On 12 May, and in subsequent ping-pongs on the data Bill, many of us voted in support of the amendments tabled by the noble Baroness,sb Lady Kidron, which called for transparency over copyright and AI. That debate too was about rights—to control one’s work, one’s data and one’s identity. The same principle is at stake here. If the UK is to lead on AI, we must lead not just in capability but in ethics. The amendments tabled by the noble Lord, Lord Holmes, are not radical but responsible; they bring our values into alignment with our technologies. I therefore urge all noble Lords to support them, even though it is highly unlikely that they will be accepted.

Photo of Lord Palmer of Childs Hill Lord Palmer of Childs Hill Liberal Democrat Lords Spokesperson (Work and Pensions)

From these Benches, all I can say is that I echo those words. I hope that the Government have listened to the arguments about AI and will respond positively.

Photo of Lord Hunt of Wirral Lord Hunt of Wirral Shadow Minister (Business and Trade)

My Lords, I too congratulate my fellow solicitor, the noble Lord, Lord Clement-Jones, and my noble friend Lord Holmes of Richmond on their amendments.

We are following up on the exchanges that took place at Question Time earlier today, when the Minister—the noble Lord, Lord Vallance—offered to give us a reading list so that we could peruse the subject during the vacation, when he explained that, sadly, the Government are not yet able to produce their consultation paper. When the noble Baroness the Minister sums up this debate, can she identify for us what her noble friend had in mind? We are anxious to make sure that we are up to date on these very important subjects.

AI technologies are evolving at pace, touching every corner of the economy, from manufacturing and logistics to retail, healthcare and particularly—as my noble fellow lawyer knows—professional services. In the context of work, AI offers real potential: it can support productivity, streamline processes and free individuals from repetitive and burdensome tasks. It may also, if properly deployed, open up new opportunities for people who have historically faced barriers to employment.

However, as the noble Lord, Lord Freyberg, just reminded us, alongside that, there are real concerns. He instanced a number of them, and they are set out in Amendment 168; they are about fairness, transparency, accountability and, indeed, the role of human oversight in the decisions that affect people’s lives and livelihoods. It is therefore important that we take a balanced, thoughtful approach.

The noble Lord, Lord Pitkeathley of Camden Town, pointed out, quite rightly, that a number of non-compete agreements are now emerging. We have to be aware that these could so easily stifle innovation, and this must be all about encouraging and stimulating innovation. Therefore, it is very important that we take a balanced, thoughtful approach. But we should not allow technological change to outpace our frameworks for fairness, ethics and employment rights.

In conclusion, AI is not a distant or abstract issue; it is here, evolving and shaping the future of work. I hope we can move forward in a way that is both pro innovation and firmly rooted in the values of fairness, dignity and accountability. We very much look forward to hearing the Minister’s thoughts on these subjects.

Photo of Baroness Jones of Whitchurch Baroness Jones of Whitchurch Parliamentary Under Secretary of State (Department for Science, Innovation and Technology), Parliamentary Under Secretary of State (Department for Business and Trade), Baroness in Waiting (HM Household) (Whip) 5:15, 21 July 2025

My Lords, I will begin with Amendment 111ZA, moved by the noble Lord, Lord Clement-Jones, and Amendments 168, 169, 171, 172, 175 and 176, tabled by the noble Lord, Lord Holmes, whom I thank for his engagement on these important issues.

I start by reassuring all noble Lords that we agree that AI should be deployed and used responsibly, including within the workplace. As the noble Lord knows, in January 2025, we published the AI Opportunities Action Plan, which included a commitment to

“support the AI assurance ecosystem to increase trust and adoption” of AI. One of the key deliverables in this area is the AI management essentials tool. We are developing this tool to support businesses, particularly SMEs, to implement good AI governance practices. Following public consultation earlier this year, I hope to update your Lordships’ House on the consultation response and an updated version of that tool soon.

Regarding these amendments, I remind noble Lords that our plan to make work pay makes it clear that workers’ interests will need to inform the digital transformation happening in the workplace. Our approach is to protect good jobs, ensure good future jobs, and ensure that rights and protections keep pace with technological change.

To be clear, we are committed to working with trade unions, employers, workers and experts to examine what AI and new technologies mean for work, jobs and skills. We will promote best practice in safeguarding against the invasion of privacy through surveillance technology, spyware and discriminatory algorithmic decision-making. In response to the noble Lords, Lord Freyberg and Lord Hunt, of course we will put ethics and fairness at the heart of that.

I am keen to stress that we are taking steps to enhance our understanding of this area. This has included engagement and round-table events with a wide range of stakeholders and experts to help enrich our understanding. I reaffirm that we will consult on the make work pay proposals in due course.

The noble Lord, Lord Clement-Jones, asked what would be in the scope of the consultation. The consultation plan includes examining: what AI and new technologies, including automation and AI, mean for work, jobs and skills; how to promote best practice in safeguarding against the invasion of privacy through surveillance technology, spyware and discriminatory algorithmic decision-making; and how best to make the introduction of surveillance technology in the workplace subject to consultation and negotiation with trade union or employee representatives.

The noble Lord, Lord Holmes, asked whether or not this was going to be domain-specific. As the noble Lord, Lord Hunt, just reminded us, this was dealt with in an Oral Question earlier this afternoon, when my noble friend Lord Vallance said that existing regulators will oversee most AI systems, supported by enhanced AI skills and cross-regulatory co-ordination through forums such as the Regulatory Innovation Office. Some cross-cutting issues will be addressed also in the planned consultation on AI.

Looking specifically at Amendment 171, let me reassure the noble Lord that we believe that data protection legislation provides sufficient protection for workers and individuals where their personal data is being used in line with the key data protection principles, including lawfulness, fairness and transparency. Consent is a lawful ground to process personal data. However, due to the power imbalance between the employee and employer, it is often inappropriate for employers to rely on consent from employees to process their data. This is why we have an additional lawful ground to carry out such processing, such as legitimate interest under the data protection law. Therefore, we do not wish to limit data processing in these situations to consent alone. I also point out that while data protection principles establish the requirements that we expect the use of AI systems to adhere to, AI assurance provides ways to evidence that these requirements have been met in practice.

Amendment 170 tabled by the noble Lord, Lord Holmes, would require workers and employers to maintain records of data and intellectual property used in AI training and to allow independent audits of AI processes. As he will know, this issue was debated extensively during the passage through your Lordships’ House of the Data (Use and Access) Act 2025. Only last month I confirmed that we will publish a report, including on transparency in the use of intellectual property material in AI training, within nine months of Royal Assent to the Act, which will be due by 18 March next year. The Government have also committed to setting up expert stakeholder working groups to help drive forward practical, workable solutions in this area, alongside a parliamentary working group to engage with policy development.

Amendment 174 tabled by the noble Lord, Lord Holmes, proposes a review of the use of AI in recruitment and employment. As the noble Lord will be aware, last year the previous Government published detailed guidance on responsible AI in recruitment, which covers governance, accessibility requirements and testing. This was developed with stakeholders and relevant regulators, such as the Information Commissioner’s Office and the Equality and Human Rights Commission. Employers and recruiters may find this guidance useful to help integrate AI into their recruitment practices in a responsible way.

Furthermore, I am excited about the opportunities of AI in supporting the UK’s workforce, as well as creating jobs and growing our economy. However, we must also understand how it may affect the labour market, including any potential disruption. The AI Security Institute has begun assessing this issue, and I hope to be able to update your Lordships’ House on this as work progresses.

Regarding our position on general AI regulation and the establishment of a new AI regulator, we believe that AI is best regulated at the point of use by the UK’s existing sectoral regulators. As experts in their sector, they are in the best place to understand the uses and risks of AI in their relevant areas, and we will support them to do this. I emphasise that in response to the AI Opportunities Action Plan, we have committed to supporting regulators in evaluating their AI capabilities and understanding how they can be strengthened. I assure your Lordships’ House that we are committed to making sure that workers’ interests inform the digital transformation taking place in the workplace.

I am grateful to my noble friend Lord Pitkeathley for raising non-compete clauses. There has been extensive research and analysis in recent years looking at the prevalence of non-compete clauses in the UK labour market and their impact on both workers and the wider economy. Government research published in 2023 found that non-compete clauses were widely used across the labour market, with around 5 million employees in Great Britain working under a contract that contained a non-compete Clause, with a typical duration of around six months. As my noble friend identified, this can adversely impact both the worker affected, through limiting their ability to move between jobs, and the wider economy, due to the impacts on competition.

It is often assumed that non-compete clauses are found only in contracts of high earners. However, research published last year by the Competition and Markets Authority found that while non-competes are more common in higher-paid jobs, even in lower-paid jobs 20% to 30% of workers believe that they are covered by non-compete clauses. The Government have been reviewing the research and work done to date on non-compete clauses, and I am pleased to be able to confirm that we will be consulting on options for reform of non-compete clauses in employment contracts in due course.

Finally, the noble Lord, Lord Hunt, asked for my suggested reading list following my noble friend’s kind offer earlier this afternoon. I can do no better than to recommend the excellent book by the noble Lord, Lord Clement-Jones, on AI. In that spirit, I ask the noble Lord, Lord Clement-Jones, to withdraw his Amendment 111ZA.

Photo of Lord Clement-Jones Lord Clement-Jones Liberal Democrat Lords Spokesperson (Science, Innovation and Technology)

The noble Baroness nearly won me over at that point. I thank her. I feel like someone who was expecting a full meal but receives a rather light snack. I will explain why as we go through.

I thank the noble Lord, Lord Holmes. I feel that I am somewhat upstaging him by putting an Amendment at the front of the group, but we have many common themes that we both have pursued over the years together. I agree with him on the desirability of a cross-sector approach. He is much more patient than I am and, in putting down individual amendments and hoping that the Minister will give satisfactory answers, he is clearly more optimistic than I am. Whether his optimism has been justified today, I am not so sure.

The Minister could not even acknowledge the work done by the TUC, which has been ground-breaking in so many ways. It has taken four years, so it is extraordinary that the Government are doing what they are doing. I acknowledge what the noble Lord, Lord Pitkeathley, had to say. I was not quite sure how it connected to AI, but he very cunningly linked the subject of non-compete clauses to innovation, which does link to AI. I was encouraged by what the Minister had to say about consultation on reform.

The noble Lord, Lord Hunt, reminded me that I was a solicitor. Unlike him, I do not still have a practising certificate still, but there we are. He has much more stamina than I have. Non-compete clauses can be extremely important in making sure that know-how is preserved within an existing business. I thank the noble Lord, Lord Freyberg, for what he had to say on making sure that AI ensures human flourishing and that we preserve agency. That is what the amendments tabled by the noble Lord, Lord Holmes, and me are all about.

The Minister talked about an AI assurance ecosystem and AI management essential tools that there will be a consultation on, but I could not sense any intention to do anything other than a sort of voluntary approach. We have a lot of employment law that has developed over the years, but the Government seem to be allergic to doing anything with any teeth. She mentioned recruitment practices, but that again seems to be very much a voluntary approach. The AI Security Institute is not a regulator. I cannot feel that the Minister has given much more than the noble Lord, Lord Leong, gave last time. For instance, the Minister talked about consultation over make-work proposals. This involved talking about best practice on the adoption of AI and how best to deal with surveillance technology. Again, I did not sense any real intent to make sure that we have a new set of protections in the workplace in the face of AI.

I very much hope that, as time goes on, the Government will develop a much more muscular approach to this. As many noble Lords have said, AI presents a great number of opportunities in the workplace, but we absolutely do not want to see the opportunities overwhelmed by mistrust and a belief that AI presents unacceptable risks on the part of those employees. We want to see employees understanding that in the face of AI adoption, they have the right to be consulted and there is proper risk assessment of the introduction of these systems into the workplace, so that there is a proper, consensual approach to AI adoption.

I really do not feel that the Government are keeping up to date with the issues in this respect, and I am afraid that is rather reflected in some of the issues that we are going to talk about on Wednesday as well. In the meantime, however, I beg leave to withdraw the amendment.

Amendment 111ZA withdrawn.

Schedule 4: Pay and conditions of school support staff in England

Amendment

As a bill passes through Parliament, MPs and peers may suggest amendments - or changes - which they believe will improve the quality of the legislation.

Many hundreds of amendments are proposed by members to major bills as they pass through committee stage, report stage and third reading in both Houses of Parliament.

In the end only a handful of amendments will be incorporated into any bill.

The Speaker - or the chairman in the case of standing committees - has the power to select which amendments should be debated.

Secretary of State

Secretary of State was originally the title given to the two officials who conducted the Royal Correspondence under Elizabeth I. Now it is the title held by some of the more important Government Ministers, for example the Secretary of State for Foreign Affairs.

Clause

A parliamentary bill is divided into sections called clauses.

Printed in the margin next to each clause is a brief explanatory `side-note' giving details of what the effect of the clause will be.

During the committee stage of a bill, MPs examine these clauses in detail and may introduce new clauses of their own or table amendments to the existing clauses.

When a bill becomes an Act of Parliament, clauses become known as sections.

Minister

Ministers make up the Government and almost all are members of the House of Lords or the House of Commons. There are three main types of Minister. Departmental Ministers are in charge of Government Departments. The Government is divided into different Departments which have responsibilities for different areas. For example the Treasury is in charge of Government spending. Departmental Ministers in the Cabinet are generally called 'Secretary of State' but some have special titles such as Chancellor of the Exchequer. Ministers of State and Junior Ministers assist the ministers in charge of the department. They normally have responsibility for a particular area within the department and are sometimes given a title that reflects this - for example Minister of Transport.

majority

The term "majority" is used in two ways in Parliament. Firstly a Government cannot operate effectively unless it can command a majority in the House of Commons - a majority means winning more than 50% of the votes in a division. Should a Government fail to hold the confidence of the House, it has to hold a General Election. Secondly the term can also be used in an election, where it refers to the margin which the candidate with the most votes has over the candidate coming second. To win a seat a candidate need only have a majority of 1.

White Paper

A document issued by the Government laying out its policy, or proposed policy, on a topic of current concern.Although a white paper may occasion consultation as to the details of new legislation, it does signify a clear intention on the part of a government to pass new law. This is a contrast with green papers, which are issued less frequently, are more open-ended and may merely propose a strategy to be implemented in the details of other legislation.

More from wikipedia here: http://en.wikipedia.org/wiki/White_paper

amendment

As a bill passes through Parliament, MPs and peers may suggest amendments - or changes - which they believe will improve the quality of the legislation.

Many hundreds of amendments are proposed by members to major bills as they pass through committee stage, report stage and third reading in both Houses of Parliament.

In the end only a handful of amendments will be incorporated into any bill.

The Speaker - or the chairman in the case of standing committees - has the power to select which amendments should be debated.

clause

A parliamentary bill is divided into sections called clauses.

Printed in the margin next to each clause is a brief explanatory `side-note' giving details of what the effect of the clause will be.

During the committee stage of a bill, MPs examine these clauses in detail and may introduce new clauses of their own or table amendments to the existing clauses.

When a bill becomes an Act of Parliament, clauses become known as sections.

intellectual property

patents (for inventions), trade marks, protected designs, and copyrights; see http://www.patent.gov.uk

House of Lords

The house of Lords is the upper chamber of the Houses of Parliament. It is filled with Lords (I.E. Lords, Dukes, Baron/esses, Earls, Marquis/esses, Viscounts, Count/esses, etc.) The Lords consider proposals from the EU or from the commons. They can then reject a bill, accept it, or make amendments. If a bill is rejected, the commons can send it back to the lords for re-discussion. The Lords cannot stop a bill for longer than one parliamentary session. If a bill is accepted, it is forwarded to the Queen, who will then sign it and make it law. If a bill is amended, the amended bill is sent back to the House of Commons for discussion.

The Lords are not elected; they are appointed. Lords can take a "whip", that is to say, they can choose a party to represent. Currently, most Peers are Conservative.

laws

Laws are the rules by which a country is governed. Britain has a long history of law making and the laws of this country can be divided into three types:- 1) Statute Laws are the laws that have been made by Parliament. 2) Case Law is law that has been established from cases tried in the courts - the laws arise from test cases. The result of the test case creates a precedent on which future cases are judged. 3) Common Law is a part of English Law, which has not come from Parliament. It consists of rules of law which have developed from customs or judgements made in courts over hundreds of years. For example until 1861 Parliament had never passed a law saying that murder was an offence. From the earliest times courts had judged that murder was a crime so there was no need to make a law.

Question Time

Question Time is an opportunity for MPs and Members of the House of Lords to ask Government Ministers questions. These questions are asked in the Chamber itself and are known as Oral Questions. Members may also put down Written Questions. In the House of Commons, Question Time takes place for an hour on Mondays, Tuesdays, Wednesdays and Thursdays after Prayers. The different Government Departments answer questions according to a rota and the questions asked must relate to the responsibilities of the Government Department concerned. In the House of Lords up to four questions may be asked of the Government at the beginning of each day's business. They are known as 'starred questions' because they are marked with a star on the Order Paper. Questions may also be asked at the end of each day's business and these may include a short debate. They are known as 'unstarred questions' and are less frequent. Questions in both Houses must be written down in advance and put on the agenda and both Houses have methods for selecting the questions that will be asked. Further information can be obtained from factsheet P1 at the UK Parliament site.

trade union

A group of workers who have united to promote their common interests.