Toggle high contrast

Briefing: Data Protection and Digital Information Bill (V2)

Implications for the use of technology in the employment relationship
Report type
Research and reports
Issue date
Action

The TUC calls on MPs to oppose the provisions of the Bill that will result in the removal or dilution of workers’ rights and to propose amendments based on the TUC’s Manifesto, Dignity at Work and the AI Revolution.

We call on MPs to oppose the following provisions in the Bill:

  • The reduced requirements for Data Protection Impact Assessments and removal of consultation rights
  • Reduced protections against automated decision making
  • New barriers to subject access requests
  • Changes to the “legitimate interest test” for lawful data processing
  • Reduced record keeping requirements
  • Relaxation of provisions on international data transfers
  • Reduced independence of ICO

Download pdf version

Background to the Bill

The Data Protection and Digital Information Bill (V1) was introduced in the House of commons on 18 July 2022. The second reading of the Bill (V1) was due to take place on 5 September 2022. The Bill (V1) was withdrawn on 5th September 2022.

The Bill (V2) was introduced to Parliament on 8 March 20232 by the Department for Science, Innovation and Technology ("DSIT").

The Bills (V1 and 2) follow on from the Government consultation, “Data: A New Direction” to which the TUC submitted a response in November 20213. The government responded to the consultation submissions in June 20224.

The Government asserts that the intention behind the Bill (V1) was to update and simplify UK data protection law in order to reduce regulatory burdens and encourage innovation.

Michelle Donelan has described the Bill (V 2) in similar terms.

On 29th March 2023 the government published its AI White Paper5.The paper suggests that the government will adopt a “soft law” approach to the use of artificial intelligence, placing a duty on regulators to enforce ethical principles, but without a statutory footing.  The AI White Paper is relevant context for the Bill (V2) as given that the government does not intend to legislate to address the use of AI at work, existing protections under the UK GDPR are all the more important.

Relevance to technology at work

Data plays a key role in the operation of algorithmic management systems at work.6 An algorithmic system typically involves technology-driven surveillance, data collection, an algorithmic system processing data, and then an automated decision or other outcome.

Algorithmic management systems can be used for a wide range of functions, including directing and allocating work, making recruitment decisions and terminating employment or access to a platform.

Over the past two years the TUC has been carrying out a project with trade unions, investigating the impact of technology on workers and making proposals for change. 

In our report, Technology Managing People – the Worker Experience7, we identified the risks and opportunities associated with technology being used to recruit and manage people at work.  

The risks for workers include data protection infringements, a lack of control over data and understanding of how it is used, invasions of privacy, the blurring of work/home boundaries, negative impact on health and safety, as well as discrimination and other forms of unfairness, and an inability to challenge decisions made by technology. 

In the TUC’s report Technology Managing People- the legal Implications, commissioned from Robin Allen QC and Dee Masters of Cloisters and the AI Law Consultancy8 Allen and Masters identify key provisions in the UK General Data Protection Regulation (UK GDPR) that provide important potential protection and redress to workers where technology is being used in the employment relationship.

Article 1 of the UK GDPR emphasises the critical importance of data protection rights in the UK. It states that the purpose of the regulation is to protect “fundamental rights and freedoms of natural persons and in particular their right to the protection of personal data”.

This is of particular relevance in the workplace, where there is often a significant imbalance of power between worker and employer. This imbalance is also reflected in the degree of knowledge and control exercised over personal data at work, with workers often having very little power over their data.

Protection of personal data has important implications for the individual. For example, in relation to the right to privacy, transparency and accountability, equality, the ability to understand and challenge decisions, and the ability to benefit from the commercial value of data..

Existing data protection legislation in the UK is not perfect. But the UK GDPR provides important protections for individuals, as well as some mechanisms to redress in part the imbalance of power over data at work and elsewhere.

These protections range from the limitations on lawful grounds for processing data, to the provisions on automated decision making and access to information about this (Articles 21 and 22 UK GDPR).

 A recent decision of the Court of Appeal in Amsterdam9, a “robo-firing” case (in which drivers successfully used provisions under the GDPR relating to data subject access rights and automated decision making), demonstrates the importance of the UK GDPR in relation to protections against algorithmic management. 

If passed in its current form, the Bill (V2) will significantly amend the UK GDPR and in our view, reduce or remove some important protections for workers when technology is used to make decisions about them.

In addition, this dilution of rights takes place in the context of the proposals in the governments AI White Paper, in which the government confirms it will not be providing alternative sources of statutory protection for individuals in relation to the use of AI at work.

Further, the Bill (V2) as currently drafted represents a significant missed opportunity to update the law to take account of the radical pace of technological change and the impact on workers’ rights.

Key provisions 

We call on MPs to oppose the following provisions in the Bill (V2):

Data Protection Impact Assessments: requirements reduced and consultation rights removed Part 1 section 17

Data Protection Impact Assessments are a crucial process and consultation tool for workers and trade unions in relation to the use of technology at work.

Indeed, TUC affiliate unions have issued guidance on the use of DPIAs in the workplace illustrating the importance of this process to workers.10

The UK GDPR (Article 35) currently provides that where there is data processing that is likely to result in a high risk to the rights and freedoms of natural persons, a data controller must carry out a data protection impact assessment. The DPIA process is prescribed in detail in Article 35, with data controllers directed to take into account proportionality and the rights and interests of individuals. Indeed, there is an obligation for a data controller to seek the views of data subjects and their representatives on the intended processing:

“Where appropriate, the controller shall seek the views of data subjects or their representatives on the intended processing, without prejudice to the protection of commercial or public interests or the security of processing operations.”

The Bill strips away a large part of the prescribed process for a DPIA and presents a new, reduced requirement is for a simple " assessment" which removes particular protections in cases of profiling and processing of special category data, removes the emphasis on proportionality, as well as the emphasis on the importance of taking into account the rights, legitimate interest and views of data subjects and representatives.

The proposed changes mean that workers and unions will be left with less influence over new technologies in the workplace as DPIAs currently provide a right to consultation and scrutiny.

Automated decision making: reduced protections Part 1 Section 11

The UK GDPR (Articles 21 and 22) currently provides a right not to be subject to automated decision making which will have legal or similarly significant effect, unless the processing (use of data) falls under specific grounds for lawful processing.

This protection is important because automated decision-making means that human influence is absent in the taking of decisions that may have significant impact on people.

This amounts to a prohibition against automated decision making, except in three specific circumstances (where necessary for the performance of a contract, where consent is given and where provided for in national law).

And even in these cases, where the grounds for processing relate to performance of a contract or consent, there is a right to human review, as well as various information rights.

The Bill (V2) replaces Articles 21 and 22 with a series of provisions that remove the prohibition against automated decision making, relaxing the current protections under the UK GDPR and allowing automated decision making, provided certain safeguards are met.  There are some stricter safeguards where the ADM involves special category data.

Version 1 of the Bill stipulated that protections against automated decision making should only apply where there was no “meaningful human involvement”. Version 2 appears to introduce the presence of profiling as a factor to be taken into account when assessing “meaningful involvement”.

We believe that Articles 21 and 22 provide vitally important protections against automated decision making and that it is important to keep the prohibition-based principle in place, even though the exceptions to this are in our view currently too uncertain and undefined.

As set out in our consultation response (see above), we believe that the operation of the exceptions to Articles 21 and 22 should be clarified to ensure that these Articles provide the greatest possible protection to individuals who are the subject of AI and ADM.

However, rather than diluting current protections (as will be the effect of the Bill V2) we suggest that the appropriate solution is to ensure that the existing exceptions are properly defined with statutory guidance prepared in consultation with unions, other members of civil society and stakeholders.

We suggest statutory guidance on Articles 6, 21 and 22 to clarify:

-       the circumstances in which an employer can lawfully process data on the basis it is necessary to the employment contract or necessary to protect legitimate interests

-       the interplay between the lawful basis for processing and Articles 22 and 21

-       when Arts 21 and 22 can be disapplied (for example, guidance on the different levels of human intervention)

We also suggest that there is a universal entitlement to human review in relation to all decisions made in the workplace that are “high risk”. This should also include a right to in-person engagement, to preserve the importance of human connection and one-to-one communication.

We are concerned that the Bill (V2) reserves a right for the Secretary of State to determine further circumstances in which there is “meaningful human involvement” and stress the importance of clarity, certainty, and the involvement of parliament and civil society.  

Data subject access requests: a new barrier Part 1, Section 7

The right of data subjects to make an information access request is an important process for workers and their representatives. This process enables workers to gain access to personal data held about them by their employer and aids transparency over how algorithmic managements systems are operating. 

The Bill changes the prescribed justification for data controllers to decline subject as requests from “manifestly unfounded or excessive” to “vexatious or excessive”. Our concern is that this new test offers data controllers much wider discretion to decline subject access requests and that reasonable requests may increasingly be declined by employers.

Lawful processing: changes to the “legitimate interest test” Part 1, Section 5

The Bill (V2) introduces a list of “recognised legitimate interests”. This amendment to the UK GDPR will implement changes to the current application of the “legitimate interest” test for lawful processing.

At the moment, if a data controller is relying on the “legitimate interests” ground for lawful processing, they must usually carry out a balancing test, balancing the interests of the data controller and data subject. Under the terms of the Bill (V2) where a data controller can rely on one of the recognised legitimate interests, there will be no need for a balancing test. 

As outlined in our consultation response, we consider the balancing test (taking into account the rights of individuals) to be a crucial element of the legitimate interests ground for lawful processing. The protection afforded individuals as part of this balancing exercise is of the utmost importance where there may be an imbalance of power between data controller and data subject, as is the case in the workplace.

The TUC believes that the grounds for lawful processing require further clarification and in our AI Manifesto, we call for statutory guidance on the circumstances in which an employer can lawfully process data on the grounds that it is “necessary“ to an employment contract, as well as “necessary to protect legitimate interests”.

Reduced record keeping requirements Part 1, Section 15

The Bill (V2) reduces data processing record keeping requirements further than version 1, stipulating that record keeping is only necessary in “high-risk” instances.

We believe that record keeping of data processing is an important requirement and that failing to maintain this may result in decisions made by technology becoming even harder to understand.

Reduction in independence of ICO Part 1 sections 27-33

The Bill (V2) introduces various reforms to the ICO and we have concerns that the appointments process, with direct involvement from the Secretary of State, may threaten the independence and objectivity of the ICO.

Relaxation of provisions on international data transfers: Schedules 5 and 6

The Bill sets out a move away from adequacy of data protection regimes as the test for international data transfers, and focuses instead on whether a data controller has made a reasonable assessment of data protection requirements.

In addition, Article 44 of UK GDPR is removed. Article 44 sets out that “All provisions in this Chapter [V] shall be applied in order to ensure that the level of protection of natural persons guaranteed by this Regulation is not undermined.

We believe these provisions will loosen the restrictions on international data transfers, reducing the protections against international transfers of worker data.

EU data adequacy

In June 2021 the EU came to a Data Adequacy agreement with the UK whereby it would allow data to flow freely between the UK and EU due to the fact the UK had GDPR regulations in place which guaranteed a high level of data protection.  We are concerned that the Data Reform Bill’s weakening of GDPR rules, and infringements on the independence of the ICO,  put this agreement at risk.  Should the EU determine that the UK does not have sufficiently high levels of protection for data, it could insist on additional safety checks and requirements for data transfers between the UK and EU which would place a significant additional cost on UK businesses which would in turn put jobs and workers’ conditions at risk. 

Trade and Cooperation Agreement

The UK government agreed to not reduce labour or environmental rights in a manner potentially affecting trade or investment under Article 387 of the EU-UK Trade and Cooperation agreement (TCA).  This stipulates that workers’ rights must not be reduced below the level they were when the UK entered into this agreement - December 2020.  The TUC’s legal opinion by Professor Federico Ortino notes that government plans to dilute GDPR rules may constitute a breach of this commitment as it would reduce protections for workers’ rights that were in place in December 2020.11 If the EU concludes that the UK has breached its commitments under the TCA it would be able to impose penalties on the UK government, including fines or suspending UK access to EU markets. This would have a serious impact on jobs and workers’ pay and conditions. 

New digital ethos

We also call on MPs to propose amendments to the Bill that secure a new digital ethos at work and greater protections for workers:

Protection against discrimination

The UK’s data protection regime should be amended to state that discriminatory data processing is always unlawful.

Equality Impact Audits in the workplace should be made mandatory as part of the Data Protection Impact Assessment (DPIA) process and made readily accessible to workers, employees and their representatives. Employers should also be obliged to publish DPIAs.

There should be joint statutory guidance on the steps that should be taken to avoid discrimination in consequence of AI and ADM at work. Statutory guidance should be developed with input from Acas, CBI, CDEI, EHRC, ICO, and the TUC.

Human review and engagement

There should be a comprehensive and universal right to human review of decisions made in the workplace which are high-risk.

There should be an express statutory right to personal analogue engagement – an ‘in-person engagement’ – when important, high-risk decisions are made about people at work.

Privacy

Although current law (Article 8 European Convention on Human Rights) protects workers against intrusive AI/ADM that infringes privacy, there is inadequate legally binding guidance to employers explaining how these rights actually work in practice.  There should be statutory guidance for employers on the interplay between AI and ADM in relation to Article 8 and key data protection concepts in the UK GDPR.

Right to disconnect

There should be a statutory right for employees and workers to disconnect from work, to create ‘communication-free’ time in their lives.

Information

To ensure that a worker has ready access to information about how AI and ADM are being used in the workplace in a way which is high-risk, employers should be obliged to provide this information within the statement of particulars required by Section 1 of the Employment Rights Act 1996.

Employers should be obliged to maintain a register which contains this information, updated regularly. This register should be readily accessible to existing employees, workers, and job applicants, including employees and workers that are posted to sites controlled by organisations other than the employer.

Explainability

UK data protection legislation should be amended to include a universal right to explainability in relation to high-risk AI or ADM systems in the workplace, with a right to ask for a personalised explanation, along with a readily accessible means of understanding when these systems will be used.

Trade deals

No international trade agreement should protect intellectual property rights from transparency in such a way as to undermine the protection of employees and workers’ rights.

Statutory guidance on automated decision making and lawful processing

As outlined above, data protection law provides workers with some key protections, including a right to challenge data processing, as well as a right not to be subject to ADM, in specific circumstances. However, we need better guidance on how these protections operate in practice, to give more clarity to everyone at work.  We need statutory guidance on Articles 6, 21 and 22 of the UK General Data Protection Regulation.

This should include guidance on:

  • The circumstances in which an employer can lawfully process data on the basis that it is ‘necessary’ to the employment contract under Article 6(1)(b) of the UK GDPR.
  • The circumstances in which an employer can lawfully process data on the basis that it is ‘necessary’ to protect their legitimate interests or those of a third party.
  • The interplay between Article 6(1)(b) and (f) bearing in mind that the lawful basis for data processing dictates the extent to which Articles 21 and 22 can be invoked and these provisions include important safeguards in relation to the use of A I powered technologies and ADM.
  • The circumstances in which Articles 21 and 22 can be disapplied.

Data reciprocity

Employees and workers should have a positive right to ‘data reciprocity’, to collect and combine workplace data.

The use of AI at work presents many opportunities for workers. Our proposals for data reciprocity will redress the imbalance of power over data at work but will also enable workers to benefit from AI-powered tools themselves. For example, by undertaking analysis of data that can evidence and support trade union campaigning for better terms and conditions at work. Our proposals are intended to help trade unions and workers realise these opportunities.

Enable Two-Factor Authentication

To access the admin area, you will need to setup two-factor authentication (TFA).

Setup now