A snapshot of workers in Wales’ understanding and experience of AI

Ceri Williams
Policy officer - Wales TUC
Report type
Research and reports
Issue date
Equalities and wider social harms are a cross cutting concern
Running throughout the varied experiences described above are cross cutting concerns. The impact of AI on workers with protected characteristics is a notable example.

It is widely accepted that AI tools can exhibit bias and produce discriminatory effects in the absence of appropriate governance and usage in hiring, firing and performance management.

According to the House of Commons science and technology committee,

“Algorithms, in looking for and exploiting data patterns, can sometimes produce flawed or biased ‘decisions’ … that can result, for example, in race or gender discrimination in recruitment processes.”

A specific concern was the deferring of judgement and decision-making to computer systems, removing contextually relevant information.

For the participants in our research, the use of de-contextualised data to aid or make decisions raised concerns regarding prejudice. A worker in the logistics sector cited how the blanket application of algorithmically determined targets are “set regardless of age or disability” and that “people are managed out if they can't meet their target” regardless of other factors.

This appears to run contrary to the Information Commissioner's Office guidance on fairness in AI that states that

“Context is key: the conditions under which decision-making takes place is equally important as the decision-making process itself.”

Equalities and wider social harms are a cross cutting concern

A trade union official said that he had seen how algorithmically determined targets could discriminate against people with protected characteristics.

“If someone is pregnant their ability to do work quickly may be lower,” he said, “but the algorithm doesn't take this into account and the worker will be penalised as a result”.

Wider social harms were raised by participants. A journalist cited concerns regarding editorial integrity and the health of the information ecosystem consumed by the public.

“When it comes to the data scraping which informs generative AI models, these are influenced by biases and prejudices in existing materials.  If there is far less human involvement in creating news reports, then these biases will be reinforced.  It’s quite scary and a big issue.”

The impact of AI on the Welsh language was discussed in terms of workers and users of public services. Concerns were raised about the ability of large language models to properly understand informal written Welsh - it was stated that this could be a problem if public bodies relied on AI tools to analyse responses to public consultations and risked ignoring some people’s views.

Whether it relates to decision-making in the workplace or creating news articles, workers are concerned that AI could increase inequality in the labour market and society. This concern bridged into perhaps the largest issue of concern for workers of rapid technological change, and that has been the topic of intense public discussion: job losses and displacement.

A photographer told us

“AI is being used to create photographic images. This is a threat to photographers’ work opportunities.”

A college lecturer asked whether his course notes would be used as “a source of material for AI software to make me redundant?”

One full time official for a retail sector union said there had been a “massive” recent layoff of staff due to new technology, including AI, saying that

“at one time 1,000 people were working at a large superstore. Now it's 250 and they are mostly part time workers.”

This intersects with equalities concerns, as more vulnerable workers may be most affected.

A 2019 study by the Royal Society for the Encouragement of Arts, Manufactures and Commerce found about 75,000 jobs as sales assistants or checkout operators previously taken by women have gone in the last seven years due to automation and e-commerce.

While men lost 33,000 of the jobs over the period 2011 to 2018, these were largely offset by increases in roles in warehouses and as delivery drivers . However, a retail worker union officer told us that many of these jobs are precarious and outsourced, meaning that reaching and organising these workers is particularly hard for union reps.

Recent research suggests that ‘professional occupations’ are most exposed to AI. The UK government found that this was the case

“particularly for those [jobs] associated with more clerical work and across finance, law and business management roles.”

The impact by gender is highly differentiated, with women generally significantly more exposed to AI related impact than men.  Notably, these private sector industries tend to have very low union density.

Across all areas of employment, AI is having a significant and varied impact on a range of different workers.  Therefore, as a next step, Wales TUC will continue  to monitor the development and impact of AI in the workplace.

More from the TUC