How workers at a further education college are taking the initiative on AI deployment

Published date
Wales TUC hosted a meeting with reps from a College

Artificial intelligence is having a dehumanizing effect on workers as they are continuously monitored. What’s more, it’s leading to workers being deskilled, their tasks restructured and sometimes managed out of their jobs.

These were the views of an experienced group of trade union representatives and members in further education, as part of a series exploring the views of workers of digital and AI at work, hosted by the Wales TUC and supported by Dr Juan Grigera from Kings College London and Adam Cantwell-Corn from Connected by Data.

At a further education college in Wales, the union reps and members identified a number of concerns related to AI and digitalisation more broadly and also outlined some of the strategies to respond. Education as a sector is going through a unique moment of transformation, including not just the use of AI by management and the changes that digitalisation is bringing to the classroom but also the use by students. Questions, curiosity and concerns were as to what extent it would augment, displace or deskill educators and whether they can take AI in their hands too.

“AI could lead to fewer paid hours”

As in other sectors, job security came as one concern. With the prospect of widespread deployment of generative AI, the education sector could be facing redundancy or shift of tasks of specific roles both in operations and within teaching roles. Participants expressed concerns over the AI system's abilities for preparation of learning materials, assessment and even instruction. While fundamental questions persist about the quality of AI systems versus qualified educators, there could be strong incentives for displacing or deskilling a workforce over much hyped AI solutions. 

Furthermore, there was a concern about the erosion of skilled personnel: there is already a tendency to replace traditional lecturers with demonstrators and this could be expanded - potentially with hybrid roles that combine teaching and demonstrating. The incentive for employers is obviously that these hybrid roles come with a lower salary compared to that of a standard lecturer.

"I have collected my data over many years – course materials and lectures. In terms of their content, they are still relevant and usable for classes. Now these are on Google’s server - will these serve as a source of material for AI software to make me redundant?"  (D, a lecturer)

For some of the participants, the introduction of EdTech without meaningful participation and clear governance mechanisms was an extension of an impenetrable bureaucracy and difficulty for college workers to have a say in the decisions that affect them.

“There is no getting around the lecturer”

"One of the biggest strengths in education is that they are not getting rid of human lecturers anytime soon. They can try to push more students into the classroom or to extend our working hours, though both have limits. The human touch is needed for students to engage and that is our strength." (A, a lecturer)

Reps also identified some important strengths in workers' positions. One key pillar is the existence of a national contract and workload agreement, along with the framework of national negotiations. These elements provide a safety net from management pressures and technological change. Moreover, the constraints imposed by Welsh qualification bodies, particularly those related to contact hours, have contributed to limiting the further extension of remote or asynchronous modes of teaching.

The assessment model is experiencing a shock - so is the output of education

The emergence of Large Language Models (LLMs) like ChatGPT has brought about massive implications for the methods by which students are assessed. One notable concern raised was the potential for a decrease in the rigour of learning and the value of qualifications.

With the widespread use of LLMs, there is a possibility that the reliance on AI-generated content might render some methods of assessment less effective - if these are not changed then qualifications could be awarded without genuinely possessing the necessary skills or understanding.

While it was clear it is the college's responsibility to devise a strategy to respond, how they will do it is yet to be seen. Furthermore, alongside questions about pedagogy and learning outcomes, there is significant concern about how liability or accountability will be determined.

“We have received wildly different advice on the student’s use of AI. I was recently told by a manager in the college that we do accept essays prepared with AI by students and that we should discuss the content with a learner to check they do understand its contents. However, we were then told – in a letter from an awarding body – that accepting work which included elements prepared by AI would be “malpractice”. - I am terrified for colleagues and myself. I have been in this job, and now I am in a situation where I might be punished if an external moderator can now find me guilty of malpractice for not spotting the use of AI.” (S, a lecturer)

What if “we drive it”?

“AI is probably coming to the classroom, so with this pilot we will try to get hold of it. One of our opportunities is that we drive it.” (S, a lecturer)

In a context of stark uncertainty, UCU reps at the college have thought of taking the lead. A union rep led pilot is under development for staff to use AI tools with a focus on the use of AI to reduce workload. This pilot, run by reps, will aim to reduce lecturers ‘contact hours,’ that is the number of hours spent face to face with students – freeing lecturers to plan and prepare for classes and other academic work.

Reps want to explore the application of AI to “schemes of work” that guide the educational curriculum. Starting with the principle that the younger generation could greatly benefit from AI assistance, the reps both intend the pilot to explore how to manage their own work but also to responsibly facilitate its use by students, fostering a more effective and supportive educational environment.

The reps do recognise the risk of further embedding AI - from it being co-opted by management or creating further problems for workloads and admin. But they seem committed to get on the front foot of a technological change. 
As one of the reps said, “Improve, don’t dehumanise”, that should be the aim of any innovative technology!”

*This write up is from the third workshop in the project. The first write up is available here, and the second one here.