University of Leeds’ traffic light system challenged as students admit widespread AI use – Bundlezy

University of Leeds’ traffic light system challenged as students admit widespread AI use

The University of Leeds’ traffic light system has come under scrutiny as students admit to widespread use of artificial intelligence (AI).

Implemented to outline appropriate AI use for assessed work, Leeds’ traffic light system categorises all assignments as either red, amber or green.

The colours indicate whether students are allowed to use generative AI tools in their assessments, and what the boundaries are. Where a piece of work falls into the green or amber category, meaning some use of AI is allowed, students must still state what they used and how they used it.

However, students have told The Leeds Tab they use AI extensively, regardless of the colour their assessments are coded, raising questions about the effectiveness of the policy.

As universities face mounting challenges in distinguishing student work from AI-generated content, the University of Leeds joins a growing number of higher education institutions developing policies which outline appropriate usage.

According to a survey by The Leeds Tab, 17 per cent of respondents admitted to using AI even in “red” assignments, with a further 17 per cent saying they bend the amber rules sometimes. Whilst the majority of students said they never use AI for assessed work (39 per cent), 27 per cent of respondents appeared unaware of Leeds’ traffic light system entirely.*

via Canva

In May, the BBC’s education reporter, Vanessa Clarke, visited the University of Leeds to see how the institution is grappling with the nationwide problem: How universities can tell the difference between work written by students, and work written by AI.

Research has shown that software to detect AI use is becoming increasingly untrustworthy, with students reporting false accusations of academic misconduct and studies emerging that programmes flagging AI may discriminate against non-native English speakers.

BBC Radio 4 found the University of Leeds to be an especially interesting case study in how educational institutions can adapt to AI, due to flagship policies like the traffic light system which are seen as “embracing” the technology. Leeds is amongst other universities that have signed up to a code of conduct to support staff and students in becoming “AI literate”.

According to statistics released earlier this year, 92 per cent of students now use some form of AI, so The Leeds Tab spoke to students about how AI fits into their studies, and whether they feel the university is adapting appropriately to such a seismic technological shift.

Danielle, whose name has been changed, said: “As a fourth-year student, I’m in a weird cohort because I’ve experienced uni pre and post AI, and I can honestly say now that I can’t believe I did the first two years of my degree without it.”

Danielle is a humanities student, and she says most of her work is assessed through coursework and essays, which, for her, are marked red on Leeds’ traffic light system. This means AI shouldn’t be used at all.

Asked whether she uses AI for work in this category, she said: “I use AI for all my assignments, from planning to actually writing stuff. I’d never just copy and paste what it says, because that’s how you get caught, but it’s so easy to just change words and phrases a bit and then it doesn’t get flagged.”

The Leeds Tab also questioned Danielle on whether she feels guilty about her usage, or whether she feels that students who use AI extensively should feel guilty at all.

The fourth year student doesn’t believe anyone should feel shame regarding their use of large language models, like ChatGPT, because she views it as the university’s responsibility to adapt to what she calls “a new reality”. In her view, universities need to be ahead of the curve, changing the way work is assessed to counter the ability of students to misuse AI.

Isaac, whose name has also been changed, told The Leeds Tab he uses AI for his assignments. During online tests, he said he always has ChatGPT open so that he can check answers on there. He explained that this has meant he hasn’t had to do any of his readings for this year, “which has saved [him] a lot of time”.

Some of Isaac’s assignments are coded amber, meaning AI can be incorporated to an extent, and some are red. However, Isaac said he uses it regardless, and only uses the traffic light system to ascertain how cautious he needs to be about submitting AI-generated text.

“I don’t know why I wouldn’t be using it in all my work, especially when everyone else is and there’s no way to tell,” he added.

Not all students are on board with the shift, though. Grace, a final year English language student, said that students using AI for assignments equates to “laziness”.

“I’m not honestly saying there isn’t a place for AI; obviously there is. But, to be honest, I just don’t get why – for my course especially – you’d choose to study and then just offload all of your degree to a computer […] What have you learnt?” she said.

The Leeds Tab asked Grace how she would respond to the argument that students who are better equipped to use AI could be more likely to thrive in AI-incorporated workspaces, meaning she may be at a competitive disadvantage by choosing not to use large language models.

The English student dismissed this as a concern, saying that using AI to help structure essays was vastly different to using it to generate an essay entirely: “Students who haven’t developed any critical thought because they’ve barely even engaged with their degree will be the ones most likely to struggle in the workplace, and I don’t think that’s being discussed enough.”

The University of Leeds has not yet responded to requests for comment.

*Based on 157 respondents

Featured image via Canva

About admin