More than half of undergraduates say they consult artificial intelligence programmes to help with their essays, while schools are trialling its use in the classroom.

A survey of more than 1,000 UK undergraduates, conducted by the Higher Education Policy Institute (Hepi), found 53% were using AI to generate material for work they would be marked on. One in four are using applications such as Google Bard or ChatGPT to suggest topics and one in eight are using them to create content.

Just 5% admitted to copying and pasting unedited AI-generated text into their assessments.

Teachers are also seeking to use AI to streamline their work, with the Education Endowment Foundation (EEF) signing up secondary schools for a new research project into the use of AI to generate lesson plans and teaching materials as well as exams and model answers.

Dr Andres Guadamuz, a reader in intellectual property law at the University of Sussex, said it was no surprise that more students were adopting AI and suggested institutions needed to be explicit in discussing how best to use it as a study tool.

“I’ve implemented a policy of having mature conversations with students about generative AI. They share with me how they utilise it,” Guadamuz said.

“My primary concern is the significant number of students who are unaware of the potential for ‘hallucinations’ and inaccuracies in AI. I believe it is our responsibility as educators to address this issue directly.”

The Hepi survey found that one in three students using AI did not know how often it “hallucinates”, ie invents statistics, academic citations or book titles to fill in what it perceives to be gaps.

Guadamuz said he had essays handed in last year that clearly used unedited ChatGPT output, given away by the “boring” style in which they were written. But as AI usage has spread, the survey found fewer students were willing to use it.

EdCentral Logo