Publication Source

It’s been a busy time for QAA responding to the rise of generative AI. After three webinars attended by over 2,000 people, discussions with providers across the UK, a brief trip to Tbilisi to share best practice with European quality agency colleagues, one piece of guidance out and another to be published imminently, we have been reflecting on the perspectives of the technical experts, academic staff and students that we brought together. So, what have we learned? 

All our event speakers agreed that banning generative AI in tertiary education is not only unfeasible but undesirable, including in assessment. Detection tools lag behind the software itself – we will shortly see this technology embedded in tools we use every day and being able to use it effectively will become a desired skill in the labour market.  

That said, we need to avoid scenarios where students and learners are using AI to churn out work without effort or critical thought. In response, we should help students understand the limitations of these tools, appreciate the negative effect that overreliance can have on their own learning experience, and focus assessment on the learning process, not just the output. 

This technology, the change it poses to higher education and its implications for academic integrity are not wholly new. During the first of our webinars, Dr Bronwyn Eager likened the emergence of ChatGPT to a fish asking ‘what is water?’. From spell check to learning analytic platforms, AI has been the ‘water’ all around us for some time. It’s the open access element that is new. 

EdCentral Logo