Lego version of Michael Knight and KITT

Image by EROL available under CC licence

Kenji Lamb, Delivery and Engagement Partner at CDN, discusses the impact of AI on education, with a focus on challenges in student coursework and recent guidance from the SQA and JCQ. He also offers practical tips for educators on navigating AI use in academic settings.

I used to open my GenAI talks with a slide of Michael Knight and KITT from the 1980s Knight Rider TV show – a symbol of the once-dreamlike reality of talking, self-driving cars that captivated my wide-eyed teenage imagination. AI now underpins much of my work, from running deep research queries via Perplexity (as I can’t afford the ChatGPT equivalent) to talking through workshop experiences with the Copilot app* during my drive home.

Still – having technology in your pocket designed to answer every question (not always accurately) creates challenges. One of the issues highlighted in SQA’s latest report on GenAI was the worry that students were using AI to help them produce coursework. With Jisc and others advising that detectors aren’t the solution, it means lecturers need to rely more traditional approaches, aided by policy set by institutions and awarding bodies.

I wanted to share the discussion we’ve been having in presentations and workshops:

  • SQA’s position in the 24/25 session is that candidates are not permitted to pass off AI-generated work as their own (i.e. plagiarism), and that AI cannot be referenced as a source (because of reliability issues). The wording in some documentation suggests that AI can’t be used at all, but this isn’t the case.
  • The Joint Council for Qualifications (JCQ) updated its AI guidance in February 2024. It clarifies that AI can be used for feedback and grading, as long as human is still in the loop. Although AI detectors weren’t discounted, the emphasis was on ‘knowing your students’ with tips to recognise AI-produced content. Where AI use was permitted, referencing was key. The guidance also included useful examples of malpractice, as well as a poster and presentation to help inform students.

The challenge persists in defining clear boundaries for AI use; my take is that AI should be viewed as any third-party assisting learners with coursework. While peers, friends, or relatives have always offered help, the limits of this kind of assistance have always been well-defined. The SQA addresses this in their Guidance on conditions of assessment for coursework (October 2023), detailing what constitutes ‘Reasonable Assistance’ with examples.

It’s important that we help prepare students (and ourselves) for an AI-enabled future, my advice would be:

  • If you haven’t already, find the time to play around with an AI model – Copilot is a good place to start and you’ll learn a lot just by asking it questions and seeing what comes back!
  • Connect with colleagues at your college who are using AI. Feel free to reach out to me as well if you’d like to discuss further.
  • Consider the very real issue of age restrictions; MS Copilot can only be used by those aged 18 and over which might prove to be a challenge in some contexts.
  • Once you’re comfortable using AI, review your college’s policies and consider collaborating with students to create a class-specific AI policy. Jisc has a good resource around Learner Guidance as a starting point.

 

*To access voice via the Copilot app, I login with a personal MS account so don’t have the data protection offered by my work account; this means I don’t share private or sensitive information.

Share This Story
FacebookTwitterGoogle+EmailShare