JCQ Guidance: AI Use in Assessments

Posted on 13th April 2023

Posted by Chris Richards

Estimated reading time: 5 minutes

Share:

Just before Easter, the Joint Council for Qualifications (JCQ) published a guide to Artificial Intelligence (AI) Use in Assessments. According to JCQ this guidance is intended to provide teachers and assessors “with the information they need to manage use of AI in assessments” to protect the integrity of qualifications.

According to the wording of the document, there is also a requirement for centres to update their policies to include the use of AI, or to draft and issue a brand new malpractice policy regarding the use of AI:

“Centres will already have agreed policies and procedures relating to assessment in place to ensure the authenticity of assessments. Centres must now ensure that these can also address the risks associated with AI misuse. Teachers, assessors and other staff must discuss the use of AI and agree their approach to managing students’ use of AI in their school, college or exam centre. Centres must make students aware of the appropriate and inappropriate use of AI, the risks of using AI, and the possible consequences of using AI inappropriately in a qualification assessment.”

We would highlight the importance of updating or drafting an AI policy to all exam officers and centre managers around the country. Given the timing of the announcement, it appears likely that centres may be asked about their AI policy when inspected during the Summer 2023 exam season.

JCQ does offer several key points of advice for centres to follow alongside updating the malpractice and plagiarism policy to acknowledge the use of AI, including:

  • Explaining to students the importance of submitting their own independent work and giving clear guidance on how to reference and acknowledge AI use appropriately;
  • Ensuring teachers and assessors are familiar with AI tools, their risk and misuse;
  • Consideration as to whether students should be required to sign a declaration that they have understood what AI misuse is and, if so, reinforcing the significance of this declaration, and the consequences of a false declaration.

The document also offers a number of points of guidance to support educators and centre staff to become aware of and prevent misuse. This is especially important when that misuse is considered to be plagiarism, for example if a student submits work that is not their own and has been written by a generative AI tool.

This advice includes sensible practices such as setting reasonable deadlines and allocating time, where appropriate, for work to be done in class under direct supervision. JCQ also recommends assessors examine intermediate stages in the production of work, if possible. Both can help teachers and assessors verify that the work is original.

Teachers and assessors are also advised to look out for potential indicators of AI use, such as if a student’s writing style becomes more complex than usual, if their spelling becomes Americanised, or if their references appear to be made up. These are sensible tips for the detection of AI use in the classroom or homework, but for assessment they will mainly be relevant for centre-assessed work where assessors have sufficient context for specific students.

JCQ’s guidance also suggests that assessors watch out for out-of-date information and that centre-defined tasks are topical, probably due to the fact that GPT-3 and 3.5 were only trained on historical data up to September 2021. However, this advice may be undermined in the future given the speed at which more sophisticated generative AI tools are being developed, with the possibility of newer language learning models likely to be trained on more recent data than those currently available.

While JCQ acknowledges that “there may be benefits to using AI in some situations”, the potential for misuse by students “either accidentally or intentionally” means centres should consider restricting access to online AI tools on their devices and networks. Though we would not offer blanket advice for every context, there are many circumstances where AI is beneficial in education. We would advocate for educating teachers and learners to be educated on how best to use AI tools safely and ethically where possible, rather than simply restricting access.

Most immediately, however, we would advise all centres to update or redraft their malpractice/plagiarism policies to acknowledge AI technology, and to reference JCQ’s new guidance document.

Chris Richards is Deputy Head of Curriculum at CENTURY, and a former Assistant Head of a secondary school responsible for Examinations and Assessment.

Learn more about how generative AI tools have fared with CENTURY’s assessment questions here.