Promoting academic integrity
Generative artificial intelligence (AI) tools can provide the opportunity for students to develop higher order critical thinking, analytical and evaluation skills, and can create efficiencies which allow students to focus more of their time on the skill being assessed. However, these tools can also make it easier for students to undermine forms of assessment that are in use today.
Learning design approaches and processes for ensuring academic integrity need to be responsive to these opportunities and risks.
To support schools to operationalise the Australian Framework for Generative Artificial Intelligence in guiding statements on ‘learning design’ and ‘academic integrity’, this chapter is structured in 3 parts:
- setting expectations and building a culture of academic integrity
- designing assessments with consideration of generative AI tools
- identifying and responding to the inappropriate use of generative AI tools in assessments.
Setting expectations and building a culture of academic integrity
Academic integrity refers to the expectation that students behave honestly and ethically in the completion and assessment of student work. Maintaining academic integrity ensures students are assessed fairly and teachers, students and parents/carers get an accurate understanding of student progress to help guide further teaching and learning.
Schools can support a culture of academic integrity by providing students with clear expectations about values, responsibilities and behaviours related to learning and assessment, and by responding to breaches of academic integrity in a consistent and proportionate way.
Some schools already have local academic integrity or honesty policies to support a whole-school culture of academic integrity or use the department’s acceptable use agreement templates to agree on behavioural expectations when engaging in digital learning activities. Schools’ existing acceptable use agreements can be updated to include the use of generative AI tools.
Schools’ expectations about how students can use generative AI with academic integrity can include the following:
- definitions for academic integrity, cheating, and plagiarism
- age-appropriate guidance on citing and referencing generative AI tools (where appropriate, schools can adopt the use of established referencing systems such as the American Psychological Association’s APA , or simplified approaches, such as outlined in the above guidance on disclosure)
- providing examples of academic misconduct using generative AI tools
- clearly outlining the consequences for academic misconduct using generative AI tools.
VCAA’s VCE and VPC Administrative Handbooks
While written for VCE, the following sections of VCAA’s VCE Administrative can apply more broadly and provide examples for acceptable student behaviour: ‘Integrity of VCE school-based assessments’, ‘Rules for authentication of school-based assessment’, ‘Strategies for avoiding authentication problems’. The following sections of VCAA’s VPC Administrative are also relevant: ‘Integrity of student work’, ‘Authentication’.
Suggested actions: Setting expectations and establishing a culture of academic integrity
School leaders can develop Acceptable Use Agreements (using DE templates) to ensure students are aware of expectations when engaging in digital learning activities.
Designing assessments with consideration of generative AI tools
Selecting the right approach for assessment
Teachers already have well established rules and practices in place to ensure that student assessments are tailored to their purpose and administrated fairly. For example, many assessments will include time-limits and rules for whether questions are provided beforehand, what notes and materials are allowed, and whether the assessment can be taken home for completion.
Similarly, teachers are encouraged to clearly establish whether and how generative AI should be incorporated into any given assessment, based on its purpose and context. This can reduce the scope for students to use generative AI tools uncritically and without acknowledgement.
Assessments such as diagnostic or high-stakes summative assessments are likely suited to approaches that prohibit or limit the use of generative AI tools. These approaches are most appropriate where the use of generative AI tools may prevent the accurate measurement of student understanding, or inhibit fair comparisons between students.
Assessments such as research projects and portfolios are likely suited to approaches that allow some use of generative AI, often with modifications, to ensure that the intended skills and knowledge are fairly assessed. In some cases, the use of generative AI tools may enhance student assessments by making them more efficient or focused on measuring the intended skills or knowledge. In this case, teachers may consider encouraging or even requiring use of generative AI tools.
Options for assessment design
Prohibit the use of generative AI where it will prevent teachers from gaining an accurate understanding of student learning – for example:
- require that students complete work in class without access to a computer, or under supervision
- require that students write in their own words and authenticate this by signing and acknowledging
- require that students complete their work using a department or school-supplied online word processor or collaboration platform, so that their progress can be monitored in real time, including whether text has been copied from a generative AI tool into a document.
Limit certain uses of generative AI while permitting others – for example:
- select assessments that cannot be fully completed by a generative AI tool, such as an oral presentation, but where research using generative AI tools is permitted.
Modify tasks to reduce the scope for students to use generative AI tools uncritically and without acknowledgement – for example:
- require that students use or refer to specific sources when completing an assessment task
- require that students submit different elements of a written task, including brainstorming, drafting, revising, and editing
- ask students to demonstrate or apply knowledge of a topic in relation to their own personal experiences
- use open-ended questions that require students to think critically and creatively and apply local context or information that a generative AI tool would not have access to.
Incorporate the optional use of generative AI tools – for example:
- require that students explain the reasoning behind their answer, so that the teacher can assess whether students truly understand and can apply what they have written
- require that students cite any use of generative AI tools, including (where relevant) the prompts entered and the outputs provided
- allow students to discuss or reflect on their work (for example, a small group discussion, or a video reflection completed individually)
- support students to critically analyse and evaluate the outputs of a generative AI tool, including asking them to refine and improve their prompt to produce higher quality outputs and/or asking them to cross-reference outputs with credible and verifiable sources.
Encourage the use of generative AI tools to support improved learning outcomes – for example:
- ask students to brainstorm and refine generative AI prompts which can help them gather information on the set topic
- encourage students to copy their draft assignments into a generative AI tool to ask for specific suggestions about how they can be improved.
Require, where the purpose of the assessment is to use the tools – for example:
- that students use generative AI tools to assess their ability to craft inputs and generate outputs, or their ability to critically analyse and evaluate outputs.
Suggested actions: Designing assessments with consideration of generative AI tools
School leaders can:
- review school assessment design approach with teaching staff
- work with staff and the school community (for example, through professional learning communities) to define acceptable student use of generative AI tools in assessments.
Teachers can:
- design assessments with consideration of the appropriate approach for whether and/or how generative AI tools can be used
- communicate how generative AI tools can or cannot be used in assessments.
Identifying and responding to inappropriate use of generative AI tools in assessments
School leaders and teachers are encouraged to identify and respond to academic misconduct using generative AI tools in a consistent manner in line with any local academic integrity policy.
Examples of academic misconduct using generative AI tools can include:
- plagiarism – students using generative AI tools to complete their schoolwork (for example, essays, presentations, images, music, and other assignments) and then presenting this work as their own without attribution
- cheating – students using generative AI tools to help them cheat in assessments (for example, by using generative AI tools to produce responses to questions on tests and quizzes).
Identifying when students are using these tools dishonestly can be challenging given the sophistication of generative AI outputs. Schools can reduce the risk of misconduct by designing assessment with consideration of AI tools and authenticating student work where misconduct is possible.
Authenticating student work
Schools can identify potential instances where submitted work may not have been wholly completed by the student, including submitted work that:
- is significantly different from the student’s usual level of performance
- is not in the student’s usual writing style
- contains inconsistencies, such as different writing styles or different text formats
- is similar to the output from popular generative AI tools when provided with the assessment question.
Teachers are encouraged to continue using well established processes to authenticate student work, including asking students to explain how they completed an assessment and discuss or present their understanding of the main ideas.
Using plagiarism detection tools
While plagiarism detection tools may have some capacity to identify the potential use of generative AI tools, they can often provide inaccurate or inconclusive results. These can include false positives (incorrectly identifying plagiarism), false negatives (failing to detect actual plagiarism), and accidental self-plagiarism (where students reuse their own work). Teachers are encouraged to use alternative methods for validating the integrity of student work, such as asking students to explain submitted work that does not seem representative of their previously demonstrated knowledge or skills.
Plagiarism detection tools may require the collection and storage of student data, such as typed essays which may contain sensitive content. Any use of plagiarism detection tools by staff, or students at the direction of staff, must comply with the Digital Technologies – Responsible Use and Privacy and Information Sharing policies. If schools elect to use a third-party plagiarism detection tool, they must follow the steps outlined in the ICT Software in Schools – Risk Assessment policy.
Suggested actions: Identifying and responding to the inappropriate use of generative AI tools in assessments
School leaders can:
- support staff to identify and respond to inappropriate use of generative AI tools in a consistent and proportionate way, for example, through discussions or examples presented within professional learning communities.
Teachers can:
- identify potential inappropriate uses of generative AI tools in assessments
- authenticate student work using established and transparent strategies
- respond to inappropriate uses of generative AI tools in assessment tasks in a consistent and proportionate way.
Reviewed 12 September 2024