education.vic.gov.au

School operations

Generative Artificial Intelligence

Appropriate use of generative AI tools

Teaching and learning about generative AI

Schools are encouraged to help prepare students to understand and use generative artificial intelligence (AI) tools, by supporting them to learn:

  • what generative AI is, how it works, and its opportunities and risks
  • how to use generative AI tools in a safe and responsible way, while avoiding potential harms
  • how to critically analyse and evaluate the functioning and outputs of generative AI tools.

Schools are encouraged to closely supervise student use of digital technology in the classroom, including generative AI tools, as detailed in the Digital Technologies – Responsible Use policy.

Setting up generative AI tools in a way that is safe and effective

In addition to the requirements outlined in the Policy tab, schools are strongly encouraged to configure generative AI tools settings to protect the privacy of users and restrict harmful content. For example, parameters that may be adjusted include:

  • adjusting the possible prompts that can be inputted into the tool
  • adjusting the possible length of any outputs of the tool
  • reducing the default ‘temperature value’ of the tool, so that it provides less variable, more focused outputs
  • adjusting privacy settings so that student inputs (prompts) are not disclosed to others or used to train the generative AI model
  • setting up the tool with customised standing instructions that are relevant to local context and that will apply to all responses provided by the tool (schools are encouraged to include in these instructions a requirement for the tool to not breach copyright in its responses).

Teaching students about prompting generative AI tools

Generative AI systems are typically operated by users entering a ‘prompt’ to guide the underlying model’s output. Prompts can be text-based, and can also include voice, images and other inputs. A user’s prompt may not initially produce the desired output from a generative AI tool, so in many cases prompt refinement is required. This is known as ‘prompt engineering’.

Where a school chooses to use generative AI tools, staff and/or students should be supported to reflect on and create effective prompts. They can do this by:

  • being specific about what kind of content to produce, including length, genre, style, what tone to use, and what audience to create for
  • asking the tool to act as a type of person or thing, such as what professional expertise or age bracket to assume, what species, object or concept to act as, or what historical figure’s voice to use
  • stipulating the format, such as whether to produce an essay, a poem, or a type of computer code
  • setting parameters, such as what to include or not include in the output
  • providing examples to demonstrate a feature of the desired output, such as tone, structure, or subject matter
  • using an iterative process, such as beginning with one prompt, then tweaking it with each subsequent prompt to improve the output, or change certain aspects of the output
  • asking the tool to help create a more effective prompt, by seeking more information from the user to refine desired output.

Teaching students to critically analyse and evaluate generative AI outputs

A generative AI tool’s outputs are based on the data it is trained on, which can lead it to reproduce inaccuracies or biases inherent in the training data. This cannot always be prevented using effective prompting, which makes it essential that generative AI outputs are critically analysed and evaluated.

Teachers can support students to critically analyse and evaluate generative AI outputs by teaching them to compare and verify outputs with reliable alternative sources of information. This can help to identify inconsistencies and biases in the generative AI output, while also providing an opportunity for reflection on why these inconsistencies and biases exist. It can also provide opportunities for students to learn about what makes a source of evidence reliable, and how they can evaluate different sources for their reliability.

Teachers can support students to consider the following questions:

  • Is this output relevant, applicable and fit-for-purpose?
  • Is this output true, accurate and current?
  • Is this output logical, coherent and internally consistent?
  • Is this output free from discriminatory or harmful content?
  • How can this output be improved with human modification?
Suggested actions: Teaching and learning about generative AI

School leaders can:

  • support teachers and staff to engage with professional learning regarding generative AI tools, how they work, the potential opportunities as well as limitations and biases
  • model the effective, safe and responsible use of generative AI and ensure that privacy, copyright and other legal obligations are consistently met.

Teachers can:

  • base any use of generative AI tools on evidence-based learning and wellbeing practices, as illustrated in the Victorian Teaching and Learning Model
  • engage with professional learning regarding generative AI tools, how they work, the potential opportunities as well as limitations and biases
  • initiate classroom discussions with students about generative AI tools, how they work, and how they can be used safely and appropriately
  • as student usage of generative AI tools increases, extend student learning about how generative AI works
  • model the safe and responsible use of generative AI.

Supporting safe and respectful use

Some of the inappropriate or harmful uses of generative AI tools include:

  • misrepresentation of information (for example, through the creation and dissemination of information that is wholly or partly inaccurate, that reduces a person or group’s ability to accurately understand something, or that is intentionally or unintentionally designed to cause harm to others)
  • harassment, including cyberbullying (for example, through the creation and dissemination of ‘deep fakes’)
  • privacy violations (for example, by uploading personal, sensitive or health information about another person into a generative AI tool and/or using that information to generate new content about that person)
  • cultural appropriation (for example, by creating new artefacts influenced by or mimicking a cultural tradition in a way that is disrespectful or offensive)
  • discrimination (for example, using generative AI tools to produce and disseminate biased content that reinforces harmful stereotypes or unfairly excludes individuals, communities or groups)
  • entering information that may breach another person’s intellectual property rights.

More detail: Misrepresentation of information

Generative AI tools can misrepresent information or allow others to do so, by making it easier to generate content that is untrue or harmful while seeming true or convincing. Misrepresentation of information can be both intentional (disinformation) and unintentional (misinformation). It is important that students are guided not to misrepresent information or disseminate disinformation or misinformation using generative AI tools. Below are 2 examples of misrepresentation of information that schools can consider protecting students from and guiding students not to produce:

  • Deep fakes: Generative AI tools can be used to create ‘deep fakes’, which are pieces of fake multimedia content which portray a person or aspects of a person saying or doing something they did not in fact say or do. When using generative AI tools, it is important that students are taught to build empathy and understand the impacts and consequences of creating such content. Deep fakes can cause serious psychological harm in addition to spreading disinformation, and it is important that this issue is given weight in schools and handled with care.
  • Biased content: The outputs of generative AI tools can be inaccurate or biased, even when expressed in confident language. Generative AI tools can also respond to the preferences of users, which may result in the adoption of dominant ideas and remove more diverse views, creating ‘echo chambers’ and reinforcing biases. When using generative AI tools, it is important to build compassion and critical thinking in students and support diversity.

Schools are encouraged to collaborate with students, parents and carers to establish protections to prevent the above uses, to protect the safety, wellbeing and inclusion of staff and students. Actions can include setting expectations and consequences for instances where expectations are not followed.

Suggested actions: Supporting safe and respectful use

School leaders can:

  • work with students, parents and carers to establish school-wide expectations on what appropriate use of generative AI tools looks like – this can be included in the school’s Acceptable Use Agreement (AUA)
  • consider inappropriate or harmful use of generative AI tools when reviewing the school’s Bullying Prevention policyExternal Link (staff login required).

Teachers can:

Demonstrating transparency and accountability

Human responsibility

Schools are strongly encouraged to ensure individuals who use generative AI tools have agency, judgement, and responsibility, and that outputs from a generative AI tool face human review prior to decisions being made. For example, it is important that the professional judgement of teachers remains central when assessing and evaluating student progress, sequencing student learning tasks, or providing feedback on student work.

It is also important that generative AI tools enhance rather than replace teacher and school voice. Authentic communication between teachers and students, and each school and its broader community, is crucial to building and maintaining trust.

Disclosure

Schools can build trust by disclosing the use of generative AI tools, especially when use may have an impact on others. Disclosure can be given to teachers, staff, students, parents and carers.

Example wording for simple disclosures may include:

  • 'This resource has been (wholly, substantially or partly) generated using generative artificial intelligence.'
  • 'This resource has been adapted from an output generated using generative artificial intelligence.'
  • 'This resource has been edited using generative artificial intelligence.'

A more detailed disclosure may include the following suggested format, which has been developed by Monash UniversityExternal Link :

I acknowledge the use of [insert AI system(s) and link] to [specific use of generative artificial intelligence]. The prompts used include [list of prompts]. The output from these prompts was used to [explain use].

Monitoring

Where a school chooses to use generative AI tools, they are encouraged to establish processes to monitor how well these tools are implemented and how much they impact learning and wellbeing outcomes. This can assist in determining how to change the tools’ parameters or change which tools are in use.

Alongside gathering evidence on impacts, schools are encouraged to consider the administrative and financial costs associated with the use of generative AI. Processes for monitoring can include staff, student, parent and carer surveys, as well as student performance data in subjects and year levels where generative AI tools are used.

Suggested actions: Demonstrating transparency and accountability

School leaders can:

  • provide school communities with easy-to-understand information about generative AI tools being used, why they are being used, and any risks associated with their use
  • communicate to all members of the school community their responsibilities and accountabilities when using generative AI tools
  • adopt disclosure approaches for instances where generative AI is used (for example, using the wording provided above)
  • appropriately acknowledge the use of generative AI tools in school administration in line with school disclosure policies
  • establish processes for keeping records of the use of generative AI tools
  • establish processes for monitoring the use of generative AI tools.

Teachers can:

  • exercise their own judgement when using generative AI tools and take responsibility for decision making where generative AI tools are used
  • appropriately acknowledge the use of generative AI tools in teaching practice in line with school disclosure approaches.
Includes information on teaching and learning about generative AI and setting up generative AI tools in a way that is safe and effective

Reviewed 12 September 2024

Was this page helpful?