education.vic.gov.au

School operations

Generative Artificial Intelligence

Protecting privacy and personal data

Privacy and data risks

Generative artificial intelligence (AI) tools pose unique privacy and data protection risks that are important for school communities to be aware of. For example:

  • generative AI providers may request or require student data that includes personal, sensitive or health information to provide access to their generative AI tools
  • students or other members of the school community may upload personal, sensitive or health information belonging to themselves or others without being aware of the associated privacy and data protection risks
  • some generative AI tools have the capacity to match data inputs with other information about individuals, which increases privacy risk through building individual data profiles
  • the safety and storage processes of data inputs into generative AI tools may be unclear
  • it may be difficult or impossible for a generative AI model to forget personal, sensitive or health information once it has been uploaded
  • inputs copied from the internet or from untrustworthy sources, when pasted as a prompt into a generative AI tool, may cause the tool to behave in unexpected or unsafe ways (for example, an image copied from the internet and pasted into a multimodal generative AI tool, may have invisible text embedded within it that the user was not aware of). This is sometimes referred to as a ‘prompt injection’
  • content generated by generative AI about a person can constitute a new collection of personal information about that individual, which can be seen as unreasonably intrusive and therefore a breach of privacy.

Suggested actions: Protecting privacy and personal data

School leaders can:

  • select tools that limit the unnecessary collection or processing of personal, sensitive and health information (either by default or by allowing schools to adjust the settings of the tool)
  • adjust the settings of generative AI tools used by students and staff to protect their privacy
  • monitor the use of tools to ensure that personal information is not uploaded in tools that share this data with third parties, use this data to train a generative AI model, or save or store this data for future use by the provider of the tool.

Teachers can:

  • only collect or upload data necessary for the task at hand, de-identify data before uploading it, use strong encryption to protect data, and only give data access to authorised personnel
  • direct students to avoid uploading personal, sensitive and health information to generative AI tools
  • guide students to be careful when using prompts based on inputs copied and pasted from the internet or from untrustworthy sources.
Includes information on the privacy and data protection risks posed by generative AI tools

Reviewed 17 June 2024

Was this page helpful?