Skip to main content

Aboriginal and Torres Strait Islander people are advised that this website may contain images and voices of deceased people.

Artificial Intelligence (AI) Transparency Statement

Purpose

The policy for the responsible use of AI in government provides mandatory requirements for departments and agencies relating to accountable officials, and transparency statements.  It sets out the Australian Government approach to embrace the opportunities of AI and provide for safe and responsible use of AI.  This page discusses NIAA’s commitment to these policy requirements and for the ethical use of AI to enhance public services for Indigenous Australians, ensuring transparency, accountability, and inclusivity.

 

NIAA approach to AI adoption and use

Opportunities and risks

The policy for the responsible use of AI in government describes a family of technologies that brings together computing power, scalability, networking, connected devices and interfaces, and data.  AI technologies and systems built on these technologies can be programmed to perform specific tasks such as reasoning, planning, natural language processing, computer vision, audio processing, interaction, prediction and more.  AI technologies can operate with varying levels of autonomy.

These potential efficiencies present opportunities for improved outcomes for service delivery, they also present risks.  NIAA continues to review its internal policies to ensure our:

  1. AI use is appropriately governed
  2. engagement with AI is confident, culturally safe and responsible
  3. stakeholders have trust in our use of AI
  4. risks are identified and addressed
  5. access and usage is monitored.

How NIAA uses AI

At this time NIAA has not authorised the use of AI in any way that members of the public may directly interact with, or be significantly impacted by, without a human intermediary or intervention.

In September 2024 NIAA personnel gained access to a generative AI service Microsoft 365 Copilot. The usage of this service is ‘workplace productivity’ in the ‘corporate and enabling’ domain.

All staff are required to complete internal AI fundamentals training.

NIAA is considering trialling the adoption of AI as part of the Australian Government’s commitment to digital innovation through its technical partnership with the Department of Prime Minister and Cabinet (PM&C).  See the ‘Adopting emerging technologies’ section in the Data and Digital Government Strategy.

 

AI safety and governance

Operating environment

NIAA outsources the provision of its ICT network infrastructure & training from PM&C, governed under terms stipulated in a signed Memorandum of Understanding (MoU). One of the terms in the MoU is NIAA will abide by PM&C’s ICT policies and procedures.  Where a statement in this document conflicts with another PM&C AI or ICT policy then the latter will take precedence.

Governance

The governance of all NIAA’s official systems leverages PM&C ICT policies, protocols and procedures, including the PM&C AI Policy.  The policy has been developed to align with the policy for the responsible use of AI in government, and outlines requirements for:

  1. measuring effectiveness of AI services through governance and other processes.
  2. maintain compliance with applicable legislation, regulation, and whole of government policies.

Please contact us if you have further enquiries.

This statement was published on 19 February 2025 and will be reviewed annually, or when:

  1. there is a significant change to the agency’s approach to AI, or 
  2. any other new factor impacts this statement.

Feedback

Did you find this page useful?