top of page

Establish an Australian AI Safety Institute

Updated: 15 hours ago



Author:  Dr Renee Wright | Publish date: 07/9/2024


Problem Identification: 

Australia has yet to establish an artificial intelligence (AI) Safety Institute (AISI).


Australians For AI Safety (AAS) noted that Australia has not fulfilled its commitment under the Seoul Declaration to establish an AISI. 


This could lead to unregulated innovations, undermining the country’s ability to ensure that AI development remains safe and beneficial. This may result in Australia falling behind in managing the rapid advancements and inherent risks associated with AI.


Context: 

In Australia, AI regulation is governed by several pieces of legislation that apply haphazardly, including the AI Ethics Framework, AI Technology Roadmap, and the review of the Privacy Act 1988 (Cth). The Australian Human Rights Commission (AHRC) shared that currently AI operates in a regulatory environment that is ‘patchwork at best’. They reported this framework allows AI to proliferate without adequate protections against human rights harms.


Public Service Act 1999 (the Act), s 65 allows for the establishment of non-corporate Commonwealth entities, also known as statutory agencies or bodies, within the Australian Public Service. The power to establish one of these bodies under the Act is given to the Governor‑General (GG). The GG will act on the advice of Ministers.


As one of 28 signatories to the Bletchley Declaration, Australia affirmed that AI should be used safely and humanely, with risks mitigated through AISIs to bolster international cooperation and understanding. The Seoul Declaration, endorsed by Australia, calls for countries to establish and expand AISIs. AAS noted that despite these endorsements, Australia has not fulfilled its commitment under either declaration. 


AAS’s May 2024 submission to the Select Committee on Adopting AI recommended establishing a separate institution from the National AI Centre; an Australian AISI. They suggested that an Australian AISI  could evaluate advanced AI systems, drive foundational safety research of frontier AI systems, and partner with national and international peers on AI safety. AAS stated this could ensure the security of cutting-edge AI models, drive AI safety research, and fulfill Australia’s commitment to international collaboration. Their submission was supported by 42 different expert organisations.


There is significant international precedent to establish AISIs. The EU, US, UK, Canada, Japan, and the Republic of Korea are leading such efforts by establishing AISIs. 


There is precedent for using s 65 of the Act, as evidenced by the creation of the Australian Submarine Agency (ASA) in 2023. 


Solution Identification: 

Under section 65 of Public Service Act 1999 (Cth) establish an Australian AI Safety Institute.


This could help support the protection and security of advanced AI models.


Advice: 

The Minister for Industry and Science should ask the Governor General to establish an Australian AISI under section 65 of the Public Service Act 1999 (Cth) at the next opportunity. 




Public Support: 

The Good Ancestors Project: https://www.goodancestors.org.au/


Where to go to learn more: 

  1. Australians for AI Safety. (2023, July 26). Letter to the Minister for Industry, Science, and Technology. https://www.australiansforaisafety.com.au/july-2023-letter

  2. Australians for AI Safety. (2023, July 21). Australian AI leaders join global call for safe AI. https://www.australiansforaisafety.com.au/media

  3. Australian Human Rights Commission. (2021). Human rights and technology: Final report.https://humanrights.gov.au/our-work/rights-and-freedoms/publications/human-rights-and-technology-final-report-2021 


Conflict of interest/acknowledgment statement: 

N/a.


Support 

If your organisation would like to add your support to this paper, or suggest amendments, please email Info@foreaustralia.com


Reference List

Australians for AI Safety. (2024, May 10). Letter to the Select Committee on Adopting AI.https://www.australiansforaisafety.com.au/ 

Australians for AI Safety. (2023, July 21). Australian AI leaders join global call for safe AI. https://www.australiansforaisafety.com.au/media

Australian Government, Department of Industry, Science and Resources. (2023, November 2). The Bletchley Declaration by countries attending the AI Safety Summit, 1–2 November 2023. https://www.industry.gov.au/publications/bletchley-declaration-countries-attending-ai-safety-summit-1-2-november-2023

Australian Human Rights Commission. (2021). Human rights and technology: Final report.https://humanrights.gov.au/our-work/rights-and-freedoms/publications/human-rights-and-technology-final-report-2021 

Bell, G., Burgess, J., Thomas, J., & Sadiq, S. (2023, March 24). Rapid response information report: Generative AI - Language models (LLMs) and multimodal foundation models (MFMs). Australian Council of Learned Academies.https://www.chiefscientist.gov.au/sites/default/files/2023-06/Rapid%20Response%20Information%20Report%20-%20Generative%20AI%20v1_1.pdf

Department of Industry, Science and Resources. (2024, May 24). The Seoul Declaration by countries attending the AI Seoul Summit, 21-22 May 2024.https://www.industry.gov.au/publications/seoul-declaration-countries-attending-ai-seoul-summit-21-22-may-2024#seoul-declaration-1 

Good Ancestors Project. (2023). Submission Paper: Commonwealth of Australia AI Inquiry. https://www.goodancestors.org.au/commonwealth-senate-inquiry-on-adopting-ai 

Parliament of Australia. (2023, May). Public sector: New entities, and investments in companies.https://www.aph.gov.au/About_Parliament/Parliamentary_departments/Parliamentary_Library/Budget/reviews/2023-24/PublicSectorInvestments


Comments


Commenting on this post isn't available anymore. Contact the site owner for more info.

Join the Waitlist for Our New Weekly Newsletter

Got an Idea?

We're always looking for expert-led, evidence-based solutions to explore.

 

If you have an idea you think we should look into, share a few quick details:

Otherwise email: info@foreaustralia.com

FORE Australia

Reach Out to FORE Australia

Disclaimers

Content Guidelines

ACN: 681 117 135

  • Instagram
  • LinkedIn

FORE Australia would like to acknowledge Aboriginal and Torres Strait Islander peoples as the Traditional Custodians of the land we live, learn, and work on.​

 

We value their cultures, identities, and continuing connection to country, waters, kin, and community. We pay our respects to Elders, both past and present, and are committed to supporting the next generation of young Aboriginal and Torres Strait Islander leaders. This always was and always will be Aboriginal land.

 

As an organisation dedicated to amplifying solutions, we recognise that First Nations peoples have long identified many of the pathways for environmental protection and meeting community needs. Our role is to listen, support, and amplify these voices.

bottom of page