FORE Australia Reporter: Dr Renee Wright
Publish Date: 07/9/2024

Problem Identification:
Australia has yet to establish an AI Safety Institute (AISI).
Australians For AI Safety (AAS) noted that Australia has not fulfilled its commitment under the Seoul Declaration to establish an AISI.
This could lead to unregulated innovations, undermining the country’s ability to ensure that AI development remains safe and beneficial. This may result in Australia falling behind in managing the rapid advancements and inherent risks associated with AI.
Context:
Solution Identification:
Advice:
Download the 1-page policy brief PDF here:
Page 2:
Endorsed by:
Australians for AI Safety
Public Support:
The Good Ancestors Project
Where to go to learn more:
Australians for AI Safety. (2023, July 26). Letter to the Minister for Industry, Science, and Technology. https://www.australiansforaisafety.com.au/july-2023-letter
Australians for AI Safety. (2023, July 21). Australian AI leaders join global call for safe AI. https://www.australiansforaisafety.com.au/media
Australian Human Rights Commission. (2021). Human rights and technology: Final report.https://humanrights.gov.au/our-work/rights-and-freedoms/publications/human-rights-and-technology-final-report-2021
Reference list:
Australians for AI Safety. (2024, May 10). Letter to the Select Committee on Adopting AI.https://www.australiansforaisafety.com.au/
Australians for AI Safety. (2023, July 21). Australian AI leaders join global call for safe AI. https://www.australiansforaisafety.com.au/media
Australian Government, Department of Industry, Science and Resources. (2023, November 2). The Bletchley Declaration by countries attending the AI Safety Summit, 1–2 November 2023. https://www.industry.gov.au/publications/bletchley-declaration-countries-attending-ai-safety-summit-1-2-november-2023
Australian Human Rights Commission. (2021). Human rights and technology: Final report.https://humanrights.gov.au/our-work/rights-and-freedoms/publications/human-rights-and-technology-final-report-2021
Bell, G., Burgess, J., Thomas, J., & Sadiq, S. (2023, March 24). Rapid response information report: Generative AI - Language models (LLMs) and multimodal foundation models (MFMs). Australian Council of Learned Academies.https://www.chiefscientist.gov.au/sites/default/files/2023-06/Rapid%20Response%20Information%20Report%20-%20Generative%20AI%20v1_1.pdf
Department of Industry, Science and Resources. (2024, May 24). The Seoul Declaration by countries attending the AI Seoul Summit, 21-22 May 2024.https://www.industry.gov.au/publications/seoul-declaration-countries-attending-ai-seoul-summit-21-22-may-2024#seoul-declaration-1
Good Ancestors Project. (2023). Submission Paper: Commonwealth of Australia AI Inquiry. https://www.goodancestors.org.au/commonwealth-senate-inquiry-on-adopting-ai
Parliament of Australia. (2023, May). Public sector: New entities, and investments in companies.https://www.aph.gov.au/About_Parliament/Parliamentary_departments/Parliamentary_Library/Budget/reviews/2023-24/PublicSectorInvestments
Saeri, A. K., Noetel, M., & Graham, J. (2024). Survey assessing risks from artificial intelligence: Technical report. Ready Research, University of Queensland.https://aigovernance.org.au/survey/sara_technical_report
The Australian Government, Department of Industry, Science and Resources. (2023, June). Safe and responsible AI in Australia: Discussion paper. https://storage.googleapis.com/converlens-au-industry/industry/p/prj2452c8e24d7a400c72429/public_assets/Safe-and-responsible-AI-in-Australia-discussion-paper.pdf
Conflict of interest / acknowledgment statement:
N/A

Comments