What AI regulation Australia 2026 means for your business: new privacy law transparency rules, APRA expectations, and three steps to take before December.
Most Australian business owners assume AI is unregulated. That assumption is now wrong, and acting on it late could be costly.
TL;DR: From 10 December 2026, the Privacy and Other Legislation Amendment Act 2024 requires Australian businesses to disclose how automated decision-making systems use personal data. Separately, APRA issued a formal letter to the financial sector in April 2026 demanding a step-change in AI governance. This article explains what AI regulation Australia 2026 actually requires, who it applies to, and three concrete steps to prepare.
Australia does not have an AI-specific Act equivalent to the EU’s. What it does have is a fast-moving combination of privacy law amendments, financial sector expectations, and a government proposals process for mandatory guardrails in high-risk settings: each with different timelines and scopes.
Here is how the landscape breaks down.
What the December 2026 Privacy Act Amendment Requires
The most broadly applicable private-sector obligation comes from the Privacy and Other Legislation Amendment Act 2024, which amends the Australian Privacy Principles. From 10 December 2026, Australian Privacy Principle (APP) entities must disclose in their privacy policies:
- The types of personal information used in substantially automated decisions
- The nature of decisions made solely or significantly by computer programs
- Which decisions could reasonably be expected to significantly affect the rights or interests of individuals
This applies to private-sector organisations with annual turnover exceeding $3 million, as well as health-service providers, credit reporting bodies, and Consumer Data Right participants regardless of size.
If your business uses AI to screen job applicants, score customer creditworthiness, determine pricing, segment customers for different offers, or route support tickets, and it uses personal data to do so, you are in scope.
The key word is “substantially automated.” You do not need a fully autonomous AI system. If a computer program makes or significantly assists a decision about a person, the disclosure obligation applies.
Full details of the amendment framework are available from the Office of the Australian Information Commissioner (OAIC).
What APRA Now Expects from the Financial Sector
For businesses in financial services, insurance, and superannuation, the obligations go further. On 30 April 2026, APRA published a letter to industry on artificial intelligence calling for a “step-change” in AI risk management and governance. The letter followed a targeted supervisory review across all APRA-regulated industries.
APRA found that AI use is moving from experimentation to customer-facing applications faster than governance arrangements are maturing. Boards were found to lack sufficient technical literacy to challenge management on AI risk. Some entities face concentration risk from dependence on a single AI provider across multiple functions.
APRA’s stated expectations for regulated entities:
- A framework for safe AI adoption with clear ownership across the AI lifecycle
- An inventory of AI tools and use cases across the organisation
- Human oversight for high-risk AI decisions
- Staff training on AI use, misuse, and limitations
APRA was explicit: where entities fail to manage AI risks proportionate to their size and complexity, it will take supervisory action. More detail is available on APRA’s news and publications page.

The Broader Regulatory Direction: Mandatory Guardrails
Beyond the Privacy Act amendments and APRA’s expectations, the Department of Industry, Science and Resources released a proposals paper on mandatory guardrails for AI in high-risk settings. The consultation closed in late 2024, and implementation timelines for private-sector mandatory guardrails remain under development.
The proposed 10 guardrails cover testing, transparency, accountability, and data governance for AI systems across the lifecycle. They are most likely to land as enforceable obligations for businesses operating AI in high-risk contexts: healthcare, financial services, employment, and systems that make consequential decisions about individuals at scale.
Australia’s standards-led approach means mandatory requirements will arrive incrementally rather than through a single Act. But the direction is clear, and businesses that wait for legislation to force the issue will be building governance frameworks under pressure rather than by design.
Three Steps to Take Before December 2026
1. Update your privacy policy to reflect automated decision-making. If your business uses any software that makes or significantly influences decisions about customers or employees using personal data, you need to describe this in your privacy policy by 10 December 2026. The description does not need to be technical: it needs to be clear about what types of data are used and what kinds of decisions are made. If your current privacy policy does not mention automated decision-making at all, it almost certainly needs updating.
2. Build an AI inventory. List every system your business uses that touches decisions about people. Include off-the-shelf tools: an AI-powered CRM scoring leads, a chatbot routing support tickets, an automated pricing engine. For each one, note what personal data it uses and what decision it influences. For APRA-regulated entities, this inventory is an expectation, not optional. For all other APP entities, it is the foundation of privacy policy compliance and the practical first step toward any future regulatory obligations.
3. Assign accountability. Someone in your business needs to own AI governance. In a small business this is typically the founder or a senior operations lead. The practical output of that ownership is knowing which systems you run, what they do, and being able to answer a customer question about why a decision was made: in plain language, not technical jargon.
What Good Looks Like
The Governance Institute of Australia notes that businesses handling AI governance well share a few traits: executive-level accountability, an active AI inventory, procurement practices that require vendors to supply explainability documentation, and a clear escalation path for complaints.
None of this requires a compliance team. A sole trader with three automated tools can build a workable governance framework in a day. The businesses that will struggle are the ones that have not started thinking about it.
For businesses that want help assessing their current AI exposure and building a governance framework that meets December 2026 requirements, Avatar Studios’ Strategy & Advisory practice works with Australian SMBs on AI readiness reviews and digital transformation strategy that accounts for regulatory risk from the start. If you are in the earlier stages of your AI journey, our AI & Automation services can help you implement tools with the documentation and governance they require.
Frequently Asked Questions
Does the December 2026 Privacy Act amendment apply to my small business?
It applies to businesses with annual turnover exceeding $3 million, and to health-service providers, credit reporting bodies, and CDR participants regardless of size. If your business is below the $3 million threshold and does not fall into those categories, you are outside the specific Privacy Act amendment scope: but existing consumer and anti-discrimination law still governs automated decisions that affect your customers.
What exactly do I need to disclose in my privacy policy?
From 10 December 2026, your privacy policy must describe: the types of personal information used in substantially automated decisions, the nature of those decisions, and where they could significantly affect individual rights or interests. The OAIC has indicated it will provide guidance templates for businesses to use.
Does this affect businesses that use tools like HubSpot or AI-powered HR platforms?
Yes, if those tools use personal data to make or assist in decisions about individuals. A HubSpot lead-scoring model that determines which customers receive follow-up is within scope. An HR platform that ranks candidates is within scope. The obligation falls on the business using the tool, not just the tool vendor.
What is the difference between the Privacy Act amendment and APRA’s AI governance requirements?
The Privacy Act amendment applies to all APP entities (private businesses above the $3M threshold and specified categories) and requires disclosure in privacy policies. APRA’s requirements apply specifically to banks, insurers, super funds, and other APRA-regulated entities, and expect broader governance frameworks: not just disclosure, but active risk management, inventories, and human oversight.
Where can I read the actual government documentation?
Key sources: the mandatory guardrails consultation paper from the Department of Industry, the APRA letter to industry on AI governance, the OAIC’s submission guidance, and for government-sector businesses, the Policy for responsible use of AI in government.