AI Governance for Australian Healthcare Providers
Navigate TGA medical device regulations, AHPRA professional obligations, and clinical governance frameworks for artificial intelligence in healthcare settings.
For hospitals, health districts, medical device companies, general practices, and health technology developers operating under Australian healthcare regulation.
The Australian Healthcare AI Regulatory Environment
Australian healthcare organisations implementing artificial intelligence face oversight from multiple regulatory bodies.
The Therapeutic Goods Administration regulates AI-enabled medical devices and software. The Australian Health Practitioner Regulation Agency sets professional obligations for clinicians using AI tools. The Office of the Australian Information Commissioner enforces privacy requirements for health data. The Australian Commission on Safety and Quality in Health Care establishes clinical governance standards.
This regulatory framework continues to evolve. In February 2025, the TGA published outcomes from its AI regulation consultation, with Government approval granted for 14 key regulatory refinements. These include potential reclassification of certain AI clinical prediction tools into higher-risk categories and extended oversight to previously exempt software types. The National Health Privacy Rules commenced in April 2025, introducing stricter requirements for health claims data. Western Australia's Department of Health implemented mandatory AI policy requirements in September 2025.
Healthcare is classified as a high-risk setting for AI. Eighty-eight per cent of stakeholders responding to the proposed mandatory guardrails consultation said there should always be human oversight for healthcare AI decisions. Current adoption rates reflect both opportunity and complexity: ninety-one per cent of Australian hospitals now use AI-powered systems, yet only thirty per cent of Australians trust AI more than they fear it.
Key Regulatory Requirements
Multiple overlapping frameworks govern AI use in Australian healthcare settings.
TGA Medical Device Regulation
SaMD Grace Period Ended: 1 November 2024
Technology-agnostic regulatory requirements for software-based medical devices. Software with medical purpose must be registered with the Australian Register of Therapeutic Goods. Classification based on risk: seriousness of condition and significance of information to clinical decision-making.
- Class III: software diagnosing critical conditions
- Adaptive AI requiring new change control approaches
- Exemptions under review for digital health tools
AHPRA Professional Obligations
Published: August 2024
Individual health practitioners remain ultimately responsible for any and all AI used in the course of their practice. GPs are fully liable for errors in patient health records, regardless of whether AI scribes generate them. AI should support clinical judgement, not replace it.
- Testing all tools before clinical use
- Understanding training data, biases, and limitations
- Addressing bias impacting Aboriginal and Torres Strait Islander communities
ACSQHC Clinical Governance
Guides Released: August 2025
Three guides for clinicians: AI Clinical Use Guide, AI Safety Scenario for Interpretation of Medical Images, and AI Safety Scenario for Ambient Scribe. Problem-driven implementation building on existing patient safety processes, digital health governance, and research ethics frameworks.
- Before: clinical use case validation, vendor evaluation, bias assessment
- While: human oversight protocols, ongoing monitoring, incident reporting
- After: post-market surveillance, change management for AI updates
Privacy Act and Health Data
National Health Privacy Rules: 1 April 2025
Health information is classified as sensitive information requiring higher protections. Personal information includes AI-generated clinical notes, diagnostic suggestions, and even hallucinations if they relate to an identifiable person. Using AI to generate or infer sensitive information requires consent unless exceptions apply.
- Cross-border data flows for offshore AI providers
- My Health Record mandatory breach notification
- Stricter requirements for MBS and PBS claims data
Common Healthcare AI Applications
AI is being deployed across diagnostic imaging, clinical documentation, and clinical decision support in Australian healthcare.
Diagnostic Imaging
AI-powered imaging detects cancers, strokes, and fractures. South Australia has deployed Annalise.ai across metropolitan and regional sites. Royal Prince Alfred Hospital uses algorithms detecting lung cancer with greater than 90% sensitivity.
Machine learning algorithms analyse over 8.5 million medical images annually.
AI Scribes
Use among GPs rose from less than 3% in May 2024 to 8.24% in October 2024. Modern ambient AI scribes have roughly 1-3% error rates but different failure modes: hallucinations, critical omissions, misattribution.
Currently most fall outside TGA oversight. 2025 compliance activities targeting diagnostic features.
Clinical Decision Support
Software making suggestions for diagnosis or treatment is subject to TGA regulation. Hospitals use AI for predicting patient deterioration, streamlining emergency care, and reducing waiting times.
Must undergo pre-market approval and ARTG registration unless exemption applies.
Addressing Algorithmic Bias
AI algorithms can perpetuate and exacerbate healthcare disparities. AHPRA guidance requires practitioners to address biases impacting Aboriginal and Torres Strait Islander communities and other diverse populations.
Sources of Bias
- AI algorithms trained on international datasets may not perform well in Australia
- Skin lesion classification often trained on predominantly white patient images (5-10% Black patients)
- Insufficient sample sizes for certain patient groups create models that perform poorly
Mitigation Requirements
- Review training data for representativeness
- Adversarial debiasing during model development
- Ongoing monitoring tracking performance across patient populations
Healthcare AI Governance Services
Specialist support for TGA compliance, clinical governance, privacy obligations, and practitioner guidance.
TGA Regulatory Compliance Support
Software as a Medical Device classification assessment, Australian Register of Therapeutic Goods registration support, clinical evidence requirements guidance, post-market surveillance framework development, adaptive AI change control processes.
Medical device regulatory pathway support
AUD $12,000 – $45,000
Depending on classification
Clinical Governance Framework Development
ACSQHC-aligned governance models, National Safety and Quality Health Service Standards integration, before-while-after implementation guides, human oversight protocols, clinical validation processes.
Framework development for single facility or practice
AUD $15,000 – $35,000
Health district or multi-site: $40,000 – $85,000
Privacy and Data Governance
Privacy Act compliance assessment, Australian Privacy Principles gap analysis for AI systems, My Health Record obligations, cross-border data transfer compliance, secondary data use frameworks.
Compliance assessment and gap analysis
AUD $8,000 – $15,000
Practitioner and Workforce Guidance
AHPRA obligations translation into operational policies, AI scribe implementation frameworks, patient consent templates, professional indemnity considerations, bias awareness training programmes.
Ongoing advisory retainer
AUD $3,500 – $8,000/month
Engagement Approach
1. Initial Assessment
Reviews current AI use or planned implementation against TGA requirements, AHPRA obligations, Privacy Act compliance, and ACSQHC clinical governance principles. Produces prioritised risk assessment and compliance roadmap.
2. Framework Development
Translates regulatory requirements into operational policies, procedures, and governance structures appropriate to the organisation's size and clinical context.
3. Implementation Support
Change management, workforce training, vendor evaluation assistance, and establishment of ongoing compliance monitoring processes.
2025-2026 Regulatory Outlook
The TGA received Government approval in January 2025 for further regulatory work based on fourteen findings from its AI consultation. Targeted consultations on regulatory refinements will continue through 2025 and 2026. Health technology companies and healthcare providers should monitor TGA communications for consultation opportunities.
The status of proposed mandatory AI guardrails remains uncertain. The September 2024 proposals paper outlined ten guardrail categories including accountability, risk management, data governance, testing protocols, and human oversight. Healthcare was identified as high-risk. However, the Productivity Commission has expressed concerns about potential chilling effects on innovation, and Australia currently sits at the permissive end of the regulatory spectrum compared to regional peers.
The Privacy and Other Legislation Amendment Act 2024 passed in late 2024, increasing transparency requirements for automated decisions using personal information. Implementation of these amendments will continue through 2025.
State-level initiatives are proceeding independently of Commonwealth frameworks. Western Australia's mandatory AI Policy became effective in September 2025. New South Wales established an Office for Artificial Intelligence to guide responsible adoption. Other states are developing their own governance approaches.
Healthcare organisations should prepare for increased regulatory scrutiny regardless of whether mandatory guardrails are legislated. The TGA's compliance activities are expanding, AHPRA guidance is now enforceable through professional standards, and the OAIC is actively engaging with AI providers in healthcare settings.
Request Healthcare AI Compliance Assessment
Initial assessment includes TGA classification review, AHPRA obligations analysis, Privacy Act gap assessment, and clinical governance framework evaluation. Fixed-fee quoted based on organisation size and AI use scope.