On 25 July 2025, the TGA published the Report: Clarifying and strengthening the regulation of Medical Device Software including Artificial Intelligence (Report). The Report details the TGA's recent consultation efforts and provides an overview of the 53 responses received. Details regarding the consultations are set out in our earlier article, AI and Healthcare: Summary of Commonwealth Consultations.
The Report sets out the TGA's 14 findings arising from its stakeholder consultations. In this article, we highlight relevant findings for sponsors, suppliers, and manufactures of therapeutic goods incorporating AI, including Software as a Medical Device (SaMD).
Overview of the latest TGA report
The Report makes it clear that the TGA will consider amending several key defined legislated terms to ensure that existing legislation remains fit-for-purpose in the AI-era.
The Report acknowledged that although use of AI products is prevalent in the healthcare sector, there is limited understanding about when AI products are regulated by the TGA. The TGA found that targeted action is needed to improve regulatory compliance by the technology sector, including in connection with existing laws governing SaMD. The TGA proposes to achieve this through increased education, further guidance and targeted compliance action to 'remove unapproved medical devices from the Australian market'.
We can expect to see increased regulatory action in relation to AI-enabled devices that qualify as a 'medical device' under the Therapeutic Goods Act 1989 (Cth) (TG Act).
The definition of a 'medical device' is broad and includes any product or software intended for human use to diagnose, treat, prevent, monitor, predict, prognose, treat or alleviate disease, injury or disability, without primarily relying on chemical or drug-based action, and also includes devices to control or support contraception. For the full definition, see section 41BD of the Therapeutic Goods Act.
This will have particular ramifications for AI-enabled tools such as digital scribes, that generate clinical documentation and assist with diagnosis or treatment decisions, as these functions may (already) bring this software under the TGA's regulatory oversight.
Notably, digital mental health tools are currently excluded from the definition of a medical device and are not regulated by the TGA as therapeutic goods. However, the TGA has formed the view that the exclusion of digital mental health tools is not appropriate and recommends an urgent review into how these tools are regulated. This means digital mental health tools may soon be brought under regulation the therapeutic goods regulatory regime, especially if they are AI-enabled, or perform functions beyond low-risk wellness support. This should prompt developers to reassess compliance obligations – noting that it is an offence to promote, manufacture or supply unregistered therapeutic goods in Australia (and such conduct can attract serious criminal and civil penalties, particularly where there is a risk of patient harm).
The TGA also recommended ongoing monitoring and review of health and wellness apps, noting that some may in fact meet the definition of a medical device. We can expect to see regulator action in relation to wellness apps and devices that are in fact performing as medical devices, including by requiring sponsors to seek inclusion on the Australian Register of Therapeutic Goods, enforcing compliance with safety and performance standards and potentially reclassifying certain apps based on their intended use and risk profile.
Notably:
- The TGA observed that the definition of 'manufacturer' is not easily applicable to the technology sector. The TGA has recommended further review of this and other legislative definitions to ensure they align with common AI concepts, such as 'developer' and 'deployer'.
- The TGA acknowledged that traditional supply controls in the therapeutic goods sector (e.g. limited supply channels through pharmacies or medical practices) are not well suited to software as a service, where software can be hosted offshore where it may not be regulated in the same way and where it can be rapidly updated or adapted – without changes being considered through a regulatory lens. The consultation found 'general confusion' about how newer supply methods, including for SaMDs, are regulated. The TGA will consider whether the definition of ‘supply’ should be expanded to include digital access to software products.
- The TGA found that further consultation is needed to clarify how responsibility under the TG Act should be assigned when AI replaces human services, particularly where the individual deploying the AI system is unaware that its outputs may constitute an offence under the Act.
- The TGA found that some digital scribes may (already) meet the definition of a medical device, particularly where they suggest diagnoses or treatments. The TGA recommended an urgent review of digital scribes to assess their compliance with regulatory requirements.
- The TGA found that its current medical device classification rules are 'largely appropriate' for devices incorporating AI, except for those that provide prognostic or predictive outputs. These types of software are often classified as low-risk and do not require third-party assessment prior to deployment. The TGA recommended a future review of classification rules for predictive and prognostic SaMDs but concluded that no broader changes are needed at this time.
- AI can be easily adapted after deployment. This means that although it might meet safety, performance and effectiveness controls at deployment, the AI system's performance can generate and alter functionality over time. The TGA noted the current regulatory framework may not be appropriate to manage change control. The TGA recommended urgent guidance is produced for technical requirements for adaptive and generative AI.
- The TGA noted there is a real risk if AI systems are trained on datasets of unknown provenance, stating 'the development of innovative devices trained on unvalidated open datasets is an emerging issue for regulators'. The TGA recommended urgent guidance for the use of open datasets and software of unknown provenance.
- The TGA emphasised the need to equip consumers and healthcare professionals with resources to understand the limitations of AI-enabled products, enabling more informed decisions about their appropriate use.
- The TGA also recommended developing guidance for developers and deployers on appropriate labelling, warnings, and instructions to help users mitigate known risks associated with AI-enabled products. It proposed that this material be made accessible through channels such as social media.
Interestingly, the TGA concluded that:
- no changes are currently required to the existing essential principles, but further guidance is required;
- a technology-agnostic approach be maintained, avoiding specific definitions for AI subtypes such as general-purpose AI or large language models;
- the majority of existing software exclusions remain appropriate, but additional guidance is needed to clarify their application;
- a review of the TGA website is necessary to improve accessibility and usability of regulatory information.
- ongoing harmonisation with international regulatory approaches is encouraged to reduce burden and avoid disruption to the supply of innovative devices; and
- Australia’s regulatory framework should remain responsive and reactive, rather than proactively prescriptive, in managing AI-related risks.
Key findings most relevant to sponsors, manufacturers and developers of therapeutic goods:
Findings 1 & 2: Definitions under review.
Stakeholder feedback highlighted that current definitions in the TG Act may require further consultation to ensure they remain applicable to software-based medical devices. Terms like “developer”, “deployer” and “distributor” are not used in the Act, but are common terminology used to refer to entities involved throughout the AI lifecycle. They align to legislated definitions for 'manufacturer' and 'sponsor' to an extent, but gaps remain.
A 'manufacturer' of a medical device is defined as the 'person who is responsible for the design, production, packaging and labelling of the device before it is supplied under the person's name' (TG Act, s 41BG). Stakeholders submitted that this definition is not recognised by entities operating in the software sector and may not apply to clinical practitioners involved in manufacturing activities in clinical practice.
A 'sponsor' of a medical device is defined as 'a person who exports or arranges the exportation of goods from Australia' or 'a person who imports, or arranges the importation of, the goods into Australia' (TG Act, s 3). Stakeholders noted confusion may arise when applying this definition to software-based medical devices, such as when a software product is made available by an overseas provider through an online platform hosted by an Australian entity.
In light of these issues, the TGA is considering whether to refine legislated definitions, or if further guidance should be provided to clarify how legislated definitions apply to industry-specific terminology.
Finding 3: Appropriate assignment of responsibility when AI gets it wrong.
The current wording of the TG Act assigns penalties for offences carried out by 'a person'. Stakeholder feedback highlighted this as a particular pain point where AI output represents an offence and the AI system replaces human involvement, or the person or entity deploying the AI system was not aware that the deployment of the AI system constitutes an offence under legislation.
The Report recommends further consultation to determine whether the language of offence provisions under the TG Act remain fit-for-purpose in the AI era.
Finding 4: AI digital scribes under scrutiny.
Stakeholder feedback noted clinical use of software products (even with a human in the loop) can lead to patient harms. It was noted users are often unaware that AI or machine learning is operating within the clinical workflow. Accordingly, the Report confirms targeted action is required to promote compliance by the software sector with regulatory obligations. The TGA has highlighted digital scribes as a particular area of focus, including to determine whether some function as medical devices, and not just an administrative support tool. In this regard, the TGA is collaborating with other regulators such as the Australian Health Practitioner Regulation Agency (Ahpra) and the Australian Commission on Safety and Quality in Health Care (ACSQHC).
Finding 5: Impending review of medical device classification rules.
Stakeholder feedback indicated the current medical device risk classification rules remain largely appropriate for use with respect to devices incorporating AI. However, feedback did note that medical devices with prognostic or predictive functions (currently classified as low risk) can in fact have significant flow-on impacts on patient care and clinical outcomes. As a result, the Report foreshadows potential future review of existing classification rules and their applicability to such 'low risk' devices.
We discuss medical device classification rules in more detail in our article, Innovation meets regulation: Medical devices and artificial intelligence.
Finding 8: Previously exempt digital mental health tools may be subject to therapeutic goods regulation.
The Report found the current exclusion for digital mental health tools is no longer appropriate, particularly where these tools are supplied to consumers without clinical oversight. Excluded health and wellness applications incorporating more therapeutic functionality will also be subject to monitoring, as they begin to border on the definition of a 'medical device'. The TGA's review will be conducted in collaboration with the ACSQHC.
Finding 9: Potential changes to advertising regulations.
The Report noted stakeholder feedback identified a general desire in more transparency around medical devices incorporating AI systems, including:
- whether the device operates using an AI system;
- information about the AI system's training datasets;
- whether AI systems are static or subject to ongoing updates which mean AI outputs may change as a result; and
- AI-related risks that consumers should be aware of when using the product.
The Report recommends a review of the advertising provisions in the Therapeutic Goods Regulations 1990 (Cth) and the Therapeutic Goods (Therapeutic Goods Advertising Code) Instrument 2021 (Cth), as a means of promoting better transparency in the medical devices sector. Such changes may require manufacturers and sponsors providing additional device information for inclusion in the Australian Register of Therapeutic Goods.
Finding 11: Change control for Generative AI.
The Report acknowledges that Generative AI (Gen-AI) pose greater consumer risks as they have the capacity to change their functionality over time, without direct manufacturer oversight. The TGA's regulatory framework traditionally caters for point-in-time assessments, and the regulator has acknowledged a requirement for frequent reassessment of such software would be costly and time consuming for sponsors and manufacturers.
The Report recommends expeditious development of guidance in relation to the regulation of medical devices that incorporate adaptive or Gen-AI technologies, including when reassessments should be triggered.
Finding 12: Medical devices trained on open datasets or incorporating software of unknown provenance.
The Report acknowledges the challenges faced by developers whose products are trained on open datasets or incorporate software of unknown provenance. Currently, there is no regulatory guidance regarding such matters and which training dataset to use is ultimately a matter for the developer. Not only does this pose difficulties and risk additional costs for developers hoping to have their products approved by the TGA, it also increases potential consumer safety risks.
The Report recommends the TGA develop guidance for technical requirements for adaptive or Gen-AI products. In the meantime, developers may have regard to standards such as ISO/IEC 5338:2023, ISO/IEC 8183:2023, and ISO/IEC 42001:2023.
Other key findings:
- Finding 7: TGA prefers a 'technology-agnostic' approach to regulation of medical devices. Accordingly, it will not be introducing specific definitions or regulations for different subtypes of AI. This means the regulation of AI-augmented or AI SaMDs will remain under the current regulatory framework.
- Finding 10: The TGA will undertake general review of its website to make it easier for consumers and health professionals to find information and guidance relevant to them.
- Finding 13: The TGA will continue to engage with the software sector to provide additional guidance and education to promote compliance. Guidance targeting stakeholders outside the medical device industry such as consumers and health professionals is also under consideration by the TGA.
- Finding 14: The TGA reaffirms its commitment to aligning with international regulatory frameworks to reduce burden and support timely access to innovative devices. The Report affirms taking part in global initiatives such as the International Medical Device Regulators Forum. The Report recommends maintaining a responsive and reactive approach to medical device regulation that adapts to developments in comparable jurisdictions.
What these changes mean for the medical device industry?
Software developers and technology suppliers in the healthcare sector should be on notice that the TGA is watching and is ready to step up enforcement action in connection with the unlawful supply of unregistered medical devices. Further, products currently excluded from TGA regulation or classified as 'low risk' may be reclassified, potentially imposing new regulatory obligations on their developers. Businesses operating in these areas should monitor developments closely and prepare for enhanced compliance obligations.
While the legislative and regulatory changes foreshadowed in the Report are yet to crystallise, medical device sponsors and manufacturers who are incorporating AI into their products and services can begin to align practices with existing regulatory guidance from other relevant regulators, such as:
Looking ahead
The TGA's consultation indicates it is devoting regulatory efforts towards safe and responsible AI in the life sciences sector. The release of the Report is just part of ongoing regulatory efforts and a 5-year Federal budget initiative to support safe and responsible AI in Australia.
The TGA will begin further targeted consultations in the latter half of 2025 and into 2026 to address areas needing regulatory reform and greater clarity. Given the content of the Report, we anticipate key focus areas will include refining legislated definitions, clarifying responsibilities across the AI lifecycle, and strengthening compliance pathways for SaMDs.
Impacted stakeholders may wish to engage in these consultations and otherwise stay tuned for further updates regarding proposed changes.
MinterEllison's leading AI Advisory and Life Sciences team are available to support your organisation in aligning with AI and regulatory best practice, and assistance to ensure your practices, products and decisions remain safe, compliant and future-ready.