Intersection with the Privacy Act and related obligations
Critically, eSafety's guidance emphasises the centrality of privacy to SMMA Obligation compliance under the OS Act. Steps taken under Part 4A of the OS Act will not be considered reasonable unless providers also meet their obligations under the Privacy Act 1988 (Cth) (Privacy Act) and the Australian Privacy Principles (APPs). In practical terms, this means that any use of personal information for age-assurance purposes must align with the privacy and data-minimisation requirements set out in the APPs.
This means that any use of personal information for age‑assurance purposes must align with privacy and data‑minimisation requirements. Platforms are therefore expected to adopt measures that respect user agency, dignity and safety while ensuring compliance with both the OS Act and Privacy legislation.
The requirements under Part 4A of the OS Act reflect some of the obligations under the Privacy Act. For example, a platform that collects personal information about an individual for SMMA Obligation compliance purposes, must not use or disclose the information for any other purpose, except:
- where disclosure is required by law, Court order or health-related reasons; or
- with the voluntary, informed, current, specific and unambiguous consent of the individual.
This is stricter than APP 6, which permits use or disclosure of personal information for certain secondary purposes, and is otherwise subject to a broader range of exceptions.
Similarly, a platform holding an individual's personal information that was collected for the purpose of (or purposes including) SMMA Obligation compliance must destroy the information after using it for the purpose for which it was collected. This reflects the obligations imposed on APP entities under APP 11.2.
A further liability risk for platforms arises from the introduction earlier this year of Australia's new statutory tort for serious invasions of privacy. Mishandling data during age verification processes could, in some circumstances, expose platforms to civil claims, making robust privacy safeguards not just regulatory requirements but essential risk management.
As enforcement and liability frameworks evolve, the true test will be how platforms, vendors and institutions respond when controls fail, particularly in safeguarding children's sensitive data in the event of cyber incidents or system failures”
Our perspective
The SMMA Obligations represent a fundamental shift in how platforms must protect young users online. Compliance is not a tick-box exercise - it requires layered controls, privacy-centred design and ongoing monitoring. What these laws signal is a move from passive age-gating to active, principles-based accountability, where platforms must demonstrate continuous commitment to upholding minimum age requirements and privacy protection. The framework makes clear that these are not competing interests: steps taken under the OS Act will not be considered reasonable unless providers also meet their Privacy Act obligations, ensuring age assurance measures are implemented transparently, proportionately and with respect for user agency and data minimisation principles.
What you should do now
Platforms that are subject to the laws will need layered controls, privacy centred design and ongoing monitoring to meet the reasonable steps standard by 10 December 2025.
To achieve compliance, platforms should:
- Complete a self-assessment: Complete an initial self-assessment promptly to confirm whether the minimum age obligation applies before 10 December 2025; repeat the assessment regularly, especially when adding new social features or noticing changes in how users engage with the service; and document the evidence supporting your assessment and consider sharing both the assessment and supporting material with eSafety to assist in its review.
- Implement a layered program: Reasonable steps involve a layered program of systems, technologies, processes, policies and communications across the user journey. This includes identifying accounts held by age-restricted users and deactivating or removing them with clear, respectful messaging; preventing under-16 users from creating new accounts; and implementing measures to mitigate circumvention (e.g. duplicate accounts, re-registration, evasion tools).
- Adopt successive validation: eSafety encourages a layered, successive validation approach, which involves combining two or more age assurance methods or technologies to determine a user's age with greater accuracy.
- Ensure privacy-centred design: Steps taken under Part 4A of the OS Act will not be considered reasonable unless providers also meet their obligations under the Privacy Act and the APPs. Platforms are expected to adopt measures that respect user agency, dignity and safety while ensuring compliance with both the OS Act and privacy laws.
- Provide alternatives to government ID: Social media platforms must not require end users to provide government-issued identification or use an accredited provider as the only method of age assurance. While these options can be offered, they must always be accompanied by a reasonable alternative.
- Address existing accounts: The SMMA Obligations apply not only to new accounts but also to existing accounts created before the law takes effect. Ongoing compliance also requires proactive monitoring to detect and address underage accounts over time.
- Obtain specialist advice: eSafety recommends obtaining independent legal advice and undertaking targeted Privacy Impact Assessments (PIAs) tailored to each service, user base and operating context. Mishandling data during age verification processes could, in some circumstances, expose platforms to civil claims, making robust privacy safeguards not just regulatory requirements but essential risk management.
- Commit to continuous improvement: Providers are expected to monitor and improve their practices over time. This is critical because determined young users will inevitably discover methods to bypass verification systems, create duplicate accounts, or exploit vulnerabilities in age-checking processes.
Looking ahead
The SMMA Obligations for protecting young social media users raise broader questions about the future of age verification in Australia. The infrastructure developed for social media could set a precedent for wider verification requirements and increased government intervention across the internet, potentially extending to sectors such as financial services (including youth banking products, cryptocurrency platforms, investment platforms, and buy-now-pay-later services), gaming and gambling platforms with strong monetisation features, and e-commerce sites selling age-restricted products such as alcohol, vaping products and supplements. These developments also bring data sovereignty to the fore, highlighting concerns about where verification data is processed and stored, and the implications of government access or foreign ownership of platforms holding Australian children's data. The intersection with recent and expected privacy reforms is particularly significant, as the introduction of a statutory tort for serious invasions of privacy, enhanced children's privacy protections, and anticipated uplifted requirements around consent and data minimisation, will fundamentally reshape how platforms approach age assurance. The privacy reforms' emphasis on privacy by design and the heightened obligations for handling children's personal information will require platforms to balance effective age verification with robust privacy safeguards. Finally, as enforcement and liability frameworks evolve, the true test will be how platforms, vendors and institutions respond when controls fail, particularly in safeguarding children's sensitive data in the event of cyber incidents or system failures. As the regulatory framework evolves, these issues will likely shape the next phase of age assurance policy.
How MinterEllison can help
Navigating these obligations is complex. Our team combines deep industry and regulatory expertise, privacy and data protection know how, and practical experience advising leading digital platforms and educational institutes. Please contact us if you would like assistance with your organisation's practices.