The age of reason(able steps): Australia’s impending Social Media Minimum Age obligations

12 minute read  02.12.2025 Maria Rychkova, Natalie Adler, Nicole Bradshaw, Paul Kallenbach and Dean Levitan

Effective from 10 December 2025, Australia’s Online Safety Amendment (Social Media Minimum Age) Act 2024 introduces a legal requirement for social media platforms to prevent users under 16 from having accounts.


Key takeouts


  • The Act places responsibility on platforms to take reasonable, privacy-preserving, and proportionate steps to verify user age, applying to both new and existing accounts.
  • Principles based compliance - the eSafety Commissioner will not mandate a single technology. What’s 'reasonable' depends on the method and the systems that support it.
  • Governance and privacy - the eSafety Commissioner recommends obtaining independent legal advice and targeted Privacy Impact Assessments tailored to each service, user base and operating context.

At a glance

When the Online Safety Amendment (Social Media Minimum Age) Act 2024 (Cth) (SMMA Act) received Royal Assent on 10 December 2024, many operational details were unsettled. Now, with the SMMA coming into effect through Part 4A of the Online Safety Act 2021 (Cth) (OS Act) on 10 December 2025, the picture is much clearer. The Government has finalised the Online Safety (Age Restricted Social Media Platforms) Rules 2025 (Rules), the Age Assurance Technology Trial (AATT) has concluded, and the eSafety Commissioner (eSafety) has issued regulatory and privacy guidance. 

This update explains who is covered, how the definitions and exclusions work, and the practical implications of eSafety’s principles based approach to 'reasonable steps'. For further background on this topic, see our earlier Insight published in December 2024.

 

Roadmap: how we got here (at a glance)

December 2024 Navigation Show below Hide below

The SMMA Act receives Royal Assent.

June 2025 Navigation Show below Hide below

preliminary AATT report is released. Participants of the trial include Meta, Google, Snapchat and TikTok (amongst others). Headline findings include that there are 'no significant technological barriers preventing the deployment of effective age assurance systems in Australia' and age‑assurance technologies can be private, robust and effective when implemented correctly in conjunction with other measures.

July 2025 Navigation Show below Hide below

Minister for Communications creates the Age-Restricted Social Media Platforms Rules (Rules), excluding certain services from SMMA Act obligations.

August 2025 Navigation Show below Hide below

final AATT report findings released across 10 volumes, including analyses of age verification, estimation, inference, successive validation, parental consent and controls.

September 2025 Navigation Show below Hide below

eSafety publishes regulatory guidance outlining what constitutes 'reasonable steps' and releases a Statement of Commitment to Children’s Rights

October 2025 Navigation Show below Hide below

eSafety publishes privacy guidance outlining the privacy considerations and obligations for social media platforms and age-assurance providers. 

Part 1 – Who must comply and who's excluded

Any electronic service that meets the definition of an 'age‑restricted social media platform' must comply with the SMMA obligations in Part 4A of the OS Act by 10 December 2025 (SMMA Obligations), unless excluded by the Rules. Section 63C of the OS Act states that an electronic service is an 'age-restricted social media platform' if all the following apply: 

  • the sole or a significant purpose is to enable online social interaction between two or more end-users; 
  • the service allows end-users to link to, or interact with, some or all of the other end-users; 
  • the service allows end-users to post material on the service; and 
  • any other conditions set out in the Rules are satisfied.

A service is not an age‑restricted social media platform if:

  • none of the material on the service is accessible to, or delivered to, one or more end-users in Australia; or
  • the service is excluded in the Rules.

Current exclusions under the Rules

The Rules exclude from the SMMA Obligations social media platforms with the sole or primary purpose of enabling or supporting: 

  • communication by messaging, email, voice or video calling;
  • playing of online games;
  • sharing of information about products or services;
  • professional networking or professional development;
  • education; or
  • health.

When the Online Safety Amendment (Social Media Minimum Age) Bill first passed, YouTube was widely expected to be exempt from age-verification requirements due to the educational uses of its content. However, on 24 June 2025, the eSafety Commissioner formally recommended including YouTube, citing it as the most frequently reported source of harmful content for 10-15 year‑olds. Shortly afterwards, the Federal Government announced that YouTube would be included within the ambit of the laws. YouTube Kids, however, is to be exempt from the restrictions.

While YouTube’s inclusion under Part 4A of the OS Act means Australians under 16 cannot hold an account in their own name, they can still view content without logging in. This distinction provides a way for educational content to be accessed in classrooms and applies to all platforms that allow content access in a logged out state.

Pursuant to the Rules, a service is also excluded where a significant purpose is:

  • facilitating communication between educational institutions and students (and/or their families); or
  • facilitating communication between health providers and people using those services.

Primary purpose test (practical lens)

The primary purpose test requires platforms to assess whether users would engage with the service in the same way without its social interaction features. If user engagement would be largely unchanged in the absence of social interaction features, those features are unlikely to be a primary purpose of the service.

eSafety's self-assessment guidance 

eSafety has also issued guidance to help services determine whether they are age restricted social media platforms. Key recommendations include:

  • completing an initial self-assessment promptly to confirm whether the minimum age obligation applies before 10 December 2025;
  • repeating the assessment regularly, especially when adding new social features or noticing changes in how users engage with the service; and 
  • documenting the evidence supporting your assessment and considering sharing both the assessment and supporting material with eSafety to assist its review.

Existing accounts are in scope

The SMMA Obligations apply not only to new accounts but also to existing accounts created before the law takes effect. From 10 December 2025, providers must take reasonable steps to identify accounts held by users under 16 and be ready either to deactivate or remove them in accordance with the reasonable steps framework. Ongoing compliance also requires proactive monitoring to detect and address underage accounts over time.

Part 2 – How compliance will be assessed (and achieved)

 

The rules are not a set and forget, they are a set and support”
Anika Wells, Minister for Communications

Guiding principles at a glance

Age-verification measures are expected to be:

  • •  reliable and effective accurate, robust and fit for purpose;
  • • privacy preserving minimise data collection and protect user privacy;
  • • accessible and fair inclusive and equitable for all users;
  • • transparent clear about processes and impacts;
  • • proportionate balanced against the level of risk; and
  • • evidence based informed by data and responsive to emerging technology.

Government ID and accredited providers - alternatives are mandatory

Social media platforms must not require end‑users to provide government‑issued identification or use an accredited provider as the only method of age assurance. While these options can be offered, they must always be accompanied by a reasonable alternative.

If a provider chooses an age‑assurance method that involves collecting government ID, an alternative pathway (such as an in‑service review or verification process) must be available. Similarly, platforms cannot mandate the use of an accredited provider without offering other choices. The rationale behind this restriction is to preserve privacy and ensure that social media platform users are not forced into a single, high‑friction or privacy‑intrusive option.

Unpacking a principles‑based approach

As noted above, eSafety has developed a 'principles-based' approach to assessing compliance with SMMA Obligations and does not list specific types of age assurance that must be employed. What is considered 'reasonable' will depend on both the age‑assurance method(s) used and the systems and processes supporting them. Drawing upon AATT findings that there is no single 'one-size fits all' solution to age verification, the rationale behind this flexibility is to enable social media platform providers to adopt solutions that are effective, proportionate and privacy‑preserving.

'Reasonable steps' in practice

As mentioned above, there is no one‑size‑fits‑all solution. Reasonableness depends on the platform’s nature, technological feasibility, and the overarching goal of reducing harm to under‑16 users. Broadly, reasonable steps involve a layered program of systems, technologies, processes, policies and communications across the user journey. This includes:

  • identifying accounts held by age‑restricted users and deactivating or removing them with clear, respectful messaging;
  • preventing under‑16 users from creating new accounts; and
  • implementing measures to mitigate circumvention (e.g. duplicate accounts, re registration, evasion tools).

Successive validation and continuous improvement

eSafety encourages a layered, successive validation approach, which involves combining two or more age assurance methods or technologies to determine a user's age with greater accuracy. As highlighted in the February 2025 report produced by eSafety, titled Behind the Screen, when implemented transparently and in line with privacy and security best practice, successive validation can deliver inclusive, scalable and proportionate age assurance. While eSafety will not dictate the steps and standards required of social media platform providers, providers are expected to monitor and improve their practices over time. This is critical given that minimum‑age rules are often circumvented, with the reliance on truthful age self-declaration at point of sign-up having proved insufficient to date in preventing underage users from making accounts. For example, in the Behind the Screen report, survey results showed that 80% of surveyed children aged 8 to 12 used one or more social media services in the survey year.  Whilst approximately half (54%) accessed these services via their parent's or carer's account(s), over a third (36%) had their own account, demonstrating the prevalence of circumvention of existing age restrictions.

Avoid these pitfalls 

eSafety has flagged the following practices as falling short of SMMA Obligations:

  • • relying solely on self‑declaration to determine age;
  • • allowing under‑16 users to retain access for extended periods before detection;
  • • failing to prevent immediate reactivation or rapid re‑registration after removal; and
  • • implementing measures that wrongly block or remove large numbers of legitimate users.

Intersection with the Privacy Act and related obligations

Critically, eSafety's guidance emphasises the centrality of privacy to SMMA Obligation compliance under the OS Act.  Steps taken under Part 4A of the OS Act will not be considered reasonable unless providers also meet their obligations under the Privacy Act 1988 (Cth) (Privacy Act) and the Australian Privacy Principles (APPs).  In practical terms, this means that any use of personal information for age-assurance purposes must align with the privacy and data-minimisation requirements set out in the APPs.

This means that any use of personal information for age‑assurance purposes must align with privacy and data‑minimisation requirements. Platforms are therefore expected to adopt measures that respect user agency, dignity and safety while ensuring compliance with both the OS Act and Privacy legislation.

The requirements under Part 4A of the OS Act reflect some of the obligations under the Privacy Act. For example, a platform that collects personal information about an individual for SMMA Obligation compliance purposes, must not use or disclose the information for any other purpose, except:

  • where disclosure is required by law, Court order or health-related reasons; or
  • with the voluntary, informed, current, specific and unambiguous consent of the individual.

This is stricter than APP 6, which permits use or disclosure of personal information for certain secondary purposes, and is otherwise subject to a broader range of exceptions.

Similarly, a platform holding an individual's personal information that was collected for the purpose of (or purposes including) SMMA Obligation compliance must destroy the information after using it for the purpose for which it was collected. This reflects the obligations imposed on APP entities under APP 11.2.

A further liability risk for platforms arises from the introduction earlier this year of Australia's new statutory tort for serious invasions of privacy. Mishandling data during age verification processes could, in some circumstances, expose platforms to civil claims, making robust privacy safeguards not just regulatory requirements but essential risk management.

 

As enforcement and liability frameworks evolve, the true test will be how platforms, vendors and institutions respond when controls fail, particularly in safeguarding children's sensitive data in the event of cyber incidents or system failures”

Our perspective

The SMMA Obligations represent a fundamental shift in how platforms must protect young users online. Compliance is not a tick-box exercise - it requires layered controls, privacy-centred design and ongoing monitoring. What these laws signal is a move from passive age-gating to active, principles-based accountability, where platforms must demonstrate continuous commitment to upholding minimum age requirements and privacy protection. The framework makes clear that these are not competing interests: steps taken under the OS Act will not be considered reasonable unless providers also meet their Privacy Act obligations, ensuring age assurance measures are implemented transparently, proportionately and with respect for user agency and data minimisation principles.

What you should do now

Platforms that are subject to the laws will need layered controls, privacy centred design and ongoing monitoring to meet the reasonable steps standard by 10 December 2025.

To achieve compliance, platforms should:

  • Complete a self-assessment: Complete an initial self-assessment promptly to confirm whether the minimum age obligation applies before 10 December 2025; repeat the assessment regularly, especially when adding new social features or noticing changes in how users engage with the service; and document the evidence supporting your assessment and consider sharing both the assessment and supporting material with eSafety to assist in its review.
  • Implement a layered program: Reasonable steps involve a layered program of systems, technologies, processes, policies and communications across the user journey. This includes identifying accounts held by age-restricted users and deactivating or removing them with clear, respectful messaging; preventing under-16 users from creating new accounts; and implementing measures to mitigate circumvention (e.g. duplicate accounts, re-registration, evasion tools).
  • Adopt successive validation: eSafety encourages a layered, successive validation approach, which involves combining two or more age assurance methods or technologies to determine a user's age with greater accuracy. 
  • Ensure privacy-centred design: Steps taken under Part 4A of the OS Act will not be considered reasonable unless providers also meet their obligations under the Privacy Act and the APPs. Platforms are expected to adopt measures that respect user agency, dignity and safety while ensuring compliance with both the OS Act and privacy laws.
  • Provide alternatives to government ID: Social media platforms must not require end users to provide government-issued identification or use an accredited provider as the only method of age assurance. While these options can be offered, they must always be accompanied by a reasonable alternative.
  • Address existing accounts: The SMMA Obligations apply not only to new accounts but also to existing accounts created before the law takes effect. Ongoing compliance also requires proactive monitoring to detect and address underage accounts over time.
  • Obtain specialist advice: eSafety recommends obtaining independent legal advice and undertaking targeted Privacy Impact Assessments (PIAs) tailored to each service, user base and operating context. Mishandling data during age verification processes could, in some circumstances, expose platforms to civil claims, making robust privacy safeguards not just regulatory requirements but essential risk management.
  • Commit to continuous improvement: Providers are expected to monitor and improve their practices over time. This is critical because determined young users will inevitably discover methods to bypass verification systems, create duplicate accounts, or exploit vulnerabilities in age-checking processes. 

Looking ahead

The SMMA Obligations for protecting young social media users raise broader questions about the future of age verification in Australia. The infrastructure developed for social media could set a precedent for wider verification requirements and increased government intervention across the internet, potentially extending to sectors such as financial services (including youth banking products, cryptocurrency platforms, investment platforms, and buy-now-pay-later services), gaming and gambling platforms with strong monetisation features, and e-commerce sites selling age-restricted products such as alcohol, vaping products and supplements. These developments also bring data sovereignty to the fore, highlighting concerns about where verification data is processed and stored, and the implications of government access or foreign ownership of platforms holding Australian children's data. The intersection with recent and expected privacy reforms is particularly significant, as the introduction of a statutory tort for serious invasions of privacy, enhanced children's privacy protections, and anticipated uplifted requirements around consent and data minimisation, will fundamentally reshape how platforms approach age assurance. The privacy reforms' emphasis on privacy by design and the heightened obligations for handling children's personal information will require platforms to balance effective age verification with robust privacy safeguards. Finally, as enforcement and liability frameworks evolve, the true test will be how platforms, vendors and institutions respond when controls fail, particularly in safeguarding children's sensitive data in the event of cyber incidents or system failures. As the regulatory framework evolves, these issues will likely shape the next phase of age assurance policy.


How MinterEllison can help

Navigating these obligations is complex. Our team combines deep industry and regulatory expertise, privacy and data protection know how, and practical experience advising leading digital platforms and educational institutes. Please contact us if you would like assistance with your organisation's practices.

 

Contact

Tags

eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJuYW1laWQiOiJlODAwZjJlMS1jMDExLTRhZWQtODAxZS1lYWM0NWE5ZGJjMTUiLCJyb2xlIjoiQXBpVXNlciIsIm5iZiI6MTc2NDcyNDY0OCwiZXhwIjoxNzY0NzI1ODQ4LCJpYXQiOjE3NjQ3MjQ2NDgsImlzcyI6Imh0dHBzOi8vd3d3Lm1pbnRlcmVsbGlzb24uY29tL2FydGljbGVzL2F1c3RyYWxpYXMtaW1wZW5kaW5nLXNvY2lhbC1tZWRpYS1taW5pbXVtLWFnZS1vYmxpZ2F0aW9ucyIsImF1ZCI6Imh0dHBzOi8vd3d3Lm1pbnRlcmVsbGlzb24uY29tL2FydGljbGVzL2F1c3RyYWxpYXMtaW1wZW5kaW5nLXNvY2lhbC1tZWRpYS1taW5pbXVtLWFnZS1vYmxpZ2F0aW9ucyJ9.jmKjDlIKmeHopqkimtMOZ6rAZlbyJzkO1laF2YP-2Lw
https://www.minterellison.com/articles/australias-impending-social-media-minimum-age-obligations