The enactment of the Digital ECA (Law No. 15,211/2025) brought to light a major topic for companies and families: assessing users’ ages. The Digital ECA and the decree regulating it (Decree No. 12,880/2026) introduced important definitions, such as age assessment, age verification, and age signaling, which are crucial to ensure the effective enforcement of protection measures.
These concepts aim at blocking minors’ access to content, products, and services prohibited by law to children and adolescents, such as pornography, alcoholic beverages, and games of chance and betting.
In parallel, the age rating system, governed by the Department of Justice and Public Safety (MJSP) Ordinance No. 1,048/2025, continues to operate on content with a pedagogical and informational focus, coexisting with the mechanisms introduced under the Digital ECA.
In this article, we detail how these instruments differ and may relate in practice.
Age Rating: focus on content
Age rating does not aim to verify, estimate, or infer users’ ages, because its focus is on the content itself. Under Article 7 of MJSP Ordinance No. 1,048/2025, age rating is pedagogical and informational, allowing families to make informed choices about entertainment appropriate to their children’s development.
Article 9 of the Ordinance highlights that age rating cannot prohibit the exhibition of works, cut scenes, or request content removal, in respect of freedom of expression and the prohibition of censorship. In this context, parental authority is exercised through freedom of choice, supported by parental control and blocking tools.
The Ordinance organizes content analysis into four thematic axes; namely, sex and nudity, violence, drugs, and interactivity, and classifies content into seven age groups, from “all audiences” to “not recommended for under‑18s.”
The interactivity axis, introduced by the Ordinance, not only considers what is shown but also how the product functions, taking into account features such as chat, live streaming, geolocation requests, microtransactions, and mechanisms that encourage prolonged use. This way, the analysis is closer to the real user experience in digital environments, even though it remains primarily content‑focused.
Age assessment, verification, and signals: focus on the user
While age rating labels content, age assessment focuses on users.
Decree No. 12,880/2026 distinguishes two concepts with different levels of strictness:
- Age assessment: it refers to methods to verify, estimate, or infer the user’s age or age group, by using technologies, and processes such as document analysis, biometrics, and usage patterns.
- Age verification: it is a specific assessment procedure that guarantees high reliability, under the standards to be set by the Brazilian Data Protection Agency (ANPD in Portuguese), based on confirming the truthfulness of the age attribute.
This distinction has important operational implications. Where the law prohibits minors’ access, the Decree requires age verification and effective blocking.
For content deemed improper or unsuitable, availability depends, where applicable, on the age rating and on the adoption of safety and parental supervision tools. In such cases, providers must receive age signals and, where applicable, conduct proportionate age assessment to tailor the experience by age group, without requiring high‑confidence verification.
The Decree defines the age signal as the information or credential that attests to the user’s age or age group to the provider, without disclosing personal data beyond what is necessary.
App stores and operating systems must request age at account creation, carry out assessment using a reliable method, allow challenges, and adopt anti‑fraud measures. These signals are made available to providers, limited to the minimum necessary to confirm the required minimum age.
In addition, for services accessed via browsers, there is a duty to assess age, with the possibility of using signals issued by operating systems, stores, or third parties. If the provider’s own assessment differs from the received signal, the more protective alternative must prevail.
Receiving the signal does not exempt providers from full responsibility for meeting all legal requirements applicable to the offering, including purpose limitation and proportionality in the processing of age‑related data.
Next regulatory steps
Implementation of the age‑assessment model will be further developed by means of complementary ANPD acts, which will define confidence criteria and methods accepted for age assessment and verification, a certification process for solutions and recognition of accrediting entities, and implementation stages with adaptation periods proportionate to risk and service type.
In general terms, the institutional timeline estimates the publication of guidelines and parameters, followed by a supervised adaptation phase, and then monitoring and inspection actions. Organizations must prepare by structuring governance to receive age signals, mapping and testing assessment methods, and documenting decisions with strict purpose limitation and data minimization.
Final remarks
The age rating system and the Digital ECA converge in practice. Whereas the former organizes information about age suitability and helps guide choices and configure parental controls depending on the content, the latter sets obligations centered on users, determining how age must be assessed or verified and which protection measures must be applied. Such interaction is especially evident for content, products, and services rated as improper or unsuitable under the Digital ECA. In other words, whereas ECA Digital requires safety‑by‑default and effective parental supervision tools, the age rating system, where applicable, supports signaling of age groups and risks to the public.
Companies must integrate these two frameworks without conflating them. That means following the guidelines of Ordinance No. 1,048/2025, tailoring the experience by age group based on age signals and proportionate assessment mechanisms, and reserving high‑confidence verification for the cases legally prohibited to minors. At the same time, it is important to strengthen governance for parental controls, purpose limitation, and data minimization in age‑related processing, while monitoring ANPD’s complementary acts on methods, reliability criteria, certification, and implementation stages.