The Digital Personal Data Protection (DPDP) Act marks a historic evolution in India's data privacy landscape. With an emphasis on individual consent, the DPDP Act empowers users with rights and unprecedented control over their data. Read our detailed breakdown of the DPDP Act to know more about India’s first data protection law.
The DPDP Act provides specialised protections for children and persons with disabilities. It is crucial for businesses to understand these laws to manage their systems in a compliant manner. Failure to comply with the DPDP’s provisions on children’s data can attract penalties up to ₹200 Crores.
What is the DPDP law on children’s consent?
DPDP Law says that a child’s personal data cannot be processed without parental consent. The same applies for persons with disabilities and their lawful guardians. To collect valid consent for using a child’s personal data you must:
- Verify if the user is a child (person below 18 years of age);
- Validate the guardian's identity and age to verify they are not minors themselves;
- Verify legitimacy of the relationship between the parent and child;
- Collect ‘verifiable’ consent from the parent/guardian;
To meet the threshold of verifiable consent, you must maintain detailed records showing that you fulfilled the above prerequisites for children’s consent. Naturally, all of the usual consent obligations such as obligation to give clear and specific notice, option of easy withdrawal of consent and erasure of data must also be provided. Remember that legally it is the Data Fiduciary’s responsibility to ensure that the consenting user is not a child. This inevitably means you must verify the age of all of your users.
How do you verify the age of every single user?
There is no perfect solution for age gating as of now. Some common methods include self declarations, quizzes, facial or biometric scans etc. Each of these has their pitfalls: self declarations are unreliable, quizzes can be gamed and hard ID verifications raise privacy concerns. None of the existing solutions are fully compliant with the DPDP standards for children’s consent. Even globally there is no perfect one size fits all solution.
However, lack of a solution will not absolve you from compliance.
Consider the case of GDPR where a third of its fines issued to social media platforms have been linked to children’s data protection, with Instagram and Tiktok amassing over €765 million in fines. It is quite possible for Indian companies to face the same level of scrutiny and penalties if they fail to comply with DPDP’s standards for children’s data.
How do you collect verifiable parental/guardian consent?
Let us assume you have successfully age gated your portal and verified the user is in fact a child. Now, how do you collect parental consent? One way to do this is to verify the age and identity of the parent that you have stored in your own records. If you do not have these records handy, the government is in the early stages of developing some technical frameworks to verify the identity and age of parents. So far one approach that has been spoken about is the Age Token approach.
Using an Age Token, a business will be able to verify the identity and age provided by the parent/guardian against the details stored in Digilocker. Once the parent enters their name and age, they can authenticate on Digilocker; Digilocker will in turn create an age token that will act as a verification record of the parent's identity and age. The age token will be encrypted to safeguard the ID's content, and only provides verification of the age and name information to the business. This is considered a Zero Knowledge Proof. Rahul Matthan, a luminary in the privacy space talks more about this here.
These zero knowledge proofs can also be created using an Aadhar based QR code or virtual ID generated by Unique Identification Authority of India (UIDAI) using a similar approach.
We expect the framework for executing children’s consent and age gating to be notified in the upcoming DPDP Rules.
Common Technology Approaches Across Jurisdictions
- AI-Based Age Estimation: Facial recognition algorithms estimate a user’s age through a selfie without storing personal data, offering a privacy-friendly and user-friendly age check.
- Digital ID Verification: Requires users to scan an ID matched with biometric images, providing highly accurate age verification suitable for strict compliance standards.
- Multi-Factor Verification: Uses a layered approach by combining biometric scans, ID checks, and security questions, ensuring high accuracy for age-restricted content.
- Privacy-Preserving Federated Models: Verifies age on the user’s device, minimizing data retention by transmitting only a pass/fail result, aligning well with privacy regulations.
- SMS/Phone Verification: Sends a verification code to the user’s phone, providing a simple and accessible age check, although less accurate than biometric options.
- Credit Card Verification: Verifies age through a credit card check, assuming only adults possess them, often used on platforms involving financial transactions.
- Knowledge-Based Authentication (KBA): Uses age-specific questions to verify identity, although minors can sometimes find answers online, making it less reliable.
- Blockchain Verification: Provides secure, decentralized age verification using blockchain, though high costs and technical complexity limit its practical use.
What are the other obligations on children’s data
Parental consent and age gating are undoubtedly onerous compliance hurdles. But that’s not all, the Digital Personal Data Protection Act puts further restrictions on processing children’s data extending beyond the foundational challenges of age gating.
No processing likely to cause detrimental effect
The Act prohibits processing activities likely to cause any ‘detrimental effect’ on the well-being of a child. The term ‘detrimental effect’ has not been defined in the law. These likely refer to activities that could compromise a child's privacy, security, and overall mental and emotional health.
For instance, exposure to inappropriate content such as violent or explicit material may have adverse effects on a child's development and psychological well-being. Similarly, engaging in digital behaviours that could lead to harassment, cyberbullying, or identity theft can compromise children's safety and mental health.
No tracking or targeted ads to children
The DPDP Act further prohibits tracking or behavioural monitoring of children or targeted advertising directed at children. These terms are also not defined making the exact scope of the obligation unclear. The prohibition may extend to age gating, content filtering or a total bar on monitoring of children's online activities and preferences over time.
For example, a gaming app might analyse a child's gameplay patterns, such as their level progression or in-app purchases to tailor ads to their preferences and gaming habits. Google and Youtube were fined $170 million for tracking children’s online activities without consent and targeting them with personalised ads. Such profiling and targeted promotion would also be prohibited under the DPDP Act.
Potential exceptions
The government has reserved the power to notify exceptions to the DPDP law on children’s consent on the basis of:
- Classes of data fiduciary to whom the obligations will not apply. Educational institutions, healthcare providers, NGOs or other types of entities may be exempted to allow ease of operations for the benefit of children.
- Specific purposes of processing that will be exempt: processing for child welfare or academic purposes may be exempted.
- A lower age for the applicability of the rules on parental consent and tracking: the age of 18 may be lowered to 16, 13 or other appropriate number depending on the notification. This will only happen if the government is satisfied that the fiduciary is processing children’s data in a verifiably safe manner.
The exact scope of the exceptions will depend on the rules and notifications to be released in the future. Please note that these exceptions will not exempt you from activities likely to cause detrimental effects to children.
International Developments
Globally, countries are enacting distinct but rigorous measures for safeguarding children’s online data, with varied approaches to age verification and parental consent. In the United Kingdom, the Age-Appropriate Design Code (AADC) and Online Safety Act require robust age verification for platforms accessible to children, disallowing simple self-declaration and encouraging biometric verification like Yoti's AI-driven age estimation. Enforced by Ofcom and the Information Commissioner's Office (ICO), these laws have led to significant fines, such as TikTok's £12.7 million penalty for inadequate age verification. Meanwhile, in the United States, the Children’s Online Privacy Protection Act (COPPA) mandates parental consent for under-13s, with emerging state laws adding layers of age verification requirements for platforms, particularly in states like Texas. Tech firms in the U.S. are exploring secure verification methods, including ID and biometric checks, to meet compliance demands, though the costs and implications for user experience spark ongoing debates.
In the European Union, GDPR and the Digital Services Act (DSA) set high standards for age verification, with country-specific consent ages between 13 and 16. Initiatives like the euCONSENT Project employ AI facial analysis and digital ID scanning to facilitate verification with minimal data retention. Germany leads with stringent age checks for adult content, while GDPR enforcements pose substantial fines for non-compliance. Australia's Privacy Act and Online Safety Act also push for effective age verification, utilizing methods like real-time video verification and third-party ID checks. With hefty fines under Australia’s Privacy Act, tech companies are collaborating on compliance strategies, balancing privacy-friendly methods with evolving laws. Across these regions, the shared emphasis is on protecting children while maintaining user privacy, with some countries exploring flexible, privacy-respecting verification methods to reduce invasiveness.
The path forward
As the government moves towards finalising the DPDP Rules, businesses must remain vigilant, anticipating potential exceptions and making timely changes. It is imperative to prioritise age verification mechanisms that balance accuracy with accessibility, ensuring that children's rights are upheld while facilitating their online engagement. Collaborative efforts between government agencies and industry stakeholders will be essential in developing comprehensive guidelines that promote children's safety and privacy in the digital realm.
In the near future you can expect further refinement of age verification technologies, greater clarity on regulatory exemptions, and increased awareness of children's digital rights. By embracing a holistic approach to children's data protection, organisations can foster a culture of responsible data stewardship, safeguarding the next generation's digital future while fostering innovation and inclusivity.