Ayush Raj is a student of National Law University, Nagpur 

I. Introduction

The recently passed Digital Personal Data Protection Act, 2023 (“the Act”) defines any individual below 18 years of age as a “child.” The Act prescribes additional obligations of data fiduciaries while processing children’s digital personal data. As outlined by the government, the Act is a principle-based legislation premised on the concept of consent in the data processing framework. In the digital era where a lot of educational, commercial and entertainment marketing is aimed at engaging with the younger generation, curiously, the Act creates no distinction between a child and a major. However, as a welcome deviation from the Draft Bill, the Act provides exemptions for specific classes and specific purposes. There is also a provision for safety dilution i.e. lowering the age requirements for obtaining mandatory parental consent in cases where processing of data is done in a predefined “verifiably safe manner”. In this regard, it becomes extremely significant to determine who is a child and up to what extent can a child exercise his right to consent. This article deals with the untouched aspects of child’s data under the Act by identifying the concerns and proposing the way forward.

II. Handling Child’s Data: The Parents Way

While the Act dilutes the consent requirements to an extent, however, the key to data processing rests with the parents (or legal guardians) only. It will pose a problem for the autonomy of the child in case there are disagreements between the parents and children regarding data sharing and processing. Parental consent appears to be resembling age-gating but pertinently, this age-gating is solely based on the discretion of the data principal without any regulatory guidelines.

Indeed globally, parental consent is the standard norm for processing a child’s data but at the same time substituting the child’s autonomy entirely with parental consent is arguably not the best manner to handle such data. In this respect, useful guidance can be sought from international norms such as GDPR and the California Consumer Privacy Act that provide for a graded consent mechanism rather than adopting a “one size fits all” approach.

III. Every Minor is a Child: A Misplaced Approach

Bringing the entire age group of 0-17 years under the same umbrella creates unique problems as the digital web requirements of a kid aged 12 years will be different from a teen aged 17 years. A better approach to handling minor’s data is by adopting a tailored regulation and there must be some agency and access given to minors for specified purposes. It must cater to the needs of each age group and must be specific to their unique requirements. This tailored approach will not only ensure autonomy over data use but also grant privacy to the child which was underlined as a fundamental right under Article 21 by the Supreme Court. Irrespective of the age segment they fall in, “children” as envisaged under the Act, make up a lucrative demography for businesses ranging from digital toys to examination preparation. Therefore, feasibility should find its place of prominence in determining a reasonable age bracket for defining a child. If we consider the global outlook, India’s definition of a child is the most adult in nature capped at 18 years of age while the UK and USA cap it at 13 years, China caps it at 14 years and the EU caps it at 16 years of age.

A.     The consent conundrum

Apparently, the Act in its present form may also widen the digital divide between boys and girls since the latter is likely to face more resistance in getting parental approval. Further, the concept of verifiable parental consent will also lead to undesirable volumes of technological surveillance to verify the age of the claimant. Moreover, parental control would not necessarily mean that the child’s data is safely processed because there will be instances of consent fatigue as well more prominent in rural areas where parents themselves may not be in an ideal position to understand the impact of giving their consent. Another pertinent issue with the Act is that it violates the principle of data minimization that obligates data fiduciaries to restrict data collection only up to the extent that is directly relevant for the purpose specified. It is because there is no rational nexus that reflects how collecting parent’s data would secure their child’s online environment. Consent fatigue ensues from the inability to give informed consent due to a lack of awareness. It appears to be misplaced in the Indian context as digital literacy is a challenge and hence, parents may hinder the online presence of their children on many grounds owing to unawareness.

B.     Countering the concerns

In light of the concerns discussed above, the government must make a sensible distinction between age and maturity as already practiced under the Indian Penal Code. It must also include a joint consent mechanism incorporating the consent of both the child and the parent in select cases based on empirical evidence. Another welcome move would be to provide decisional autonomy to teenagers to truly include them under the umbrella of digital nagriks. Additionally, data fiduciaries must ensure that push notifications enable children to opt out of a service if there is a more privacy-friendly manner for the same which would give decisional autonomy to children.

C.     Navigating through a tailored & purpose driven approach

The Act mandates that data fiduciaries must not take any measures that can potentially affect a child’s well-being. Again, a more nuanced way to protect the online well-being of a child is by providing details of permissible and impermissible actions to the data fiduciaries. In this regard, the UK’s age-appropriate design code sets out standards to be followed by tech platforms. It is suggested that India should adopt such a code with a risk-based approach. A risk-based approach would categorize children’s accessibility as per the use and impact of the platform; for example, a food ordering website poses is less risky for a child in comparison to a social media or an online trading platform as they engage in large volumes of sensitive personal data harvesting. Furthermore, profiling each set of data received raises multiple issues; firstly, it will unnecessarily increase the burden of technological scrutiny, hamper user experience, and restrict children’s digital access. Therefore, it is important to highlight that a risk-based approach coupled with selective profiling of data will help companies curate their content delivery while profiling data without any guidelines will do little good to the legislative objective.

Additionally, in the present form, the class exemptions offer very little help to the companies. In the digital era, many platforms are used in multiple ways, for example, telegram is used as a source of educational exchange, however, by its very nature, and it may be included in the class of messenger apps. Likewise, YouTube can be classified as an online streaming service despite it being a huge repository of knowledge.

D.    Puzzling the platforms on monitoring online activities: Prohibition versus regulation

The Act mandates that a children’s data processing must not affect their well-being nor the data fiduciaries can engage in targeted advertisements for children. Notably, the Act has removed the detailed definition of “harm”; nevertheless, the obligations of data fiduciaries in this regard remain the same. It also prohibits a child’s behavioural monitoring by data fiduciaries. There are class exemptions for these restrictions; however, broad application of these exemptions is anticipated and hence, clarity on its definition, scope, and ambit is a necessity.

As a signatory to the UN Convention on Child Rights, the government must endeavour to adopt the “best interests of the child” obligations to issue self-assessments and prescribe actions against probable indulgence in illegal activities. There should be a cost-benefit analysis of age assurance models like KYC, self-assessment, biometrics, and according to their efficacy, different models may be prescribed for different classes of platforms and websites.

Platforms should be able to prevent children’s access to online gambling, cyber bullying, pornography, hateful content, etc., and accordingly, inform the enforcement agencies to advance preventive measures. The “end-all” approach under the Act leaves no space for the platforms to exercise these options in the best interests of the child. A blanket ban on the platforms to track the online activities of the child is a double-edged sword. The manner of vigilance, howsoever delicate, should not encroach upon the fundamental right to privacy, nevertheless, reasonable measures are imminent to nurture and filter the online activities of the child ensuring the best interests of the latter.

IV.            Conclusion

The debate revolving around children’s online privacy and data protection is nascent in India and therefore, having a robust mechanism requires extensive consultation and analysis. The Act has to provide adequate safeguards to promise online privacy to children and the legislature must strike a fine balance between children’s autonomy, parental control, and commercial purposes of data fiduciaries in the best interest of the child by rolling a tailored and age-specific privacy regime having graded consent requirements for different age groups.

Share this post