Our Terms & Conditions | Our Privacy Policy
Can APAAR ID Be Used to Verify Parent-Child Relationships Under India’s Data Protection Law? #NAMA
Automated Permanent Academic Account Registry (APAAR) IDs could emerge as a solution to verifying parent-child relationships under the draft Digital Personal Data Protection Rules, 2025 (DPDP Rules), participants said at MediaNama’s discussion on the Rules on February 7. MediaNama conducted this discussion under the Chatham House Rule. (Chatham House Rule in a meeting allows the participants to freely use the information received, but the identity of the speakers or of any other participant must not be revealed.)
This comment was made during the discussion on the confusion around processing children’s data. The DPDP Act and its Rules state that companies have to obtain verifiable parental consent before processing the data of a child. To do so, companies have to verify the identity of anyone claiming to be a parent and give consent on behalf of a child.
APAAR ID is an identifier linked to a student’s academic records. Colleges and employers can refer to this information to verify that a student is accurately representing their qualifications. Speakers at the discussion pointed out that these bodies along with private entities could ping this ID and get yes or no information about parent-child relationships. This is because the ID also contains information about who the parent is.
Key points from the discussion:
Need to get answers about how APAAR ID would work for verification:
“I think we should seek more answers from what that looks like and what are the potential risks with that, because you’re pinging a database [like academic records] day in and day out to get something which actually has a lot of sensitive personal data,” a participant at the discussion pointed out.
Concern around the privacy of ping-back verification mechanisms:
“Even if it’s a yes or a no ping, who initiated that ping, what prompted it, where is it coming from? Can that be retraced?” a participant said. They expressed concern that even just a simple ‘yes’ or ‘no’ answer could raise privacy concerns.
Self-certification would not be enough:
One of the participants argued that if platforms do not carry out hard verification and verify the identity of all users of their services, children could slip through the cracks. “If a parent goes to data protection board saying, this platform has given my child access without verifying my consent, effectively the platform doesn’t have a leg to stand on, because under the law, the platform is required to get verifiable parental consent, even if the Rules don’t designate a mechanism for identifying whether someone is a parent or a child,” they argued.
Companies can use age assurance measures to prevent unlawful processing:
Another participant argued that companies would start using age estimation technologies to ensure that children don’t access the platform. In case a parent takes a bigger company to court saying that the company processed their child’s data without parental consent, the company could say that they have various signals based on user data to identify the user’s age. “There are varying levels of accuracy depending on the amount of data you have, the age of the child, whether they are 14 or say 17, for instance, but I think the broader point is not everybody is a large social media company with the ability to have sophisticated age assurance technologies,” they added.
Companies may get some flexibility to implement measures:
The DPDP Rules suggest that companies must implement “appropriate technical and organisational measures” to ensure that they obtain verifiable parental consent before processing the data of a child. Participants argue that this language suggests that the IT Ministry does not expect everyone to avoid processing a child’s data in a completely foolproof manner.
“One end of the interpretation spectrum [for the Rules] is you actively verify, find out if every user you’re dealing with is a child or not. Until there’s public infra built out for the kind of pinging that we were speaking of, that would essentially mean collecting maybe government IDs. That’s one extreme scenario. The other end is in all circumstances, you rely simply on a self-declaration. And I think each platform has to determine where it stands in the middle,” a participant explained. They added that even the Act does not state explicitly that companies have to verify each child.
Higher burden for platforms where parents are not existing users:
“You can see very clearly from the illustrations [mentioned in the Rules] that if you are a platform that already has the parent on your platform as an existing user, you can simply leverage that data to collect parental consent. But if you’re not, you have to rely on DigiLocker or a tokenized third party,” a participant argued. They argued that the Rules unintentionally favor larger platforms who have existing databases as opposed to smaller ones.
Are zero-knowledge proofs a feasible solution?
“When we talk about zero knowledge proof solutions, I think what we need to be clear about before we can integrate our APIs with this mechanism is, is this scalable? Is this computationally feasible? And is it, in fact, not going to have any negative impact on user experience?” a participant questioned. They asked this question in the context of identity proofs (like APAAR) which allow companies to ping the database and just get confirmation that a user is who they claim to be without the platform getting access to any additional information. They argued that this route of relying on a third party (be it DigiLocker or another authorised entity) would be more difficult than simply having existing databases of your own.
Advertisements
Technological Frictions with DigiLocker:
“If you use a virtual private network, DigiLocker stops working. If you turn on developer options on your Android device, DigiLocker fails again. If you have jailbreak iOS, DigiLocker is again glitching out,” a speaker argued. They said that in this case, DigiLocker would not be a robust solution to verifiable consent/identity verification.
Even bigger companies do not have existing databases:
A speaker pointed out that even larger companies, where some parents may have existing accounts, lack the requisite datasets to rely solely on their existing databases. “Now, if we’re talking about social media, and we all know, sibling information, spousal information, birth date, but all of that is not a compulsory data field. So, if you’re going to aggregate that data, it’ll be less than 30% who have chosen to declare,” he argued. Even if a platform has the required information, this information is not entirely correct. Users often put up wrong birthdates or other information, making this data unreliable.
Issues with obtaining consent synchronously:
Parents are not always present to provide consent when their children wish to access a service. In this case, children will not be able to get consent at the right time, a speaker mentioned. “If they are working on a school project, and they are looking at 20 different sites, one after another, and every newspaper site, for example, requires a login. And every time they login, then they have to do this. And I don’t see any companies also asking for it. How will you ensure that it does not come in the way that children are accessing internet generally?” a participant questioned.
Potential Introduction of a Universal Internet Access System
Pondering upon the possible solutions to synchronous consent, a speaker suggested, “There will [probably] be something called a universal internet access number, like an Aadhaar for internet access, which will be a new DPI [Digital Public Infrastructure], and that passport [like system] will work across [the internet] because you’ve already been authenticated otherwise.” To this, another added that the larger problem then becomes that all these solutions would lead to more and more data collection.
Key recommendations:
Age verification is not enough to protect children:
Participants pointed out that if the government’s objective with the DPDP Rules and the Act was to protect children online, then it would be a “slightly lazy assumption here that verifying the age of the child and collecting parental consent is somehow going to magically resolve all our children’s online safety problems.” Another participant added that there is a need for separate child safety legislation to address these concerns.
Need for an Age-Appropriate Design Code:
“I do think that a recommendation we should all get behind is asking this very fatigued government, which I completely empathize with, to have a slightly more considered, earnest discussion about the need for an age-appropriate design code that talks about 20 different things that platforms can do, including risk assess themselves,” a participant argued.
Also read:
Support our journalism:
For You
[ad_1]
Images are for reference only.Images and contents gathered automatic from google or 3rd party sources.All rights on the images and contents are with their legal original owners.
[ad_2]
Comments are closed.