New Delhi: Age verification, mandatory reporting of child sex abuse material (CSAM), and parental consent are some of the issues that the National Commission for the Protection of Child Rights has asked social media companies to address to uphold the safety of children using their platforms.
Representatives of Google, YouTube, Meta (Facebook, Threads and Instagram), Reddit, SnapChat, Sharechat and Bumble were present at the meeting.
Some of the key points that the NCPCR raised were methods of age verification, safety tools, mechanisms for detecting and reporting CSAM, support to law enforcement agencies, tools to identify deep fakes and measures to protect children from explicit content.
In the meeting, a consensus was reached on following KYC procedures under section 9 of the Digital Personal data Protection Act 2023 and mandatory reporting of CSAM under the POCSO Act 2012.
Apart from that, NCPCR also asked the platforms to provide data on the total number of cases submitted to the National Center for Missing & Exploited Children (NCMEC) for the period between January 2024 and June 2024. This data should include categories of image or video hashes (unique digital fingerprints), content details (such as child pornography, child exploitation), time-stamped logs, metadata, and any other relevant information.
The two sides also agreed on allowing social media platforms to enter into contracts with minors only after obtaining explicit consent from their parents or legal guardians.
The platforms also agreed to display disclaimers in English, Hindi and local or regional languages before showing any adult content, in pursuant of Section 11 of the POCSO Act as well as the Section 75 of the Juvenile Justice Act.
The disclaimers must warn parents that they may be held liable if the child views adult content under the aforementioned legal provisions.
The platforms have been asked to share a compliance report with the NCPCR in the next seven days.