The National Information Technology Development Agency (NITDA) has unveiled a Child Online Protection Strategy within its proposed Online Harms Protection (OHP) Bill, focusing on age verification to create safer digital spaces for minors in Nigeria.
The strategy was revealed in a white paper released by NITDA, containing the proposed Online Harms Bill developed in partnership with Advocacy for Policy and Innovation (API).
The white paper aims to balance digital rights with safety measures, protecting Nigerian citizens while fostering a secure digital ecosystem, according to NITDA, which says that the white paper was created as part of a collaborative initiative to address the escalating challenge of online harm in Nigeria.
“All online platforms shall implement age assurance and verification mechanisms to ensure that individuals under 18 cannot access services not intended for them,” NITDA says.
“Age-appropriate material,” NITDA says, “should only be accessible after the user’s age is verified as 18 or older, and social media sites shall put measures in place to limit access for individuals below the minimum age requirement, often set at 13 years old. In shaping our strategy, this white paper considers platforms catering to users aged 13-18.”
Key features of the strategy include:
Age Assurance Measures: Platforms must verify users’ ages to enforce content restrictions.
Parental Controls: Tools like time limits, content filters, and privacy settings to help parents manage online activities.
Transparency and Risk Assessments: Larger platforms will be required to publish regular risk assessments for child safety.
Illegal Content Removal: Platforms must block harmful content like child abuse materials, hate speech, and misinformation, with fast removal processes supported by judicial oversight.
New Offences: New crimes include cyber flashing, sharing deepfakes, trolling, and encouraging self-harm.
Access to Data: Bereaved parents can access their deceased child’s data under strict privacy rules.
Reporting Tools: Platforms must offer simple tools for parents and children to report harmful content.
Sanctions: Platforms failing to follow the law will face penalties based on global best practices.
Additionally, NITDA says that the strategy promotes public awareness campaigns to educate parents on how to use these tools effectively and ensure that protections evolve with changing digital behaviors.