The National Information Technology Development Agency (NITDA) says the Online Harms Protection Bill has proposed Content Moderation Organisations (CMOs) that will take charge of online content moderation and fact-checking to tackle digital misconduct and harmful online content.
NITDA, in collaboration with Advocacy for Policy and Innovation (API), has unveiled a white paper on the proposed Online Harms Bill, saying that the proposed law aims to balance digital rights protection with safety measures.

According to the Federal IT agency, the CMOs “would collaborate with social media platforms, news outlets, and other content providers to ensure factual and unbiased information dissemination. Additionally, they would educate the public on media literacy, fostering a culture that values truth and enables individuals to discern credible sources.”
NITDA: Online Harms Protection Bill seeks to check digital misconducts
“The bill will mandate platforms to fact- check and promptly remove instances of image-based sexual abuse, cyberflashing, and the creation or dissemination of deepfake pornography within a stringent but fair timeframe,” the Federal IT agency, NITDA, says.
NITDA says that the proposed CMOs will play a pivotal role in enforcing these measures. “Their responsibilities would include verifying the accuracy of information, debunking false claims, and providing clarity on disputed content.”
According to the Federal IT agency, the CMOs “would collaborate with social media platforms, news outlets, and other content providers to ensure factual and unbiased information dissemination. Additionally, they would educate the public on media literacy, fostering a culture that values truth and enables individuals to discern credible sources.”
The OHP Bill states that CMOs will also collaborate with online platforms, news outlets, and other stakeholders to refine content policies, ensuring transparency and the protection of users’ rights. “They would employ a holistic approach, utilising automated systems and human review processes to identify and address content that promotes hate speech, violence, terrorism, and other forms of harm.”
To ensure accountability, NITDA says that CMOs will be required to maintain transparent methodologies for content review and takedowns. “Collaboration with authorities will involve working alongside government agencies to address emergent online threats and supporting law enforcement in investigations relating to online harms while adhering to legal constraints.“