The need for moderating the OTT platforms in India: Sahil Chopra

The new rules will play a pivotal role in tackling fake news, hate speeches, cybercrime, sexual harassment, explicit content, and make social media a more secure place.

The inception of the public internet can be traversed back to the era of the 1980s and since then it has succeeded in piercing every vertical of our life. It has revamped the
Sahil Chopra

conventional marketplace and unlike television, print or radio, which follow guidelines framed by the I&B ministry, social media and over-the-top (OTT) platforms had little to no regulation on the choice of content they are offering. Anyone from anywhere could write anything and publish it instantly without any review or moderation. The few guidelines that existed can be checked by moderation only in the form of post-review in which content is flagged after it has been posted and subjected to review and potential removal. Thus, these platforms underline the concept of post-moderation of content that in a layman language can be said as “post first, moderate later”.

After realising that the bad intent content can spread virally, reaching potentially millions of people before the platforms get into the shoes of post-moderation reviewing and removing them, the OTT operators decided to take necessary steps before it becomes too late to mend and have adopted the voluntary codes of self-regulation about the content shown on their platforms. In January 2019, OTT players such as Netflix, Hotstar and Alt Balaji along with others signed a code of best practices. The code was framed with the sole objective to empower audiences to make informed choices on age-appropriate content and protect their interests in choosing the content they want to watch, at their own time, will, and convenience.

Looking at all the steps, it felt like things will fall into their place. But the recent controversy of Tandaav narrates a different story. The controversies and Twitter trends made the Indian Government jump into the field, and announce new rules to regulate the social media platforms, OTT service providers and digital content providers in India, which were called “Guidelines for Intermediaries and Digital Media Ethics Code.”

Have a look at the changes and challenges offered by the new guidelines:


1. The OTT platforms will now have to self-classify the content into five age-based categories – U (Universal), U/A 7 years, U/A 13 years, U/A 16 years, and A (Adult).
2. The OTT platforms have to provide a mechanism of parental lock for content classified as U/A 13+ or higher, and reliable age verification mechanisms for content classified as “A” to make it more secure for children.
3. The social media platforms will need to remove flagged posts within 36 hours of receiving notice and the encrypted messaging apps will need to trace the originator of contentious messages. They also have to delete message including sexual content within 24 hrs.
4. There will be a three-tier redressal mechanism in which the first level would be self-regulation by the publishers, while the second level would be self-regulation by the self-regulating bodies of the publishers. The third level, on the other hand, would be an oversight mechanism.
5. The social media intermediaries will have to follow due diligence prescribed by the IT Rules 2021. If the due diligence is not followed, the ‘safe harbour’ provisions under section 79 of the IT Act will not apply to the social media intermediary. This is the legal protection to intermediaries who host user-generated content and exempts them from liability for the actions of users on their platform if they adhere to guidelines prescribed by the government.
6. The digital media will have to adhere to the Norms of Journalistic Conduct of the Press Council of India and the Programme Code under the Cable Television Networks Regulation Act.
7. The companies have to appoint a grievance redressal officer based in India, responsible for the redressal of grievances. The officer will decide on every grievance received by the company within 15 days.
8. There will be one or more self-regulatory bodies of publishers that will be headed by a retired judge of the Supreme Court, a High Court or an independent learned person, and not have more than six members.


1. The new guideline makes it mandatory for social media platforms to identify the originator of a message that authorities consider to be anti-national. Now, this comes directly in conflict with citizen’s free speech and privacy. Thus, it will be a challenge for social media platforms to maintain end-to-end encryption and commit user-data privacy.
2. The OTT platforms and digital media companies have to create a 3-tier grievance redressal mechanism. The third and top level of this structure will consist of representatives of various ministries and government departments. It will give powers to the bureaucrats to censor and block the content thus, curbing the creative freedom of content curators.
3. The new guidelines will also be extended to the digital new media bringing the challenges for freedom of the press. The inter-ministerial committee of bureaucrats will keep a check on what can and can’t be published on a media platform. It will bring the digital media outlets under the intense scrutiny of the state and will impact the free and unhindered news reporting.
4. The ‘Code of Ethics’ for OTT streaming platforms bring the content airing on them under the close watch of the government, as well as create regulatory observations that will have a direct impact on the nature and quality of the content being created and watched.
5. The next challenge comes with the kind of tools that new rules want social media platforms to use to filter out the objectionable content. These technology-based tools and automated censorships can suffer from accuracy problems which can lead to function creep. Moreover, there is no accountability and transparency in such surveillance forms and it will impact the user’s freedom of speech and expression.

The new guidelines have been welcomed by almost all digital media agencies and companies. It’s believed that new rules will play a pivotal role in tackling fake news, hate speeches, cybercrime, sexual harassment, explicit content, and make social media a more secure place. But there would be some challenges for their implementation considering a very thin line between user privacy and user security. Nevertheless, shifting from the post-review to pre-review moderation may surely bring some changes to our digital universe. Let’s see how these changes will be welcomed by the Nation.

[This is an authored article by Sahil Chopra, Founder & CEO of iCubesWire. All views, opinions and expressions are personal and limited to the author.]