Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021

Document type
Regulation
Country
Raghav
Mendiratta

In February 2021, the Ministry of Electronics and Information Technology released the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021 (hereafter, ‘the Rules’). The Rules drastically altered the intermediary liability regime in India and superseded/replaced the Information Technology (Intermediaries Guidelines) Rules, 2011.

The 2021 Rules are significantly broader than the Draft Intermediary Rules, 2018 as they regulate online intermediaries, as well as digital news organisations and OTT video streaming. The Rules are being criticised by civil society, policy organisations, and digital news organisations for, amongst other things, being overbroad and withering away digital freedoms including the freedom of expression and the right to privacy online. The Constitutionality of the Rules is currently being challenged before the Delhi High Court and the Kerala High Court.

Some of the key features of the Rules are: 

  • Overbroad regulation of OTT platforms, digital news organisations and other entities that publish news and current affairs: In addition to regulating intermediaries that receive, store or transmit information such as social media platforms and search engines, the Rules go a step further and also regulate digital news organisations, OTT platforms and other entities such as those that publish news and current affairs. This is problematic as Section 79(2)(c) of the Information Technology Act, 2000 under which the Rules are framed only allows the Government to make rules to regulate intermediaries. Regulating digital news organisations and OTT platforms that do not act as intermediaries but act as publishers of information arguably goes beyond the authority of the parent legislation. Further, given that no user-threshold requirement is specified for entities that comment on news and current affairs, the Rules could also impact individual internet users who upload content on platforms such as YouTube.
  • Arbitrary classification of intermediaries: The Rules impose additional stringent obligations on certain “significant social media intermediaries” that have a user base of over 5 million users. Interestingly, Rule 6 provides that even if a social media intermediary does not meet this user-threshold, the Central Government may still require an intermediary to meet these additional obligations if it believes that their operations create a material risk of harm to the sovereignty and integrity of India or to the security of the State. This discretion to the Central government may lead to the arbitrary imposition of additional obligations on certain intermediaries.
  • Tracing requirementUnder Rule 4(2), it is mandatory for a significant social media intermediary providing messaging services to identify the first originator of a message if a competent court or executive authority orders that it is necessary to do so for the purposes of investigation and prosecution of certain offences punishable with imprisonment for a term not less than five years. Technical experts say that compliance with this requirement is not possible unless end-to-end encryption on messaging services such as WhatsApp is broken.
  • Complex grievance redressal mechanism: The Rules require intermediaries to appoint a grievance redressal officer who must acknowledge all complaints received within 24 hours and must resolve complaints within 15 days of receipt. This is greater than the one period allowed by the 2011 Rules. Further, the Rules additionally require significant social media intermediaries to create a grievance redressal team that includes a Chief Compliance Officer who would be personally liable for the intermediary’s non-compliance with the Rules, a nodal point of contact available for 24*7 coordination with law enforcement agencies, and a Resident Grievance Officer. All three officers would be legally required to reside in India.
  • Automated content filtering: Rule 4(4) mandates significant social media intermediaries to deploy AI based content filtering technologies that proactively identify content such as rape or child sexual abuse, or other content that had previously been removed. The Rule specifies that such tools must be used by intermediaries in a way that they are proportionate with free speech, privacy and must contain safeguards including human oversight and periodic reviews for bias and discrimination.
  • Oversight mechanism for digital news organisations and OTT platforms: The Rules lays down a complex three-level structure to ensure that OTT platforms and digital news organisations comply with the Rules.
    • The first level of the oversight mechanism consists of a complaint to the intermediary/publisher’s Grievance Officer if any person has a grievance with the content published by a publisher in relation to these Rules. This complaint must be decided within 15 days. If unsatisfied with the response or the lack of a response, the complainant can appeal the Officer’s decision to a “Self-Regulating Body” at the second level of which the intermediary/publisher is a member.
    • There may be one or more Self Regulating Bodies of publishers created by publishers and their associations. Such a Self-Regulating Body would consist of a maximum of six members including “eminent persons” from the media, broadcasting, human rights field and the Body must be headed by a retired judge of the Supreme Court or the High Court. Each Self-Regulating Body must be registered with the Ministry of Information and Broadcasting and the Ministry of Information and Broadcasting must “satisfy itself” that the body has been constituted “properly”. The body would be empowered to warn/censure/reprimand the publisher and even censor the content as it deems fit. If the intermediary fails to comply with the Body’s directions, the Self Regulating Body may refer the matter to the third tier of the oversight mechanism that involves a complaint to an “Oversight Mechanism”.
    • The Ministry of Information and Broadcasting would develop an Oversight Mechanism to establish an “Inter-Departmental Committee” for hearing grievances. This Committee would consist of representatives from the Ministry of Information and Broadcasting, the Ministry of Electronics and IT, the Ministry of Defence and other ministries. This committee would recommend to the Ministry of Information and Broadcasting whether the intermediary/publisher should be warned/censured/made to apologise and may also recommend that certain content be deleted or modified for preventing incitement to the commission of an offence. The Ministry would then issue orders to the publisher.
  • 24-hour takedown requirement for non-consensually shared intimate content: Rule 4(1)(p) requires intermediaries to take down non-consensually shared sexually explicit content within 24 hours from the time that the intermediary receives a complaint from a person depicted in the content or by a person on their behalf.
Country
Year
2021
Topic, claim, or defense
Child Protection (Includes Child Pornography)
Revenge Porn
Obscenity or Morality
Fake News
Document type
Regulation
Issuing entity
Executive Branch
Type of service provider
Host (Including Social Networks)
Search Engine or Index
Internet Access Provider (Including Mobile)
Cable/Digital Video Recorder/TV
Issues addressed
Trigger for OSP obligations
OSP obligation considered
Block or Remove
Monitor or Filter
Data Retention or Disclosure
Type of liability
Primary
Type of law
Civil
General effect on immunity
Weakens Immunity
General intermediary liability model
Takedown/Act Upon Knowledge (Includes Notice and Takedown)
Takedown/Act Upon Court Order
Takedown/Act Upon Administrative Request