• The draft rules proposed by the government to curb “unlawful content” on social media that make it mandatory for intermediaries to trace the “originator” of such content have drawn strong criticism from the Opposition. The latter contend that the state is expanding the scope for surveillance of citizens.
• However, a close look at the draft Information Technology (Intermediaries Guidelines) Amendment Rules, 2018, shows that the proposed changes are largely in line with developments on this front in cases before the Supreme Court in recent months.
• While the Centre itself has been informing the court since October about its intentions, the court has also voiced its concern over irresponsible content on social media.
• In fact, in a July 17, 2018 judgment in the Tehseen S. Poonawalla case, the court gave the government a virtual carte blanche to stop/curb dissemination of “irresponsible and explosive messages on various social media platforms, which have a tendency to incite mob violence and lynching of any kind.”
• For instance, Rule 3 of the draft speaks about the “due diligence” to be observed by online platforms that have over 50 lakh users.
Norms for access
• Now consider this. On December 6, a Supreme Court Bench, led by Justice Madan B. Lokur, mentioned online giants Google, YouTube, Facebook, Microsoft and WhatsApp and recorded that “everybody is agreed that child pornography, rape and gang-rape videos and objectionable material need to be stamped out.”
• The same order also noted submissions by senior advocate Kapil Sibal, for WhatsApp, that “they have an end-to-end encryption technology, due to which it will not be possible to remove the content”.
• Subsequently, on December 11, the Bench ordered the Centre to frame the necessary guidelines/Standard Operating Procedure (SOP) and implement them within two weeks to “eliminate child pornography, rape and gang rape imagery, videos and sites in content hosting platforms and other applications”. The court then listed the case for February 2019. The draft rules have come within two weeks of the Supreme Court order.
• These two orders came on a suo motu case being heard in the Supreme Court from 2015 to curb online sexual abuse.
‘Safer social media’
• Past orders in the case show that since October, the government has been trying hard to convince the court that it really wants to make social media safe.
• Thus, a Supreme Court order of October 22 records that the Centre has already prepared a SOP “for taking action by the security/law enforcement agencies under Section 79(3)(b) of the Information Technology Act. A November 28 order records the submission of Solicitor-General Tushar Mehta indicating that “certain actions were required to be taken by the intermediaries”.
• These included setting up of proactive monitoring tools for auto-deletion of “unlawful content” and setting up a 24X7 mechanism for dealing with requisitions of law enforcement agencies.All these mechanisms can be found in the various clauses of the draft rules.
• The draft rules require the intermediary to trace the “originator of information” for authorised government agencies.
• The intermediary has to produce the information in 72 hours, but only if the request is based on a lawful order, in writing and concerns State security or investigation or prosecution or prevention of an offence, which may include lynching or mob violence.
• Besides, the draft rules put the onus on social media giants to “take all reasonable measures” to protect individual privacy as required under the Information Technology (Reasonable Security Practices and Procedures and Sensitive Personal Information) Rules of 2011.
Mains Paper 3: Internal Security | Role of media & social networking sites in internal security challenges
Prelims level: Various sections mentioned in the amendment bill
Mains level: Menace of unlawful content over social media and measures to curb it