This post has been written by Founding Editor Viraj Ananth.
Introduction
Late in 2018, the Ministry of Electronics and Information Technology released the Draft Information Technology [Intermediaries Guidelines (Amendment) Rules] 2018 (hereinafter ‘the Draft Rules’). The Draft Rules introduced unduly onerous obligations on intermediaries including incorporating and registering in India, enabling the tracing of originators of information and proactively identifying and removing access to unlawful content. These requirements were met with widespread criticism due to their disproportionate impact on start-ups, which thereby further centralises market power in favour of incumbents.
Recently, in June 2019, the Madras High Court directed the Central Government to issue a time-frame for rolling out a final version of the Draft Rules, thus reinvigorating the debate on intermediary liability. This article seeks to examine the appropriateness as well as lawfulness of the Draft Rules.
The Draft Rules Impose Financially and Technologically Unsound Requirements
Rule 3(7) of the Draft Rules provides that foreign entities with more than fifty lakh users in India must – i) be incorporated under the Companies Act; ii) have a permanent registered office in India with physical address and iii) appoint a nodal point of contact in India. These requirements impose disproportionately high costs of compliance on companies (particularly start-ups) and promise to negatively disrupt the market. Aside from the costs of staffing and opening a physical office in India, there are numerous pre-incorporation as well as annual compliances required under the Companies Act, 2013. For instance, a Company is required to appoint an auditor, maintain statutory registers and books of accounts and file numerous E-Forms. Additionally, the Indian Company will also be taxed in India.
Given the low threshold of 50 lakh users, which amounts to a mere 1% of Indian internet users and the lack of clarity as to whether the term extends to active users or all registered users, this requirement is likely to impact not only well-established incumbents but even newer players with a growing Indian user base. This is concerning because these compliance costs will be better absorbed by larger organisations with sufficient resources and staffing. Correspondingly, smaller entities, and particularly those with relatively non-revenue generating user bases in India, may be forced to stop operations in the country. As a result, the ultimate consequence is focused on the consumer who is left with less choice and less innovation, in an increasingly oligopolistic market.
Under Rule 3(5) of the Draft Rules, the government has introduced the requirement of “tracing out of such originator of information”. Compliance with this traceability requirement is financially unviable for numerous intermediaries and would create artificial entry barriers for many others. The requirement under Rule 3(5) also suffers from technological limitations, as asserted by WhatsApp before the Madras High Court in June 2019. For instance, a natural corollary of compliance with the said requirement is that standard encryption protocols must be broken, so as to enable traceability. This, however, will once again disadvantage the consumer since encryption protocols operationalise the user’s right to privacy by protecting against external hacks and enabling the intermediary to better fulfil data security obligations.
There are also numerous technological challenges that limit the appropriateness of “technology based automated tools” for the removal of illegal content, as required under Rule 3(9). Primarily, it is well-settled that these tools are subject to errors and biases, which are attributable to biases in the datasets fed to the artificial intelligence (hereinafter ‘AI’) algorithms. Thus, these tools may often perpetuate power imbalances and historical biases, and disproportionately impact minorities. Accordingly, a necessary pre-condition to such a move is not only generating huge volumes of data to train AI algorithms, but also formulating adequate ethical frameworks for the non-biased operation of these tools, as is being done by the European Commission.
Further, the Draft Rules, in laying down a broad and extremely subjective interpretation of unlawful content under Rule 3(2)(b), have failed to provide the much needed objective metrics and specificity required to enable algorithmic decision making. Such objective standards are important given the inability of automated tools, at the present stage of technological sophistication, to identify the context surrounding (purportedly) illegal content. For instance, automated tools can easily be used to detect infringing material by reference to a database of copyrighted content. However, such determination is not as straightforward when content is protected under the fair use doctrine, for example.
The Draft Rules are ultra vires the Information Technology Act and the Constitution.
The Draft Rules introduce the requirement of “proactively identifying and removing or disabling.… unlawful content”. This requirement runs in contradistinction to the interpretation of the term ‘actual knowledge’ under S. 79 of the Information Technology Act, 2000, (hereinafter ‘the IT Act’) as laid down by the Supreme Court in Shreya Singhal v. Union of India. Here, the Court read down ‘actual knowledge’ to mean situations where the intermediary receives a Court order or a government notification requesting for the removal of infringing material. In doing so, the Court reaffirmed India’s commitment, under the Manila Principles on Intermediary Liability, to provide ‘safe-harbour’ protection to intermediaries. Thus, the determination of the unlawfulness of content, which often involves a subjective assessment, was to remain strictly within the bounds of the Judiciary. This interpretation has been upheld in numerous recent cases including Kent RO Systems v. Amit Kotak and Myspace Inc. v. Super Cassettes Industries Ltd.
The Draft Rules are ultra vires from a delegated legislation lens. It is a well-settled principle of law that delegated legislation issued by an administrative authority cannot be repugnant to, or travel beyond the scope of, the enabling statute. Particularly, the delegated power cannot be exercised in a manner so as to create substantive rights, obligations or disabilities not contemplated within the enabling statute. Here, the Draft Rules extinguish the statutory immunity accorded to intermediaries under S. 79 of the IT Act and accordingly, must be declared void. Additionally, the Draft Rules vest the power to seek information or assistance in “any government agency”, which is not in consonance with either S. 69 or 69B of the IT Act. As per these provisions, access requests may be issued by an ‘authorised agency’ that must obtain specific authorisation or approval by the Central Government. Therefore, for the above two reasons, the Draft Rules are ultra vires the parent statute.
The Draft Rules are also ultra vires numerous constitutional protections, including those enshrined in Art. 14, 19 and 21. Art. 14 strikes at arbitrariness in State action. Accordingly, any law which seeks to pass constitutional muster must incorporate sufficient procedural safeguards to guard against arbitrary action. For instance, the Information Technology (Procedure and Safeguards for Blocking for Access of Information by Public) Rules, 2009 requires that a decision to block content be subject to the recommendation of a committee comprising a diverse set of government stakeholders and finally, to the approval of the Secretary, Department of Information Technology. The Draft Rules, on the contrary, disregard any such requirement and not only vests the judicial/quasi-judicial decision to block content solely in a private corporation but also fails to incorporate sufficient procedural safeguards to check the exercise of such power.
Numerous provisions lack the degree of specificity required to protect against arbitrary exercise. For example, the broad and vague definition of ‘unlawful content’ in Rule 3(2)(b) paves the way for ‘censorship creep’. In such circumstances, the intermediary may be forced to err on the side of caution and resort to over-enforcement. Further, the scope of the Government’s powers to request for “such information or assistance as asked for” for “detection or prevention of offences, cybersecurity and matters connected with or incidental thereto” under Rule 3(3)(a) is extremely broad. Finally, given the absence of checks and balance mechanisms on the exercise of Government and intermediary power, by way of reporting requirements for example, and the minimal importance accorded to principles of natural justice, the Draft Rules must be held as violative of Art. 14.
Under the Draft Rules, intermediaries are directed to inform their users not to upload or share any information which “threatens public health or safety”. This requirement runs afoul of Art. 19 of the Constitution, which does not feature public health or safety as an exception permitting infringement of the right. Further, private intermediaries removing content without any form of due process is in itself an unreasonable restriction on Art. 19. Additionally, certain provisions, such as the requirement of traceability, are excessive and violate the three-pronged test of proportionality, laid down by the Supreme Court in Justice KS Puttaswamy v. Union of India. Accordingly, the invasion into the right to privacy, guaranteed under Art. 21 of the Constitution, is not within permissible limits and must be rendered void.
Conclusion
The Draft Rules symbolise a bona fide intention on the part of the government to confront the growing threat of fake news and hateful online content. However, in failing to sufficiently engage in informed deliberation with the concerned stakeholders, the government has grossly mischaracterised the nature of the problem at stake as well as the response needed. If the government chooses to detract from the globally accepted position on intermediary liability and safe-harbour protection, it must, at the very least, rationalise the commercial and technological viability of its approach. Particularly, it must be pay due consideration to the potential displacement of start-ups and consequently, the growth in the centralisation of market power. Finally, it must ensure that its approach falls within statutory and constitutional bounds, lest it is exposed to wasteful and avoidable litigation.