This blog is written by Shivani Sehrawat, a 2nd year law student at Faculty of Law, University of Delhi.
INTRODUCTION
The Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Amendment Rules, 2023 aims to regulate the operation of online gaming platforms and the dissemination of information about the Central Government in online databases. The Amended Rules seek to establish different agencies on various levels for securing its objectives.
LEGISLATIVE HISTORY
The Central government enacted the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Amendment Rules, 2023 in exercise of its powers conferred by sub-section (1) and clauses (z) and (zg) of sub-section (2) of Section 87 of the Information Technology Act, 2000. Thereafter the Rules came into force on April 6, 2023, when the Ministry of Electronics and Information Technology notified them in the Official Gazette.
OBJECT
The primary objectives of the Rules 2023 are to ensure observance of security standards, especially on the online platforms which provide gaming facilities and news about the affairs of the central government to avoid future possibilities of personal and public loss.
Further, to ensure the completion of the purpose, the Rules provide that gaming platforms would ensure complete due diligence on their platforms by complying with all the security requirements that have been demanded and information-sharing platforms would ensure that no fake or misleading information is shared on their respective portal about the central government.
PROVISIONS OF THE ACT
S. No. | PROVISIONS | BRIEF OF PROVISIONS |
1. | Objective | The objective of these rules is to replace and update the Information Technology (Intermediaries Guidelines) Rules of 2011 with these new rules governing intermediaries under the Information Technology Act, 2000. This aims to establish current guidelines for intermediaries and ensure compliance with regulatory standards set by the Central Government. |
2. | Rule 2 – Definition | These rules establish definitions and terms for regulatory purposes related to online curated content and digital media. The definitions cover various aspects, such as access control mechanisms for content restriction based on user identity or age verification, access services for improving accessibility for persons with disabilities, and content descriptors for classifying online curated content based on specific concerns like discrimination, nudity, or violence. The rules also outline terms like ‘grievance’ for complaints related to content or intermediary duties and define roles like the ‘Grievance Officer’ appointed by intermediaries or publishers. Additionally, the rules specify definitions for ‘news and current affairs content’, ‘newspapers’, ‘news aggregators’, ‘online curated content’, ‘digital media’, and related terms used within the framework of the Information Technology Act, 2000. These definitions aim to provide clarity and standardization in the regulation of online content and digital media activities. |
4. | Rule 3 (1) – Due diligence by an intermediary | Intermediaries, including social media and online gaming platforms, must display rules, privacy policies, and user agreements prominently on their websites or apps. These guidelines inform users not to share certain types of content, such as offensive or infringing material. The intermediary should periodically inform users about these policies and quickly remove or disable illegal content upon notification from authorities or users. They must also preserve removed content for investigation, secure their systems, and assist law enforcement within specific timeframes. Additionally, intermediaries must not alter theirsystems to circumvent laws and must report cybersecurity incidents to the appropriate authorities. |
5. | Rule 3 (2) – Grievance redressal mechanism of intermediary | The intermediary must prominently display on its website or mobile application the details of a Grievance Officer for users to report violations. The Grievance Officer must acknowledge complaints within 24 hours and resolve them within 15 days, with specific timeframes for certain types of complaints like removal requests. The intermediary must also promptly act on complaints related to specific objectionable content, taking measures to remove or disable access within 24 hours of receiving the complaint. Additionally, the intermediary should have a system in place to receive and process complaints related to objectionable content effectively. |
6. | Rule 4A – Verification of online real money game | According to this rule, the Ministry may designate online gaming self-regulatory bodies through official notifications. Entities eligible for designation must meet specific criteria, including being a registered company under the Companies Act, having a representative membership from the gaming industry, and maintaining responsible online gaming practices. The self-regulatory body can verify online real money games based on set criteria, ensuring compliance with rules and laws. Verified games must display a mark of approval, and the self-regulatory body must maintain updated lists of verified games and members on their platforms. The Ministry has oversight authority and can issue directives or revoke designations if necessary. Grievance redressal frameworks must be prominently displayed, allowing complaints to be addressed within specific timelines. The Ministry can intervene in cases of non-conformity or for user protection. The term “prominently publish” means making information clearly visible on relevant platforms. |
7. | Rule 4C – Obligations in relation to online game other than online real money | The Central Government may direct intermediaries to comply with specific obligations, similar to those for permissible online real money games, for certain online games. This applies if it is deemed necessary for India’s sovereignty, security, foreign relations, public order, or preventing user harm. The period of compliance will be specified in the notification. The rules under section 4A will then apply to these notified online games. |
8. | Rule 10 – Furnishing and processing of grievance | Anyone with a grievance about content under the Code of Ethics can use the publisher’s grievance mechanism. The publisher will acknowledge the grievance within 24 hours and resolve it within 15 days. If unresolved, it can be escalated to the publisher’s self-regulating body. If still dissatisfied, an appeal can be made to the Oversight Mechanism for resolution within 15 days. |
9. | Rule 11 – Self-regulating mechanism at Level I | Publishers must establish a grievance redressal mechanism with an appointed Grievance Officer in India, who resolves grievances within fifteen days. Publishers must display contact details for the Grievance Officer on their website and comply with a self-regulating body’s terms. The Grievance Officer serves as the contact point for Code of Ethics-related grievances and interacts with complainants, self- regulating bodies, and the Ministry. Publishers must classify online curated content according to specified categories and display content ratings prominently for users before access. |
10. | Rule 12 – Self-regulating body | Self-regulatory bodies of publishers, led by retired judges or experts from media, oversee adherence to the Code of Ethics. They register with the Ministry within thirty days, ensuring alignment with rules. Functions include guiding publishers, addressing grievances, and issuing compliance advisories. Advisories may include warnings, apologies, content modifications, or referral of sensitive content to the Ministry. If no violation, the body informs complainants and publishers. Non- compliance prompts referral to the Oversight Mechanism within fifteen days. |
11. | Rule 13 – Oversight mechanism | The Ministry oversees Code of Ethics compliance, develops a charter for self-regulating bodies, and establishes an Inter-Departmental Committee for grievances. It issues guidance to publishers and directs adherence to the Code of Ethics. An officer of the Ministry not below the rank of a Joint Secretary is appointed as the “Authorised Officer” for issuing directions under rules 15 or 16. |
12. | Rule 16 – Blocking of information in case of emergency | In emergencies, the Authorised Officer examines content for potential blocking under section 69A of the Act, submitting recommendations to the Secretary, Ministry of Information and Broadcasting. The Secretary may issue interim blocking directions if necessary, recording reasons, without a hearing. The Authorised Officer must bring the matter before the Committee within 48 hours. The Secretary decides on the blocking request based on Committee recommendations; if not approved, the interim block is revoked. |
13. | Rule 18 -Furnishing of Information | Publishers of news, current affairs, and online curated content in India must provide entity details and documents to the Ministry for communication and coordination. This information must be submitted within 30 days of rule publication or commencement of operations in India. Publishers must issue monthly compliance reports detailing grievances received and actions taken. The Ministry can request additional information as needed for rule implementation. |
LANDMARK JUDGMENTS
- SHREYA SINGHAL V. UNION OF INDIA AIR 2015 SUPREME COURT 1523
In this landmark case, regarding Section 66A of the IT Act 2000, the Supreme Court struck down the provision due to its lack of clear definitions for the terms used within the section. Section 66A pertained to punishment for sending offensive or fake messages through communication devices or computer resources, causing annoyance, danger, criminal intimidation, insult, injury, or inconvenience to others. The court emphasized the importance of defining these terms precisely to avoid misuse and uphold free speech rights. Additionally, the court addressed the issue of real knowledge under the Intermediary Guidelines, stating that an intermediary’s knowledge is established upon notification by a court or appropriate government authority. The court ruled that intermediaries must promptly remove unlawful content upon receiving actual knowledge of its illegality, failure of which would result in loss of intermediary status and breach of due diligence obligations. This ruling clarified the responsibilities of intermediaries in managing content on online platforms while safeguarding free expression and legal compliance.
- MYSPACE INC. V. SUPER CASSETTES INDUSTRIES LTD (2017) 236 DLT 478 (DB)
In this landmark judgment, the court addressed the liability of online platforms like My Space for objectionable content uploaded by third parties. MySpace operated a social media website where users could upload and view content, with no alterations made to user-uploaded content apart from additional advertisements. The court examined the extent of the platform’s liability and the adequacy of due diligence in implementing rules, privacy policies, regulations, and user agreements. Regarding liability, the court focused on the concept of knowledge outlined in Section 79(3)(b) of the IT Act. The court determined that an intermediary’s knowledge of infringing content is based on specific information provided by the aggrieved person, typically in the form of specific URLs pointing to the location of the infringing content on the platform. This requirement clarified the conditions under which intermediaries like MySpace could be held liable for user-uploaded content, emphasizing the importance of addressing infringing material promptly upon receiving specific information from the affected party.
NEEDS & SIGNIFICANCE OF THE AMENDED RULES
The amendment in the IT Rules 2021 was a necessary requisite as cases of fraud on online gaming platforms were increasing rapidly across the country and due to want of any specific law which could regulate their activities, the individuals were suffering huge amounts of monetary loss along with violation of their fundamental rights. Even the survey report of PWC also mentioned in its second edition that more than 57% of fraud are committed on online platforms which includes monetary transactions and one of the major reasons is the absence of regulatory mechanism.
Similarly, there has been a remarkable development in the spread of fake news about the government in the last couple of years due to rapid advancement in technology and easy accessibility, especially during the pandemic which resulted in severe chaos across the frontier among the masses and widespread aggression against the union government. Consequently, the government felt it necessary to maintain a mechanism which regulates the spread of false and fake information on online platforms specifically, as sharing and spread of any sort of information is very convenient on these platforms without any requirement of verification which in itself is a leading factor to demand a control mechanism.
Thus, the need to provide a regulatory mechanism to control the affairs and operations of online platforms is necessary to ensure the protection of individuals against any harm including access to wrong information about government activities which has the capacity to result in hazardous effects across the territory of the nation.
CRITICISM OF THE AMENDMENT RULES 2023
Despite the very fact that the amendment in the 2021 Rules was a need of the hour to ensure the security of the citizens, the Amended Rules have been severely criticized and condemned by the general public, legal practitioners, academicians and journalists throughout the country for violating the fundamental principles of the Constitution of India and core values of the democracy.
- RULES ARE BEYOND THE SCOPE OF THE ACT
The Information Technology Act, of 2000 empowers the central government to make rules for specifying the requirements of due diligence to be observed by intermediaries for claiming the exemption under the Act, although, the new Amended Rules, 2023 create a new category of false and fake information for regulating and restricting the content available on the website of the intermediaries. The creation of this new category is beyond the scope of the delegated power of the central government as the Act does not provide for the regulation of fake information and that too by an executive body of the government.
- VIOLATION OF ARTICLE 19 OF THE COI
Article 19(1)(a) of the Constitution of India grants the fundamental right to freedom of speech and expression to every citizen of the country and imposes certain reasonable restrictions on the exercise of the right. It has been repeatedly held by the Apex Court of the Country in its numerous judgments that the right under Article 19(1)(a) can be restricted only on the grounds mentioned in Article 19(2) of the Constitution. Therefore, the Amended Rules which provide for the deletion of fake or false information is violative of the fundamental right as it is an additional ground which seeks to restrict the freedom of speech and expression and cannot survive legally as it is not a Constitutionally recognized restriction.
- CONTRARY TO CONSTITUTIONAL MORALITY
The provision of control and identification of information about the business of the central government whether as true or false by the fact check unit of the Press Information Bureau is contrary to the freedom and right of journalists and press as it seeks to hinder their professional privilege of disseminating news and opinion which is integral to their occupation.
Moreover, even The Indian Newspaper Society urged the Ministry of Electronics & Information Technology to withdraw the IT Amendment Rules, 2023, as it gives sweeping powers to the central government to determine fake false or misleading information and take down such content, which is further contrary to democratic values of the nation as it is the right of the press to determine that what sort of information is necessary to be shared with the public and not of the government. Moreover, some of the press media houses have named the Amended Rules as a mode of censorship of the freedom of media.
- AGAINST THE PRINCIPLES OF NATURAL JUSTICE
The Amended Rules make the central government as the sole authority to determine whether a particular information is truthful or not which amounts to the delegation of excessive power and not only it is beyond the scope of the IT Act, 2000 but also against the provisions of the Constitution of India, thus, violating the Doctrine of Ultra Vires. Similarly, making the central government the final arbiter to determine the authenticity of information about its business is contrary to the principle of Nemo judex in causa sua as the government is made the judge in its own case.
- RULES ARE ENACTED BEYOND THE JURISDICTION
The central government only have the power to make laws on communication as specified in Entry 31 of the Union List in the Seventh Schedule of the Constitution, consequently, the Rules as made are applicable only to the extent of communication on the platform of online gaming intermediaries and any application beyond that would not be legally acceptable as the central government does not have the competence to enact law for that matter.
CONCLUDING REMARKS
The Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Amendment Rules, 2023 has a definite objective of providing a regulatory mechanism for controlling the fraudulent conduct of individual users on the platforms of online intermediaries and also aims at ensuring compliance with security standards for reducing the increasing rate of cyber offences, however, due to lack of accountability and proper safeguarding procedures, the Rules have received a severe backlash and strict scrutiny from all the concerned interest groups.
Moreover, the Rules grant excessive power to the central government without any need which is violative of basic democratic values and constitutional rights of different stakeholders across the spectrum without prescribing any safeguard against the abuse of the power. Similarly, the Rules ambiguously seek to regulate the gaming intermediaries without stipulating any appropriate method and procedure for the identification of the concerned intermediaries.
However, despite the ruthless criticism of the Rules, the Amended Rules, 2023 has a legitimate purpose which is necessary to be achieved in the larger public interest as a spread of false information about government activities and affairs of central government might lead to serious security concerns on national level and even to public riots which has become a new Indian reality since 2020. Also, if the operations of gaming platforms are left uncontrolled then there is a probability that these intermediaries would cause irreparable harm to their users as with the passage of time, the number and rate of online gaming users are growing cumulatively throughout India.
Henceforth, the Amended Rules, 2023 depict the need for their enactment, however, the faults in its drafting do not do justice to its objectives, hence, requiring certain modifications in its provisions.