Вы находитесь на странице: 1из 9

I.

THAT THE INTERMEDIARIES CANNOT AVAIL SAFE HARBOR


PROTECTION UNDER SECTION 79 OF INFORMATION AND
TECHNOLOGY ACT, 2000.
The counsel on behalf of the petitioners humbly submits before the Hon’ble Court that the
Funbook and ChatOn has failed to maintain the due diligence required by any intermediary to
claim immunity from third party content. In the instant case, the video of man being beaten to
death and Lady Unnamed being raped has been circulated over the platform Funbook and
ChatOn over and over again which further lead to instigating riots over certain part of the
Republic of Sindhia.
The Organization for Economic Co-operation and Development (OECD) in April 2010
proposed that “Internet intermediaries” be defined as follows:1 “Internet intermediaries bring
together or facilitate transactions between third parties on the Internet. They give access to,
host, transmit and index content, products and services originated by third parties on the
Internet or provide Internet-based services to third parties.” In India, Internet technology is
regulated by the IT Act. The Act also stood as a facilitator of digital commerce in the country
by providing safeguard tools. In many ways the Indian "safe harbor" provisions for online
intermediaries are similar to the ECD2, not the DMCA3. "Intermediary" is a broadly defined
term, catering to all perceivable kinds of service providers on the Internet and is similar to the
definition of intermediaries as found in the ECD. Intermediaries find mention and are defined
under Section 2(1)(w) of the Act which reads as follows:―(w) ―intermediary‖, with respect
to any particular electronic records, means any person who on behalf of another person
receives, stores or transmits that record or provides any service with respect to that record and
includes telecom service providers, network service providers, internet service providers, web-
hosting service providers, search engines, online-payment sites, online-auction sites, online-
market places and cyber cafes;
The true intent of Section 79 is to ensure that in terms of globally accepted standards of
intermediary liabilities and to further digital trade and economy, an intermediary is granted
certain protections. Section 79 is neither an enforcement provision nor does it list out any penal
consequences for non-compliance. It sets up a scheme where intermediaries have to follow

1
Organisation for Economic Co-operation and Development, The Economic and Social Role of Intermediaries,
Definitions, 9, 2010.
2
The Electronic Commerce Directive Regulations 2002.
3
Digital Millennium Copyright Act (1998).
certain minimum standards to avoid liability; it provides for an affirmative defence and not a
blanket immunity from liability.
Thus, the platform used for transmitting both the video in this case such as Funbook and
ChatOn would lie under the ambit of a definition of Intermediary and hence in case of failure
of necessary requirement to be fulfilled on part of third-party content, the Intermediary will
attract the liability for the same. The intermediaries are required to specifically follow all the
rules mentioned under intermediary guideline4 to claim immunity from third party content.
 IT WAS INCAPABLE IN OBSERVING DUE DILIGENCE AND CERTAIN
GUIDELINES ISSUED BY THE CENTRAL GOVERNMENT.
The due diligence to be followed by the intermediaries are enacted under the Rule 3 of The
Information Technology (Intermediaries guideline) Rules, 2011, which reads as follows:
Due diligence to be observed by Intermediary- The intermediary shall observe following due
diligence while discharging his duties, namely: (a) belongs to another person and to which the
user does not have any right to; (b) is grossly harmful, harassing, blasphemous defamatory,
obscene, pornographic, paedophilic, libellous, invasive of another's privacy, hateful, or
racially, ethnically objectionable, disparaging, relating or encouraging money laundering or
gambling, or otherwise unlawful in any manner whatever; (c) harm minors in any way; (d)
infringes any patent, trademark, copyright or other proprietary rights; (e) violates any law for
the time being in force; (f) deceives or misleads the addressee about the origin of such messages
or communicates any information which is grossly offensive or menacing in nature; (g)
impersonate another person; (h) contains software viruses or any other computer code, files or
programs designed to interrupt, destroy or limit the functionality of any computer resource; (i)
threatens the unity, integrity, defence, security or sovereignty of India, friendly relations with
foreign states, or public order or causes incitement to the commission of any cognisable offence
or prevents investigation of any offence or is insulting any other nation.
The term “due diligence” is not exactly defined by the Act, but in Bharat Petroleum
Corporation Ltd. v. Precious Finance Investment Pvt. Ltd, 5 The Dictionary meaning of the
expression "due diligence" as given in the Black’s Law Dictionary6, means "Such a measure
of prudence, activity or assiduity, as is properly to be expected from, and ordinarily exercised
by, a reasonable and prudent man under the particular circumstances; not measured by any

4
Information Technology (Intermediary Guidelines) Rules (2011).
5
Bharat Petroleum Corporation Ltd. v. Precious Finance Investment Pvt. Ltd, 2006 (6) BomCR 510; Chander
Kanta Bansal v. Rajinder Singh Anand, (2008) 5 S.C.C. 117.
6
Black’s Law Dictionary (6th ed. 1990).
absolute standard, but depending on the relative facts of the special case." Similarly, the Law
Lexicon7 explains "due diligence" to mean such watchful caution and foresight as the
circumstances of the particular case demands. While examining the explanation offered or
cause shown as to why in spite of due diligence a party could not have raised the matter before
commencement of trial, the Court may have to see the circumstances in which the party is
seeking amendment. In short, the explanation as to "due diligence" depends upon the particular
circumstances and the relative facts of each case to reach a conclusion one way or the other.
Here in this present case, the intermediary lacks the proper vigilance mechanism as the content
which was uploaded to their platform was not only against their terms of usage but was also
incoherent with the due diligence process followed by all the intermediaries to claim immunity
from third party content. The video belonged to another person and the uploader had no right
over the same and it was grossly harmful nature in nature and could corrupt the minds of young
people and minor of the society. Moreover, the video was also in violation of the copyright of
the owner as the video was recorded without his consent and hence the lady unnamed has right
over the same. And since the video also contained some provocative religious slogans which
incited hatred among the different religions of the society and thus lead to riot over certain part
of the Republic of India. Thus, according to definition of due diligence, Funbook and ChatOn
were not cautious of the content uploaded to their platform and was negligent in fulfilling their
obligations.
 UPON RECEIVING ‘ACTUAL KNOWLEDGE’ OF THE ILLEGAL CONTENT ON
THEIR PLATFORM, INTERMEDIARIES DID NOT TAKE ANY ACTION.
In view of the abovesaid act of respondent no.1, it is an “intermediary‟ within the definition of
Section 2(1) (w) and Section 79 of the Information Technology Act, 2000. Under Section
79(3)(b) of the IT Act,2000, respondent no.1 is under an obligation to remove unlawful content
if it receives actual notice from the affected party of any illegal content being
circulated/published through its service. He is bound to comply with Information Technology
(Intermediaries Guidelines) Rules 2011. Rule 3(3) of the said rules read with Rule 3(2) requires
an intermediary to observe due diligence and not knowingly host or publish any information
that is grossly harmful, defamatory, libellous, disparaging or otherwise unlawful. Prima facie,
it appears that the respondent no.1 acted in clear non- compliance of the said due diligence
obligations under the said rules read with Section 79 of IT Act, 2000 by not removing illegal

7
P. Ramanatha Aiyer, Law Lexicon (2nd ed. 2001).
information even after it obtained actual knowledge of the same when plaintiff has served on
it cease and desist notice on it
Further, Rule 3(5) of the said rules requires an intermediary to inform its users that in case of
non-compliance with rules of user agreement, the intermediary has the right to terminate access
or its usage rights of users and remove non-compliant information. Here in the similar
platforms like WhatsApp, they explicitly mention the term of usage by customer8 and is stated
as:
“You must access and use our Services only for legal, authorized, and acceptable purposes.
You will not use (or assist others in using) our Services in ways that: (a) violate, misappropriate,
or infringe the rights of WhatsApp, our users, or others, including privacy, publicity,
intellectual property, or other proprietary rights; (b) are illegal, obscene, defamatory,
threatening, intimidating, harassing, hateful, racially, or ethnically offensive, or instigate or
encourage conduct that would be illegal, or otherwise inappropriate, including promoting
violent crimes; (c) involve publishing falsehoods, misrepresentations, or misleading
statements; (d) impersonate someone; (e) involve sending illegal or impermissible
communications such as bulk messaging, auto-messaging, auto-dialling, and the like; or (f)
involve any non-personal use of our Services unless otherwise authorized by us.”
In the present case, respondent no.1 being aware of the illegal content posted on their platform
and the content to be in violation of their term of usage, neither removed the illegal content nor
terminated the rights of access from its platform. Moreover, illegal content was brought into
the notice by the aggrieved person i.e., the lady unnamed who’s video was transmitted through
their platform without her consent. Thus, leading to violation of her Fundamental right of right
to privacy of the lady unnamed.
The similar provision which deals with active knowledge of the content being posted on
intermediary platform are enacted under Copyright Act. that under Section 51(a)(ii)9, once a
copyright owner establishes that an intermediary has permitted its space for communicating
infringing works to the public, the burden is upon its ignorance or lack of reasonable grounds
to believe that infringing content was uploaded on its website. The language of Section
51(a)(ii) shows that an infringer’s lack of knowledge about infringement is irrelevant.

8
Acceptable Use Of Our Services, Whatsapp (Aug. 07, 2019, 11:10 PM),
https://www.whatsapp.com/legal/#terms-of-service.
9
Copyright Act §51(a)(ii) (1957).
In the case of MySpace,10 it had knowledge of continuing infringement despite notice that was
sufficient to hold its reasonable grounds of belief about infringement on its website. Reliance
was placed on Sega Enterprises Ltd. v. MAPHIA11, to say that where there is knowledge,
encouragement, direction and provision to enable infringement, the website is liable regardless
of its lack of knowledge of exact location of the files. The exclusive right to exploitation,
adaptation, storage including and especially electronic storage, reproduction and
communication to the public was now violated because of MySpace’s acts. Its inaction to take
down offending content resulted in infringement.
The Court of Final Appeal held that a host (in the case, the operator of a popular online forum)
could only have a defense “if it was established that [he], upon obtaining knowledge of the
content, promptly took all reasonable steps to remove the offending content from circulation
as soon as reasonably practicable”.12
There is significant distance between knowing that content exists and knowing that it is illegal
between knowing about unlawful activity and knowing about the unlawfulness of certain
activity. In establishing that liability flows strictly from knowledge of illegal content, those
decisions fail to create the conditions that cater for the huge difficulty that, at times, exists in
inquiring into illegality itself.13
In A& M Records Inc v. Napster14, the issue was whether Napster, the defendant was liable for
deliberate knowledge; what was the applicable standard for deciding an intermediary’s
infringement. On the facts, it was held that Napster had actual information and knowledge of
copyrighted infringed works on its site. The appellate court indicated the relevant test for the
purpose: We agree that if a computer system operator learns of specific infringing material
available on his system and fails to purge such material from the system, the operator knows
of and contributes to direct infringement. ... Conversely, absent any specific information which
identifies infringing activity, a computer system operator cannot be liable for contributory
infringement merely because the structure of the system allows for the exchange of copyrighted
material.

Even after issue of notice of cease and desist, intermediary expressing its inability to remove
or block those illegal contents or sexually explicit material only on the ground that it has no

10
Myspace v. Super Cassettes, 236 (2017) D.L.T. 478.
11
Sega Enterprises Ltd. v. MAPHIA, 857 F. Supp. 679, 683 (N.D. Cal. 1994).
12
Oriental Press Group and Another v. Fevaworks Solutions Ltd., (2013) 16 H.K.C.F.A.R. 366, 401; Metropolitan
v. Designtechnica, (2009) E.W.H.C. 1765 (Q.B.).
13
Hamling v. United States, 418 U.S. 87, 120 (1974); Rosen v. United States, 161 U.S. 29, 41 (1986).
14
A& M Records Inc v. Napster, L.L.C. 239 F.3d. 1004 (2001).
control over it and the internet service provider directing the parties to approach the Court and
obtain order for removal of such material; indirectly it amounts to encouraging the net users to
post such illegal content or sexually explicit material including child pornography in the
websites and it will be continued on the website till a direction was issued by competent Court
for removal of such content. In this case as well, the plaintiff flagged the content and asked for
immediate removal and to stop the transmission of the video15 as total reputation of the
aggrieved lady against whom such immoral content was posted would be greatly affected in
the eye of the society and would seriously affects the character and image of such person
 IT WAS INVOLVED IN CONSPIRING, ABETTING, AIDING OR INDUCING THE
COMMISSION OF THE UNLAWFUL ACT.
Intermediary is not the publisher of information was the view 2-3 years ago. They were
regarded as neutral highways, but this has changed. Now they are curating content on the
basis of user behaviour and therefore cannot be regarded ‘neutral’, Governments all across
the world are grappling with the issue of moderating illegal content on the Internet and
gave the example of the Christ Church shooting 16 in New Zealand where the perpetrator
live streamed the act on Facebook. Intermediary had all the power to remove the content
but did not remove it which suggests that they had consented to the publication of the
defamatory content. They might have a defence prior to the lady unnamed sending t hem a
notice, but after receiving the notice, they were obligated to remove it as it affected the
complainant’s right to live with dignity and self-respect.
The content in question has been hosted on respondent platform which is per-se inflammatory,
unacceptable by any set of community standards; seeks to create enmity, hatred and communal
Violence amongst various religious communities: is demeaning, degrading and obscene, and it
will corrupt minds and adversely affect religious sentiments. It is further submitted that the
videos circulated in the respondent’s platform were unacceptable to the secular fabric provided
by the Constitution of India and would be intolerable to any community or religion. It is further
alleged that on a bare perusal of the contents it is clear that the same would certainly corrupt
young minds below the age of 18 and even elders, it is highly provocative and which may even
lead to consequences effecting communal harmony.
It is further stated that the said contents available and transmitted on these platforms are per-se
unacceptable and clearly established the offences punishable under various provisions

15
Moot Proposition, ¶8.
16
Christchurch shootings: New Zealanders hand over guns, (Aug. 17, 2019, 01:08 AM),
https://www.bbc.com/news/world-asia-48973511.
mentioned in the IPC. if such content is allowed to continue on these platforms in this form,
then incalculable and irreparable damage will be caused to the secular fabric of India. It is
alleged that all those who are responsible for allowing this content to be transmitted on the
platform are conspired with those who are the source of such content, and those who are
promoting such material with malice to defame the country and with intent to spread communal
violence to destabilise the country with undisclosed persons and are liable to be prosecuted and
punished for offences U/s 153 (A), 153(B), 292, 293, 109, 500 and120-B IPC.
It is further averred that the contents which are shown on the social networking platforms are
clearly showing and instigating enmity between different groups on grounds of religion, race,
place of birth, residence, language etc. and doing acts prejudicial to the maintenance of
harmony as is quite apparent on a bare look at the material available on these social networking
platforms. It is further stated that the content which has been shown on these platform amount
to imputations, assertions, which are prejudicial to national integration. It is asserted that the
contents which are available on these platforms are illegal and indcent in nature, which may
lead to creation of obscene books, pamphlets, paper, which can easily be downloaded from
these platforms affecting the minds of children and was harmful for social harmony and may
lead to increase in crime against women also.

If it were assumed that Funbook and ChatOn could claim safe harbor under Section 79, it does
not fulfil the criteria enlisted in the provision. It would not fall under Section 79(1) because
that deals with information held by an intermediary in its capacity as an intermediary. If an
intermediary actively begins participation in communicating such content and does not comply
with the requirements of Section 79(2) and 79(3), the safe harbor under Section 79(1) is
inapplicable.
II. THAT RESPONDENT NO.1 SHALL BE LIABLE TO PAY COMPENSATION
TO LADY UNNAMED
The petitioner humbly submits before this Hon’ble Court that the Respondent no.1 shall be
held liable to pay compensation to Lady Unnamed.
Section 43A of the Act states that Where a body corporate, possessing, dealing or handling
any sensitive personal data or information in a computer resource which it owns, controls or
operates, is negligent in implementing and maintaining reasonable security practices and
procedures and thereby causes wrongful loss or wrongful gain to any person, such body
corporate shall be liable to pay damages by way of compensation, not exceeding five crore
rupees, to the person so affected.17 Further, Section 69A of the IT Act provide for a procedure
for the government to take down third party content, failure for which makes the intermediary
liable to a penalty.18 Similarly, in the decision of the Court of Catania of June 29, 2004 No.
2286, the Court, when making reference to the E- Commerce Decree ruled that the hosting and
cacheing provider can be held liable for negligent behaviour when it is aware of the presence
on the website of potentially infringing material and does not expeditiously in order to ascertain
the actual illegality of such material, and subsequently removes it.19 Also, Article 14 of E-
Commerce Decree provides that the service provider is not liable when the provider does not
have actual knowledge of illegal activity or information and the provider, upon obtaining such
knowledge or aware ness, acts expeditiously to remove or to disable access the information.20
Further, in Myspace v. Super Cassettes21, the Delhi High Court interpreted the requirement of
awareness to mean ‘actual knowledge’, of specific infringing works, not necessarily by way of
a court order. Drawing a parallel to the case at hand, Funbook and ChatOn received various
complaints from users regarding the concerned videos which violated the privacy of women.
They were aware of the concerned video but still they didn’t took preventive measures to
maintain reasonable security practices. Moreover, the video also caused massive riots and
violence in the State of Broomland, causing a communal fight between the Sindhus and
Jeruslams.22
Further, the Apex Court has also held search engines, liable, as intermediaries, for hosting
advertisements and keywords relating to pre-natal sex determination, in the case of Sabu
Mathew George v. Union of India23 to hold them liable for displaying advertisements or
searches in violation of the Prenatal Sex Determination Act, and the Court imposed obligations
to monitor the complaints and respond to complaints relating to the Act upon the search
engines. In the recent case of R.T.I v. Youtube24, the Court held in the appeal that YouTube is
obliged to remove the unlawful material as soon it is notified by the relative right holder of the
breach without having to wait for a judicial order requiring the removal.25 In Godfrey v. Demon
Internet Ltd26, the Court held that Demon was informed of publishing a defamatory matter and

17
Information Technology Act §43A (2000).
18
Information Technology Act §69A (2000).
19
Pauline C.Reich, Cybercrime and Security (Oxford University Press, 2011).
20
EU E-Commerce Directive Art. 14, 2000/31/E.C.
21
Myspace v. Super Cassettes, 236 (2017) D.L.T. 478.
22
Moot Proposition, ¶7.
23
Sabu Mathew George v. Union of India, (2017) 7 S.C.C. 657.
24
Supra 19.
25
Reti Televisive Italiane S.p.A. (R.T.I.) v. Yahoo! (2011) Case No. 3821/11.
26
Godfrey v. Demon Internet Ltd., (1999) 4 All E.R. 342.
he neither took reasonable steps to remove content nor can be said to have no knowledge of
the same. Hence, he could not escape liability under UK Defamation Act 1996. Connecting it
with the present facts, Lady Unnamed whose rape video had become viral in Video 2 expressed
her angst and repeatedly pleaded Funbook and ChatOn to remove and requested to prevent
further circulation of the same27. Also, DIT issued a show cause notice to Funbook demanding
a detailed report of the incidents.28 Here, Funbook and ChatOn were having knowledge of
obscene videos being publicly circulated and still they didn’t took any step to remove it.
Moreover, Lady Unnamed was not satisfied with the response received from Funbook and
ChatOn after repeatedly complaining them.29 Therefore, they are liable to pay compensation to
the Lady Unnamed for the violating the privacy of women. Also, they are liable for being
negligent and inefficient to maintain reasonable security practices.

27
Moot Proposition, ¶8.
28
Moot Proposition, ¶10.
29
Supra 8.

Вам также может понравиться