Академический Документы
Профессиональный Документы
Культура Документы
168452
LAUREN GALLO WHITE, State Bar No. 309075
2 KELLY M. KNOLL, State Bar No. 305579
WILSON SONSINI GOODRICH & ROSATI
3 Professional Corporation
650 Page Mill Road
4 Palo Alto, CA 94304-1050
Telephone: (650) 493-9300
5 Facsimile: (650) 565-5100
Email: dkramer@wsgr.com
6 Email: lwhite@wsgr.com
Email: kknoll@wsgr.com
7
Attorneys for Defendants
8 GOOGLE LLC and YOUTUBE, LLC
13
22
23
24
25
26
27
28
2 Page
3 INTRODUCTION ........................................................................................................................... 1
4 BACKGROUND ............................................................................................................................. 2
9 ARGUMENT .................................................................................................................................. 7
22
23
24
25
26
27
28
2 Page(s)
3 CASES
6 Ramos v. Nielsen,
336 F. Supp. 3d 1075 (N.D. Cal. 2018) .............................................................................. 7
7
Rautenberg v. Westland,
8 227 Cal. App. 2d 566 (1964) ............................................................................................. 19
9 Riley v. Nat’l Fed’n of Blind of N.C., Inc.,
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
2 Plaintiffs seek an extraordinary emergency TRO that would upend the status quo by
3 compelling YouTube to display videos that it considers harmful and in violation of its content
4 policies. Plaintiffs ask the Court to force YouTube to publish videos that, among other things,
5 attacked former White House Chief of Staff John Podesta as a “world class underage sex slave op
6 cover upper,” accused the late Senator John McCain of aiding and abetting a child sex-trafficking
7 ring, and asserted that President George H.W. Bush was secretly executed for sex trafficking.
8 Plaintiffs’ request for a mandatory injunction overriding YouTube’s editorial judgments has no
10 Plaintiffs notably do not try to premise their application on their First Amendment claim
11 (which is foreclosed by Ninth Circuit precedent and was recently described by another judge in
12 this Court as “entirely implausible”). See Prager Univ. v. Google LLC, 951 F.3d 991, 997-99 (9th
13 Cir. 2020); Tr. of Hearing at 9:13, Daniels v. Alphabet Inc., No. 5:20-cv-04687-VKD (N.D. Cal.
14 Oct. 6, 2020). Instead, Plaintiffs rely solely on their claim that YouTube breached its Terms of
15 Service by removing Plaintiffs’ YouTube channels. But this argument proceeds only by ignoring
16 (or distorting) the operative language of the contract. The Terms of Service expressly provide that
17 “YouTube is under no obligation to host or serve Content” and that YouTube has discretion to
18 remove content that it reasonably believes violates its content guidelines or “may cause harm to
19 YouTube, our users, or third parties.” That is precisely what happened here. Indeed, Plaintiffs do
20 not argue that their channels complied with YouTube’s content policies, which prohibit
21 harassment, including content that targets individuals in connection with conspiracy theories
23 Plaintiffs were engaged in conduct that violated its Harassment policy, which is designed to
24 prevent harm to YouTube, its users, and others. YouTube did not breach the contract by doing
26 Plaintiffs’ lack of any chance of success on the merits dooms any request for preliminary
27 relief, but Plaintiffs’ bid for a TRO flounders at every other step of the analysis as well.
28
2 consequence of YouTube’s exercise of its contractual rights and where Plaintiffs remain free to
3 use any number of other platforms, including their own websites, to disseminate their views.
4 Second, both the balance of equities and the public interest tip decisively against the relief
5 that Plaintiffs seek. Plaintiffs have no First Amendment right to force YouTube to host their
6 content. In contrast, an injunction compelling YouTube to publish videos that it has determined
7 violate its content policies would directly violate YouTube’s own rights to exercise “editorial
8 discretion over the speech and speakers in the forum.” Manhattan Cmty. Access Corp. v. Halleck,
9 139 S. Ct. 1921, 1930 (2019). Such an order would also be contrary to the public’s interest in
10 avoiding unwanted speech, including content promoting outlandish conspiracy theories that have
12 Finally, as a matter of law, Plaintiffs are not entitled to any injunction seeking specific
13 performance of YouTube’s Terms of Service. That is so both because Plaintiffs cannot point to
14 any provision of that contract that the Court could order to be specifically performed and because
15 that agreement is a personal-service contract that involves considerable discretion and judgment.
16 For all these reasons, Plaintiffs’ meritless TRO application should be denied.
17 BACKGROUND
19 YouTube, a subsidiary of Google, is a popular online service for sharing videos and
20 related content. Compl. ¶ 2. In general, users are able to post content to their own “channels” (or
21 YouTube accounts) at no charge, with YouTube covering the costs of storage. Decl. of YouTube
22 Representative (“YouTube Decl.”) ¶ 2. Almost anyone, even those without a YouTube channel,
23 can view content uploaded to the service by others. Id. That access is also generally provided for
24 free, with YouTube paying for the bandwidth used to transmit the content. Id.
25 To create a YouTube channel and upload videos and other content, a user must have a
26 Google account—an identity used across various personalized Google services, including Gmail
27 and Drive. YouTube Decl. ¶¶ 13-14. Users must also accept the YouTube Terms of Service
28 Agreement (the “TOS”) and the various policies that are incorporated by reference in the TOS. Id.
2 YouTube’s Community Guidelines, which establish detailed rules about the kind of content and
3 behavior that are not allowed on YouTube. Id. ¶ 5, Ex. 2. The TOS also makes clear to users that
4 the Community Guidelines “may be updated from time to time.” Id., Ex. 1.
5 The TOS has specific provisions addressing “Content on the Service” and “Removal of
6 Content by YouTube.” Id. The former provision states explicitly that “YouTube is under no
7 obligation to host or serve Content.” Id. The latter makes clear that YouTube has considerable
8 discretion to remove unwanted material from its service: “If we reasonably believe that any
9 Content is in breach of this Agreement or may cause harm to YouTube, our users, or third parties,
10 we may remove or take down that Content in our discretion.” Id. While YouTube does not
11 promise to provide notice through any specific channel or on any specific time frame, the removal
12 provision further states that “[w]e will notify you with the reason for our action” unless we
13 reasonably believe that certain conditions apply, such as where doing so “would cause harm to
15 To enforce its Community Guidelines and content policies, YouTube continually monitors
16 its service using a combination of automated systems and human content reviewers. YouTube
17 Decl. ¶ 6. When it identifies content that violates its policies, YouTube removes the offending
18 content and, depending on the severity of the violation, may issue a warning or a strike against the
19 uploading user’s channel, or may terminate the user’s channel altogether. Id. ¶¶ 8-9, Ex. 5.
20 Channels can be terminated due to repeated violations; for a single case of severe abuse; or where
21 the channel is considered to be “dedicated to” a policy violation. Id. ¶ 9, Ex. 5. Whether in
22 response to a single strike or a channel suspension, the user can appeal the decision to YouTube.
24 Distinct from these content-removal provisions, the TOS has separate provisions
25 addressing “Account Suspension & Termination.” Id., Ex. 1. Those provisions explain that
26 “YouTube may suspend or terminate your access [to YouTube], your Google account, or your
27 Google account’s access to all or part of the [YouTube] Service,” under certain conditions. Id.
28 (emphases added). Those conditions include where “we believe there has been conduct that
2 Affiliates.” Id.
5 The Community Guidelines, which (as noted) are expressly incorporated into YouTube’s
6 Terms, are public-facing policies that identify various categories of content that YouTube
7 expressly does not permit. YouTube Decl. ¶ 5. Among those categories are: harassment, hate
8 speech; and “harmful or dangerous” content. Id., Ex. 2. Since well before Plaintiffs’ channels
9 were terminated, YouTube’s Harassment policy prohibited “[c]ontent that threatens individuals”
10 or that “targets an individual with prolonged or malicious insults based on intrinsic attributes.”
11 Id., Ex. 3. As examples, the policy specifically banned “[c]ontent that incites others to harass or
13 Separately, YouTube’s hate speech policy prohibits “content promoting violence or hatred
14 against individuals or groups” based on a variety of attributes. Id., Ex. 4. Among other types of
15 content that violates this policy, YouTube lists “[c]onspiracy theories saying individuals or
16 groups are evil, corrupt, or malicious” based on any of the specified attributes, and content that
18 Consistent with its notice in the TOS that these Community Guidelines policies “may be
19 updated from time to time” (id., Ex. 1), YouTube has been working for years to remove and limit
20 the reach of conspiracy theory content and harmful misinformation. YouTube Decl. ¶¶ 16-20. For
21 example, in January 2019, it announced a change to “begin reducing recommendations of
22 borderline content and content that could misinform users in harmful ways.” Id., Ex. 6. And in
23 June 2019, YouTube announced it would update its policies to explicitly reject content promoting
24 “conspiracy theories saying individuals or groups are evil, corrupt, or malicious” based on traits
26 YouTube continued that process on October 15 of this year, announcing in a public blog
27 post entitled “Managing harmful conspiracy theories on YouTube” that it was “taking another
28 step in our efforts to curb hate and harassment by removing more conspiracy theory content used
2 “further expanding both our hate and harassment policies to prohibit content that targets an
3 individual or group with conspiracy theories that have been used to justify real-world violence.
4 One example would be content that threatens or harasses someone by suggesting they are
5 complicit in one of these harmful conspiracies, such as QAnon or Pizzagate.” Id. The October 15
6 blog post explained that YouTube would “begin enforcing this updated policy today, and will
7 ramp up in the weeks to come.” Id. In connection with this announcement, YouTube updated its
9 “[t]argeting an individual and making claims they are involved in human trafficking in the
10 context of a harmful conspiracy theory where the conspiracy is linked to direct threats or violent
13 While most of the Plaintiffs have refused to disclose their identities to Defendants’
14 counsel or even to the Court, they claim to be “journalists, videographers, advocates, [and]
15 commentators” who have collectively created 18 individual channels and published those
17 channels and “extremely controversial,” TRO at 8, 15, but they avoid any actual description of
18 the videos that Plaintiffs posted to them. Tellingly, Plaintiffs do not deny that their videos
22 criminal conduct supposedly committed by specific individuals. See YouTube Decl. ¶¶ 23-25. For
23 example, videos posted on the channel “JustInformed Talk” suggested that Hillary Clinton “was
24 involved with satanic rituals with children,” (including “human ritual sacrifice”) and made claims
25 that the Democratic party had orchestrated mass infection of COVID-19 in order to encourage
26 voter fraud. Id. ¶ 23. Videos posted on the “TRUReporting” channel made similarly vile claims
27 about multiple well-known Americans, including that one “eats babies,” another “killed his wife,”
28 others are “pedophiles or ‘pedowoods,’” and others “breed children in order to sell them.” Id. ¶
2 that the shooting was a hoax. Id. Because this type of content targets individuals in connection
3 with the QAnon and Pizzagate conspiracy theories (and others), YouTube has concluded that it is
4 not welcome on its service and amounts to harassment that has the potential to cause harm to third
6 After it learned of these and other equally odious videos in Plaintiffs’ channels, on
7 October 15, 2020, YouTube suspended the channels and removed all the content that was posted
8 there. YouTube Decl. ¶ 22. YouTube took that action because it determined that each of the
9 channels contained multiple, serious violations of the Community Guidelines. Id. The violations
10 were sufficiently pervasive that YouTube found that each of the Plaintiffs’ channels were
11 dedicated to a violation of YouTube’s Harassment policy. Id. In connection with these removal
12 actions, YouTube sent to each of the Plaintiffs an email notification that informed them of what
13 had happened and the reasons for it: “We’d like to inform you that due to repeated or severe
15 your YouTube account ... has been suspended.” YouTube Decl. Ex. 9; see also id. ¶ 27. This
16 notice also explained that Plaintiffs’ videos contained “targeted harassment,” and invited each of
17 the Plaintiffs to appeal the suspension, using a form that YouTube provides for such appeals. Id.;
19 While YouTube suspended Plaintiffs’ channels, thus removing all the content from them
20 and barring new content, YouTube did not otherwise suspend or terminate the Plaintiffs’ ability to
21 access YouTube, including to view videos. YouTube Decl. ¶ 28. Nor did YouTube suspend or
23 In the days that followed, Plaintiffs organized a crowdfunding page (created on October
24 19), which linked to a YouTube video in which their counsel details their legal strategy.
25 Declaration of David H. Kramer (“Kramer Decl.”) Exs. A-B. Plaintiffs also began using other
26 online platforms to exhort their followers to find their content elsewhere, including on their own
27 websites. See, e.g., Kramer Decl. Ex. C. But Plaintiffs did not actually file this case until more
28 than a week after their channel suspensions. Their Complaint asserts claims for breach of
2 Amendment. ECF No. 1. The Complaint was followed by a motion to proceed under pseudonyms
3 and the next day by an ex parte application for temporary restraining order. ECF Nos. 1, 4, 8. The
4 Court directed Defendants to respond to Plaintiffs by noon on October 30, 2020, and set a hearing
6 ARGUMENT
9 NRDC, Inc., 555 U.S. 7, 24 (2008). A plaintiff seeking a temporary restraining order or a
10 preliminary injunction “must establish that he is likely to succeed on the merits, that he is likely
11 to suffer irreparable harm in the absence of preliminary relief, that the balance of equities tips in
12 his favor, and that an injunction is in the public interest.” Id. at 20. To grant preliminary
13 injunctive relief, a court must find that “a certain threshold showing [has been] made on each
14 factor.” Leiva-Perez v. Holder, 640 F.3d 962, 966 (9th Cir. 2011) (per curiam).
15 A temporary restraining order is “not a preliminary adjudication on the merits but rather a
16 device for preserving the status quo and preventing the irreparable loss of rights before
17 judgment.” Ramos v. Nielsen, 336 F. Supp. 3d 1075, 1084 (N.D. Cal. 2018) (quoting Sierra On-
18 Line, Inc. v. Phoenix Software, Inc., 739 F.2d 1415, 1422 (9th Cir. 1984)); see Tanner Motor
19 Livery, Ltd. v. Avis, Inc., 316 F.2d 804, 808 (9th Cir. 1963) (“It is so well settled as not to require
20 citation of authority that the usual function of a preliminary injunction is to preserve the status
21 quo ante litem pending a determination of the action on the merits.”). Here, however, Plaintiffs
22 are not trying to preserve the status quo; they are asking the Court to alter it by forcing YouTube
23 to reinstate videos that it removed from its service more than a week before Plaintiffs filed this
24 lawsuit. Plaintiffs thus seek a “mandatory injunction”—one that “orders a responsible party to
25 ‘take action.’’ Marlyn Nutraceuticals, Inc. v. Mucos Pharma GmbH & Co., 571 F.3d 873, 879
26 (9th Cir. 2009) (quoting Meghrig v. KFC W., Inc., 516 U.S. 479, 484 (1996). Such injunctions are
27 “particularly disfavored,” Marlyn Nutraceuticals, 571 F.3d at 879, and “are not issued in doubtful
28 cases,” Garcia v. Google, Inc., 786 F. 3d 733, 740 (9th Cir. 2015) (en banc). Accord Dahl v.
2 “subject to heightened scrutiny and should not be issued unless the facts and law clearly favor the
3 moving party.”).
6 Because Plaintiffs seek a mandatory injunction, they “must establish that the law and facts
7 clearly favor [their] position, not simply that [they are] likely to succeed.” Garcia, 786 F.3d at
8 740. Plaintiffs come nowhere close to that showing. Their TRO application rests wholly on the
9 breach of contract claim against YouTube, but Plaintiffs have no chance of prevailing on that
10 claim. Not only do Plaintiffs fail to identify the relevant provisions in the Terms of Service that
11 actually govern the actions that YouTube took, those provisions expressly authorize YouTube to
12 do what it did.
13 Plaintiffs’ central theory is that YouTube breached the TOS by suspending Plaintiffs’
14 YouTube channels “without cause.” TRO at 7.1 They base this claim on the provisions of the
15 TOS that govern “Account Suspension & Termination.” TRO at 12. But this is not the relevant
16 provision of the contract. The provisions that Plaintiffs invoke give YouTube the right to
17
1
18 Plaintiffs also suggest that YouTube breached the TOS by “failing to provide a reason” for
19 the channel suspensions, TRO at 7, but they do not appear to rely on that claim in connection with
20 their request for emergency relief, see id. at 12. Nor could they. For one thing, Plaintiffs did
21 receive notice that their channels were suspended for repeated or severe violations of YouTube’s
22 Community Guidelines. YouTube Decl. ¶ 27, Ex. 9. Nothing more is promised by the Terms of
23 Service. In any event, Plaintiffs do not even try to explain how a purported lack of adequate
24 notice of the reason for the channel suspensions caused them any harm—much less irreparable
25 harm—or how that harm could be remedied by a TRO. Nor do Plaintiffs explain how YouTube
26 “failed to provide the appeals process it promised.” See, e.g., Compl. ¶ 72, 80, 88. In fact,
27 Plaintiffs all received notices that included a form to submit an appeal of YouTube’s actions.
2 or part of the Service.” YouTube Decl. Ex. 1. None of that happened here. YouTube did not
3 suspend or terminate Plaintiffs’ access to YouTube or their Google accounts. YouTube Decl.
4 ¶ 28. Instead, YouTube suspended Plaintiffs’ channels and removed the videos and other features
6 These actions are expressly authorized by the provisions of the TOS related to “Content
7 on the Service” and “Removal of Content by YouTube.” YouTube Decl. Ex. 1. The former, as
8 Plaintiffs acknowledge (TRO at 4), makes plain that “YouTube is under no obligation to host or
9 serve Content.” YouTube Decl. Ex. 1. By itself, that provision is fatal to any claim that
10 YouTube’s removal of channels or other content somehow violated the agreement. But that
11 provision does not stand alone. The Removal of Content provision expressly allows YouTube to
12 remove Content “in its discretion,” if it “reasonably believe[s] that any Content is in breach of
13 this Agreement or may cause harm to YouTube, our users, or third parties.” YouTube Decl. Ex. 1
14 (emphasis added). And that is what YouTube concluded. In removing Plaintiffs’ content,
16 YouTube’s Harassment policy—and amounted to material that could cause real-world harm to
19 language appears only in the separate provision governing suspensions or termination of Google
20 accounts. Because Plaintiffs’ Google accounts were not terminated and because their YouTube
21 access was not cut off, Plaintiffs’ extended (and convoluted) argument that they did not materially
22 breach the TOS (TRO at 12-14) is misplaced. But even if the Google account termination
23 provisions did somehow apply to this case, Plaintiffs’ argument still would fail. By its terms, that
24 provision authorizes YouTube to suspend or terminate accounts or access not merely in the case
25 of a material breach (as happened here, where Plaintiffs’ channels contained multiple Harassment
26 policy violations), but also where “we believe there has been conduct that creates (or could
27 create) liability or harm to any user, other third party, YouTube or our Affiliates.” YouTube Decl.
28 Ex. 1. As explained above, YouTube determined that Plaintiffs’ channels were engaged in just
2 specific individuals, putting them in the cross hairs of conspiracy theories that have been linked to
3 real-world violence. YouTube Decl. ¶¶ 22-26. That determination would have justified any action
4 that YouTube might have taken under the Account Suspension & Termination provision.
5 Because YouTube’s actions were authorized by the governing agreement, Plaintiffs have
6 no likelihood of succeeding on their breach of contract claim. See Storek & Storek, Inc. v.
7 Citicorp Real Estate, Inc., 100 Cal. App. 4th 44, 56-57 (2002) (“[I]f defendants were given the
8 right to do what they did by the express provisions of the contract there can be no breach.”);
9 accord Lewis v. Google LLC, 2020 U.S. Dist. LEXIS 150603, at *44 (N.D. Cal. May 20, 2020)
10 (“YouTube’s terms and guidelines explicitly authorize YouTube to remove or demonetize content
11 that violate its policies, including ‘Hateful content.’ Therefore, Defendants’ removal or
12 demonetization of Plaintiff’s videos with ‘Hateful content’ or hate speech was authorized by the
13 parties’ agreements and cannot support a claim for breach of the implied covenant of good faith
14 and fair dealing.”); Ebeid v. Facebook, Inc., 2019 U.S. Dist. LEXIS 78876, at *21-22 (N.D. Cal.
15 May 9, 2019) (Facebook did not breach implied covenant by removing plaintiff’s posts where
16 “Facebook had the contractual right to remove or disapprove any post or ad at Facebook’s sole
17 discretion”); Mishiyev v. Alphabet, Inc., 444 F. Supp. 3d 1154, 1159 (N.D. Cal. Mar. 13, 2020),
18 appeal docketed, No. 10-15657 (9th Cir. Apr. 14, 2020) (YouTube did not breach TOS by
19 removing plaintiffs’ videos where the TOS “authorized YouTube to do exactly that”); Prager
20 Univ. v. Google LLC, 2019 Cal. Super. LEXIS 2034, at *31-32 (Cal. Super. Ct. Nov. 19, 2019)
21 (YouTube did not breach implied covenant by demonetizing and limiting access to plaintiff’s
23
2
Plaintiffs appear to suggest that, because YouTube suspended Plaintiffs’ channels before
24
changing the webpage on which its public-facing Harassment policy appears—to expressly
25
include, as an example, content “targeting an individual and making claims they are involved in
26
human trafficking in the context of a harmful conspiracy theory where the conspiracy is linked to
27
direct threats or violent acts”—YouTube’s Terms of Service were unilaterally amended and
28
(continued...)
DEFENDANTS’ OPPOSITION TO APPLICATION -10- CASE NO. 5:20-cv-07502-BLF
FOR TEMPORARY RESTRAINING ORDER
1 II. PLAINTIFFS FAIL TO SHOW ANY POSSIBILITY OF IRREPARABLE HARM
2 Plaintiffs’ application equally falls far short with respect to the required showing of
3 irreparable harm: “the single most important prerequisite for the issuance of a TRO or
4 preliminary injunction.” Morrow v. City of Oakland, 2012 U.S. Dist. LEXIS 81314, at *8 (N.D.
6 YouTube gave Plaintiffs the ability to broadcast their content for free. It did so under an
7 unambiguous TOS that explained to Plaintiffs that YouTube had “no obligation to host” their
8 content, that YouTube could remove content that violated its Community Guidelines, and that
9 YouTube could change those guidelines over time. Having struck this bargain, Plaintiffs cannot
10 now complain that YouTube’s exercise of those contractual rights is causing them irreparable
11 harm. As a matter of law, in fact, harm that “results from the express terms of [the] contract”
12 cannot be irreparable. Epic Games, Inc. v. Apple Inc., 2020 U.S. Dist. LEXIS 154231, at *8 (N.D.
13 Cal. Aug. 24, 2020) (finding no irreparable harm and denying TRO seeking to compel Apple to
14 restore software app where contract gave Apple the right to remove apps that failed to comply
15 with its policies); accord Salt Lake Trib. Publ’g Co. v. AT&T Corp., 320 F.3d 1081, 1106 (10th
16 Cir. 2003) (publisher’s loss of personnel or editorial voice does not constitute irreparable harm
17 where those consequences flowed from its contractual agreement); Med-Care Diabetic & Med.
18 Supplies, Inc. v. Strategic Health Alliance II, Inc., 2014 U.S. Dist. LEXIS 10881, at *15-16 (S.D.
19 Ohio Jan. 29, 2014) (consequences flowing from defendant’s exercise of its rights under the
20 express terms of parties’ agreement do not constitute irreparable harm). Plaintiffs’ cases (see
21 TRO at 16) are not contrary. They merely recognize the possibility that a party’s breach of
22
23 rendered invalid. TRO at 8. That is wrong. The update to the policy, which was publicly
24 announced in the October 15 blog post, occurred before the removal of Plaintiffs’ channels.
25 YouTube Decl. ¶¶ 20, 22. And the TOS expressly provides that YouTube’s Community
26 Guidelines “may be updated from time to time.” YouTube Decl. Ex. 1. YouTube did not amend
27 (much less breach or invalidate) the TOS by updating its Harassment policy to make clearer what
28 it prohibited.
2 suggest that the consequences of a party’s exercise of an express contractual right could ever be
4 Plaintiffs also reference cases recognizing the potential for irreparable harm where First
5 Amendment rights are curtailed. TRO at 13. But Plaintiffs’ First Amendment rights are not at
6 issue. Their bid for injunctive relief rests not on the First Amendment, but on a contract theory.
7 Even more importantly, YouTube is not the government, and the Ninth Circuit has made clear
8 that YouTube’s editorial decisions do not implicate the First Amendment. Prager, 951 F.3d at
9 996-99.
10 Plaintiffs’ failure to establish irreparable harm runs beyond YouTube’s rights under the
11 contract. While Plaintiffs claim that they have lost their ability to communicate with their
12 audience through YouTube, that harm is not in itself irreparable. Plaintiffs have not shown, or
13 even attempted to show, that if they somehow won this case, obtained damages, and perhaps a
14 permanent injunction restoring their videos, they would have suffered irreparably in the interim.
15 In fact, Plaintiffs still have access to (and continue to use) other online platforms, including their
16 own websites, where they have every ability to broadcast their views to anyone who wants to
17 listen. Kramer Decl. Exs. A, C. In other ways as well, Plaintiffs’ own actions refute the notion
18 that they are suffering irreparable harm. After their channels were suspended, Plaintiffs spent ten
19 days detailing their litigation strategy on a crowdfunding website and conducting interviews
20 before finally heading to court. (Kramer Decl. Exs. A-B). That suggests that this case (and this
21 application) is more of a political stunt than a situation truly warranting emergency relief. But
22 whatever it is, it is not a case where irreparable harm will result from allowing the status quo to
26 Plaintiffs face further insurmountable obstacles to the order they seek. First, the balance of
27 hardships decisively favors YouTube: while Plaintiffs have no constitutional right to speak on a
28 private platform like YouTube, a mandatory injunction restoring Plaintiffs’ videos to YouTube
2 publish, and thereby associate itself with, videos that it has determined, in an exercise of editorial
3 judgment, should not appear on its platform. The First Amendment does not allow that result. In
4 addition, the public interest would be significantly harmed by undoing the status quo to force
5 YouTube to republish Plaintiffs’ reckless and potentially dangerous efforts to target specific
9 In their application, Plaintiffs make no serious effort to grapple with the hardship that their
10 desired injunction would have on YouTube. Plaintiffs offhandedly assert that “Defendants will
11 suffer no legitimate harm by continuing to accord Plaintiffs the same platform access that they
12 accord to the rest of their user base.” TRO at 17. That is badly mistaken. If forced to republish
13 Plaintiffs’ videos, YouTube would suffer direct harm in the form of a direct assault on its First
14 Amendment rights.
15 It is a “basic First Amendment principle that freedom of speech prohibits the government
16 from telling people what they must say.” Agency for Int’l Dev. v. Alliance for Open Society Int’l,
17 570 U.S. 205, 213 (2013) (citations omitted). Here, however, the TRO that Plaintiffs seek would
18 compel YouTube to publish a set of videos promoting a dangerous conspiracy that YouTube does
19 not wish to associate with and that it has determined may be harmful to its users and third parties.
20 This is a direct violation of the First Amendment’s prohibitions on compelled speech and
21 association. Accord Riley v. Nat’l Fed’n of Blind of N.C., Inc., 487 U.S. 781, 782 (1988) (First
22 Amendment protects “the decision of both what to say and what not to say”) (emphasis added);
23 Roberts v. United States Jaycees, 468 U.S. 609, 623 (1984) (“Freedom of association . . . plainly
24 presupposes a freedom not to associate”); cf. Madsen v. Women’s Health Ctr., Inc., 512 U.S. 753,
25 764-65 (1994) (explaining that because injunctions “carry greater risks of censorship and
26 discriminatory application than do general ordinances,” they require “a somewhat more stringent
27 application of general First Amendment principles”).
28
2 “editorial control and judgment” over its private service. Miami Herald Publ’g Co., Div. of
3 Knight Newspapers, Inc. v. Tornillo, 418 U.S. 241, 258 (1974). Under this principle, private
4 platforms and publishers have a First Amendment right to make their own choices about whether
5 to publish or disseminate third-party speech. See, e.g., Turner Broad. Sys., Inc. v. FCC, 512 U.S.
6 622, 636 (1994) (cable operators engage in “editorial discretion” protected by the First
8 Irish Am. Gay, Lesbian & Bisexual Grp., 515 U.S. 557, 573--74 (1995) (parade organizers engage
9 in protected speech by selecting which marchers may participate in parade); City of Los Angeles
10 v. Preferred Commc’ns, Inc., 476 U.S. 488, 494 (1986) (“by exercising editorial discretion over
13 As the Supreme Court has explained, governmental efforts to “compel speakers to utter or
14 distribute speech bearing a particular message are subject to the same rigorous scrutiny” as efforts
15 to prohibit them from doing so. Turner, 512 U.S. at 642 (emphasis added); accord Tornillo, 418
16 U.S. at 258 (First Amendment protected newspapers from statute requiring publication of public
17 officials’ responses to negative coverage); Assocs. & Aldrich Co. v. Times Mirror Co., 440 F.2d
18 133, 135 (9th Cir. 1971) (“the acceptance or rejection of articles submitted for publication ...
19 necessarily involves the exercise of editorial judgment”). And this First Amendment right to
20 make editorial judgments about speech fully applies to online service providers. See, e.g., Zhang
21 v. Baidu.com, Inc., 10 F. Supp. 3d 433, 439-41 (S.D.N.Y. 2014) (search engine protected by First
22 Amendment for excluding search results on topics it considered politically sensitive); La’Tiejira
23 v. Facebook, Inc., 272 F. Supp. 3d 981, 991-92 (S.D. Tex. 2017) (recognizing “Facebook’s First
24 Amendment right to decide what to publish and what not to publish on its platform”); Langdon v.
25 Google, Inc., 474 F. Supp. 2d 622, 629-30 (D. Del. 2007) (First Amendment barred attempt to
28 would compel YouTube to publish videos that it determined violated its content rules and would
2 YouTube Decl. ¶¶ 22-26. But the judgments that YouTube makes about what content to host—
3 including what videos are so dedicated to dangerous harassment that they are not welcome on the
4 service—are exactly what the First Amendment protects. They are akin to decisions about the
5 “material to go into a newspaper, and the decisions made as to limitations on the size and content
6 of the paper, and treatment of public issues and public officials.” Tornillo, 418 U.S. at 258;
7 accord e-ventures Worldwide, LLC v. Google, Inc., 2017 U.S. Dist. LEXIS 88650, at *11-12
8 (M.D. Fla. Feb. 8, 2017) (“Google’s actions in . . . determining whether certain websites are
9 contrary to Google’s guidelines and thereby subject to removal are the same as decisions by a
10 newspaper editor regarding which content to publish, which article belongs on the front page, and
12 Just as “the courts ... should [not] dictate the contents of a newspaper,” Aldrich, 440 F.3d
13 at 135, the First Amendment does not allow Plaintiffs to obtain a court order overriding
14 YouTube’s editorial decisions and directing what material it must publish. See, e.g., Zhang, 10 F.
15 Supp. 3d at 440 (ordering search engine to include information in its search results that it had
16 decided to exclude “would plainly ‘violate[] the fundamental rule of protection under the First
17 Amendment, that a speaker has the autonomy to choose the content of his own message’”); e-
18 ventures, 2017 U.S. Dist. LEXIS 88650, at *11-12 (First Amendment barred claims seeking to
19 hold Google liable for excluding certain websites from its search results); Langdon, 474 F. Supp.
20 2d at 629-30 (First Amendment prohibits order compelling search engines to “‘honestly’ rank
21 Plaintiff’s websites”); accord Denver Area Educ. Telecomms. Consortium, Inc. v. FCC, 518 U.S.
22 727, 737-38 (1996) (plurality op.) (because “the editorial function itself is an aspect of ‘speech,’ a
23 court’s decision that a private party, say, the station owner, is a ‘censor,’ could itself interfere
24 with that private ‘censor’s’ freedom to speak as an editor”). Such an order would “eviscerate”
25 Defendants’ “rights to exercise editorial control over speech and speakers on their properties or
26 platforms.” Halleck, 139 S. Ct. at 1932; cf. Washington League for Increased Transparency &
27 Ethics v. Fox Corp., No. 20-2-07428-4 SEA (Wash. Superior Ct. May 27, 2020) (First
28 Amendment bars claims attacking cable programmer’s decision to publish alleged misinformation
5 The public-interest factors also cut decisively against undoing the status quo by forcing
7 bear the initial burden of showing that the injunction is in the public interest, Stormans, Inc. v.
8 Selecky, 586 F.3d 1109, 1139 (9th Cir. 2009), they do not offer any explanation (much less
9 evidence) of how the injunction they seek would offer any benefit to non-parties, Bernhardt v.
10 L.A. Cty., 339 F.3d 920, 931-32 (9th Cir. 2003). Instead, Plaintiffs say that the “public interest is
11 served when parties perform as promised under their contracts.” TRO at 17. That may be so, but
12 it does not help Plaintiffs here. As discussed, nothing in the TOS requires YouTube to host
13 Plaintiffs’ videos—to the contrary, the agreement expressly provides that YouTube is “under no
15 The only case Plaintiffs offer on this point is Blizzard Entm’t. Inc. v. Ceiling Fan Software
16 LLC, 28 F. Supp. 3d 1006, 1018-1019 (C.D. Cal. 2013), which is entirely irrelevant. That case did
17 not involve preliminary injunctive relief at all. Instead, the court issued a permanent injunction
18 following a motion for summary judgment, which was based on the plaintiff’s showing that
19 defendants were liable for multiple claims, one of which was breach of contract. Here, in contrast,
20 Plaintiffs seek a TRO solely based on a breach of contract claim, and they face the “doubly
21 demanding” burdens of seeking emergency, mandatory injunctive relief. Moreover, in evaluating
22 the public-interest factor, Blizzard simply noted the benefits of parties performing under their
23 contracts “rather than seeking out surreptitious methods to commit difficult-to-detect breaches of
25 Plaintiffs also assert that “there is a significant public interest in upholding First
26 Amendment principles.” TRO at 17. That is true, but this principle only cuts further against the
27 TRO they seek. Plaintiffs have no relevant First Amendment rights here. The Ninth Circuit could
28 hardly have been clearer in holding that “the state action doctrine precludes constitutional
2 Guidelines.” Prager, 951 F.3d at 999; accord Halleck, 139 S. Ct. at 1933 (explaining that a
3 private actor “is not subject to First Amendment constraints on how it exercises editorial
4 discretion over the speech and speakers” on its platform). Instead, as discussed above, the
5 relevant First Amendment rights implicated here are YouTube’s, not Plaintiffs’. That only
7 That is especially so here, given the nature of the content at issue. As discussed above, in
8 cracking down generally on QAnon videos and on Plaintiffs’ channels specifically, YouTube
9 determined that such “conspiracy theory content” is often “used to justify real-world violence.”
10 YouTube Decl. Ex. 8. That problem is even worse where, as here, that material “targets an
11 identifiable individual as part of a harmful conspiracy theory where the conspiracy theory has
12 been linked to direct threats or violent acts.” YouTube Decl. ¶¶ 22-26, Ex. 3. Tellingly, Plaintiffs
13 never dispute that their videos were seeking to promote such conspiracies or that YouTube
14 appropriately classified their channels under these policies. Indeed, nowhere in their application
15 do Plaintiffs identify the content of their channels or the videos that were removed. That matters
17 Plaintiffs brush past the possibility that much of the public might not want to see their
18 videos and that those videos may be linked to actual violence. But the House Resolution that
19 Plaintiffs cite (Compl. ¶ 10) indicates a genuine public concern about QAnon and related
20 conspiracy theories. The resolution explains that the FBI “has assessed with high confidence that
21 ‘fringe political conspiracy theories’, including QAnon, ‘very likely motivate some domestic
22 extremists, wholly or in part, to engage in criminal or violent activity’, and that these conspiracy
23 theories ‘very likely encourage the targeting of specific people, places and organizations, thereby
24 increasing the likelihood of violence against these targets.’” Compl. Ex. F (finding that “QAnon
25 adherents have been implicated in crimes that they claim their QAnon beliefs inspired”); see also
26 YouTube Decl. ¶ 26. This Court should not enter an order compelling YouTube to publish
27 content that may contribute to such real-world harm and expose members of the public to exactly
28 the kind of material that they might wish to avoid. Cf. Hill v. Colorado, 530 U.S. 703, 716 (2000)
5 Beyond all of this, Plaintiffs’ bid for injunctive relief on their contract claim fails as a
6 matter of California contract law. For two separate reasons, the contract at issue simply is not one
7 for which an injunction compelling specific performance is legally permissible: (1) Plaintiffs do
8 not (and cannot) point to any provision in the TOS that they seek to enforce through specific
9 performance; and (2) specific performance is not available under California law for the alleged
11 First, California law makes clear that specific performance is not available where there is
12 no specific and clear contractual term establishing the supposed right to be enforced. See Cal.
13 Civ. Code § 3390 (prohibiting courts from ordering specific performance where “the precise act”
14 to be done is not “clearly ascertainable” from the terms of the agreement). “Where a party seeks
15 specific performance of a contract, the terms of the contract must be certain and definite in all
16 particulars essential to its enforcement. A court must be able to say what is the stipulated
17 performance.” Colo Corp. v. Smith, 121 Cal. App. 2d 374, 376 (1953); accord Blackburn v.
18 Charnley, 117 Cal. App. 4th 758, 766 (2004) (specific performance may only be ordered where
19 there is “substantial similarity of the requested performance to the contractual terms”); Mora v.
20 U.S. Bank N.A., 2012 U.S. Dist. LEXIS 79357, at *18 (N.D. Cal. June 7, 2012) (“Plaintiffs … do
21
3
22 Against this backdrop, the possibility that some members of the public may have an interest
23 in Plaintiffs’ YouTube channels (see TRO at 2, 9) is not remotely enough to tip the public interest
24 in Plaintiffs’ favor, particularly since Plaintiffs’ audience remains able to access Plaintiffs’
25 content through other platforms. See, e.g., Epic Games, Inc., 2020 U.S. Dist. LEXIS 154231, at
26 *11-12 (denying TRO despite “numerous internet postings and comments submitted in the record
27 that Fortnite players are passionate supporters of the game, and eagerly anticipate its return to the
28 iOS platform”).
2 whether the requested performance is substantially similar to that required under the contract.”).
3 Here, Plaintiffs do not identify any provision in the TOS that they would have YouTube
4 specifically perform. See TRO at 18. Nor could they. Not only is there no provision in the Terms
5 that Plaintiffs could invoke to force YouTube to host their videos, the agreement expressly
6 provides that “YouTube is under no obligation to host or serve Content.” YouTube Decl. Ex. 1.
7 Because there is nothing in the agreement that obligates YouTube to do the “precise act” that
8 Plaintiffs seek to compel—the publication and continued hosting of their videos on its service—
9 California law does not permit the Court to enter such an order.
10 Second, Plaintiffs’ requested relief fails because, under California law, “specific
11 performance cannot be decreed to enforce a contract for personal services.” Woolley v. Embassy
12 Suites, Inc., 227 Cal. App. 3d 1520, 1533 (1991) (rejecting specific performance for contracts that
13 required the “exercise of special skill and judgment” and “involve[d] daily discretionary
14 activities”); accord Thayer Plymouth Ctr., Inc. v. Chrysler Motors Corp., 255 Cal. App. 2d 300,
15 303 (1967) (“A contract which requires a continuing series of acts and demands cooperation
16 between the parties for the successful performance of those acts is not subject to specific
17 performance.”); Cal. Civ. Code § 3423; Cal. Code Civ. P. § 526(b)(5). The TOS is just such a
18 contract: it “obligates the individual parties to perform acts requiring an exercise of personal
19 judgment and a degree of cooperation in making the several enterprises successful which cannot
20 be compelled by a court decree of specific performance.” Rautenberg v. Westland, 227 Cal. App.
21 2d 566, 572 (1964); accord Barndt v. Cty. of L.A., 211 Cal. App. 3d 397, 404 (1989) (rationale for
22 prohibiting specific performance “is particularly applicable where the services to be rendered
23 require mutual confidence among the parties and involve the exercise of discretionary authority”).
24 The TOS requires extensive, discretionary decision-making, especially in regard to the content-
25 moderation and removal issues raised here. YouTube must (and does) frequently review the
26 content on its service to determine whether it complies with its guidelines, and YouTube
27 continually refines and updates its content policies as real-world conditions, threats, and
28 challenges evolve. See YouTube Decl. ¶¶ 5-6, 16-20. The injunction that Plaintiffs seek would
2 to assess and supervise YouTube’s content moderation decisions on an ongoing basis. California
3 law does not permit that. See City of Thousand Oaks v. Verizon Media Ventures, 2002 U.S. Dist.
4 LEXIS 8704, at *27 (C.D. Cal. May 15, 2002), revd. on other grounds, 69 Fed. App’x 826 (9th
5 Cir. 2003) (rejecting specific performance because “it is impractical to require judicial oversight
6 of a contract which calls for special knowledge, skill, or ability”); Poultry Producers of S. Cal. v.
7 Barlow, 189 Cal. 278, 281 (1922) (courts will not enjoin the breach of a contract “requiring the
9 CONCLUSION
10 Plaintiffs have no chance of success on the merits of their claim, and equitable
11 considerations cut overwhelmingly against the extraordinary mandatory injunction they seek. For
13
16
By: /s/ Lauren Gallo White
17 Lauren Gallo White
18 lwhite@wsgr.com
22
23
24
25
26
27
28
13
22
23
24
25
26
27
28
3 YouTube’s content moderation policies, known as the Community Guidelines. For my protection,
4 I have omitted from this declaration any personally identifiable information given that the subject
5 matter of this litigation involves harmful conspiracy theories known to be connected to instances
6 of real-world violence. I have personal knowledge of the facts described below and, if called as a
9 2. YouTube is an online service that enables millions of users around the world to
10 upload, view, and share videos and related content. In general, users are able to post content to
11 their own “channels” at no charge, with YouTube covering the costs of storage. Almost anyone,
12 even those without a “channel” (sometimes called a “YouTube account”), can view content
13 uploaded to the service by others. That access is free of charge, with YouTube paying for the
15 3. YouTube strives to create a space where people all over the world can create and
16 share unique, creative, and original content. But while YouTube believes in giving its users a
17 voice, it has (and always has had to my knowledge) policies that govern how people may use the
18 service, including restrictions on the types of content they may post. These guidelines are designed
19 and regularly updated to make YouTube a safer and more enjoyable place for users and creators.
24 people can use the service. In addition, the Terms of Service expressly incorporate Community
25 Guidelines, which provide additional detail about the kinds of content and activity that YouTube
26 prohibits. As the Terms of Service make clear, YouTube revises the Community Guidelines from
27 time to time to account for new and different content or behavior that YouTube deems
28 unacceptable or unsafe. The current Community Guidelines are attached as Exhibit 2. They detail,
2 COVID-19, hate speech, and more. Exhibits 3 and 4, drawn from the Community Guidelines, set
5 6. In an effort to ensure that content uploaded to YouTube complies with its policies,
6 YouTube uses a combination of automated systems and manual (human) reviewers who review
7 content 24 hours a day, 7 days a week, around the world. YouTube also invites users to “flag” for
8 YouTube’s review, content that they believe might violate the company’s policies.
9 7. YouTube’s Terms of Service state that it may remove content if it believes the
10 content breaches the Terms of Service, including the Community Guidelines, or if it believes the
14 the severity of the violation, patterns of violative behavior, or channels being focused on (or
17 YouTube’s enforcement practices. It makes clear that a user’s channel can be terminated for a
18 “single case of severe abuse” or when YouTube determines that the channel is “dedicated to a
19 policy violation.” So, for example, if a user’s channel contains multiple videos that violate the
20 same or related YouTube policies, YouTube may immediately remove the user’s entire channel.
21 10. It is our standard practice to notify a user when their content or their channel is
22 removed from the YouTube service, and to offer the user the ability to appeal YouTube’s decision.
23 11. Further, even where a user’s channel is removed, the user typically retains the
26 12. As I mentioned, virtually anyone can access YouTube to watch videos free of
28 13. If a user wishes to upload content to YouTube, they must register with YouTube
2 user to take advantage of interactive aspects of the YouTube service including commenting and
3 playlists.
4 14. In order to create a YouTube channel or account, a user must already have a
5 Google account. A Google account is a broader identity across Google’s suite of products that
6 allows users personalized access to Google services like Gmail, Drive, and Google Pay.
8 policies, the YouTube Terms of Service make clear that YouTube may terminate a user’s access to
9 YouTube altogether and prohibit even the viewing of videos. It may also suspend or terminate the
10 user’s Google account. These enforcement escalations go beyond the mere removal or suspension
15 17. These kinds of videos can misinform users in harmful ways. In particular,
16 conspiracy theories like QAnon have been linked to multiple instances of real-world violence,
17 including attempted terrorist attacks, kidnapping, murder, and numerous threats of violence. Given
18 the potential for harm associated with this type of content, we’re continually working to update
19 our policies and modify our enforcement efforts to address this content effectively to prevent harm
21 18. YouTube announced, almost two years ago, that we would begin limiting the reach
22 of videos that could misinform users in harmful ways by limiting recommendations for them. We
23 specifically called out content “such as videos promoting a phony miracle cure for a serious
24 illness, claiming the earth is flat, or making blatantly false claims about historic events like 9/11.”
26 19. In June 2019, we announced updates to our hate speech policy under which we
27 would prohibit content promoting conspiracy theories about protected groups (like a religious
28 group, or a group based on sexual orientation). This announcement is attached as Exhibit 7. This
2 Americans and others through coordinated control of mass media and the banking system. This
3 was reflected in our Community Guidelines help center where we note that we prohibit content
4 promoting “[c]onspiracy theories saying individuals or groups are evil, corrupt, or malicious based
5 on any of the attributes noted above” in which “attributes noted above” refers to traits such as race,
6 religion, and sexual orientation. Our current hate speech policy, Exhibit 4, reflects this language.
9 20. In October 2020, we took another step in our efforts to curb harmful content. We
10 announced that we were expanding our hate and harassment policies and that would begin
11 removing more conspiracy theory content. Specifically, we stated we prohibit “[c]ontent that
12 targets an identifiable individual as part of a harmful conspiracy theory where the conspiracy
13 theory has been linked to direct threats or violent acts.” And we provided as an example of this:
14 “Targeting an individual and making claims they are involved in human trafficking in the context
15 of a harmful conspiracy theory where the conspiracy is linked to direct threats or violent acts.”
16 YouTube made clear in a public announcement, attached as Exhibit 8, that we would begin
19 21. I have a general understanding of Plaintiffs’ claims in this case and I have reviewed
20 our records regarding YouTube’s removal of Plaintiffs’ channels from the YouTube service.
21 22. On each of the channels Plaintiffs identified in their application for a temporary
22 restraining order, my team found multiple videos that they concluded violated YouTube’s
23 harassment policy, Exhibit 3, that prohibits, among other things: (i) content that “targets an
24 identifiable individual as part of a harmful conspiracy theory where the conspiracy theory has been
25 linked to direct threats or violent acts”; (ii) content that “incites others to harass or threaten
26 individuals on or off YouTube”; and (iii) content that “targets an individual with prolonged or
27 malicious insults based on intrinsic attributes.” These violations were sufficiently pervasive that
28 my team determined that each of the Plaintiffs’ channels were dedicated to violating YouTube’s
2 Trust and Safety Team terminated (i.e., removed) Plaintiffs’ channels from the YouTube service
4 23. While I have not yet been able to review all of the videos my team considered in
5 concluding that these channels violated YouTube’s Community Guidelines, I agree with their
6 assessment on each of the videos I have reviewed. The threat of harm to others and the severity of
7 the violations in the videos are obvious. For example, a video posted on the channel JustInformed
8 Talk suggested that Hillary Clinton “was involved with satanic rituals with children,” which
9 included a “set of cult chamber rooms” and a “temple that was built for allegedly abusing children
10 and going all the way up to human ritual sacrifice and worshipping the cult of satanism.” Another
11 video on that same channel made claims that the Democratic party had orchestrated a mass
12 infection of COVID-19 in order to encourage voter fraud. Additionally, a video from the channel
13 dnajlion7 claimed the late Senator John McCain “aided and abetted a child sex trafficking ring” in
14 Arizona that victimized hundreds of thousands of children. And on SGT Report, a video described
15 former White House Chief of Staff John Podesta as a “world class underage sex slave op cover
16 upper.”
17 24. A video posted on the TRUReporting channel made claims about multiple well-
18 known Americans, alleging that one “eats babies,” that another is “a rapist,” that another “killed
19 his wife,” and that at least eight others are “pedophiles or ‘pedowoods.’” A separate video on the
20 same channel claimed that certain Americans “breed children in order to sell them” and “when
21 they are sold they come without birth certificates which means it’s easier to kill them.” Another
22 alleged that President George H.W. Bush was secretly executed for child sex trafficking. And yet
23 another, removed even before October 2020, targeted survivors of the Stoneman Douglas High
24 School shooting in Parkland, Florida and denied the shooting ever happened.
25 25. Other videos on Plaintiffs’ channels were clear instances of targeting identifiable
27 celebrities) as being part of a conspiracy theory that has been linked to direct threats or violent
28 acts. At least one of these instances included targeting environmental activist Greta Thunberg,
2 26. Given the connection between the QAnon conspiracy theory and real-world
3 violence, I agree that these videos may incite others to “take action” and may cause harm to our
4 users or other people. My view is supported by external security and terrorism experts as well. A
5 May 2019 FBI bulletin specifically cited QAnon as among the conspiracy theories that “very
6 likely will emerge, spread, and evolve in the modern information marketplace, occasionally
7 driving both groups and individual extremists to carry out criminal or violent acts.” Jana Winter,
8 Exclusive: FBI Document Warns Conspiracy Theories are a New Domestic Terrorism Threat,
11 Center stated: “A survey of cases of individuals who have allegedly or apparently been radicalized
12 to criminal acts with a nexus to violence by QAnon, including one case that saw a guilty plea on a
13 terrorism charge, makes clear that QAnon represents a public security threat with the potential in
14 the future to become a more impactful domestic terror threat.” Amarnath Amarasingam & Marc-
15 André Argentino, The QAnon Conspiracy Theory: A Security Threat in the Making?, 13
18 multiple instances of threatened violence sparked by Pizzagate conspiracy theories. Cecilia Kang
19 & Adam Goldman, In Washington Pizzeria Attack, Fake News Brought Real Guns, N.Y. TIMES,
21 shooting-fake-news-consequences.html.
22 27. YouTube sent all Plaintiffs notice of the termination and the reason for the
23 termination of their channels. Attached as Exhibit 9 is a screenshot depicting the form of the
25 28. While YouTube removed Plaintiffs’ channels from the service, our enforcement
26 action was limited in important ways. YouTube did not suspend or terminate the Plaintiffs’ access
27 to the YouTube service. Plaintiffs can still freely watch videos that others have posted. Further,
28 YouTube did not suspend or terminate Plaintiffs’ Google accounts, all of which remain active.
2 true and correct. Executed October 30, 2020 at San Francisco, California.
3
/s/ YouTube Representative
4 YouTube Representative
5
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
2 I, Lauren Gallo White, am the ECF User whose ID and password are being used to file
3 this document. In compliance with N.D. Cal. Civil L.R. 5-1(i)(3), I hereby attest that the
4 concurrence in the filing of this document has been obtained from the signatory.
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
Search Sign in
About Press & Blogs Copyright Safety Creators & Partners Advertising Developers Help
Welcome to YouTube!
This section outlines our relationship with you. It includes a description of the Service, de nes our Agreement, and
names your service provider. Key updates:
Age Requirements. We have stated the speci c age requirements for your country, re ecting our Google wide
policies, and included a notice that, if you are a minor in your country, you must always have your parent or
guardian’s permission before using the Service.
Parental Permission. We’ve added a section to explain your responsibility if you allow your child to use YouTube.
Businesses. Our Terms now make clear that, if you are using the Service on behalf of a company or organisation,
that business accepts this Agreement.
Google Accounts and YouTube Channels. We’ve provided details about which features of the Service can be
accessed without a Google account or YouTube channel, and which features require one.
Your Information. We haven’t made any changes to the way we treat your information. You can read about our
privacy practices by reviewing the Privacy Policy and YouTube Kids Privacy Notice. As a reminder, you can always
review your privacy settings and manage your data and personalisation by visiting your Google Account.
Restrictions. We have updated this section to re ect our requirements around contests, and to include a prohibition
on manipulating metrics.
Service Changes. We have improved our Terms to be more transparent about why we might need to make changes
to the Service, and provided a commitment to give you notice when those changes might affect you.
License. We’ve clari ed the content license you grant us to make it easier to understand. We’re not asking for
additional permissions and there’s no difference in how we’re using your content.
Duration. We have removed the right for YouTube to use your comments in perpetuity.
Removals. We have included a link to the tools you will need to remove your content, as well as a clear description
about why we might need to take down content, and how to appeal removals.
Analyzing Content. We may automatically analyze content on YouTube, to help detect abuse and keep the platform
safe.
Terminations. Our Terms now include more details about when we might need to terminate our Agreement with bad
actors. We provide a greater commitment to give notice when we take such action and what you can do to appeal if
you think we’ve got it wrong. We’ve also added instructions for you, if you decide you no longer want to use the
Service.
Our liability. We’ve made changes to the disclaimers and limitations of liability in the Terms.
Modi cations. We want to give you the chance to review future material updates to these Terms.
-----
Terms of Service
Dated: December 10, 2019
Welcome to YouTube!
Introduction
Thank you for using the YouTube platform and the products, services and features we make available to you as part of the
platform (collectively, the “Service”).
Our Service
The Service allows you to discover, watch and share videos and other content, provides a forum for people to connect, inform,
and inspire others across the globe, and acts as a distribution platform for original content creators and advertisers large and
small. We provide lots of information about our products and how to use them in our Help Center. Among other things, you
can nd out about YouTube Kids, the YouTube Partner Program and YouTube Paid Memberships and Purchases (where
available).You can also read all about enjoying content on other devices like your television, your games console, or Google
Home.
The entity providing the Service is Google LLC, a company operating under the laws of Delaware, located at 1600
Amphitheatre Parkway, Mountain View, CA 94043 (referred to as “YouTube”, “we”, “us”, or “our”). References to YouTube’s
“A liates” in these terms means the other companies within the Alphabet Inc. corporate group (now or in the future).
Applicable Terms
Your use of the Service is subject to these terms, the YouTube Community Guidelines and the Policy, Safety and Copyright
Policies which may be updated from time to time (together, this "Agreement"). Your Agreement with us will also include the
Advertising on YouTube Policies if you provide advertising or sponsorships to the Service or incorporate paid promotions in
your content. Any other links or references provided in these terms are for informational use only and are not part of the
Agreement.
Please read this Agreement carefully and make sure you understand it. If you do not understand the Agreement, or do not
accept any part of it, then you may not use the Service.
If you are under 18, you represent that you have your parent or guardian’s permission to use the Service. Please have them
read this Agreement with you.
If you are a parent or legal guardian of a user under the age of 18, by allowing your child to use the Service, you are subject to
the terms of this Agreement and responsible for your child’s activity on the Service. You can nd tools and resources to help
you manage your family’s experience on YouTube in our Help Center and through Google’s Family Link.
Businesses
If you are using the Service on behalf of a company or organisation, you represent that you have authority to act on behalf of
that entity, and that such entity accepts this Agreement.
https://www.youtube.com/static?template=terms 2/6
10/29/2020 Terms of Service - YouTube
Content is the responsibility of the person or entity that provides it to the Service. YouTube is under no obligation to host or
serve Content. If you see any Content you believe does not comply with this Agreement, including by violating the Community
Guidelines or the law, you can report it to us.
Creating a YouTube channel will give you access to additional features and functions, such as uploading videos, making
comments or creating playlists (where available). Here are some details about how to create your own YouTube channel.
To protect your Google account, keep your password con dential. You should not reuse your Google account password on
third-party applications. Learn more about keeping your Google account secure, including what to do if you learn of any
unauthorised use of your password or Google account.
Your Information
Our Privacy Policy explains how we treat your personal data and protect your privacy when you use the Service. The YouTube
Kids Privacy Notice provides additional information about our privacy practices that are speci c to YouTube Kids.
We will process any audio or audiovisual content uploaded by you to the Service in accordance with the YouTube Data
Processing Terms, except in cases where you uploaded such content for personal purposes or household activities. Learn
More.
The following restrictions apply to your use of the Service. You are not allowed to:
1. access, reproduce, download, distribute, transmit, broadcast, display, sell, license, alter, modify or otherwise use any part
of the Service or any Content except: (a) as expressly authorized by the Service; or (b) with prior written permission from
YouTube and, if applicable, the respective rights holders;
2. circumvent, disable, fraudulently engage with, or otherwise interfere with any part of the Service (or attempt to do any of
these things), including security-related features or features that (a) prevent or restrict the copying or other use of
Content or (b) limit the use of the Service or Content;
3. access the Service using any automated means (such as robots, botnets or scrapers) except (a) in the case of public
search engines, in accordance with YouTube’s robots.txt le; or (b) with YouTube’s prior written permission;
4. collect or harvest any information that might identify a person (for example, usernames), unless permitted by that person
or allowed under section (3) above;
5. use the Service to distribute unsolicited promotional or commercial content or other unwanted or mass solicitations;
6. cause or encourage any inaccurate measurements of genuine user engagement with the Service, including by paying
people or providing them with incentives to increase a video’s views, likes, or dislikes, or to increase a channel’s
subscribers, or otherwise manipulate metrics in any manner;
7. misuse any reporting, agging, complaint, dispute, or appeals process, including by making groundless, vexatious, or
frivolous submissions;
8. run contests on or through the Service that do not comply with YouTube’s contest policies and guidelines;
9. use the Service to view or listen to Content other than for personal, non-commercial use (for example, you may not
publicly screen videos or stream music from the Service); or
10. use the Service to (a) sell any advertising, sponsorships, or promotions placed on, around, or within the Service or
Content, other than those allowed in the Advertising on YouTube policies (such as compliant product placements); or (b)
sell advertising, sponsorships, or promotions on any page of any website or application that only contains Content from
the Service or where Content from the Service is the primary basis for such sales (for example, selling ads on a webpage
where YouTube videos are the main draw for users visiting the webpage).
Reservation
Using the Service does not give you ownership of or rights to any aspect of the Service, including user names or any other
Content posted by others or YouTube.
YouTube is constantly changing and improving the Service. We may also need to alter or discontinue the Service, or any part
of it, in order to make performance or security improvements, change functionality and features, make changes to comply
https://www.youtube.com/static?template=terms 3/6
10/29/2020 Terms of Service - YouTube
with law, or prevent illegal activities on or abuse of our systems. These changes may affect all users, some users or even an
individual user. Whenever reasonably possible, we will provide notice when we discontinue or make material changes to our
Service that will have an adverse impact on the use of our Service. However, you understand and agree that there will be
times when we make such changes without notice, such as where we feel we need to take action to improve the security and
operability of our Service, prevent abuse, or comply with legal requirements.
If you have a YouTube channel, you may be able to upload Content to the Service. You may use your Content to promote your
business or artistic enterprise. If you choose to upload Content, you must not submit to the Service any Content that does not
comply with this Agreement (including the YouTube Community Guidelines) or the law. For example, the Content you submit
must not include third-party intellectual property (such as copyrighted material) unless you have permission from that party
or are otherwise legally entitled to do so. You are legally responsible for the Content you submit to the Service. We may use
automated systems that analyze your Content to help detect infringement and abuse, such as spam, malware, and illegal
content.
You retain ownership rights in your Content. However, we do require you to grant certain rights to YouTube and other users of
the Service, as described below.
License to YouTube
By providing Content to the Service, you grant to YouTube a worldwide, non-exclusive, royalty-free, sublicensable and
transferable license to use that Content (including to reproduce, distribute, prepare derivative works, display and perform it) in
connection with the Service and YouTube’s (and its successors' and A liates') business, including for the purpose of
promoting and redistributing part or all of the Service.
You also grant each other user of the Service a worldwide, non-exclusive, royalty-free license to access your Content through
the Service, and to use that Content, including to reproduce, distribute, prepare derivative works, display, and perform it, only
as enabled by a feature of the Service (such as video playback or embeds). For clarity, this license does not grant any rights
or permissions for a user to make use of your Content independent of the Service.
Duration of License
The licenses granted by you continue for a commercially reasonable period of time after you remove or delete your Content
from the Service. You understand and agree, however, that YouTube may retain, but not display, distribute, or perform, server
copies of your videos that have been removed or deleted.
You may remove your Content from the Service at any time. You also have the option to make a copy of your Content before
removing it. You must remove your Content if you no longer have the rights required by these terms.
If we reasonably believe that any Content is in breach of this Agreement or may cause harm to YouTube, our users, or third
parties, we may remove or take down that Content in our discretion. We will notify you with the reason for our action unless
we reasonably believe that to do so: (a) would breach the law or the direction of a legal enforcement authority or would
otherwise risk legal liability for YouTube or our A liates; (b) would compromise an investigation or the integrity or operation
of the Service; or (c) would cause harm to any user, other third party, YouTube or our A liates. You can learn more about
reporting and enforcement, including how to appeal on the Troubleshooting page of our Help Center.
Copyright Protection
We provide information to help copyright holders manage their intellectual property online in our YouTube Copyright Center. If
you believe your copyright has been infringed on the Service, please send us a notice.
We respond to notices of alleged copyright infringement according to the process in our YouTube Copyright Center, where
you can also nd information about how to resolve a copyright strike. YouTube's policies provide for the termination, in
appropriate circumstances, of repeat infringers’ access to the Service.
https://www.youtube.com/static?template=terms 4/6
10/29/2020 Terms of Service - YouTube
YouTube may suspend or terminate your access, your Google account, or your Google account’s access to all or part of the
Service if (a) you materially or repeatedly breach this Agreement; (b) we are required to do so to comply with a legal
requirement or a court order; or (c) we believe there has been conduct that creates (or could create) liability or harm to any
user, other third party, YouTube or our A liates.
YouTube may terminate your access, or your Google account’s access to all or part of the Service if YouTube believes, in its
sole discretion, that provision of the Service to you is no longer commercially viable.
We will notify you with the reason for termination or suspension by YouTube unless we reasonably believe that to do so: (a)
would violate the law or the direction of a legal enforcement authority, or would otherwise risk legal liability for YouTube or
our A liates; (b) would compromise an investigation or the integrity or operation of the Service; or (c) would cause harm to
any user, other third party, YouTube or our A liates. Where YouTube is terminating your access for Service changes, where
reasonably possible, you will be provided with su cient time to export your Content from the Service.
If your Google account is terminated or your Google account’s access to the Service is restricted, you may continue using
certain aspects of the Service (such as viewing only) without an account, and this Agreement will continue to apply to such
use. If you believe your Google account has been terminated in error, you can appeal using this form.
Open Source
Some software used in our Service may be offered under an open source license that we make available to you. There may be
provisions in an open source license that expressly override some of these terms, so please be sure to read those licenses.
Limitation of Liability
EXCEPT AS REQUIRED BY APPLICABLE LAW, YOUTUBE, ITS AFFILIATES, OFFICERS, DIRECTORS, EMPLOYEES AND AGENTS
WILL NOT BE RESPONSIBLE FOR ANY LOSS OF PROFITS, REVENUES, BUSINESS OPPORTUNITIES, GOODWILL, OR
ANTICIPATED SAVINGS; LOSS OR CORRUPTION OF DATA; INDIRECT OR CONSEQUENTIAL LOSS; PUNITIVE DAMAGES
CAUSED BY:
THIS PROVISION APPLIES TO ANY CLAIM, REGARDLESS OF WHETHER THE CLAIM ASSERTED IS BASED ON WARRANTY,
CONTRACT, TORT, OR ANY OTHER LEGAL THEORY.
YOUTUBE AND ITS AFFILIATES’ TOTAL LIABILITY FOR ANY CLAIMS ARISING FROM OR RELATING TO THE SERVICE IS
LIMITED TO THE GREATER OF: (A) THE AMOUNT OF REVENUE THAT YOUTUBE HAS PAID TO YOU FROM YOUR USE OF THE
SERVICE IN THE 12 MONTHS BEFORE THE DATE OF YOUR NOTICE, IN WRITING TO YOUTUBE, OF THE CLAIM; AND (B) USD
$500.
Indemnity
https://www.youtube.com/static?template=terms 5/6
10/29/2020 Terms of Service - YouTube
To the extent permitted by applicable law, you agree to defend, indemnify and hold harmless YouTube, its A liates, o cers,
directors, employees and agents, from and against any and all claims, damages, obligations, losses, liabilities, costs or debt,
and expenses (including but not limited to attorney's fees) arising from: (i) your use of and access to the Service; (ii) your
violation of any term of this Agreement; (iii) your violation of any third party right, including without limitation any copyright,
property, or privacy right; or (iv) any claim that your Content caused damage to a third party. This defense and indemni cation
obligation will survive this Agreement and your use of the Service.
Third-Party Links
The Service may contain links to third-party websites and online services that are not owned or controlled by YouTube.
YouTube has no control over, and assumes no responsibility for, such websites and online services. Be aware when you leave
the Service; we suggest you read the terms and privacy policy of each third-party website and online service that you visit.
Severance
If it turns out that a particular term of this Agreement is not enforceable for any reason, this will not affect any other terms.
No Waiver
If you fail to comply with this Agreement and we do not take immediate action, this does not mean that we are giving up any
rights that we may have (such as the right to take action in the future).
Interpretation
In these terms, “include” or “including” means “including but not limited to,” and any examples we give are for illustrative
purposes.
Governing Law
All claims arising out of or relating to these terms or the Service will be governed by California law, except California’s con ict
of laws rules, and will be litigated exclusively in the federal or state courts of Santa Clara County, California, USA. You and
YouTube consent to personal jurisdiction in those courts.
YOU AND YOUTUBE AGREE THAT ANY CAUSE OF ACTION ARISING OUT OF OR RELATED TO THE SERVICES MUST
COMMENCE WITHIN ONE (1) YEAR AFTER THE CAUSE OF ACTION ACCRUES. OTHERWISE, SUCH CAUSE OF ACTION IS
PERMANENTLY BARRED.
Language: English Location: United States Restricted Mode: Off History Help
https://www.youtube.com/static?template=terms 6/6
EXHIBIT 2
10/29/2020 YouTube's Community Guidelines - YouTube Help
If you see content that you think violates these guidelines, use the agging feature to submit it for review by our YouTube
staff.
Sensitive content
We hope to protect viewers, creators, and especially minors. That's why we've got
rules around keeping children safe, sex & nudity, and self harm. Learn what's
allowed on YouTube and what to do if you see content that doesn't follow these
policies.
Regulated goods
https://support.google.com/youtube/answer/9288567?hl=en 1/2
10/29/2020 YouTube's Community Guidelines - YouTube Help
Certain goods can't be sold on YouTube. Find out what's allowed—and what isn't.
• Regulated goods
• Content featuring rearms
Please take these rules seriously. If a YouTube creator’s on- and/or off-platform behavior harms our users, community,
employees or ecosystem, we may respond based on a number of factors including, but not limited to, the egregiousness
of their actions and whether a pattern of harmful behavior exists. Our response will range from suspending a creator’s
privileges to account termination.
Yes No
https://support.google.com/youtube/answer/9288567?hl=en 2/2
EXHIBIT 3
10/29/2020 Harassment and cyberbullying policy - YouTube Help
The safety of our creators, viewers, and partners is our highest priority – and we look to each of you to help us
protect this unique and vibrant community. It’s important you understand our Community Guidelines, and the
role they play in our shared responsibility to keep YouTube safe. Please take the time to carefully read the policy
below. You can also check out this page for a full list of our guidelines.
We recently announced some updates on our harassment policy to better protect creators and users. The policy below
has been updated to re ect these changes.
Content that threatens individuals is not allowed on YouTube. We also do not allow content that targets an individual with
prolonged or malicious insults based on intrinsic attributes, including their protected group status or physical traits.
If you nd content that violates this policy, please report it. Instructions for reporting violations of our Community
Guidelines are available here. If you've found multiple videos or comments that you would like to report, you can report
the channel. For tips and best practices to stay safe, keep your account secure, and protect your privacy, check out this
Help Center article.
If speci c threats are made against you and you feel unsafe, report it directly to your local law enforcement agency.
• Content that features prolonged name calling or malicious insults (such as racial slurs) based on their intrinsic
attributes. These attributes include their protected group status, physical attributes, or their status as a survivor of
sexual assault, domestic abuse, child abuse etc.
• Content uploaded with the intent to shame, deceive or insult a minor. A minor is de ned as a person under the legal
age of majority. This usually means anyone younger than 18 years old, but the age of a minor might vary by country.
• Revealing someone’s private information, such as their home address, email addresses, sign-in credentials,
phone numbers, passport number, or bank account information.
• Note: This does not include posting widely available public information, such as a public o cial’s o ce
phone number or the phone number of a business.
• Content that incites others to harass or threaten individuals on or off YouTube.
• Content that encourages abusive fan behavior such as doxxing, dogpiling, brigading or off-platform targeting.
https://support.google.com/youtube/answer/2802268?hl=en 1/3
10/29/2020 Harassment and cyberbullying policy - YouTube Help
Content that targets an identi able individual as part of a harmful conspiracy theory where the conspiracy
•
theory has been linked to direct threats or violent acts.
• Content making implicit or explicit threats of physical harm or destruction of property against identi able
individuals.
• Note: “Implicit threats” include threats that don’t express a speci c time, place or means, but may feature
weapon brandishing, simulated violence, etc.
• Content posted by vigilantes restraining or assaulting an identi able individual.
• Content that depicts creators simulating acts of serious violence against others (executions, torture,
maimings, beatings, etc.).
• Content featuring non-consensual sex acts, unwanted sexualization or anything that graphically sexualizes or
degrades an individual.
• Content that displays or shows how to distribute non-consensual sexual imagery.
This policy applies to videos, video descriptions, comments, live streams, and any other YouTube product or
feature. Please note this is not a complete list.
Exceptions
We may allow content that includes harassment if the primary purpose is educational, documentary, scienti c, or artistic
in nature. This is not a free pass to harass someone. Some examples include:
• Debates related to high-pro le o cials or leaders: Content featuring debates or discussions of topical issues
concerning people who have positions of power, like high-pro le government o cials or CEOs of major multinational
corporations.
• Scripted performances: Insults made in the context of an artistic medium such as scripted satire, stand up comedy, or
music (e.g. a diss track). Note: This is not a free pass to harass someone and claim “I was joking.”
• Harassment education or awareness: Content that features actual or simulated harassment for documentary
purposes or with willing participants (e.g. actors) to combat cyberbullying or raise awareness.
Note: We take a harder line on content that maliciously insults someone based on their protected group status,
regardless of whether or not they are a high-pro le person.
Examples
Here are some examples of content that’s not allowed on YouTube:
• Repeatedly showing pictures of someone and then making statements like “Look at this creature’s teeth, they’re so
disgusting!”, with similar commentary targeting intrinsic attributes throughout the video.
• Targeting an individual based on their membership in a protected group, such as by saying: “Look at this lthy [slur
targeting a protected group], I wish they’d just get hit by a truck.”
• Targeting an individual and making claims they are involved in human tra cking in the context of a harmful conspiracy
theory where the conspiracy is linked to direct threats or violent acts.
• Using an extreme insult to dehumanize an individual based on their intrinsic attributes. For example: “Look at this dog
of a woman! She’s not even a human being — she must be some sort of mutant or animal!”
• Depicting an identi able individual being murdered, seriously injured, or engaged in a graphic sexual act without their
consent.
• Accounts dedicated entirely to focusing on maliciously insulting an identi able individual.
More Examples
https://support.google.com/youtube/answer/2802268?hl=en 2/3
10/29/2020 Harassment and cyberbullying policy - YouTube Help
• Targeting an individual based on their intrinsic attributes to wish for their death or serious injury, for example
“I wish someone would just bring a hammer down on that [Member of a Protected Group’s] face.”
• Threatening someone’s physical safety. This includes implied threats like “when I see you next, things will
end badly for you,” explicit threats like “when I see you on Saturday I’m going to punch you in the face,” or
implying violence by saying things such as “You better watch out” while brandishing a weapon.
• Posting an individual’s nonpublic personal identifying information like a phone number, home address, or
email to direct abusive attention or tra c toward them. For example: “I got a hold of their phone number,
keep on calling and leaving messages until they pick up!”
• “Raiding” or directing malicious abuse to identi able individuals through in-game voice chat or messages
during a stream.
• Directing users toward a YouTuber’s comment section for malicious abuse. For example: “everyone needs to
go over to this person’s channel right now and just go crazy, let them know how much we want them to die.”
• “Swatting” or other prank calls to emergency or crisis response services, or encouraging viewers to engage in
this or any other harassing behavior.
• Stalking or attempting to blackmail users.
• Zooming in on prolongedly focused emphasis on the breasts, buttocks or genital area of an identi able
individual for the purposes of degrading, objectifying, or sexualizing.
• Video game content which has been developed or modi ed (“modded”) to promote violence or hatred
against an individual with the attributes noted above.
Please remember these are just some examples, and don't post content if you think it might violate this policy.
We may also terminate your channel or account for repeated violations of the Community Guidelines or Terms of Service,
as well as due to a single case of severe abuse, or when the channel is dedicated to a policy violation. You can learn more
about channel or account terminations here.
Yes No
https://support.google.com/youtube/answer/2802268?hl=en 3/3
EXHIBIT 4
10/29/2020 Hate speech policy - YouTube Help
The safety of our creators, viewers, and partners is our highest priority – and we look to each of you to help us
protect this unique and vibrant community. It’s important you understand our Community Guidelines, and the
role they play in our shared responsibility to keep YouTube safe. Please take the time to carefully read the policy
below. You can also check out this page for a full list of our guidelines.
While we’ve always had policies that prohibited hate speech on YouTube, on June 5, we announced some changes to
our hate speech policies. You can learn more about those changes here. The below policy has been updated with
those changes.
Hate speech is not allowed on YouTube. We remove content promoting violence or hatred against individuals or groups
based on any of the following attributes:
• Age
• Caste
• Disability
• Ethnicity
• Gender Identity and Expression
• Nationality
• Race
• Immigration Status
• Religion
• Sex/Gender
• Sexual Orientation
• Victims of a major violent event and their kin
• Veteran Status
If you nd content that violates this policy, please report it. Instructions for reporting violations of our Community
Guidelines are available here. If you've found multiple videos or comments that you would like to report, you can report
the channel.
https://support.google.com/youtube/answer/2801939?hl=en 1/3
10/29/2020 Hate speech policy - YouTube Help
Encourage violence against individuals or groups based on any of the attributes noted above. We don’t allow threats on
•
YouTube, and we treat implied calls for violence as real threats. You can learn more about our policies on threats and
harassment.
• Incite hatred against individuals or groups based on any of the attributes noted above.
• Dehumanizing individuals or groups by calling them subhuman, comparing them to animals, insects, pests,
disease, or any other non-human entity.
• Praise or glorify violence against individuals or groups based on the attributes noted above.
• Use of racial, religious or other slurs and stereotypes that incite or promote hatred based on any of the
attributes noted above. This can take the form of speech, text, or imagery promoting these stereotypes or
treating them as factual.
• Claim that individuals or groups are physically or mentally inferior, de cient, or diseased based on any of the
attributes noted above. This includes statements that one group is less than another, calling them less
intelligent, less capable, or damaged.
• Allege the superiority of a group over those with any of the attributes noted above to justify violence,
discrimination, segregation, or exclusion.
• Conspiracy theories saying individuals or groups are evil, corrupt, or malicious based on any of the attributes
noted above.
• Call for the subjugation or domination over individuals or groups based on any of the attributes noted above.
• Deny that a well-documented, violent event took place.
• Attacks on a person’s emotional, romantic and/or sexual attraction to another person.
• Content containing hateful supremacist propaganda including the recruitment of new members or requests
for nancial support for their ideology.
• Music videos promoting hateful supremacism in the lyrics, metadata, or imagery.
Educational content
We may allow content that includes hate speech if the primary purpose is educational, documentary, scienti c,
or artistic in nature. This is not a free pass to promote hate speech. Examples include:
• A documentary about a hate group: Educational content that isn’t supporting the group or promoting ideas
would be allowed. A documentary promoting violence or hatred wouldn’t be allowed.
• A documentary about the scienti c study of humans: A documentary about how theories have changed over
time, even if it includes theories about the inferiority or superiority of speci c groups, would be allowed
because it’s educational. We won’t allow a documentary claiming there's scienti c evidence today that an
individual or group is inferior or subhuman.
• Historical footage of an event, like WWII, which doesn't promote violence or hatred.
This policy applies to videos, video descriptions, comments, live streams, and any other YouTube product or
feature. For educational content that includes hate speech, this context must appear in the images or audio of
the video itself. Providing it in the title or description is insu cient.
Examples
Here are examples of hate speech not allowed on YouTube.
https://support.google.com/youtube/answer/2801939?hl=en 2/3
10/29/2020 Hate speech policy - YouTube Help
• “I’m glad this [violent event] happened. They got what they deserved [referring to persons with the attributes noted
above].”
• “[Person with attributes noted above] are dogs” or “[person with attributes noted above] are like animals.”
More examples
• “Get out there and punch a [person with attributes noted above].”
• “Everyone in [groups with attributes noted above] is criminals and thugs.”
• “[Person with attributes noted above] is scum of the earth.”
• “[People with attributes noted above] are a disease.”
• “[People with attributes noted above] are less intelligent than us because their brains are smaller.”
• “[Group with any of the attributes noted above] threaten our existence, so we should drive them out at every
chance we get.”
• “[Group with any of the attributes noted above] has an agenda to run the world and get rid of us.”
• “[Attribute noted above] is just a form of mental illness that needs to be cured.”
• “[Person with any of the attributes noted above] shouldn't be educated in schools because they shouldn't be
educated at all.”
• “All of the so-called victims of this violent event are actors. No one was hurt, and this is just a false ag.”
• “All of the ‘so-called victims’ of this are actors. No one was hurt.”
• Shouting “[people with attributes noted above] are pests!” at someone regardless of whether the person does
or does not have the alleged attributes
• Video game content which has been developed or modi ed (“modded”) to promote violence or hatred
against a group with any of the attributes noted above.
Please remember these are just some examples, and don't post content if you think it might violate this policy.
We may also terminate your channel or account for repeated violations of the Community Guidelines or Terms of Service,
as well as due to a single case of severe abuse, or when the channel is dedicated to a policy violation. You can learn more
about channel or account terminations here.
If we think your content comes close to hate speech, we may limit YouTube features available for that content. You can
learn more about limited features here.
Yes No
https://support.google.com/youtube/answer/2801939?hl=en 3/3
EXHIBIT 5
10/29/2020 Channel or account terminations - YouTube Help
• Repeated violations of the Community Guidelines or Terms of Service across any form of content (like
repeatedly posting abusive, hateful, and/or harassing videos or comments)
• A single case of severe abuse (such as predatory behavior, spam, or pornography)
• Channels or accounts dedicated to a policy violation (like hate speech, harassment, or impersonation)
If you believe that your channel/account was terminated by mistake, you can appeal using this form.
• Don’t submit an appeal request more than once. Multiple requests increase the volume to review and cause delays in
our response.
• Fill out the form as completely as possible including your Channel ID. The more information you give us, the easier it
will be to process your request.
Copyright terminations
If your channel was terminated due to copyright infringement claims and you think the claims are incorrect, you may le a
counter noti cation . This process is still available for creators with terminated channels, but the counter noti cation
webform will be inaccessible. You may submit a free-form counter noti cation.
For more information on the counter-noti cation process, visit the Copyright Center .
Yes No
https://support.google.com/youtube/answer/2802168?hl=en 1/1
EXHIBIT 6
10/29/2020 Continuing our work to improve recommendations on YouTube
Jan.25.2019
When recommendations are at their best, they help users find a new song to fall in love with, discover their
next favorite creator, or learn that great paella recipe. That's why we update our recommendations system
all the time—we want to make sure we’re suggesting videos that people actually want to watch.
You might remember that a few years ago, viewers were getting frustrated with clickbaity videos with
misleading titles and descriptions (“You won’t believe what happens next!”). We responded by updating our
system to focus on viewer satisfaction instead of views, including measuring likes, dislikes, surveys, and
time well spent, all while recommending clickbait videos less often. More recently, people told us they were
getting too many similar recommendations, like seeing endless cookie videos after watching just one recipe
for snickerdoodles. We now pull in recommendations from a wider set of topics—on any given day, more
than 200 million videos are recommended on the homepage alone. In fact, in the last year alone, we’ve
made hundreds of changes to improve the quality of recommendations for users on YouTube.
https://blog.youtube/news-and-events/continuing-our-work-to-improve 1/2
10/29/2020 Continuing our work to improve recommendations on YouTube
We’ll continue that work this year, including taking a closer look at how we can reduce the spread of content
that comes close to—but doesn’t quite cross the line of—violating our Community Guidelines. To that end,
we’ll begin reducing recommendations of borderline content and content that could misinform users in
harmful ways—such as videos promoting a phony miracle cure for a serious illness, claiming the earth is
flat, or making blatantly false claims about historic events like 9/11.
While this shift will apply to less than one percent of the content on YouTube, we believe that limiting the
recommendation of these types of videos will mean a better experience for the YouTube community. To be
clear, this will only affect recommendations of what videos to watch, not whether a video is available on
YouTube. As always, people can still access all videos that comply with our Community Guidelines and,
when relevant, these videos may appear in recommendations for channel subscribers and in search results.
We think this change strikes a balance between maintaining a platform for free speech and living up to our
responsibility to users.
This change relies on a combination of machine learning and real people. We work with human evaluators
and experts from all over the United States to help train the machine learning systems that generate
recommendations. These evaluators are trained using public guidelines and provide critical input on the
quality of a video.
This will be a gradual change and initially will only affect recommendations of a very small set of videos in
the United States. Over time, as our systems become more accurate, we'll roll this change out to more
countries. It's just another step in an ongoing process, but it reflects our commitment and sense of
responsibility to improve the recommendations experience on YouTube.
https://blog.youtube/news-and-events/continuing-our-work-to-improve 2/2
EXHIBIT 7
10/30/2020 Our ongoing work to tackle hate
Jun.05.2019
Over the past few years, we’ve been investing in the policies, resources and products needed to live up to
our responsibility and protect the YouTube community from harmful content. This work has focused on four
pillars: removing violative content, raising up authoritative content, reducing the spread of borderline content
and rewarding trusted creators. Thanks to these investments, videos that violate our policies are removed
faster than ever and users are seeing less borderline content and harmful misinformation. As we do this,
we’re partnering closely with lawmakers and civil society around the globe to limit the spread of violent
extremist content online.
We review our policies on an ongoing basis to make sure we are drawing the line in the right place: In 2018
alone, we made more than 30 policy updates. One of the most complex and constantly evolving areas we
deal with is hate speech. We’ve been taking a close look at our approach towards hateful content in
consultation with dozens of experts in subjects like violent extremism, supremacism, civil rights, and free
speech. Based on those learnings, we are making several updates:
YouTube has always had rules of the road, including a longstanding policy against hate speech. In 2017, we
introduced a tougher stance towards videos with supremacist content, including limiting recommendations
and features like comments and the ability to share the video. This step dramatically reduced views to these
videos (on average 80%). Today, we're taking another step in our hate speech policy by specifically
prohibiting videos alleging that a group is superior in order to justify discrimination, segregation or exclusion
based on qualities like age, gender, race, caste, religion, sexual orientation or veteran status. This would
include, for example, videos that promote or glorify Nazi ideology, which is inherently discriminatory. Finally,
we will remove content denying that well-documented violent events, like the Holocaust or the shooting at
Sandy Hook Elementary, took place.
We recognize some of this content has value to researchers and NGOs looking to understand hate in order
to combat it, and we are exploring options to make it available to them in the future. And as always, context
matters, so some videos could remain up because they discuss topics like pending legislation, aim to
condemn or expose hate, or provide analysis of current events. We will begin enforcing this updated policy
today; however, it will take time for our systems to fully ramp up and we’ll be gradually expanding coverage
over the next several months.
In addition to removing videos that violate our policies, we also want to reduce the spread of content that
comes right up to the line. In January, we piloted an update of our systems in the U.S. to limit
recommendations of borderline content and harmful misinformation, such as videos promoting a phony
miracle cure for a serious illness, or claiming the earth is flat. We’re looking to bring this updated system to
more countries by the end of 2019. Thanks to this change, watch time that this type of content gets from
recommendations has dropped by over 50% in the U.S. Our systems are also getting smarter about what
types of videos should get this treatment, and we’ll be able to apply it to even more borderline videos
moving forward. As we do this, we’ll also start raising up more authoritative content in recommendations,
building on the changes we made to news last year. For example, if a user is watching a video that comes
close to violating our policies, our systems may include more videos from authoritative sources (like top
news channels) in the "watch next" panel.
Con inuing o reward rusted creators and enforce our mone iza on pol c es
Finally, it’s critical that our monetization systems reward trusted creators who add value to YouTube. We
have longstanding advertiser-friendly guidelines that prohibit ads from running on videos that include hateful
content and we enforce these rigorously. And in order to protect our ecosystem of creators, advertisers and
viewers, we tightened our advertising criteria in 2017. In the case of hate speech, we are strengthening
enforcement of our existing YouTube Partner Program policies. Channels that repeatedly brush up against
https://blog.youtube/news-and-events/our-ongoing-work-to-tackle-hate 2/3
10/30/2020 Our ongoing work to tackle hate
our hate speech policies will be suspended from the YouTube Partner program, meaning they can’t run ads
on their channel or use other monetization features like Super Chat.
The openness of YouTube’s platform has helped creativity and access to information thrive. It’s our
responsibility to protect that, and prevent our platform from being used to incite hatred, harassment,
discrimination and violence. We are committed to taking the steps needed to live up to this responsibility
today, tomorrow and in the years to come.
https://blog.youtube/news-and-events/our-ongoing-work-to-tackle-hate 3/3
EXHIBIT 8
10/29/2020 Managing harmful conspiracy theories on YouTube
Oct.15.2020
Today, we are taking another step in our efforts to curb hate and harassment by removing
more conspiracy theory content used to justify real-world violence.
Managing misinformation and harmful conspiracy theories is challenging because the content is always
shifting and evolving. To address this kind of content effectively, it’s critical that our teams continually
review and update our policies and systems to reflect the frequent changes. Today, we are taking another
step in our efforts to curb hate and harassment by removing more conspiracy theory content used to justify
real-world violence. This builds on our work over the last several years to strengthen and evolve our policies
and enforcement — work that has been organized around four pillars: removing violative content, reducing
the spread of harmful misinformation, raising authoritative voices, and rewarding trusted creators.
“ In fact, when we
looked at QAnon
content, we saw the
https://blog.youtube/news-and-events/harmful-conspiracy-theories-youtube 1/3
10/29/2020 Managing harmful conspiracy theories on YouTube
Nearly two years ago, we took a major step to limit the reach of harmful misinformation by updating our
recommendations system. This resulted in a 70% drop in views coming from our search and discovery
systems. In fact, when we looked at QAnon content, we saw the number of views that come from non-
subscribed recommendations to prominent Q-related channels dropped by over 80% since January 2019.
Additionally, we’ve removed tens of thousands of QAnon-videos and terminated hundreds of channels under
our existing policies, particularly those that explicitly threaten violence or deny the existence of major
violent events. All of this work has been pivotal in curbing the reach of harmful conspiracies, but there’s
even more we can do to address certain conspiracy theories that are used to justify real-world violence, like
QAnon.
Today we're further expanding both our hate and harassment policies to prohibit content that targets an
individual or group with conspiracy theories that have been used to justify real-world violence. One example
would be content that threatens or harasses someone by suggesting they are complicit in one of these
harmful conspiracies, such as QAnon or Pizzagate. As always, context matters, so news coverage on these
issues or content discussing them without targeting individuals or protected groups may stay up. We will
begin enforcing this updated policy today, and will ramp up in the weeks to come.
Due to the evolving nature and shifting tactics of groups promoting these conspiracy theories, we’ll continue
to adapt our policies to stay current and remain committed to taking the steps needed to live up to this
responsibility.
Related op cs
POLICY
https://blog.youtube/news-and-events/harmful-conspiracy-theories-youtube 2/3
10/29/2020 Managing harmful conspiracy theories on YouTube
Related Ar cles
I N S I D E YO U T U B E I N S I D E YO U T U B E
The Four Rs o Responsib l ty, Par 1: Removing harm ul The Four Rs o Resp
conten conten and reducin
Sep.03.2019 m sinforma on
Dec.03.2019
https://blog.youtube/news-and-events/harmful-conspiracy-theories-youtube 3/3
EXHIBIT 9
1 DAVID H. KRAMER, State Bar No. 168452
LAUREN GALLO WHITE, State Bar No. 309075
2 KELLY M. KNOLL, State Bar No. 305579
WILSON SONSINI GOODRICH & ROSATI
3 Professional Corporation
650 Page Mill Road
4 Palo Alto, CA 94304-1050
Telephone: (650) 493-9300
5 Facsimile: (650) 565-5100
Email: dkramer@wsgr.com
6 Email: lwhite@wsgr.com
Email: kknoll@wsgr.com
7
Attorneys for Defendants
8 GOOGLE LLC and YOUTUBE, LLC
13
23
24
25
26
27
28
2 1. I am a member of the law firm Wilson Sonsini Goodrich & Rosati, P.C., counsel
3 for Defendants Google LLC and YouTube, LLC in the above-captioned action. I have personal
4 knowledge of the facts set forth herein and, if called as a witness, I could and would testify
5 competently thereto.
8 whistleblowers. This copy of the webpage was retrieved on October 29, 2020.
9 3. Attached hereto as Exhibit B is a true and correct copy of the YouTube webpage
10 containing the video titled “Cris Armenta Explains the Legal Strategy of the PunchGoogle
14 4. Attached hereto as Exhibit C is a true and correct copy of a message that the online
15 service Twitter indicates was posted to its website on October 15, 2020 at 9:09 p.m. by a user with
19 I declare under penalty of perjury under the laws of the United States that the foregoing is
20 true and correct. Executed October 30, 2020 at Palo Alto, California.
21
24
25
26
27
28
2 I, Lauren Gallo White, am the ECF User whose ID and password are being used to file
3 this document. In compliance with N.D. Cal. Civil L.R. 5-1(i)(3), I hereby attest that the
4 concurrence in the filing of this document has been obtained from the signatory.
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
How it Start a
Search Sign in Share Donate
works GoFundMe
Share
Donate now
Anonymous
$50 • 12 mins
Levi George
Zach Vorhies is organizing this fundraiser on $10 • 1 hr
behalf of Ryan Hartwig.
Anonymous
$100 • 3 hrs
Created October 19, 2020 Other
edward hannegan
$10 • 3 hrs
DONATE NOW to sue Google! STOP THE
ELECTION COUP!
Anonymous
$20 • 3 hrs
All funds go to cover costs of this EMERGENCY
INJUNCTION against Google/YouTube.
See See top
Updates:
* FULLY FUNDED! THANK YOU TO ALL THE
https://www.gofundme.com/f/support-big-tech-whistleblowers 1/12
10/29/2020 Fundraiser for Ryan Hartwig by Zach Vorhies : Stop Big Tech Election Interference
Images courtesy of
altCensored.com/channel/deleted
(link)
https://www.gofundme.com/f/support-big-tech-whistleblowers 5/12
10/29/2020 Fundraiser for Ryan Hartwig by Zach Vorhies : Stop Big Tech Election Interference
Thank you,
~Zach
LEGAL STRATEGY
Q&A
Q I l i b i
https://www.gofundme.com/f/support-big-tech-whistleblowers
id d? 9/12
10/29/2020 Fundraiser for Ryan Hartwig by Zach Vorhies : Stop Big Tech Election Interference
Q: Is a class action status being considered?
A: Yes.
Legal Disclaimer
GoFundMe Guarantee
Donate Share
Contact
Report fundraiser
#1 GOFUNDME EXPERT
FUNDRAISING GUARANTEE ADVICE, 24/7
PLATFORM
In the rare case Contact us
People have that something with your
raised more isn’t right, we questions and
money on will refund we’ll answer,
GoFundMe your donation. day or night.
than anywhere Learn more Learn more
else.
Learn more
https://www.gofundme.com/f/support-big-tech-whistleblowers 11/12
10/29/2020 Fundraiser for Ryan Hartwig by Zach Vorhies : Stop Big Tech Election Interference
Team fundraising
Donate button
Support COVID-19
fundraisers
https://www.gofundme.com/f/support-big-tech-whistleblowers 12/12
EXHIBIT B
EXHIBIT C
10/29/2020 SGTreport on Twitter: "Please look for ALL future videos & interviews at https://t.co/MJortZQZNF Thanks for asking Hidden Jewell. …
Search Twitter
SGTreport
Explore
@SGTreport
twitter.com/TheHiddenJewel…
US elections · LIVE
California: Election news an
updates
Show more
https://twitter.com/SGTreport/status/1316954297695916033 2/2