The Decency of the Communications Decency Act

linkYou own or administer a website or blog that allows visitors to comment.  The odds are that, on occasion, a commenter will say something critical about someone or some company.   It might even border on outrageous and it might, if it were not true, be slanderous.    Should you remove the statement?   And if you are sued for not removing the statement placed on your website by a commenter,  will you lose?   It turns out that U.S. federal law provides legal protections for those who publish online (operators of websites, interactive computer services, and others) from liability for the content and statements of third parties. Immunity means that you are not liable for damages caused by those who post content on your site.  You are protected by the Communications Decency Act, or, as lawyers will call it:  47 U.S.C. § 230.

You qualify for protection under this law if:

1.  You are  “provider or user of an interactive computer service.”  See 47 U.S.C. § 230(c)(2).

An “interactive computer service” is defined in the statute as “any information service, system, or access software provider that provides or enables computer access by multiple users to a computer server, including specifically a service or system that provides access to the Internet and such systems operated or services offered by libraries or educational institutions.”  47 U.S.C. § 230(f)(2).  Although the language is somewhat archaic, the definition of “interactive computer service” is broad enough to cover most forms of social media: theybsites, forums, blogs, listervs and other User Generated Content (UGC)-heavy sites.

2.  The cause of action must “treat” the defendant (service) “as the publisher or speaker” of the allegedly unlawful content in order to be eligible for immunity under CDA § 230.  See 47 U.S.C. § 230(c)(1).

The CDA § 230 immunity excludes causes of action based on federal criminal law, Intellectual Property laws, and the Electronic Communications Privacy Act (EPCA). See 47 U.S.C. § 230(e).

3.  The content in dispute must have been “[p]rovided by another information content provider” other than the defendant. See 47 U.S.C. § 230(c)(1).

An “information content provider means any person or entity that is responsible, in whole or in part, for the creation or development of information provided through the Internet or any other interactive computer service.”  47 U.S.C. § 230(f)(3).

This law was developed in 1996 by Congress for the purposes to  stop, or inhibit, the profusion of pornography, and other obscene material. The word used in the text is “indecent”, but it remains undefined throughout the bill, with no legislator willing to create a definition which clearly articulates what indecency is without including art or creative writing.  Indecent remains a term which is subjective in nature—meaning each person’s idea of what is indecent varies.  Some may find particular art “indecent”, some would find it perfectly fine.  (The same goes for humor, by the way) .

While the CDA upholds the illegality of defamation, whether slander or libel, on the Internet, it specifically carves out an exception for Internet Service Providers  from liability for defamatory comments. The courts have defined the term “Internet Service Providers” quite broadly. This broad interpretation of Section 230 has served to protect Internet Service Providers and third-parties from litigation concerning libel or slander online.  Bloggers, for example, can be sued depending upon their relation to the defamatory content. If they publish the content personally, then they are liable. However, if their blog’s comment section is host to the defamatory comment, then they are acting as an Internet Service Provider and are not liable for the damage to your reputation. To the extent that they allow users to use their blogs as communicative outlets, they are exempt from any responsibility whatsoever.

Caution:  If you edit these comments, guest editorials, or republished material you cannot be held liable unless the editing process itself creates defamatory comments.   So any forum website, wiki, or chat site will be protected from the defamatory statements of its users,  regardless of whether the website has posting guidelines or not—those guidelines have no bearing on who is liable for which comments. Section 230 of the CDA also makes clear that republishing defamatory content is not illegal, assuming the republisher does not add his or her own two defamatory cents.

Law Regarding the Communications Decency Act

There is a relatively well developed set of case law that fully defines your rights and responsibilities regarding the posting of comments by third parties.   If you are concerned about your liability, or if you have been sued, here is a discussion of various cases that might help you better understand your exposure.   Most of these cases have been reported by New Media RightsRip-off Reports and Wikipedia, although we also reference some other cases that might be applicable as well.

A number of recent cases have reinforced §230 immunity, despite plaintiffs’ repeated attempts to plead creatively around the statute. Doe v. MySpace, Inc. (5th Cir 2008) 528 F3d 413, is one of several cases involving underage girls who were sexually assaulted by men with whom they communicated on social networking websites. Plaintiffs argued that MySpace was negligent in failing to police its postings, but the court dismissed this argument as simply another way of holding MySpace liable as a publisher of content generated by its users. The court dismissed the failure-to-police argument as a thinly disguised claim based on the defendant’s editorial practices, which were precisely what §230 immunizes.

In Barnes v Yahoo!, Inc. (9th Cir 2009) 565 F3d 560, amended by 570 F3d 1096, the Ninth Circuit re-emphasized the point that §230 preempts negligence claims. In Barnes, the defendant failed to remove unauthorized nude photographs of the plaintiff and a fraudulent profile in which the plaintiff’s former boyfriend impersonated the plaintiff after the plaintiff had requested removal. Although the plaintiff’s negligence claim was barred by §230, the court held that the plaintiff could recover under a promissory estoppel theory because the host website had promised to remove the offending content. Thus, if a host service promises to remove offending content but does not do so, §230 will not bar a claim for promissory estoppel.

An interesting question arises when a website that publishes user content is ordered by injunction to take down certain content. Is the site required to comply with the injunction, or may the site ignore it and claim CDA §230 immunity? Although the court in Blockowicz v Williams (ND Ill, Dec 21, 2009, No. 09-C-3955) 2009 US Dist Lexis 118599, did not answer this question directly, it did not require the website ( to comply with the injunction. In Blockowicz, a user posted allegedly defamatory remarks concerning the plaintiff on a number of websites, the plaintiff obtained a declaratory judgment, and all sites except removed the content. The court concluded that had only a “tenuous” relationship with the poster and could not be termed “in active concert or participation with” the enjoined party under Fed R Civ P 65(d)(2)(C). 2009 US Dist Lexis 118599, *5.

Arguably, if state intellectual property law claims—such as a claim of violation of the right of publicity—are not preempted by §230, conflicts between federal and state law could chill experimentation and the creation of new media services. Yet the language of §230 strongly supports the conclusion that state intellectual property law claims should not be preempted. Section 230(e)(2) reads, “[n]othing in this section shall be construed to limit or expand any law pertaining to . . . intellectual property,” without limiting preemption to “federal” intellectual property only.

Doe v Friendfinder Network, Inc (D NH 2008) 540 F Supp 2d 288 is one of many §230 cases involving an adult dating service’s online publication of a false user-supplied profile. Friendfinder, however, contained two new allegations: (1) although the site removed the false profile, it replaced the removed profile with a message that read, “Sorry, this member has removed his/her profile,” which, plaintiff alleged, meant that at one point she had authorized the profile; and (2) portions of the false profile were reused by the defendant as advertisements on third party websites. 540 F Supp 2d at 292. The court quickly dismissed the plaintiff’s tort claims on the grounds of §230, despite the dating service’s affirmative reposting and despite the dropdown menus that it provided to users to build profiles. However, the court denied preemption of the plaintiff’s right of publicity claims and any other state intellectual property law claims.

NOTE The Friendfinder decision directly conflicts with the Ninth Circuit’s ruling in Perfect 10, Inc. v CCBill LLC (9th Cir 2004) 488 F3d 1102, which held that §230 does not preempt federal intellectual property claims but does preempt state claims, such as trade secret misappropriation. The Friendfinder court makes a strong interpretative argument that the word “federal” should not be read into §230(e)(2), but it is not binding precedent within the Ninth Circuit.

Blatantly Illegal Content

As discussed above, a number of cases have explored the extent to which a service provider retains §230 immunity when it involves itself in the content of user submissions. The Ninth Circuit’s decision in Fair Housing Council of San Fernando Valley v LLC (9th Cir 2008) 521 F3d 1157, succeeds in highlighting the confusion. In that case, the Ninth Circuit ruled that courts should err in favor of the service provider if faced with a grey area, but the criteria used by the court to find liable for its search functionality were not well defined.

In, users were required to provide information about their gender, sexual orientation, and whether they would bring children into a household, as well as their preference for roommates based on the same criteria. The website also provided a fill-in “Additional Comments” section for users. 521 F3d at 1161. The site published the information provided by users and used it to connect users with compatible roommate preferences. The Ninth Circuit held that did not qualify for §230 immunity because of (1) the specific questions it posed to users, (2) the users’ profile answers using Roomates’ pull-down menus, and (3) its search and e-mail system which displayed and suppressed results based on unlawful criteria. Roomates did, however, receive §230 immunity for its open-ended “Additional Comments” field, despite any unlawful content submitted by users in that field. 521 F3d at 1174.

The opinion attempted to narrow the defendant’s offending behavior by reinforcing that pull-down menus are not inherently problematic. The court stated that “[a] dating website that requires users to enter their sex, race, religion and marital status through drop-down menus, and that provides means for users to search along the same lines, retains its CDA immunity insofar as it does not contribute to any alleged illegality.” 521 F3d at 1169. The court also stated: “The message to website operators is clear: If you don’t encourage illegal content, or design your website to require users to input illegal content, you will be immune.” 521 F3d at 1175. The problem is that ambiguities exist in the opinion regarding what actions by a website would “encourage illegal content,” and uncertainty can breed caution among websites that wish to provide advanced, structured searches of user-generated content.

Contrary to some expectations, the decision has not spawned more cases narrowing §230 immunity. Only a few cases in which online hosting services lost cite, such as the Tenth Circuit’s decision in FTC v Accusearch Inc. (10th Cir 2009) 570 F3d 1187. In Accusearch, the court held that an online seller of illegal telephone records did not qualify for immunity, even though it received the records from a third party rather than obtaining the records itself. Indeed, the Ninth Circuit’s effort in to limit its findings to services that “encourage illegal content” has resulted in more citations of the case in favor of defendants. See, e.g., Nemet Chevrolet, Ltd. v (4th Cir 2009) 591 F3d 250.

Marketing Representations

Mazur v eBay Inc. (ND Cal, Mar. 3,  2008, No. C 07-03967) 2008 US Dist Lexis 16561, explores the boundaries of liability for marketing representations. In Mazur, the court held that eBay was entitled to §230 immunity for (1) information provided by third party vendors, (2) failing to prevent a seller’s illegal conduct, and (3) asserting that certain auction houses were screened. The court found, however, that the following three statements were not immunized: (1) that live bidding was “safe,” (2) that bidding was conducted against “floor bidders,” and (3) that bidding involved international auction houses. Mazur illustrates how a combination of representations by the service provider and user-generated content—which was distinguished by the court as not merely a regurgitation of the user’s content—may result in liability. The decision has been criticized for its failure to immunize service providers for marketing representations. As a result, the case may open up a significant opportunity for plaintiffs to bypass §230.

Other Relevant Federal Cases

Zeran v. AOL, 129 F.3d 327 (4th Cir. 1997).[8]
Immunity was upheld against claims that AOL unreasonably delayed in removing defamatory messages posted by third party, failed to post retractions, and failed to screen for similar postings.

Blumenthal v. Drudge, 992 F. Supp. 44, 49-53 (D.D.C. 1998).
The court upheld AOL’s immunity from liability for defamation. AOL’s agreement with the contractor allowing AOL to modify or remove such content did not make AOL the “information content provider” because the content was created by an independent contractor. The Court noted that Congress made a policy choice by “providing immunity even where the interactive service provider has an active, even aggressive role in making available content prepared by others.”

Carafano v., 339 F.3d 1119 (9th Cir. 2003).
The court upheld immunity for an Internet dating service provider from liability stemming from third party’s submission of a false profile. The plaintiff, Carafano, claimed the false profile defamed her, but because the content was created by a third party, the website was immune, even though it had provided multiple choice selections to aid profile creation.

Batzel v. Smith, 333 F.3d 1018 (9th Cir. 2003).
Immunity was upheld for a website operator for distributing an email to a listserv where the plaintiff claimed the email was defamatory. Though there was a question as to whether the information provider intended to send the email to the listserv, the Court decided that for determining the liability of the service provider, “the focus should be not on the information provider’s intentions or knowledge when transmitting content but, instead, on the service provider’s or user’s reasonable perception of those intentions or knowledge.” The Court found immunity proper “under circumstances in which a reasonable person in the position of the service provider or user would conclude that the information was provided for publication on the Internet or other ‘interactive computer service’.”

Green v. AOL, 318 F.3d 465 (3rd Cir. 2003).
The court upheld immunity for AOL against allegations of negligence. Green claimed AOL failed to adequately police its services and allotheyd third parties to defame him and inflict intentional emotional distress. The court rejected these arguments because holding AOL negligent in promulgating harmful content would be equivalent to holding AOL “liable for decisions relating to the monitoring, screening, and deletion of content from its network — actions quintessentially related to a publisher’s role.”

Barrett v. Rosenthal, 40 Cal. 4th 33 (2006).
Immunity was upheld for an individual internet user from liability for republication of defamatory statement on a listserv. The court found the defendant to be a “user of interactive computer services” and thus immune from liability for posting information passed to her by the author.

MCW, Inc. v. Magedson/XCENTRIC Ventures LLC) 2004 WL 833595, No. Civ.A.3:02-CV-2727-G, (N.D. Tex. April 19, 2004).
The court rejected the defendant’s motion to dismiss on the grounds of Section 230 immunity, ruling that the plaintiff’s allegations that the defendants wrote disparaging report titles and headings, and themselves wrote disparaging editorial messages about the plaintiff, rendered them information content providers. The theyb site,, allows users to upload “reports” containing complaints about businesses they have dealt with.

Hy Cite Corp. v. (companies/Ed Magedson/XCENTRIC Ventures LLC), 418 F. Supp. 2d 1142 (D. Ariz. 2005).
The court rejected immunity and found the defendant was an “information content provider” under Section 230 using much of the same reasoning as the MCW case.
Gentry v. eBay, Inc., 99 Cal. App. 4th 816, 830 (2002).
eBay’s immunity was upheld for claims based on forged autograph sports items purchased on the auction site.

Ben Ezra, Weinstein & Co. v. America Online, 206 F.3d 980, 984-985 (10th Cir. 2000), cert. denied, 531 U.S. 824 (2000).
Immunity for AOL was upheld against liability for a user’s posting of incorrect stock information.

Goddard v. Google, Inc., C 08-2738 JF (PVT), 2008 WL 5245490, 2008 U.S. Dist. LEXIS 101890 (N.D. Cal. Dec. 17, 2008).
Immunity upheld against claims of fraud and money laundering. Google was not responsible for misleading advertising created by third parties who bought space on Google’s pages. The court found the creative pleading of money laundering did not cause the case to fall into the crime exception to Section 230 immunity.

Milgram v. Orbitz Worldwide, LLC, ESX-C-142-09 (N.J. Super. Ct. Aug. 26, 2010)
Immunity for Orbitz and CheapTickets was upheld for claims based on fraudulent ticket listings entered by third parties on ticket resale marketplaces.
Doe v. America Online, 783 So. 2d 1010, 1013-1017 (Fl. 2001),[20] cert. denied, 122 S.Ct. 208 (2000)
The court upheld immunity against state claims of negligence based on “chat room marketing” of obscene photographs of minor by a third party.

Kathleen R. v. City of Livermore, 87 Cal. App. 4th 684, 692 (2001)
The California Court of Appeal upheld the immunity of a city from claims of waste of public funds, nuisance, premises liability, and denial of substantive due process. The plaintiff’s child downloaded pornography from a public library’s computers which did not restrict access to minors. The court found the library was not responsible for the content of the internet and explicitly found that section 230(c)(1) immunity covers governmental entities and taxpayer causes of action.

Doe v. MySpace, 528 F.3d 413 (5th Cir. 2008)
The court upheld immunity for a social networking site from negligence and gross negligence liability for failing to institute safety measures to protect minors and failure to institute policies relating to age verification. The Does’ daughter had lied about her age and communicated over MySpace with a man who later sexually assaulted her. In the court’s view, the Does’ allegations, they’re “merely another way of claiming that MySpace was liable for publishing the communications.”

Dart v. Craigslist, Inc., 665 F. Supp. 2d 961 (N.D. Ill. Oct. 20, 2009)
The court upheld immunity for Craigslist against a county sheriff’s claims that its “erotic services” section constituted a public nuisance because it caused or induced prostitution.

Chicago Lawyers’ Committee For Civil Rights Under Law, Inc. v. Craigslist, Inc. 519 F.3d 666 (7th Cir. 2008).
The court upheld immunity for Craigslist against Fair Housing Act claims based on discriminatory statements in postings on the classifieds website by third party users.

Fair Housing Council of San Fernando Valley v., LLC, 521 F.3d 1157 (9th Cir. 2008)
The Ninth Circuit Court of Appeals rejected immunity for the roommate matching service for claims brought under the federal Fair Housing Act and California housing discrimination laws. The court concluded that the manner in which the service elicited information from users concerning their roommate preferences (by having dropdowns specifying gender, presence of children, and sexual orientation), and the manner in which it utilized that information in generating roommate matches (by eliminating profiles that did not match user specifications), the matching service created or developed the information claimed to violate the FHA, and thus was responsible for it as an “information content provider.” The court upheld immunity for the descriptions posted by users in the “Additional Comments” section because these they’re entirely created by users.


Delfino v. Agilent Technologies, 145 Cal. App. 4th 790 (2006), cert denied, 128 S. Ct. 98 (2007).
A California Appellate Court unanimously upheld immunity from state tort claims arising from an employee’s use of the employer’s e-mail system to send threatening messages. The court concluded that an employer that provides Internet access to its employees qualifies as a “provider . . . of an interactive service.”  In short, the CDA provides that when a user writes and posts material on an “interactive website” such as companies, the site itself cannot, in most cases, be held legally responsible for the posted material. Specifically, 47 U.S.C. § 230(c)(1) states, “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.” Because the reports on companies are authored by users of the site, they cannot be legally regarded as the “publisher or speaker” of the reports contained here, and hence they are not liable for reports even if they contain false or inaccurate information (NOTE: they occasionally create editorial comments and other material, but when they do, this is clearly marked as such). The same law applies to sites like FaceBook, MySpace, and CraigsList – users who post information on these sites are responsible for what they write, but the operators of the sites are not.

The reasons for this law are simple. their sites cannot possibly monitor the accuracy of the huge volume of information which their users may choose to post. If an angry plaintiff is permitted to hold a website liable for information that the site did not create, this would stifle free speech as sites would be willing to permit users to post anything at all. See generally Batzel v. Smith, 333 F.3d 1018, 1027-28 (9th Cir. 2003) (recognizing, “Making interactive computer services and their users liable for the speech of third parties would severely restrict the information available on the Internet. Section 230 [of the CDA] therefore sought to prevent lawsuits from shutting down sites and other services on the Internet.”)

Other State Cases

Intellect Art Multimedia, Inc. v. Milewski, 2009 WL 2915273 (N.Y.Sup. Sept. 11, 2009) (claims against companies dismissed for failure to state a claim due to CDA immunity)
GW Equity, LLC v. Xcentric Ventures, LLC, 2009 WL 62173 (N.D.Tex. 2009) (summary judgment entered in favor of companies based on CDA immunity)
Global Royalties, Ltd. v. Xcentric Ventures, LLC, 544 F.Supp.2d 929 (D.Ariz. 2008) (claims against companies dismissed pursuant to Fed. R. Civ. P. 12(b)(6) without leave to amend based on CDA immunity)
Global Royalties, Ltd. v. Xcentric Ventures, LLC, 2007 WL 2949002 (D.Ariz. Oct. 10, 2007) (claims against companies dismissed pursuant to Fed. R. Civ. P. 12(b)(6) based on CDA immunity)
Whitney Info. Network, Inc. v. Xcentric Ventures, LLC, 2008 WL 450095; 2008 U.S. Dist. LEXIS 11632 (M.D.Fla. Feb. 15, 2008) (summary judgment entered in favor of companies based on CDA immunity)

There was ONE case in 2003 where a website was sued in a foreign country and a default judgment was entered in the plaintiff’s favor for more than $27 million “Eastern Caribbean Dollars” (which is a currency that may or not even exist anymore). When the plaintiff tried to domesticate that judgment in the United States, they fought it. The case was resolved and the judgment was satisfied without any money being paid. However, a new law was enacted in August 2010 which generally prohibits U.S. courts from honoring foreign judgments like this. The U.S. adopted a new “anti-libel tourism” law, 28 U.S.C. § 4102, which generally prohibits ALL U.S. courts from honoring foreign libel/defamation judgments if they conflict with the free speech rights guaranteed by our First Amendment. In addition, the new law specifically prohibits U.S. courts from honoring foreign libel/defamation judgments against website operators if the claims at issue would have been barred under U.S. law. See 28 U.S.C. § 4102(c).

0 replies

Leave a Reply

Want to join the discussion?
Feel free to contribute!

Leave a Reply

Your email address will not be published.