i2Coalition Supports an End to Federal Criminal Publishing Liability
The i2Coalition is proud to join with the Center for Democracy and Technology and many others in a joint statement to Congressional leaders expressing our concerns about the potential federal criminal liability for entities that host user-generated content. Congress is currently reviewing legislation that would hold web hosts responsible for what others post. The bills being considered are hoping to stop content that aids in child trafficking. The i2Coalition strongly supports the end of human trafficking and through our Best Practices Initiative, we have worked tirelessly to make the Internet a safer, more secure place through our partnerships.
The current legislation, as written, is overly broad and threatens privacy rights as well as free speech. There are several steps that Congress can take to reach is a laudable goal without compromising constitutional guarantees. From our statement:
- Congress has long recognized the importance of protecting content hosts. The Internet is a powerful platform for individuals to access information and exchange opinions. This is due in significant part to legal protections for intermediaries that make up the Internet, including third-party content hosts, user-generated content platforms, and advertising networks. 47 U.S.C. § 230 (commonly known as “Section 230”) provides these intermediaries with crucial certainty that they will not be held legally liable for the content that their users post. Congress recognized that, without such legal protection, the risk of potential liability would strongly discourage content hosts from offering people the ability to share information, opinions, and creative expression online.
- Small businesses will be disproportionately affected. Legislation that creates the potential for federal criminal liability based on content created by a third party would expose all of these intermediaries and publishers to the threat of criminal prosecution. The need to defend against such prosecutions, even as a wholly innocent party, would prove extremely costly to small businesses and would create a high hurdle to many new start-ups. U.S. laws that have placed responsibility for content with the creators of that content have resulted in the most robust and attractive Internet infrastructure in the world. Imposing liability on the conduits of this information would likely lead users of U.S. infrastructure to use providers in other countries.
- Content creators will engage in self-censorship. Faced with the potential for costly and time-consuming legal proceedings, user-generated content sites and other publishers would likely take down any content that is flagged as potentially problematic, rather than risk even the possibility of criminal prosecution. This overbroad approach would lead to the removal of constitutionally protected speech. It would also create a potentially powerful “heckler’s veto” mechanism for those seeking to suppress other users’ speech, as providers of service to content creators would be unlikely to take the risk of ignoring even a spurious flag, choosing simply to disable content flagged in this manner. |
- Further, hosts and other Internet infrastructure providers will be discouraged from moderating content. Perversely, the new risk of criminal liability would discourage good-faith screening and content moderation efforts by hosts and other Internet infrastructure providers. Efforts to pre-screen could be used to support allegations that a host had knowledge of illegal content, and one incorrect decision in a pre-screening process that allowed something unlawful to slip through could open the door to prosecution. This perverse incentive is precisely what Congress intended to prevent with Section 230’s “good Samaritan” provision.
- Extensive recordkeeping requirements will place immense burdens on hosts, Internet infrastructure providers, and online speakers.  An online identity verification requirement would unquestionably chill adults’ willingness to engage in lawful communications and significantly intrude upon their right to privacy. Similar identification requirements in the Child Online Protection Act led to that law being struck down due, in part, to the burden these requirements place on speakers, listeners, and hosts of protected speech.[1] Hosts and other intermediaries would be unable to independently verify identification information supplied by those posting content and would face the risk of liability despite good-faith efforts to comply with the law. Identification requirements would create thousands of privately held databases of sensitive information, dramatically increasing the likelihood of damaging data breaches that expose individuals’ personal information to malicious actors.
- Courts have found similar state laws unconstitutional. State laws pursuing similar aims have been enjoined from violating the First Amendment, with courts finding that such laws are vague, overbroad, create a chilling effect on lawful speech, and fail the least-restrictive-means test.[2]
Read the full statement and list of signatories on the Center for Democracy & Technology’s website: Coalition Statement in Opposition to Federal Criminal Publishing Liability.