{{ site.title }}

Section 230 and Libraries

ARL and our partners are working to help lawmakers, our community, and the general public understand how the law protects libraries and universities and not just “big tech,” which is often the subject of Congressional hearings and legislation. Section 230 of the Communications Decency Act protects online platforms and their users from liability for the speech of third parties. It also protects the first amendment right to moderate third-party content.

How do libraries rely on Section 230?

Thanks to Section 230, libraries can continue to provide venues for public discourse and community-engaged research and implement content-moderation policies, without fear of incurring liability for third-party speech or for taking action to remove it. Examples include: 

  • Moderating virtual discussions in real time by invoking conduct policies
  • Removing offensive contributions from crowdsourced projects and community engagement projects
  • Implementing policies around materials deposited in open repositories, such as student dissertations, faculty papers, pre-prints, oral histories, etc.

Is Section 230 blanket immunity?

No. Section 230 includes exceptions from materials that are already unlawful, including child sexual abuse material and infringement of intellectual property laws. Additionally, it does not provide protection against federal criminal liability.

Are algorithms covered under Section 230?

Algorithmic amplification is protected by the First Amendment. The question of whether algorithms are covered under Section 230 has not been addressed by courts. ARL’s position is that legislation or a court finding exposing platforms to liability for recommending, promoting, ranking, arranging, or otherwise displaying third-party content would negatively affect library discovery tools, personalized recommendations, and research.

What is our message for Congress?

Reforming Section 230 without consideration for libraries, archives, and universities may result in the loss of rich public discourse and opportunities to inform, educate, and address misinformation. ARL urges Congress to consider the potential unintended consequences of reforming Section 230, and to hold hearings with research libraries and other higher education stakeholders to develop a record of the breadth of the law’s protections.

An overview of Section 230

The first clause holds that it is the speakers, not the platforms, who are liable for content they post:

Section 230(c)(1) No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider 

The second clause incentivizes content moderation by guaranteeing the liability protection:

Section 230(c)(2)  No provider or user of an interactive computer service shall be held liable on account of— 

A. any action voluntarily taken in good faith to restrict access to or availability of material that the provider or user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected; or

B. any action taken to enable or make available to information content providers or others the technical means to restrict access to material described in paragraph (1)

Libraries are included in the statutory definition of “interactive computer service”:

Section 230 (f ) (2) The term “interactive computer service” means any information service, system, or access software provider that provides or enables computer access by multiple users to a computer server, including specifically a service or system that provides access to the Internet and such systems operated or services offered by libraries or educational institutions.

For more information, please see:

Affiliates