The FCC is trying to govern content moderation: It doesn’t have the authority

The FCC is trying to govern content moderation: It doesn't have the authority

CITRIS Policy Lab Director Brandie Nonnecke published an op-ed in The Hill that discusses the FCC’s recent announcement that it will issue rulemaking on Section230 of the Communications Decency Act to govern platform content moderation. The op-ed explains how this move would be inappropriate, illegitimate, and inconsistent with its previous actions.

The Hill: In response to increased content moderation tactics implemented by social media platforms to quell the spread of mis- and disinformation in the 2020 presidential election, the Trump administration continues to pressure the Federal Communications Commission (FCC) to move forward on rulemaking to clarify Section 230 of the Communications Decency Act (CDA). President Trump has put forward Nathan Simington, a senior advisor within the National Telecommunications and Information Administration (NTIA), to serve as an FCC commissioner. In a line of questioning led by Sen. Blumenthal (D-Conn.) during a nomination hearing on Nov. 10, Simington’s role in reviewing and drafting the NTIA petition calling for the FCC to issue rulemaking to clarify Section 230 has called into question his impartiality. If confirmed, Simington may be the Trump administration’s last chance to have the FCC reform Section 230.

Justification for the FCC’s involvement in reforming Section 230 has been fiercely debated. On Oct. 15, FCC Chairman Ajit Pai released a statement announcing that the FCC would “move forward with a rulemaking to clarify [Section 230’s] meaning.” In response to widespread criticism that the FCC lacks authority to do so, the FCC’s legal counsel issued a statement claiming its authority. However, the FCC does not — and should not — have authority to reform Section 230. The move would be inconsistent with the FCC’s own previous decisions to maintain limited oversight over “information services” and would be in contradiction with the intent of the legislation.

Section 230 of the CDA provides a liability shield to platforms for user-generated content. In its infamous 26-word sentence, Section 230 states that “no provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.” For better or worse, this protection has enabled platforms to thrive through the largely unencumbered distribution of user-generated content.

Read more >