Review Editor’s Note: Editorials represent the views of the Star Tribune Editorial Board, which operates independently of the newsroom.
Section 230 of the Communications Decency Act of 1996 has to be among the things that fall under the “so good it hurts” category.
It’s the part of the US code that essentially frees online content providers from liability for what their multitudes of users might post. It also gives platforms legal protection to moderate content to the extent they choose.
When the legislation was passed, one of the motives was to “promote the continued development of the Internet and other interactive computing services and other interactive media”.
I guess it worked.
Another was “to preserve the vibrant and competitive free market that now exists for the Internet and other interactive computer services, unimpeded by federal or state regulation.”
Hmm. Internet content is indeed a vibrant layer of many colors, but as long as we make metaphors, we will add that much of it matches the palette of various other human productions. The off-putting experience of it all can drive the most constructive participants out of the room, which means there is no truly competitive marketplace of ideas.
The primary goal of the Communications Decency Act, itself part of a late update to the Telecommunications Act, was to cleanse the young Internet of pornography. Most of the CDA was later shot down in court, but Section 230 survived and brought us the online world we have today. Which is hardly healthy, although that’s the least concern now. The biggest concern is how legal immunity allows misinformation and hate to spread – and how to reach agreement on defining and addressing these issues.
Political leaders from both parties line up to take on Section 230.
President Joe Biden would apparently knock over the whole piñata. He told the New York Times editorial board during his campaign that Section 230 “should be revoked, immediately” to counter the spread of lies, although it now sounds like an example of speaking loudly while holding a small stick.
President Donald Trump also wanted the law repealed, for the opposite reason – he believed content providers had too much power over what was circulating and were exercising it for political reasons.
House Republican staffers have crafted a set of principles aimed at addressing the essence of Trump’s allegation. One of their ideas is to remove immunity for large providers while keeping it in place for smaller, newer participants. Another is to review the application of Section 230 to Big Tech companies every five years.
Former President Barack Obama, in a speech last month at Stanford University, said he favored reforming Section 230 but not repealing it altogether. “A lot of the conversation around misinformation centers around what people post,” he said. “The biggest issue is what content these platforms promote.” He said companies should allow regulators to review their algorithms and moderation, similar to what is happening with proprietary practices in other industries.
Senior U.S. Senator from Minnesota Amy Klobuchar has also entered the fray with a 2021 bill to strip immunity from tech companies if their algorithms amplify health misinformation during public health emergencies. (The Department of Health and Human Services would define misinformation.) Klobuchar also co-introduced the bipartisan NUDGE Act, which attempts to address the role of algorithms without changing Section 230 immunity.
A little legal accountability can go a long way to reinforcing internal norms, as those of us who work in mainstream media can attest. But dumping Section 230 seems unlikely given the differences in philosophy behind the various Republican and Democratic efforts. There seems to be an overlap for careful reform. Not a bad place to start. Unintended consequences are the reason the law is being revised. They would also be a risk in the event of repeal.