There has been a lot of digital ink spilled on Section 230 and plenty of threats to eliminate it.
I never liked Section 230 and being that I ran an ISP in the 1990s I was in the middle of all the areas of debate that led to it. We had Cubby .v. CompuServe as our guidepost in regard to case law, and some cautionary tales on the other side when it came to censorship and such (the Prodigy case being one of them.)
The CDA came about in response to other hot-button issues related to kids and sex content, primarily but not exclusively. There was also a raging dumpster file over ripped-off content, particularly on Usenet, which was an interesting issue because the ripped off content was so voluminous that it constituted 80-90% of the total. A large part of that was due to the fact that Usenet was mostly a text discussion system so to put up binary files (such as images or stolen software) you had to 7-bit-text encode it first and that made the file sizes wildly expand; the rest was simply due to the fact that one scan out of Hustler was larger in size, even before amplification, than 100 text replies.
It was thus impossible for an ISP to claim "they didn't know" what was going on. Not only did they know they had to spend 80% or more of their resources that went to Usenet, both in storage and bandwidth, specifically to sell access to illegal material whether that illegality was kiddie porn or ripped off software and images.
I was very public with my view of these things and the industry wasn't unaware of either my position nor the inevitability that law enforcement and the courts would eventually start looking at it this way -- knowing, intentional participation for the explicit purpose of profit, and from there liability, either civilly or criminally was a near-certainty.
In this background the CDA was passed, including Section 230, which exempts service providers from liability provided they are not the "source" of the content in question. It also left open the capacity to make decisions on an editorial basis without puncturing that shield. The law was poorly-written from my perspective and I opposed it at the time for that reason; there was a legitimate argument for codifying protection of some sort but what they passed was trivially abusable, which has now occurred.
So let's fix it, but without vitiating Section 230 directly -- and yes, we can.
You see at the time, in the late 1990s, there was no capacity on a storage or analysis basis to do what we do now. Specifically disks and processor time were both quite expensive. The disks on our primary RAID arrays, were 8Gb -- yes, gigabytes, and they were (by today's standards) very expensive. Today they're both trivially cheap -- indeed solid state drives with 30x that capacity and more than a hundred times the performance are a fraction of their cost, which is why all the tracking cookies and such are all over the place and used for targeted advertising.
But this capacity also puts in place the means to neuter the monster.
You see the argument for content discrimination (e.g. Facebook deciding they do not want certain things on their servers) is that advertisers might or do object to their ads being run against something they find offensive and thus there is serious reputational risk.
But.... isn't the point of all this tracking data to not do that in the first place, since said person is not a viable customer?
Of course it is.
Here's the reality of it folks: Google, Facebook, Twitter and others specifically sell targeting advertising. That is the entirety of their business in that respect: Providing advertisers tools so their ads run against content they, and only they, wish to have it shown against and only to those persons who meet whatever criteria they select on.
Nobody wants to pay for advertising that is worthless, obviously, nor does anyone want to pay for advertising against something that offends their customers or their corporate view of whatever some issue may be. Where what you have is an undifferentiated display (e.g. a billboard on the side of the highway) there is no way to choose who sees it because you have no way to control who is on the road and every pair of eyeballs can look at it.
But in the modern world with electronic devices that personally deliver advertising with granularity and selection down to an individual human being, which is what every one of these businesses does and forms the very basis upon which they have value as firms this is no longer true.
Therefore any claim that someone must be "removed" or "prohibited" from espousing a particular view lest advertisers have their promotional messages attached to and associated with content they disagree with is a lie and in fact fraud as no advertiser using these businesses displays ads against content except by their decision to key said display based on whatever tags and other characteristics of the viewer and content that they, in their sole election, choose.
Further it is a well-established principle of law under 15 USC Chapter 1, standing as law for 100 years, that banding together to monopolize -- or attempt to do so -- is a felony. Therefore for advertisers or content providers to undertake a collective action (which is what they do today with "TNI" and similar initiatives) to prohibit certain viewpoints from being monetized or even expressed when such persons so-act with commercial intent cannot be defended on the basis of "reputational risk" since all such advertising can be and is already targeted so only show up on content and to people where the advertiser wishes to have it seen.
Indeed the entire point of such actions is to prejudice and even prohibit commerce that a group disagrees with; that is, through the acts of two or more people monopolize commerce by effectively barring firms from the marketplace that those doing so don't like.
Facts vitiate false claims, especially when you profit from them and in fact structure your entire business around those facts.
So let's make it simple and pass a short two-paragraph federal law as an amendment to the CDA:
No provider of a public service, whether social media, storage, processing power or infrastructure (including DNS, pipes, etc.) may discriminate on viewpoint as to the persons they allow on said service or the viewpoints expressed in any form or fashion, for or against, if they collect, analyze and sell or use tracking information of any sort for the purpose of marketing, advertising or content selection. For the purposes of this law any cross-ownership interest of any sort between firms, including stock or option ownership and common board membership, or membership in or participation with any coordinating entity between firms whether for profit or not shall bring a firm under the umbrella of this requirement.
A provider who violates this section of law takes publisher liability for all content stored or distributed on their systems and networks irrespective of any provisions that would otherwise shield them either civilly or criminally.
Does this stop a site like mine from banning someone on content or viewpoint, when my TOS disclaims the collection, collation and sale of information on the people who read and post here, either retrospectively or prospectively? No. Section 230 still exists.
But does it stop Facebook or Google (e.g. Youtube) from doing so, and does creating something like "TNI" immediately do so for all member organizations? Yes, unless they dismantle their not-content nor person neutral analysis and data sales procedures, which they can't do without destroying their business since the entire point of their firm, indeed the only reason they have a firm, is said targeting of advertising.
Can one of these "ad-selling firms", after such law, de-monetize a site? No, because the advertisers on said firm can each individually choose to not display ads on said site. Indeed for them to do so directly implicates 100+ year old antitrust law and this two-paragraph addition to the CDA codifies same as a prohibited action unless the provider chooses to in fact be a publisher with all the attendant liability thereof for everything on and distributed through their systems.
That ought to do it.