Cato Institute

05/01/2024 | News release | Distributed by Public on 05/01/2024 11:29

Senate Report Highlighting Bias in Online Services Shows How the Market Can Serve More Perspectives and Advance Expression

May 1, 20241:22PM

Senate Report Highlighting Bias in Online Services Shows How the Market Can Serve More Perspectives and Advance Expression

SHARE
Listen to this article
Generated with ElevenLabs AI technology.

Senator Ted Cruz's (R-TX) staff on the Senate Commerce Committee recently published a report that investigates several instances of what it calls "Online Service Providers … silencing conservatives" by relying "upon biased left‐​wing organizations." While these specific examples may be viewed as mere anecdotes, the report dives deeper than many other accusations of political bias and tries to understand how exactly these companies reached their decisions to de‐​platform various conservatives.

The report concludes that "The ideal solution is for market forces to correct Online Service Providers' discriminatory policies." While the report also embraces some limited legislative fixes that do not live up to this ideal, its emphasis on understanding how content decisions are made and how the market can help address real or perceived bias is constructive. In this same vein, I will be publishing a paper next month that explains how social media content moderation works and how the market and civil society can support greater expression and greater choice online.

It's worth noting that it is and should be the right of such companies to set their own policies and deny their services to those with whom they disagree. But, the report argues, then we should dispense with the pretense that these are neutral companies trying to offer their services to most Americans so that the market can satisfy those no longer being served by these large tech companies. And unlike many other criticisms of tech company decisions, the report spoke directly with the companies to understand how and why these decisions were made.

The first highlighted decision was by Slack, a workplace and communications system, to suspend the right‐​wing social media influencer Libs of TikTok because of its various posts and reporting about what myriad LGBT organizations and activists were themselves posting online. The report notes that the deplatforming effectively caused Libs of TikTok employees to lose their prior communications on Slack, causing significant disruption to their organization's operations.

The report then looks at Eventbrite's cancellation of various events by groups like Young Americas Foundation, College Republicans, and others hosting conservative commentator Matt Walsh or his documentary What is a Woman? for questioning a progressive view of sex and gender. Similarly, Eventbrite cancelled an event featuring Riley Gaines, an elite college swimmer, for her views regarding transgender athletes participating in female sports. These cancellations often caused significant disruption and costs to the event organizers.

The report also notes other instances of deplatforming including the immigration‐​critical Foundation for American Immigration Reform (FAIR), the Independent Woman's Forum, and other organizations that were excluded by these and other online service providers because of mainstream political and social views on important issues of our day.

The enforcement described in this report is different than your typical social media content moderation. When dealing with billions of pieces of content, AI‐​powered enforcement tools are necessary to weed through the sheer amount of material, often paired with large contingents of human moderators who must make relatively quick moderation decisions. This report describes more comprehensive reviews by expert and executive trust and safety teams that are looking at a broad range of factors.

This was one of my roles when I was a member of Meta's content policy team and was often reserved for the most difficult and socially or politically sensitive content. And while using highly trained and expert moderation for difficult decisions makes sense, it also risks allowing external and internal biases to influence the outcome.

For example, the committee report identified outside organizations that may exert significant influence over not just the writing of various policies but also how those policies are enforced. Eventbrite testified to the committee that it "relies on third‐​party sources, including Southern Poverty Law Center (SPLC) and the Anti‐​Defamation League, in determining whether an organization's event violates the Eventbrite Community Guidelines." Similarly, Slack told the committee that "it generally relies on "third‐​party experts" and "industry‐​recognized resources" in enforcing its policies.

The way this works in practice is that external and generally progressive "experts" and "partners," frequently, persistently, and aggressively lobby for certain users, pieces, or types of content to be removed as hateful or otherwise harmful. It is rare for right‐​leaning or libertarian groups to be as trusted internally or to report as much content as these progressive "partners."

Often, the content did not violate the clear letter of the policies, but pressure was applied by these organizations through media campaigns, organized boycotts, or one‐​sided research designed to make companies appear complicit in some harmful content. The result of such external pressure, together with receptive internal teams, is that sometimes companies cave or willingly support the suppression of certain viewpoints. They invoke high‐​level principles or certain contexts rather than specific policy lines or may even change the policy to align with activist demands.

This can be seen at work in this committee report, in which the service providers often avoid detailing how a user specifically violated their policies but instead refer to vague principles or contextual information they believe to be important. For example:

  • When asked if they were removing events featuring Riley Gaines because she made an X post about how women lack a Y chromosome, Eventbrite said the post "speaks for itself."

  • "[W]hen asked multiple times if Eventbrite would remove another event concerning women's sports that featured Riley Gaines, Eventbrite dodged the question."

  • Libs of TikTok was "problematic" on Slack because of its "specific audience."

  • Eventbrite cited "the overall tone and message" of the trailer for the What is a Woman? documentary … "in combination with Matt Walsh's related public statements," rather than anything specifically violating in the film, for removing various organization's viewing parties of the film or even unrelated events with Matt Walsh.

  • FAIR was de‐​platformed by Slack for being "affiliated with a known hate group" (likely referencing lists maintained by groups like the SPLC).

As I noted earlier, these organizations can be as vague, imperfect, or biased as they want in creating or enforcing their rules. And as the report concludes, "First Amendment protections generally do not apply to actions by private companies, who have the freedom to associate and do business with the customers they choose." Therefore, "the ideal solution is for market forces to correct Online Service Providers' discriminatory policies. If Online Service Providers continue to cancel conservative organizations, it should create a new demand for Online Service Providers that service the conservative market."

This is exactly right. Nothing is stopping new service providers from entering into these fields to provide these important services-except for the threat of additional tech regulation by those on the right and the left. Those who want more expression online should make the case to current and prospective service providers that policies limiting significant amounts of socially important speech may not be serving these companies or the broader society well. Those who want more restricted speech have no problem aggressively making their case for less speech, leveraging increasing sympathy in academia, government, and society with their views as variousmetricspoint to increasing conflict over the value of free expression.

Those of us who want to see greater free expression must make the case not only for the First Amendment legal protection from government censorship but also make the case for a culture of free expression, including to the companies that are limiting vibrant discussions of important social issues.

The research done by the Senate report is part of this effort to promote a broader culture of free expression, but it goes a bit too far in seeking legislation that makes some significant demands of private companies. Specifically, the report calls for intrusive transparency from companies regarding their policies and enforcement actions. While many of these recommendations may be best practices that I think companies would benefit from employing, I also know that sometimes companies don't want to provide complete transparency in their rules. A common reason is to prevent adversarial actors from gaming their policies.

Similarly, requiring specific transparency around why users are being punished or deplatformed appeals to our desire for due process and fairness, but companies don't need a reason, much less a clear or good reason, to remove users from their services. Ultimately, their lack of transparency and due process are creating bad experiences with their products that, unchecked, will likely result in alternative solutions emerging in the marketplace, a reality that the Senate report elsewhere understands.

Judge Learned Hand famously stated, "Liberty lies in the hearts of men and women; when it dies there, no constitution, no law, no court can even do much to help it. While it lies there it needs no constitution, no law, no court to save it." We must remind our fellow Americans why a culture of free expression is so important and how the free market can address bias and discrimination.