Article 19

04/16/2024 | News release | Distributed by Public on 04/16/2024 07:30

Blog: Reframing the TikTok debate – the need for a human rights approach

By Michael Caster, Asia Digital Programme Manager, and Chantal Joris, Senior Legal Officer

In March, the US House of Representatives passeda billthat would mandate TikTok's owner, the Chinese technology firm ByteDance Ltd, sell the US version of the social media platform within 180 days or face its prohibition in US app stores. The bill's fate in the Senate remains uncertain. The White House has expressed its support. In its current form, however, it is certain to end up in the courts.

This is not the first US effort to ban TikTok. In 2020 the Trump Administration issued an overbroad executive orderprohibiting business with ByteDance. It was later revoked. At the state level, an effort to ban TikTok across all of Montana is currently making its way through appealfollowing a November 2023 ruling.

Legislators cite concerns about the Chinese government's potential access to the personal data of TikTok's estimated 170 million US users, and its ability to influence American public opinion by shaping the content that those users see.

The fears regarding TikTok's ownership, as well as concerns around data privacy, censorship, and manipulation of the information space, are well founded. Yet the bill fails to address any of the human rights issues associated with TikTok, creating more problems than it solves.

The TikTok Trojan Horse

Internal content moderation documents leaked in 2019 and reviewedby the Guardian included instructions to censor content mentioning the Tiananmen Square Massacre, Tibetan independence, Uyghur human rights, or other topics sensitive to Beijing. Many of the guidelines for China-related content restrictions fell under a section on 'hate speech and religion'. One overbroad rule called for a ban on 'highly controversial topics'.

At the time of the leak, ByteDance claimed that these instructions had already been abandoned. At a UK parliamentary hearing in late 2020, a TikTok executive, Elizabeth Kanter, admittedto such past censorship policies, however claiming they were no longer in force. During his Congressional testimonyin March 2023 the company's CEO, Shou Zi Chew, made a contradictory statement, claiming that TikTok has never censored sensitive topics. Meanwhile, a Rutgers University reportpublished after Chew's testimony in late 2023 found that the same topics targeted by the censorship policies supposedly since abandoned, were dramatically underrepresented on TikTok compared to other platforms.

A 2021 report by Citizen Lab revealedthat TikTok retains much of the same source code as its Chinese counterpart Douyin, which includes capabilities to perform server-side censorship of platform search results for content labeled as 'hate speech', 'suicide prevention', and 'sensitive', noteworthy in their similarity to terms previously used to guide censorship.

Although the Citizen Lab report found no overt data transmission from TikTok to China-based servers, it acknowledged that data could be routed through non-China servers on their way to servers in China, increasing the likelihood of Chinese government gaining access once the data is stored in China. Of course, this does not dispel concerns about China-owned servers outside of the country or potential access otherwise. For example, there is at least one documented case in 2022 where ByteDance employees accessed IP addresses of two reporters with BuzzFeed and the Financial Times who were using the TikTok app in an effort to track down sources in a possible leak, and while the company fired those responsible, the incident disproved ByteDance claims that it could not monitor US users' data.

In terms of what kind of user data TikTok collects, the report found that it collects similar types as other popular social media platforms, which is already problematic, and makes use of third-party advertisements and tracking services. The report's author states that it is unclear how TikTok would respond to data requests from the Chinese government, but Chinese law imposes clear censorship and surveillance obligations on all Chinese companies, such as TikTok's parent ByteDance.

There are thus certainly significant freedom of expression and privacy concerns linked to government-controlled platforms, all the more so when the government has China's track record of abusing human rights. Yet it should be seen as a human rights problem as opposed to a national security problem, and efforts to address the adverse human rights implications of Chinese technology companies, along with broader social media regulations, should be guided by international human rights norms and internet freedom principles.

TikTok is a human rights problem

Despite the human rights issues associated with TikTok, the bill fails to offer appropriate responses and raises its own significant human rights concerns. Beyond First Amendment free speech protections, from an international human rights law perspective, the US has also ratified the International Covenant on Civil and Political Rights (ICCPR). It requires that any restrictions to freedom of expression - including restrictions imposed on platforms for exercising the right to free speech - be based on the so-called three-part test, consisting of the principles of legality, legitimacy, necessity, and proportionality. In particular, the least restrictive measure must be applied if it is capable of achieving the same purpose as a more restrictive one. While national security constitutes a legitimate aim for limiting freedom of expression, the Johannesburg principles on national security and freedom of expressionprovide that a government must demonstrate the threat to national security and how restrictions comply with the three-part test. The burden of demonstrating the validity of the restriction in question rests with the government.

While the bill does not propose an outright ban on TikTok in the US, there are serious risks that the platform would no longer be available to US users in its current, highly popular form. In practical terms, the sale will be difficult to achieve. There is some precedent: in 2019, the Committee on Foreign Investment in the United States (CFIUS) already ordered the Chinese owners of Grindr to sell the LGBTQ+ dating app, based on vague reasoningthat was unlikely to comply with international freedom of expression standards. The Chinese owners agreedto sell the dating app. Yet, with TikTok, the situation is more complicated, not least because ByteDance is almost certainly going to refuseto divest. In addition, the sale price would be exorbitant, and a buyer difficult to find within the set timeframe. While there have been initial suggestionsthat TikTok could be bought without its algorithm and be rebuilt 'from scratch', this would result in a wholly unrecognisable service and ultimately a different platform for its users.

This means that the ultimate consequence of the bill could likely result in a ban depriving TikTok users of the platform. The consequences of this could have been less drastic if regulation regarding foreign ownership had been adopted before TikTok became so immensely popular in the U.S., which raises further questions as to whether Federal Communications Commission (FCC) rules on foreign ownership should be revised to explicitly address digital media. Today, a ban would deprive millions of TikTok users in the U.S. of their ability to engage in protected expression, share opinions, or access a vast array of content ranging from comedy to politics - unlikely to be a proportionate measure in response to latent national security concerns which have not been sufficiently substantiated as unique to TikTok by the US government with public evidence.

A human rights-compliant approach, on the other hand, would seek to protect users' free expression and privacy through structural changes applied universally to all platforms. Yet, human rights considerations have not always been central to this debate. For example, there appears to be no requirement that the new buyer would have any regard for data privacy or human rights, or ensure safeguards against the underlying model of surveillance capitalism in which China could quite easily just purchase TikTok user data via third-party data brokers, or engage in the same kind of online threats and harassment China-backed actors already employon other social media platforms.

It is furthermore unclear how mere US ownership would prevent China-backed censorship or information manipulation from playing out on TikTok. After all, American tech giants from Apple to Microsoft have not been shielded from complicityin globalising censorship at the behest of China, and Chinese propaganda is rifeon X, formerly Twitter. What is more, divestment, without structural safeguards, may actually embolden Chinese actors to weaponize TikTok further once freed from efforts to try and appear independent from Beijing.

A US ban could set a troubling precedent globally, potentially influencing other countries to follow suit, especially those already exploiting national security justifications to impose strict internet controls. This could ultimately affect social media users in other countries, empowering other governments to impose similar restrictions, including bans, on TikTok or other platforms based in the US in favour of national products where censorship or data access would be even easier. There is also no doubt that a shift towards local ownership of global platforms and data localisation has the potential to grant governments excessive leverage over platforms complicating efforts to promote universal human rights online.

TikTok is already banned on the US, and other government devices around the world from Canada and the UK to India and Taiwan, where, for example, it was systematicallyused to spread Chinese disinformation ahead of presidential elections this January. International law is more accommodating to such restrictions within private systems, like government or university networks, but restrictions affecting the broader public must meet the narrow parameters set out in the ICCPR. Of course, TikTok and US social media platforms are banned in China, but reciprocity for reciprocity's sake, as satisfying as it may feel, is not a good policy to reverse human rights harms.

The role of Chinese tech companies and China's influence over global tech companies raises serious human rights and internet freedom concerns, but oversimplifying the TikTok problem as one of national security wholly distinct from the broader big tech problem means embracing solutions that may do little to hold either China or big tech accountable.

Ultimately, this bill is unlikely to fundamentally disrupt China's adverse influence over the digital space and without significant changes will fail to solve the fundamental problems posed by the biggest tech platforms, such as data privacy and algorithmic manipulation stemming from surveillance capitalism. It does very little to limit the power the tech giants, including the likes of Google or Meta, exert over public discourse, if anything, it weaponises consolidation and digital monopolies for geo-political reasons, leaving unchecked the structural problems such monopolies raise for free expression and democracy.

Toward a human rights solution

There are ways to address China's adverse influence over global tech companies, regardless of their ownership, but the approach should be through structural, human rights-compliant, and universally applied policies. Greater investment in civic-tech and human rights-based people-public-private partnerships is certainly one, especially for its emphasis on transparency and community engagement.

Congress should focus on passing mandatory human rights due diligence, in line with the UN Guiding Principles on Business and Human Rights(UNGPs), for all US companies. Said due diligence should consider the human rights risks associated with companies' global operations and vulnerability to complicityin censorship as a result of economic or political leverage, a concern which is not unique to, but especially acute from China. This applies to a range of US companies, including companies like Apple, demonstrating that the problem of Chinese influence over the digital space is more complex than TikTok or social media generally.

Rather than seeking greater consolidation of the market through the possible absorption of TikTok by another leading tech company, Congress should be strengthening competition lawsfocused on tackling the excessive market power of social media giants and promoting user access and exposure to a diversity of voices. For example, requiring platforms to unbundle hosting from content curation activities, and to oblige large platforms to allow third parties to offer content curation to users could reduce the influence of any actor, including foreign governments, over what users get to see or share online.

The Chinese government should not be able to easily access or acquire American data, but they already can, beyond TikTok. The United States needs a robust personal data protection law, which should not only include transparent and enforceable limitations on the type and volume of user data collected but also explicit prohibitions of any data transmission to all foreign actors with records of human rights abuse. The recently passed House Billprohibiting data brokers, a serious part of the problem, from transferring sensitive data to 'foreign adversaries', such as China, Iran, and Russia is a step in the right direction. A comprehensive data protection regime would not only prevent China from accessing user data on TikTok but also apply across the tech spectrum in truly meaningful ways.

While pursuing greater privacy protections for users, Congress should likewise be working to enforce greater transparency across the tech sector, whether in tracking advertisement and related online spending to labelling requirements on China-affiliated or other state-affiliated actors engaged in information manipulation online. Particularly in light of what makes TikTok so popular, its recommender algorithm, greater transparency regulations must include provisions for algorithmic accountability, to more readily expose when the content we are seeing is being manipulated for influence.

The current bill's focus on enforcement at app stores in the US in preventing access to TikTok calls to mind the rampant censorshipof mobile applications on the Apple App Store in China. Noting this parallel, rather than seeking to block certain applications on US app stores, Congress should arguably instead be calling for greater testimony and solutions from US tech executives whose companies have significant business operations in China, as to the degree to which they have conducted human rights due diligence. Within established policymaking limitations over the private sector, Congress should furthermore examine regulatory approaches to hold accountable US companies whose business operations with China clash with their corporate responsibilities to protect human rights, as outlined in the UNGPs. This would close a gaping hole through which China exerts influence over the digital space.

At the end of the day, with so much attention on fixing TikTok, why not treat this as a test case for pursuing truly radical solutions. Rather than merely focusing on shifting TikTok's ownership under an existing or newly incorporated big tech entity, here is an opportunity to explore a fundamentally new approach to ownership and oversight through a transparent, multi-stakeholder, independent social media council, or as part of a new community-oriented digital public infrastructure, much more impactful in the protection of user rights.