My Concerns about the TikTok Divestiture Bill as a Software Researcher/Developer

TikTok Divestiture Bill Has Passed US House

House Resolution 7521, dubbed the “Protecting Americans from Foreign Adversary Controlled Applications Act,” passed the U.S. House Of Representatives on a bipartisan 352-65 vote today. The current text of the bill as passed can be found here. It’s pretty short and easy-to-read, so I read it, and I have some concerns.

Let’s break down what the proposed law does, where I think it makes some easy mistakes, and how it can be improved. I’m not going to take a stand on whether TikTok should be sold or banned in this post. I will focus on how best to force the sale of TikTok if forcing that sale is the action Congress decides it wants to take.

I am a software developer with a strong interest in tech policy, but no law degree, so you should not take this as a skilled legal analysis, but instead as an analysis of my concerns about how this legal proposal might interact with technology.

How the TikTok Divestiture Bill Works

First, this bill isn’t actually aiming to ban TikTok. It really wants to force ByteDance to sell TikTok to someone else who is not controlled by China. It does that by threatening to make it illegal to distribute the TikTok app to Americans unless ByteDance sells TikTok within 180 days of the law taking effect.

When I pull out my “Congress-to-English” dictionary and read the bill generously, I think what the bill is trying to say is that if ByteDance doesn’t sell TikTok, U.S. based app stores have to take down TikTok for U.S. users and U.S. web hosts must not host the TikTok web app for U.S. users. Unfortunately, when I read the text, that’s not actually what they have said, because the Internet is complicated and it’s hard to write clear rules about it without causing unintended consequences.

Getting into the nitty gritty, the bill defines a category of application of what it calls “foreign adversary controlled applications”. A “foreign adversary controlled application” is defined in Section 2(g)(3) as a:

“website, desktop application, mobile application, or augmented or immersive technology application that is operated, directly or indirectly (including through a parent company, subsidiary, or affiliate), by a covered company that.. is controlled by a foreign adversary; and that is determined by the President to present a significant threat to the national security of the United States…”

It also specifically defines ByteDance Ltd’s operation of TikTok as meeting this definition. It uses the definition of foreign adversary from 10 USC § 4872(d)(2), which includes North Korea, China, Russia, and Iran.

It bans any “entity” (person, company, etc) “within the land or maritime borders of the United States” from:

“(a) Providing services to distribute, maintain, or update such foreign adversary controlled application (including any source code of such application) by means of a marketplace (including an online mobile application store) through which users within the land or maritime borders of the United States may access, maintain, or update such application.

(b) Providing internet hosting services to enable the distribution, maintenance, or updating of such foreign adversary controlled application for users within the land or maritime borders of the United States.”

It enforces this by setting a fine of $5,000 per user per day for violating Part A or $500 per user per day for violating Part B. If you violate either of these prohibitions, the U.S. Attorney General can sue you to get the money and to get a court order to make you stop doing whatever it is you are doing. It provides a specific exemption where the government cannot sue “an individual user of a foreign adversary controlled application”, so it won’t make it illegal to use TikTok, just to distribute/host it.

It does all this threatening to motivate ByteDance to make what it calls a “qualified divesture” of TikTok, or in other words a sale of TikTok to a non-foreign-adversary-controlled company. TikTok would then continue to be allowed to be distributed/hosted/etc by U.S. entities to U.S. users.

This Enforcement Mechanism Kind of Sucks?

There has been previous attempts by U.S. states, most notably in Montana, to ban TikTok that have been struck down by courts as unconstitutional, because they violate the First Amendment. The promoters of this bill are hoping to draft it in such a way that it does not fall victim to that fate. I think they have failed.

The main reason I think they have failed is that their enforcement mechanism - the part of the defines what would be illegal if TikTok refuses to sell - sucks pretty hard. It basically says that if ByteDance fails to sell, then Americans are not allowed to share or facilitate the sharing of software, including source code, written by ByteDance with other Americans. This is a very broad prohibition, a dangerous precedent, and ultimately I think a bigger hammer than is needed to deal with this particular nail.

Classic Constitutional Concerns

First, this would clearly at least raise a constitutional issue based on the Bernstein v. United States precedent that held that software source code was speech protected by the First Amendment.1 As a hypothetical example to demonstrate the parade of horribles, this would seem to preclude a researcher who discovers some mechanism TikTok uses to censor posts about e.g. Tiananmen Square from sharing the source code or binary artifacts to substantiate their claims. After all, couldn’t those artifacts enable the distribution or updating of TikTok? If they could, and the researcher hosted them on the Internet to share them, then the researcher would be providing internet hosting services to enable the distribution or updating of a foreign adversary controlled application. This bill claims to make that illegal. Even if it’s ultimately determined that this bill does not prohibit that sort of thing, it still might require the researcher to get an expensive (thousands of dollars) legal opinion or to be subject to a preemptive takedown by an overzealous host. That is the classic “chilling effect”.

Over-Broadness Concerns

Second, how the bill defines “hosting services” would seem to reach pretty far into the infrastructure of the Internet to effect its prohibitions. This is especially notable, because for a speech-regulation to be constitutional, it must among other things use the “least restrictive means” to achieve its goal. Laws that infringe on more speech rights than strictly necessary to accomplish the government’s “compelling purpose” are not constitutional.

The bill defines hosting services as:

“a service through which storage and computing resources are provided to an individual or organization for the accommodation and maintenance of 1 or more websites or online services, and which may include file hosting, domain name server hosting, cloud hosting, and virtual private server hosting.”

The way I read this, I think it would require Verisign to de-platform the domain “tiktok.com” from the “.com” name servers. After all, the “.com” servers at the very least provide storage resources to maintain the record of what authoritative name servers are responsible for “tiktok.com” domain name in order to maintain the TikTok web app. Also, I think this prohibition has a very broad effect. Part B above prohibits US entities from providing hosting services to anyone who enables the distribution or maintenance of TikTok. I think this effect thus stretches beyond providing DNS services to ByteDance directly, and also prohibits providing DNS services to anyone who enables the distribution of TikTok to US users. So, for example, source code distribution websites or file hosting websites would have to take down TikTok source code or binary files, or else risk getting their domain name revoked.

Finally, a related over-broadness concern I have is whether this bill would effect “peering”, or in other words, whether it would prohibit U.S. Internet Service Providers (ISPs) from routing traffic to a TikTok service hosted abroad. I think it all turns on what “accommodation and maintenance” means. I have not seen this term used before in an Internet law, so I think it is wide open to interpretation. On one hand, I could see a stretch argument about how a U.S. ISP allowing access to a TikTok service hosted abroad would be providing “computer resources” by which the “accommodation or maintenance” of an “online service” that “enables the distribution of the foreign adversary controlled application” is accomplished. On the other hand, to me as a software guy, peering is pretty different from the examples given of file hosting/virtual private server hosting/etc, but I don’t want to read too much into that, because I’m not sure a federal judge would see it that way.

There Is A Better Way

I think the drafters of this bill could easily dodge these First Amendment issues and still achieve their desired effect of threatening to de-platform TikTok from American platforms for American users by taking inspiration from sanctions law. US law already has the concept of the “Specifically Designated Nationals list” (SDN list), which is a list of companies and people that US persons and businesses are prohibited from doing business with. There might be some concern that ByteDance does not currently meet the definition to be listed on the SDN list, but if Congress is already passing a law on this issue, they can specifically clarify that makers of “foreign adversary controlled applications” can be listed on the SDN list. I think that would be much cleaner and easier than setting up some new prohibition mechanism just for TikTok. It would have basically the same desired practical effect: Preventing US businesses from doing business with ByteDance would effectively cut them off from the Apple App Store, Google Play store, US web hosts, cloud providers, and CDNs, and any US domain registrar. It also would prevent them from selling advertisements to US companies, or from making deals with US record companies to include their music libraries in the video editor. It would also not prevent American researchers from publicly sharing any ByteDance-produced source code in their work, or American ISPs from routing traffic to foreign ISPs that later serve TikTok. It would still clearly kill TikTok. It just kills it without threatening a bunch of other probably-constitutionally-protected conduct as a side-effect.

I think this approach is much, much better. Tech platforms are also already very adept at dealing the SDN list. They know what they have to do when a company is listed on it. There is a lot of clarity there. I also think the existence of this alternative approach is a very strong point towards the current bill’s proposed enforcement mechanism not being the least restrictive means to accomplish the compelling government interest, and thus not meeting constitutional muster.

This Bill Might Go Further Than TikTok?

Another concern I have with this bill is the open-ended nature of the ability to designate other apps as “foreign adversary controlled applications”. I think that it’s fine in principle, and probably necessary to have a way to react to future threats without another act of Congress. However, the particular conditions laid down by the bill are probably too lax. For one, despite using the word “controlled”, the bill actually only requires a twenty percent foreign stake to trigger eligibility to be designated. (See Section 2(g)1(b)) It’s not at all uncommon for even American-run social media apps to have large stakes owned by foreign people and companies. For example, Chinese company Tencent did/maybe still does? own a 12 percent stake in Snapchat. A company would not become automatically designated just because it is 20-percent owned by Chinese investors; it would still require action by the President. However, I tend to think that any law that hinges the hope that one person will do the right thing is a bad idea, because hope is not a plan. It’s pretty easy to imagine, for example, a U.S. president threatening to designate a U.S. social media platform as “foreign adversary controlled” if that platform moderates his posts in a way that the President does not approve of.

It’s also notable that I think this definition would cover Telegram, the very popular Russian app whose use is very significant in monitoring ransomware actors and in open-source intelligence of Russian military operations. I’m not going to take a stance here on whether the U.S. should try to force the sale of Telegram, but it is worth thinking about whether that’s the sort of thing we want to empower the government to do, or if we want to further restrict the kinds of national security harm that can merit this kind of action.

U.S. Senate: Cleanup On Aisle HR 7521

Luckily, there is still plenty of time to fix this. It’s unclear when or if this bill will be taken up by the Senate. If they do, I strongly urge them to fix the issues I laid out here. We can all avoid a lot of heartache with just an ounce of careful legislative drafting.

  1. This case only made it to the Ninth Circuit Court of Appeals, so this precedent only applies in the Ninth Circuit, but it bolsters the point that the free speech concerns here are credible.