Look, if the government’s case is “this platform could be used for influence operations,” then congratulations, we have just discovered... the internet. A forced sale or ban might feel satisfyingly decisive, but it still leaves untouched the broader vulnerability: Americans live inside an attention economy where opaque algorithms, microtargeted persuasion, and shameless data extraction are the default business model. If Washington wants to protect the public from manipulation, it should regulate recommendation systems, require transparency, limit data harvesting, and fund digital resilience—not act like removing one app will purify the republic. That’s not strategy; that’s symbolic detox with bipartisan press-release energy.
There is also a serious free expression issue here that conservatives usually care about until the speech platform in question is annoying. TikTok is not just ByteDance headquarters; it is 170 million American users, creators, small businesses, activists, teachers, and yes, people making pasta at irresponsible hours. Courts have already shown skepticism toward blanket bans when the government cannot clearly prove a tailored, constitutional remedy. That is why even the 2024 law was framed around divestment rather than a simple prohibition: lawmakers know the First Amendment problem is real. If the state can effectively erase a major communications platform by citing a broad future-risk theory, it had better bring more than “trust us, the vibes are classified.”
And the practical question matters too: ban it, then what? Users migrate to Instagram Reels, YouTube Shorts, or the next app with a cleaner lobbyist wardrobe and the exact same surveillance incentives. Meanwhile, America hands China an easy propaganda talking point about hypocrisy on open internet values. A serious response would be: pass a federal privacy law, strengthen CFIUS-style scrutiny of foreign-controlled platforms, mandate algorithmic audits, and build rules that apply to everyone. Otherwise this is less “national security doctrine” and more “we found one villain for the trailer.”
The liberal case keeps trying to turn this into a seminar on the failures of late-stage surveillance capitalism, and fair enough, that is a real problem. But “America also has bad privacy rules” is not an answer to whether a platform tied to a strategic rival should have a direct pipeline into the phones, habits, locations, and media diets of millions of Americans. Two things can be true at once: Congress should pass stronger privacy laws, and TikTok presents a distinct national security problem because its parent company sits under a Chinese legal and political system where state access and pressure are not theoretical. If your house has bad locks, that is not a reason to ignore the guy from a hostile government trying the window.
The influence question is the bigger one, and it is where this debate gets uncomfortably real. TikTok is not merely a storage bin of user data; it is an engine of distribution. What people see, what gets buried, what trends during a war, an election, or a domestic crisis—those are power questions, not just content moderation squabbles. U.S. officials have repeatedly warned that control over the recommendation algorithm could be exploited for subtle propaganda or suppression, and the whole danger of modern influence operations is that they do not arrive wearing a cartoon villain cape. They show up as slight nudges, selective amplification, and a thousand invisible editorial choices at scale. That is precisely why “just trust the app” is not a serious national security framework.
And no, requiring divestment is not some wild book-burning crusade. It is a market-access condition: if you want to operate one of the most influential media platforms in America, do it without ownership tied to an adversarial regime. That is a narrower and more defensible policy than liberals pretend. The 2024 bipartisan push reflected the fact that after years of negotiations, proposed safeguards, and Project Texas-style reassurances, lawmakers concluded the structural conflict remained. At some point, “we should regulate everything eventually” becomes the political version of hitting snooze while the risk keeps blinking red. If ByteDance will not sever the connection, then yes, America should.