Four days ago, we wrote that the TikTok deal was "a masterwork of structured ambiguity, a deal that lets everyone claim victory while changing almost nothing." We were wrong about the second part. Things are changing. They're just changing in ways that make the ambiguity look less like compromise and more like cover.
The Glitches
On Sunday, TikTok US announced it was experiencing a "data center power outage" that had disrupted services. Users reported seeing old videos flooding their feeds—content from months or years ago surfacing in place of the algorithm's usually uncanny recommendations. The company assured everyone the disruption was temporary and technical in nature.
The timing is notable. The outage began days after the new ownership structure took effect, just as American oversight of TikTok's operations was supposedly kicking in. Coincidence, probably. But in the world of algorithmic platforms, "technical difficulties" have a way of being politically convenient.
Today, TikTok confirmed the issues persist. The algorithm, the company says, is still recovering from the outage. Any changes users have noticed in their feeds are artifacts of the restoration process, not deliberate modifications.
The Word That Shall Not Be Sent
Then there's the Epstein situation.
Users began reporting over the weekend that they couldn't send the word "Epstein" in direct messages. The messages simply wouldn't go through. TikTok responded today that it doesn't "have rules against sharing the name Epstein in direct messages" and is "investigating why some users are experiencing issues."
This is, to put it mildly, a strange technical glitch. Messaging systems don't typically develop allergies to specific surnames. Content moderation systems do. The most charitable interpretation is that someone's overzealous filter caught a false positive. The less charitable interpretation is that someone decided Jeffrey Epstein's name—and the web of powerful people connected to it—shouldn't be easily discussable on the platform.
TikTok hasn't clarified which interpretation is correct. They're investigating.
The Investigation
California Governor Gavin Newsom isn't waiting for TikTok's internal investigation. He announced today that the state will launch its own review into whether TikTok is violating California laws by censoring content critical of President Trump.
The accusation is explosive: that the newly American-supervised TikTok is suppressing political speech unfavorable to the administration that approved its continued operation. If true, it would mean the platform traded Chinese government influence for American government influence—not exactly the national security improvement that justified six years of regulatory pressure.
TikTok has denied any political censorship. But the denial arrives in the same week as unexplained algorithm disruptions and mysteriously unsendable words. The pattern, whether intentional or not, looks bad.
The Deal's Design
None of this should be surprising. Look back at what the TikTok deal actually created:
ByteDance retained the algorithm—the recommendation engine that determines what 170 million Americans see. Oracle got oversight of data and deployment, but the code itself remains Chinese intellectual property, licensed to the new US entity. The American investors got equity and board seats, but operational decisions still flow through systems designed in Beijing.
The deal was structured to satisfy two governments with contradictory demands. China wouldn't allow algorithm transfer; America wouldn't allow continued Chinese control. The solution was a legal fiction: American ownership wrapped around Chinese technology, with monitoring that could catch data exfiltration but couldn't prevent algorithmic manipulation.
If someone at ByteDance—or someone with influence over ByteDance—wanted to adjust what American users see, the deal's structure provides limited protection. Oracle can audit the data. It cannot audit the intent behind recommendation weights. It cannot know why one video surfaces and another doesn't. The algorithm is a black box that remains, by design, opaque to its American overseers.
What We're Watching
It's too early to know whether this week's incidents represent deliberate manipulation, genuine technical problems, or the inevitable friction of a complex corporate transition. TikTok deserves the benefit of the doubt on any individual incident.
But the incidents aren't individual anymore. In the span of four days:
- The algorithm experienced unexplained disruptions
- A politically sensitive word became unsendable
- A state governor launched a censorship investigation
- A new social platform is surging on claims of "political impartiality"
This is what "structured ambiguity" looks like in practice. When everyone can claim victory, no one is accountable for failure. When oversight is designed to check data flows rather than editorial decisions, editorial decisions go unchecked. When the algorithm remains proprietary and opaque, we cannot distinguish between bugs and features.
The TikTok deal was celebrated as a solution that preserved American access to a beloved app while addressing national security concerns. What it actually preserved was uncertainty—about who controls the platform, about what content policies apply, about whether the oversight mechanisms can detect the manipulation they were designed to prevent.
Four days in, we're getting our first glimpse of how that uncertainty resolves. The algorithm is adjusting. The question is: adjusting to what?