Well, you know how it is. Head out the door for a long-planned vacation, pray nothing major erupts before you get back, and before you can so much as pack a suitcase the president announces that ByteDance must sell TikTok — or else TikTok will be banned in the United States.

The timing was abrupt. The authoritarian cast of President Trump’s remarks on the subject was disturbing. And yet, for those of us who have followed TikTok’s trajectory this year, nothing that has happened over the past two weeks can truly be said to be surprising. I wrote this here on January 7th:

I imagine that most within ByteDance would still rather avoid any of these scenarios, and just keep running TikTok as it is.

And yet — how likely does it seem to you that ByteDance will ultimately have that option? The company hasn’t caught a break on the regulatory front in recent memory. The trade war with China shows no signs of ending — or even de-escalating — this year. It no longer seems unlikely to me that TikTok could be reborn as an American citizen. And it might happen sooner than we all expect.

And then, sure enough, on Friday evening Trump — citing the results of a review by the Council on Foreign Investment in the United States, or CFIUS — ordered ByteDance to divest itself of TikTok and any data it had gathered on Americans. This was the second time Trump had ordered a sale of the app in as many weeks; he previously gave the company 45 days, and it was revealed that Microsoft is the most likely prospective buyer.) Here’s Nicole Sperling in the New York Times:

President Trump doubled down late Friday on his previous actions against TikTok by formally giving ByteDance, the Chinese owner of the popular video-sharing app, 90 days to divest from its American assets and any data that TikTok had gathered in the United States.

“There is credible evidence that leads me to believe that ByteDance,” which merged TikTok with the American lip-sync app Musical.ly in 2018, “might take action that threatens to impair the national security of the United States,” Mr. Trump wrote in an executive order issued at 7:45 p.m.

How do we feel about this? As a rule of thumb, I only trust people’s opinions on TikTok if they are mixed. Alex Stamos made a nice Venn diagram last month that captured some of the ambiguities involved here. On one hand, with 100 million Americans using the app regularly, TikTok represents a vibrant hub of all kinds of speech — some that’s political, and much that isn’t — and to kill it off would represent a dire suppression of speech. That Trump’s decision appears to have been influenced by a recent stunt in which TikTok users registered en masse for one of his rallies and then no-showed only underscores the dangers of having national tech policy set by a thin-skinned strongman.

At the same time, as I’ve written here, TikTok’s biggest problems are largely outside its control. Despite a recent and noble offer to open up its algorithms for inspection, ByteDance suffers from the strong (and, I believe, justified) belief that the company could not withstand any serious effort from the Chinese government to obtain and misuse American user data — or use it as a channel to manipulate our public discourse. Imagine the 2016 election, except Facebook was incorporated in Moscow — this is the basic fear that has driven much of the discussion around TikTok’s future — and I can’t say that it feels particularly unjustified.

The best outcome here would have been a rigorous investigation, and possibly Congressional hearings, followed by a public report and set of recommendations for how to proceed. But Trump chooses to rule by chaos, crisis, and whim, and so now we have these artificial deadlines. Microsoft has utterly abased itself in its public actions to date, bizarrely promising to make a contribution to the US Treasury in order to secure the deal. (No less than Bill Gates has called the prospective TikTok acquisition “a poisoned chalice.”) But the outcome also seems preferable to TikTok disappearing entirely — putting thousands of employees out of work, roiling its nascent community of influencers, and making the world of consumer social apps even less competitive than it already is.

It’s for that reason that some tech reporters have mused that Facebook must be loving TikTok’s struggles. But as I first reported last month, Mark Zuckerberg has told employees that a TikTok ban would represent a terrible precedent. Zuckerberg echoed those comments in an internal meeting last week, Ryan Mac and Craig Silverman reported at BuzzFeed:

“I just think it’s a really bad long-term precedent, and that it needs to be handled with the utmost care and gravity whatever the solution is,” Zuckerberg said. “I am really worried…it could very well have long-term consequences in other countries around the world.”

When Facebook was founded, the idea of a truly global social network has seemed obvious — inevitable, even. China’s decision to ban American social networks clouded that vision, though, and in the years since the splintering of the internet has only accelerated. Now a single social network, TikTok, may find itself splintered into different companies all around the world. And given the high level of government outrage at Facebook and other tech platforms, it seems unlikely that TikTok would be the last. The splinternet is coming for every big tech company, and it’s not clear what will save them.

The Ratio

Today in news that could affect public perception of the big tech platforms.

Trending up: Facebook launched a voting information hub that will centralize election resources for US users. The goal is counter the platform’s ongoing misinformation epidemic. (Taylor Hatmaker / TechCrunch)

Trending up: YouTube updated its policies on deceptive videos to ban videos containing information that was obtained through hacking and could meddle with elections or censuses. This includes hacked campaign emails with details about a candidate. It’s worth saying that this is a more forward-thinking policy than a lot of mainstream news organizations have! (Richard Nieva / CNET)

Trending down: Black employees at the Chan Zuckerberg Initiative (CZI) are speaking out about the organization’s blind spots in regards to race and racial justice. Some say Mark Zuckerberg’s philanthropic efforts are stymied by the same desire to appear unbiased that critics of Facebook claim is causing real-world harm to Black communities. (Nitasha Tiku / The Washington Post)

Trending down: Google’s advertising platform algorithm is blocking articles about racism. The company does not allow advertising on content that disparages people on the basis of race, gender, or sexual orientation — but the system doesn’t always take context into consideration. (Aaron Mak / Slate)

Trending down: Pinterest employees staged a virtual walkout on Friday to protest discrimination at the company. The news follows allegations by three former employees about race and gender inequity. (Zoe Schiffer / The Verge)

Governing

Facebook’s algorithm “actively promotes” Holocaust denial content, according to an investigation by the Institute for Strategic Dialogue (ISD). The UK-based group found that typing “holocaust” in the Facebook search bar brought up suggestions for denialist pages, which in turn recommended links to publishers which sell denial literature. This story got a good bit of attention even though Facebook’s decision to allow Holocaust denial on the platform is more than a decade old. Here’s The Guardian’s Mark Townsend:

The ISD also discovered at least 36 Facebook groups with a combined 366,068 followers which are specifically dedicated to Holocaust denial or which host such content. Researchers found that when they followed public Facebook pages containing Holocaust denial content, Facebook recommended further similar content.

Jacob Davey, ISD’s senior research manager, said: “Facebook’s decision to allow Holocaust denial content to remain on its platform is framed under the guise of protecting legitimate historical debate, but this misses the reason why people engage in Holocaust denial in the first place.

The renewed onslaught of fake news about Kamala Harris highlights how Trump and his campaign are eager to use misinformation to try to win in November. Some media outlets, including the zombie remnants of what used to be Newsweek, have shown that they’re willing to participate. Here’s Rebecca Heilweil at Recode:

There has been a significant spike in misinformation about Harris in the few days since she was made the VP candidate. According to research from the media intelligence firm Zignal Labs, there have been more than 150,000 instances of people sharing, discussing, or promoting misinformation online related to Harris in the past week. Meanwhile, the progressive research group Media Matters found that right-leaning Facebook pages they analyzed posted about Harris twice as much as left-leaning groups in the past week, and the right-leaning posts saw 50 percent more engagement than those on more liberal pages.

With Marjorie Taylor Greene’s primary victory in Georgia, QAnon has officially gone mainstream. An internal Facebook investigation also found millions of members across thousands of QAnon groups and pages. (Charlie Warzel / The New York Times)

QAnon has emerged in recent months as a centralized hub for conspiracy and alternative health communities. Users who started off in wellness communities and religious groups on Facebook, Twitter and Instagram were introduced to extremist groups like QAnon during the pandemic, and helped fuel the anti-mask phenomenon. (Ben Collins / NBC)

Facebook announced a new policy to crack down on political content disguised as local news, but it’s not as strict as it appears. The rule won’t touch sites funded by political actors, only those they own or lead. (Rob Pegoraro / Forbes)

After Facebook determined that T. Raja Singh, a prominent Indian politician, had violated the company’s hate-speech rules and should be banned from the platform, he remained active on both Facebook and Instagram. The company’s top public-policy executive in the country opposed the ban, saying it would damage the company’s business prospects in the country. (Newley Purnell and Jeff Horwitz / The Wall Street Journal)

Tech platforms should commit to protecting democracy and democratic participation as an expression of their own values, argues this opinion writer. The right to vote is no less fundamental a right than free expression. Platforms should embrace them both. (Eileen Donahoe / The Hill)

Here’s what Big Tech is doing to counter election misinformation on their platforms. They’re going all in on civic engagement efforts ahead of November’s election. (Sara Fischer and Ashley Gold / Axios)

Joe Biden outspent President Trump on Facebook ads last week, for the first time since June 6th. The news follows Biden’s selection of Kamala Harris as his running mate, an announcement that could drive fundraising. (Salvador Rodriguez / CNBC)

Congress still has questions for the Big Tech CEOs in the wake of the antitrust hearing. Among them: Rep. Jamie Raskin (D-MD) asked Facebook whether the company has made any “concessions” to political organizations “to address perceived political or ideological bias.” (Cristiano Lima / Politico)

Canada’s antitrust watchdog launched an investigation into Amazon, to see whether its behavior has hurt consumers and smaller companies. The bureau is examining whether sellers can operate a successful business on the marketplace without using Amazon’s fulfillment or advertising services. (Annie Palmer / CNBC)

Germany’s anti-trust authority has launched an investigation into Amazon’s relationship with third-party selling. It’s looking at how the company influences how sellers set prices on the platform. (Reuters)

Google gave detailed personal information about far-right users to law enforcement agencies working on counter terrorism, according to leaked documents. In some cases, Google didn’t ban the users they reported. Some still have accounts on YouTube and Gmail. (Jason Wilson / The Guardian)

Google removed 197 links that previously came up as search results related to a German princess’s rant about killing Muslims. Her family used Europe’s right to be forgotten law to try and make the content disappear. (Jordan Wildon / Vice)

The Global Internet Forum to Counter Terrorism, established in 2017 by Facebook, Microsoft, Twitter, and YouTube, has been called “the most underrated project in the future of free speech.” Today it’s making some of the most consequential decisions in online speech governance without the scrutiny of the public. (Chloe Hadavas / Slate)

Immigration and Customs Enforcement (ICE) signed a contract with the facial recognition company Clearview AI for “mission support.” In May, Clearview said it would stop selling its app to private companies. (Kim Lyons / The Verge)

The Secret Service bought location data from a service called Locate X. The service, from Babel Street, tracks devices anonymously using data harvested by popular apps installed on peoples’ phones. (Joseph Cox / Vice)

The man who helped rally India against Facebook now says the country has gone too far in the direction of digital nationalism. He’s calling for India and other democratic governments around the world to work together to build a better internet, rather than building walls around their own national versions of it. (Will Oremus / OneZero)

Industry

TikTok is partnering with the music distribution company UnitedMasters to allow artists to share their songs directly from the app to streaming services like Apple Music, Spotify and YouTube. It may be the most significant transaction to date for Kevin Mayer, TikTok’s new CEO. The New York Times reports:

Despite the uncertainty of TikTok’s fate in the U.S., the UnitedMasters deal shows that the company is not standing still, even if the benefits of the new partnership will probably accrue to a new owner.

It’s an effort to deepen relationships with influential artists who use the app. TikTok’s young and engaged audience has helped songs go viral, jump-starting the careers of musicians like Lil Nas X and BMW Kenny. Trying to keep these creators engaged with the app is particularly important as TikTok faces competition from deep-pocketed rivals like Facebook’s Instagram, which recently launched a TikTok clone called Reels.

Facebook is testing short-form videos in the Facebook app in India, its biggest market by users. The “Short Videos” feature has a dedicated section within the news feed. (Manish Singh / TechCrunch)

Telegram launched one-on-one video calls on both its Android and iOS apps. The company said 2020 had “highlighted the need for face-to-face communication.” (Kim Lyons / The Verge)

The Army and Navy have both resumed streaming on Twitch in an effort to recruit Gen Z. The groups were criticized for banning users who asked about war crimes, a move that was categorized by some as violating users’ First Amendment rights. (Shannon Liao / CNN)

Pinterest appointed Andrea Wishom, President of real estate company Skywalker Holdings, to its board of directors. The appointment makes her Pinterest’s first Black board member and third female board member. (Megan Rose Dickey / TechCrunch)

Google is lobbying Australian consumers against a proposed regulation that would make it pay media outlets for news content. A pop-up on the Google homepage reads “the way Aussies use Google is at risk” and “their search experience will be hurt by new regulation.” Just their opinion! (Jon Porter / The Verge)

Here’s how social media platforms manipulate you into giving away more of your data. The tactics known as “dark patterns” persist even as companies say they’re giving users increased control over their information. (Arielle Pardes / Wired)

And finally…

Talk to us

Send us tips, comments, questions, and what you did on your summer vacation: casey@theverge.com and zoe@theverge.com.