TikTok’s recommendation algorithms have caught the world’s attention and made the app a growing success. Now, the mechanisms underlying the hyper-personalized user feed curation feature are catching the attention of children’s rights and digital privacy advocates. And not in a good way.

On Thursday, a coalition of 20 child advocacy, privacy, and consumer organizations filed a complaint with the Federal Trade Commission to request an investigation over the TikTok’s data collection and privacy practices for minors. 

Two participating organizations, the Center for Digital Democracy and the Campaign for a Commercial-Free Childhood, said in a press release that TikTok has continued to violate the Children’s Online Privacy Protection Act (COPPA) even after the FTC instituted a $5.7 million fine for a settlement with the app’s predecessor, Musica.ly, in February 2019. 

Under the federal law, online services are required to obtain parental consent before collecting personal data of users under the age of 13. TikTok agreed in 2019 to destroy all related personal information it has collected and to obtain parental permission for similar accounts in the future.

That December, the app introduced “younger user accounts” with additional safety and privacy protocols for users under 13. In April, the app also added family pairing mode, which allows parents to limit the app’s functionality for their minors.

The coalition, however, believes TikTok’s efforts fall short of the settlement terms and federal compliance. 

The app not only continued to host videos of minors dating back to before the settlement, but also “incentivized” minors to falsify their age to circumvent limited “young user” accounts, the coalition said in the complaint.

Even in the case that younger users do register for those prohibitive accounts, TikTok still fails to obtain “verifiable parental consent,” the complaint added, because it has not yet made “reasonable efforts” to make sure that parents receive direct notice of the app’s data collection practices.

“For these accounts, TikTok collects detailed information about how the child uses the app and uses artificial intelligence to determine what to show next, to keep the child engaged online as long as possible,” the press release reads. 

Indeed, the app collects location data, app activity data, usage data, video watches, time spent in the app, according to TikTok’s privacy policy for younger users. For comparison, Instagram will delete an account entirely if it cannot verify that the user is over 13 years old; Snapchat, too, asks that parents reach out with their child’s username for verification. 

“We take privacy seriously and are committed to helping ensure that TikTok continues to be a safe and entertaining community for our users,” a TikTok spokesperson told Mashable.

Still, the coalition remains skeptical of TikTok’s commitment to young user privacy.

“For years, TikTok has ignored COPPA, thereby ensnaring perhaps millions of underage children in its marketing apparatus, and putting children at risk of sexual predation,” Josh Golin, executive director of Campaign for a Commercial-Free Childhood, said in the release. “Now, even after being caught red-handed by the FTC, TikTok continues to flout the law.”