Similar spot checks of Twitch’s less popular competitors, YouTube Gaming and Facebook Gaming, turned up far fewer instances of apparent children livestreaming. To stream on YouTube via mobile, a user must have over 1,000 followers. Facebook Live doesn’t have a comparable restriction, but its live channel discovery sections for Facebook Gaming and Facebook Live appear more curated or moderated than Twitch’s. (Facebook also works with about 15,000 content moderators in the US alone.) That doesn’t mean those platforms are faultless; Facebook Live in particular has struggled publicly with moderating violent or dark livestreams. And issues with child predation and exploitation extend well beyond livestreaming; The New York Times reported earlier this year that instances of media related to online child sexual abuse increased 50 percent in 2019, including 60 million photos and videos flagged by Facebook alone.

The dozens of active accounts WIRED discovered on Twitch sometimes contain harrowing conversations between apparent children and strangers. In some instances, the strangers “dare” young streamers for their entertainment, including asking young girls to flip their hair or kiss their friend on-camera. Other times, strangers ask for young streamers’ contact information on other apps such as Facebook-owned Instagram or WhatsApp. (Twitch also has an integrated private chat feature.) They also pretend to donate money, making a chat message appear like a verified donation, or post inappropriate ASCII art in chat. The streamers themselves are by and large unsupervised.

WIRED shared dozens of apparent childrens’ accounts with Twitch. Some have since been deactivated.

“The safety of our global community is a priority for Twitch and one in which we are perpetually investing,” a Twitch spokesperson told WIRED. “We are continuously working to ensure all members of the community experience Twitch in the way we intend and are investing in technologies to support this effort. In addition, we regularly assess and update policies and practices to ensure we are appropriately addressing emerging and evolving behaviors.” The spokesperson says that Twitch has a dedicated law enforcement response team, and that it works with parent company Amazon’s law enforcement team. When appropriate, the company flags violations to law enforcement, and works with the Technology Coalition and the National Center for Missing and Exploited Children.

Dr. Martha Kirby, the Child Safety Online Policy Manager at the UK’s National Society for the Prevention of Cruelty to Children, says that Covid-19-related lockdowns have exacerbated the risk of online sexual abuse “like never before.”

“Poor design choices on livestreaming sites can be exploited by groomers to abuse children in real time,” she says. “Tech firms have consistently failed to design their sites with child safety in mind, allowing offenders to easily watch livestreams of children and message them directly.”

In one video, archived three days ago, an apparent child describes herself as “boredd” and asks people to talk to her. She sits in her driveway eating ice cream making small talk with a stranger. “I don’t really have much stuff to do,” she says before asking the stranger where they live. At least a half dozen other videos livestreamed in the last day feature apparent children referencing boredom.

Security expert and Savvy Cyber Kids founder Ben Halpert also noted that, in quarantine, children often go unsupervised as they spend increasing time online. “Kids feel connection with other people when they’re livestreaming and communicating on things like Twitch,” says Halpert. At the same time, it is notoriously difficult to moderate live content—especially when there’s a lot of it. According to analytics firm Arsenal, hours watched of Twitch’s Just Chatting section increased from 86 million in January to 167 million in June.

Domestic helper visa extension hk$900.