Inside the TikTok documents: Stripping teens and boosting ‘attractive’ people
Go Deeper.
Create an account or log in to save stories.
Like this?
Thanks for liking this story! We have added it to a list of your favorite stories.
Kids as young as 15 were stripping on TikTok’s live feature fueled by adults who were paying for it.
That’s what TikTok learned when it launched an internal investigation after a report on Forbes. Officials at TikTok discovered that there was “a high” number of underage streamers receiving a “gift” or “coin” in exchange for stripping — real money converted into a digital currency often in the form of a plush toy or a flower.
This is one of several disturbing accounts that came to light in a trove of secret documents reviewed last week by NPR and Kentucky Public Radio. Even more troubling was that TikTok executives were acutely aware of the potential harm the app can cause teens, but appeared unconcerned.
The information came after a more than two-year investigation into TikTok by 14 attorneys general that led to state officials suing the company on Tuesday.
Turn Up Your Support
MPR News helps you turn down the noise and build shared understanding. Turn up your support for this public resource and keep trusted journalism accessible to all.
Here are a few more of the most serious, and previously unreported, allegations against TikTok, the wildly popular app that is used by around 170 million people in the U.S.
You can be “addicted” in under 35 minutes, or 260 videos
TikTok quantified the precise amount of viewing it takes for someone to form a habit: 260 videos.
Kentucky authorities note that while it might seem a lot, TikTok videos can be just a few seconds long.
“Thus, in under 35 minutes, an average user is likely to become addicted to the platform,” the state investigators concluded.
TikTok’s algorithm prioritizes beautiful people
When TikTok’s main video feed saw “a high volume of … not attractive subjects” filling everyone’s screens, the app rejiggered its algorithm to amplify users the company viewed as beautiful, according to an internal report viewed by Kentucky investigators.
In fact, TikTok’s documents showed it went so far as to tweak its algorithm to reduce the visibility of people it deemed not very attractive and “took active steps to promote a narrow beauty norm even though it could negatively impact their Young Users,” the Kentucky authorities wrote in the previously redacted documents.
Limits on TikTok use barely had an impact
The app lets parents set time limits on their kids’ usage that range from 40 minutes to two hours per day. TikTok even created a tool that set the default time prompt at 60 minutes per day to combat excessive and compulsive use of the social media app.
After tests, TikTok found the tool had little impact – accounting for just about a minute and a half drop in usage, from around 108.5 minutes per day to 107 minutes with the tool. According to the complaint, TikTok did not revisit this issue.
One document shows one TikTok project manager speaking s candidly about the time-limit feature’s real goal: “improving public trust in the TikTok platform via media coverage,” the TikTok employee said. Our goal is not to reduce the time spent.”
95% of smartphone users under 17 use TikTok
A presentation shown to top company officials revealed that an estimated 95% of smartphone users under 17 used TikTok at least once a month. This led a company staffer to state that it had “hit a ceiling among young users.”
TikTok viewed itself as being in an “arms race for attention,” according to a 2021 internal presentation.
An internal document about users under 13 instructed moderators to not take action on reports on underage users unless their bio specifically states they are 13 or younger.
Under federal law, social media companies cannot collect data on children under 13 unless the companies have explicit consent from parents.
Compulsively using TikTok interferes with kids’ normal lives
The documents show that TikTok was aware that it “interferes with essential personal responsibilities like sufficient sleep, work/school responsibilities, and connecting with loved ones.”
One unnamed TikTok executive put it in stark terms, saying the reason kids watch TikTok is because the power of the app’s algorithm keeps them from “sleep, and eating, and moving around the room, and looking at someone in the eyes.”
Response from TikTok, senators and a watchdog group
On Thursday, TikTok spokesman Alex Haurek criticized NPR for reporting on information that is now under a court seal, claiming the material “cherry-picks misleading quotes and takes outdated documents out of context to misrepresent our commitment to community safety.”
Some advocacy groups, however, welcomed the disclosures.
The Oversight Project, a social media watchdog group, said that TikTok has not been honest about how safe children are on the app. “These unredacted documents prove that TikTok knows exactly what it’s doing to our kids – and the rot goes all the way to the top,” the group wrote on X.
Also on Friday, a bipartisan pair of senators Richard Blumenthal (D-Conn.) and Marsha Blackburn (R-Tenn.) wrote a letter to TikTok asking the company to turn over “all documents and information” related to disclosures about child safety on the app, citing NPR’s report.
Copyright 2024, NPR