The account was one of dozens of automated accounts, or bots, created by The Wall Street Journal to understand what TikTok shows young users. These bots, registered as users aged 13 to 15, were turned loose to browse TikTok’s For You feed, the highly personalized, never-ending feed curated by the algorithm.

An analysis of the videos served to these accounts found that through its powerful algorithms, TikTok can quickly drive minors—among the biggest users of the app—into endless spools of content about sex and drugs.

TikTok served one account registered as a 13-year-old at least 569 videos about drug use, references to cocaine and meth addiction, and promotional videos for online sales of drug products and paraphernalia. Hundreds of similar videos appeared in the feeds of the Journal’s other minor accounts.

TikTok also showed the Journal’s teenage users more than 100 videos from accounts recommending paid pornography sites and sex shops. Thousands of others were from creators who labeled their content as for adults only.

Still others encouraged eating disorders and glorified alcohol, including depictions of drinking and driving and of drinking games.

The Journal shared with TikTok a sample of 974 videos about drugs, pornography and other adult content that were served to the minor accounts—including hundreds shown to single accounts in quick succession.

Of those, 169 were removed from the platform before the Journal shared them—whether by their creators or TikTok couldn’t be determined. Another 255 were removed after being shared with the company, among them more than a dozen portraying adults as “caregivers” entering relationships with people pretending to be children, called “littles.”

The woman in the role-playing video said she wished TikTok did a better job of keeping adult content out of minors’ feeds.

“I do have in my bio that is 18+ but I have no real way to police this,” she wrote in a message. “I do not agree with TikTok showing my content to someone so young.”

A spokeswoman declined to address the content of the individual videos, but said the majority didn’t violate guidelines. She said TikTok removed some of the videos after the Journal’s accounts viewed them, and restricted the distribution of other videos to stop the app from recommending them to other users, but declined to say how many.

The spokeswoman said the app doesn’t differentiate between videos it serves to adults and minors but said that the platform is looking to create a tool that filters content for young users.

TikTok’s terms of service say that users must be at least 13 years old, and that users under 18 need consent from their parents.

“Protecting minors is vitally important, and TikTok has taken industry-first steps to promote a safe and age-appropriate experience for teens,” the spokeswoman said in a statement. She noted that the app allows parents to manage screen time and privacy settings for their children’s accounts.

The addiction machine

An earlier video investigation by the Journal found that TikTok only needs one important piece of information to figure out what a user wants: the amount of time you linger over a piece of content. Every second you hesitate or re-watch, the app tracks you.

Through that one powerful signal, TikTok can learn your most hidden interests and emotions, and drive users of any age deep into rabbit holes of content—in which feeds are heavily dominated by videos about a specific topic or theme. It’s an experience that other social-media companies like YouTube have struggled to stop.

“All the problems we have seen on YouTube are due to engagement-based algorithms, and on TikTok it’s exactly the same—but it’s worse,” said Guillaume Chaslot, a former YouTube engineer who worked on that site’s algorithm and is now an advocate for transparency in how companies use those tools. “TikTok’s algorithm can learn much faster.”

The Journal assigned each of its 31 minor accounts a date of birth and an IP address. Most were also programmed with various interests, which were revealed to TikTok only through lingering on videos with related hashtags or images and through scrolling quickly past the others. Most didn’t search for content and instead simply watched videos that appeared in their feed.

Here’s how that can work:

The creator promoting the 420 friendly website didn’t respond to questions about the video being shown to an account registered to a 13-year-old.

About a dozen of the Journal’s 31 minor accounts ended up being dominated by a particular theme.

This can be especially problematic for young people, who may lack the capability to stop watching and don’t have supportive adults around them, said David Anderson, a clinical psychologist at The Child Mind Institute, a nonprofit mental-health care provider for children.

He said those teens can experience a “perfect storm” in which social media normalizes and influences the way they view drugs or other topics.

Even when the Journal’s accounts were programmed to express interest in multiple topics, TikTok sometimes zeroed in on single topics and served them hundreds of videos about one in close succession.

TikTok served one account, which had been programmed with a variety of interests, hundreds of Japanese film and television cartoons. In one streak of 150 videos, all but four featured Japanese animation—many with sexual themes.

The TikTok spokeswoman said the Journal’s bots “in no way represents the behavior and viewing experience of a real person,” in part because humans have diverse and changing interests. She added that the platform was “reviewing how to help prevent even highly unusual viewing habits from creating negative cycles, particularly for our younger users.”

The spokeswoman said that when users encounter something they don’t want to see, they can select “not interested” to see less of that content.

Dozens of the videos promoting paid pornography have since been deleted from the app.

In some cases, TikTok creators were clear about not wanting children to see their videos, labeling them (or their accounts) as for adults only. But the app served them anyway.

In one stretch of 200 videos, nearly 40% were labeled as being for adults only.

In all, at least 2,800 such videos were served to the Journal’s minor accounts.

The proliferation of sexually charged content has stirred concerns inside TikTok. Videos directing people to OnlyFans were so abundant that in a meeting in the fall of 2020, the company’s chief operating officer, Vanessa Pappas, asked employees to explain what the site was, according to a person familiar with the meeting.

After the meeting, TikTok at first decided to ban content directing users to OnlyFans, since employees argued much of the content on the site is pornographic, the person familiar with the decision said. The platform then decided to allow users to link to the site after other employees pointed out that not everything on OnlyFans is X-rated, and that other social-media platforms allow links to the content.

The TikTok spokeswoman said that it prohibits nudity and sexual solicitation and removes accounts that redirect users to sexual content or services, including on OnlyFans.

A spokeswoman for OnlyFans said the site is strictly for people 18 years and older and declined to comment on TikTok accounts directing people to the site.

Policing

TikTok relies on a combination of algorithms and more than 10,000 people to police its huge and growing volume of content, according to former executives of the company.

The company said in a recent report that it removed 89 million videos in the second half of last year.

But it has been hard to keep up with the app’s growth, the former executives said: TikTok now has about 100 million users in the U.S. consuming and producing videos, from about 25 million in 2019.

The company said that users upload tens of thousands of videos every minute.

To keep pace, moderators focus on the most popular content, leaving videos with lower view counts largely unreviewed, the former executives said.

In July, TikTok said that in the U.S. it would begin relying on its algorithms to both identify and remove certain types of videos that violate its rules in an effort to enforce its rules more quickly. Previously, TikTok’s algorithms identified rule-breaking videos, but humans reviewed them before removal.

The company made the announcement after the Journal shared hundreds of examples of potentially rule-breaking content that the app had served its bots. TikTok said it has been experimenting with this new system over the past year.

TikTok’s spokeswoman said that no algorithm will ever be completely accurate at policing content because of the amount of context that goes into understanding a video, particularly ones about drugs.

TikTok has also struggled to eradicate video posts promoting eating disorders.

Policing content has been complicated by the company’s decisions in recent years to loosen some restrictions in the U.S., including around skin exposure and bikinis, according to several former executives and content moderators.

The result has been more sexualized videos on the platform, the people said.

The spokeswoman for TikTok said the company’s policies evolve in response to industry norms and changing user behavior. She also said the company expects new and different content as TikTok’s audience grows older and more diverse.

And that bot account registered for a teenage user that fell into the world of role-playing and other sexually oriented content?

Methodology

Over the course of several months, The Wall Street Journal set up more than 100 TikTok accounts that browsed the app with little human intervention, including 31 accounts registered as users between ages 13 and 15. The Journal also created software to guide the accounts and analyze their behavior. These accounts were assigned a date of birth and an IP address.

Most of the accounts were given interests, consisting of keyword terms and machine learning image classifications. If a video matched the interest, then the account would dwell on that video; otherwise, the account would quickly move to the next video.

The Journal collected videos, thumbnail images, description text and metadata associated with each video, and created internal analysis tools to help sift through the results. In the end, the accounts watched nearly 400,000 videos, including roughly 100,000 served to the accounts for 13- to 15-year-olds. For more on our findings, watch the visual investigation here.