TikTok reportedly released explicit sex and drug videos to 13-year-olds

  • TikTok reportedly recommended explicit drugs and sex videos to its younger users.
  • A Wall Street Journal survey found hundreds of adult videos on For You pages for users ages 13 to 15.
  • A spokesperson for TikTok said the app doesn’t differentiate between the videos it recommends for adults and minors.

TikTok distributes drug-related and sexually explicit content to minors, a recent Wall Street Journal investigation found.

Using dozens of automated bots registered as users between the ages of 13 and 15, the post revealed that TikTok provides accounts with dozens of videos promoting rape, drugs and sexual fantasies, including some representing caregivers and children.

Teens make up TikTok’s largest group around 100 million monthly active users. Last year, minors made up more than a quarter of app users, according to company data.

An earlier Journal investigation found that TikTok organizes a user’s For You page based on the content a user lingers on in their feed. Using the same methodology, bots that focused on drug content quickly saw their For You page invaded by nearly 600 videos featuring drug-related content. The Journal said the page contained a rabbit hole of content advertising how to get drugs, as well as external links to web pages selling illegal substances.

Likewise, bots that lingered on more sexual content were bombarded with videos about sexual power dynamics and violence, as well as links to external pages for porn like OnlyFans. that 90% of his videos are about sex and bondage. Many sexually explicit videos had tags indicating they were meant to be ‘adults only’.

Certain content encountered by Le Journal is prohibited by the platform by TikTok Community Guidelines. The post said hundreds of videos were removed from the platform before they could be shared with TikTok, but it shared 974 examples of explicit content with the company.

A spokesperson for TikTok did not respond to a request for comment from Insider in time for publication, but told the Journal that the company declined to comment on individual video content. The spokesperson said the majority of the videos did not violate TikTok policies. However, the company told the Journal that it removed some of the videos after the post alerted the company and restricted the distribution of some of the other videos.

The spokesperson also said that the app does not differentiate between the videos it serves to adults and minors. However, the platform is looking to create a tool that filters content for younger users.

In July, an Insider investigation found that TikTok’s algorithm automatically suggested content about eating disorders that appeared to violate TikTok’s community guidelines. The Journal’s robots have also developed content promoting eating disorders, as well as drunk driving.

Read the full Wall Street Journal survey here.


Source link

Comments are closed.