TikTok algorithm directs violent videos to more minority users than white: lawsuit

New You can now listen to Insurance Journal articles!

TikTok is facing a claim that its algorithm directs more violent videos at minority subscribers than white users in a lawsuit accusing the platform of the death of a 14-year-old African-American girl.

The complaint, which also names Meta Platforms Inc., Snap Inc. and TikTok parent company ByteDance Ltd. as defendants, is part of a series of lawsuits that are trying to hold social media companies accountable for teens’ addiction to their platforms.

Parents of Englyn Roberts, who died in September 2020 about two weeks after attempting suicide, say TikTok is aware of biases in its algorithm related to race and socioeconomic status. Roberts would not have seen and been addicted to the harmful content that contributed to his death had it not been for TikTok’s programming, according to the lawsuit filed Wednesday in federal court in San Francisco.

TikTok sued over 10-year-old child’s death in ‘Blackout Challenge’

“TikTok’s social media product directed and promoted more harmful and violent content at Englyn Roberts than they promoted and amplified at other Caucasian users of different age, gender and status. similar residence,” the parents alleged.

The complaint was filed by Social Media Victims Rights Centera Seattle-based advocacy group.

Representatives for TikTok, Meta and Snap did not immediately respond to requests for comment.

The case is Roberts v. Meta Platforms, Inc., 22-cv-04210, US District Court, Northern District of California.

Photograph: The logo of the TikTok app by ByteDance Ltd. is arranged for a photograph on a smartphone in Sydney, New South Wales, Australia, Monday, September 14, 2020. Photo credit: Brent Lewin/Bloomberg.

Related:

Copyright 2022 Bloomberg.

Topics
Prosecutions

Was this item valuable?


Here are some other articles you might like.

Interested in Prosecutions?

Receive automatic alerts for this topic.

Comments are closed.