The social network of Chinese origin for sharing short videos and in vertical format, TikTok, has become increasingly popular among young people around the world, only in the United States, two thirds of adolescents use it with an average of 80 minutes up to date. The situation becomes serious when, according to A study, it has been detected that through the content generated, TikTok promotes eating disorders and self-harm.
And it is that, the application, owned by the Chinese company Bytedance, quickly delivers a series of short videos to users and has surpassed Instagram, Facebook and YouTube in the bid for the hearts, minds and screen time of young people .
However, most people understand very little about how TikTok works or the potential dangers of the platform, says the Center to Counter Digital Hate?CCDH, for its acronym in English?, a non-profit organization that details that this app reveals a generation gap in usage and understanding.
"The results are every parent's nightmare: young people's feeds are bombarded with harmful and heartbreaking content that can have a significant cumulative impact on their understanding of the world around them and on their physical and mental health," the report said. “Deadly by design”, which highlights that it seeks to give parents and lawmakers an idea of the content and algorithms that shape the lives of young people today.
TikTok operates via a recommendation algorithm that builds a personalized endless-scrolling “For You” feed, apparently based on a user's likes, followers, watch time, and interests. CCHR researchers created “standard” and “vulnerable” accounts in the geographies covered.
Research has indicated that users searching for eating disorder content often choose usernames with related language; therefore, the "vulnerable" accounts created by the organization contained the term "lose weight" in their usernames.
TikTok identifies the vulnerability of the user and capitalizes on it. Vulnerable accounts in the study received 12 times more self-harm and suicide video recommendations than standard accounts. "Young people engaging with this content are faced with an astonishing onslaught of more and more recommended videos in their feeds."
For the study, researchers at the Center for Countering Digital Hate ? CCHR? created new accounts in the United States, the United Kingdom, Canada, and Australia at the minimum age allowed by TikTok, 13 years old.
"What we found was deeply disturbing." In 2.6 minutes, TikTok recommended suicidal content. In 8 minutes, TikTok showed content related to eating disorders. Every 39 seconds, TikTok recommended videos on body image and mental health to teens, the report said.
CCDH refers that this 2022, for the first time, a coroner's investigation in the United Kingdom ruled that social media platforms contributed to the suicide of 14-year-old Molly Russell. Molly had liked, shared or saved 2,100 posts related to suicide, self-harm or depression on Instagram in the 6 months prior to her death.
Molly's research has shown that Big Tech's negligence has real, life-altering consequences, and that comprehensive regulation is needed to protect children online.
For her part, earlier this year, TikTok COO Vanessa Pappas testified before the Senate Homeland Security and Government Affairs Committee, where she said security was a "priority" for her company and that TikTok's mission was to "inspire creativity and bring joy."
However, “its guarantees of transparency and accountability are empty promises loaded with buzzwords that legislators, governments and the public have heard before,” says the organization CCDH.
The researchers found a community for eating disorder content on the platform, amassing 13.2 billion views across 56 hashtags, often designed to evade moderation.
"Rather than entertainment and safety, our findings reveal a toxic environment for TikTok's youngest users, intensified for the most vulnerable."
The report underscores the urgent need to reform online spaces, arguing that policymakers should require platforms to build in security by design, transparency of their algorithms and financial incentives, and accountability and responsibility for not enforcing their terms of service and the damage that perpetuates their platforms.
Without oversight, TikTok's opaque algorithm will continue to benefit from serving its users – kids as young as 13, remember? increasingly intense and distressing content without controls, resources or support, “he says.
You may be interested in: From games to election results, this was the most searched on Google this 2022