close
close

TikTok designed to be an addiction machine, internal documents reveal

TikTok designed to be an addiction machine, internal documents reveal


Next article

TikTok is aware of the risks of addiction

what is the story

TikTok reportedly ignored the adverse mental health impact its features could have on teen users.

TikTok executives and employees knew the app’s features promoted compulsive use, documents reviewed by NPR of a lawsuit filed by Kentucky Attorney General Russell Coleman, disclosed.

The lawsuit alleges that TikTok “falsely claims (to be) safe for young people,” and Coleman says the app was “specifically designed to be an addiction machine, targeting children who are still in the process of developing a proper self-control.”

Internal research reveals negative effects on mental health

TikTok’s own research found that compulsive use of the app correlates with several negative effects on mental health.

These include loss of analytical skills, memory formation, contextual thinking, empathy, depth of conversation and increased anxiety.

The documents also revealed that TikTok executives were aware that compulsive use can interfere with sleep, school and work responsibilities, and even “connecting with loved ones.”

Ineffective time management tool

Internal documents have revealed that TikTok’s time management tool, which limits app use to 60 minutes a day, has been largely ineffective in curbing screen time among teenagers.

Despite the implementation of the tool, teenagers still spent an average of 107 minutes on the app every day.

The company reportedly based the tool’s success on how it “improved public trust in the TikTok platform through media coverage,” privately acknowledging that “minors lack executive function to control the your screen time.”

The existence of dangerous “filter bubbles” is acknowledged

TikTok is reportedly aware of the existence and potential dangers of “filter bubbles” on its platform.

According to internal studies, users can be drawn to negative filter bubbles, such as those focused on painful (“painhub”) and sad (“sad notes”) content, within 30 minutes of using a only once

The company’s researchers also noted the promotion of “thin inspiration,” content associated with disordered eating, due to the way TikTok’s algorithm works.

Struggles with content moderation

According to the lawsuit documents, TikTok is struggling with content moderation issues.

An internal investigation revealed that underage girls were given “gifts” and “coins” in exchange for live undressing.

Top company officials allegedly instructed moderators not to remove users who were considered under 13 unless their accounts explicitly stated their age.

NPR reports that TikTok recognized a significant amount of content that violates its rules through its moderation techniques, including videos that normalize pedophilia and glorify minor sexual assault.

Lawsuit Alleges TikTok Prioritizes ‘Pretty People’

The lawsuit also claims that TikTok has favored “beautiful people” on its platform.

The complaint claims the company altered its algorithm after an internal report noted a high “volume of … unattractive subjects” on the app’s main “For You” channel.

It also accuses TikTok of publicizing content moderation metrics that are “largely misleading,” indicating a gap between the company’s public image and internal practices.

TikTok defends practices amid lawsuit

In response to the allegations, TikTok spokesperson Alex Haurek defended the company’s commitment to community safety.

He argued that the Kentucky AG’s complaint “pulls misleading quotes and takes outdated documents out of context.”

Haurek also highlighted TikTok’s “robust safeguards,” including proactive removal of suspected underage users.

He noted that the company voluntarily rolled out safety features like default screen time limits, family pairing and default privacy for under-16s.