TikTok is becoming a hotbed for misinformation about mental health
An investigation by The Guardian (UK) has found that more than half of the most popular videos offering mental health advice on TikTok contain misinformation.
A new study has raised the alarm about the misuse of speech therapy on social media, especially TikTok. The platform has seen an explosion of videos sharing “mental health tips,” but many of them contain misleading, unscientific, and even dangerous claims.
Some of the advice circulating on TikTok can be shocking in its absurdity. For example, one video suggests that eating an orange in the shower can help reduce anxiety, a claim that has absolutely no scientific basis.

Another video diagnoses normal emotional reactions like sadness or irritability as signs of borderline personality disorder or abuse. Such content is not only misleading, but also contributes to blurring the line between normal psychological symptoms and disorders that require serious medical treatment.
More worryingly, the majority of videos in this trend promote supplements like saffron or basil as “natural remedies” to combat anxiety, despite the lack of strong scientific evidence to support their effectiveness in treating mental health issues.
To test the scale and prevalence of the problem, The Guardian analysed the 100 most popular videos tagged #mentalhealthtips on TikTok, and shared them with mental health professionals.
Results showed that 52 of the 100 videos contained at least one piece of misinformation, and many others offered advice that was vague, unhelpful, or inappropriate for specific psychological situations.
David Okai, a neurologist and psychological medicine researcher at the prestigious King's College London (UK), commented that most of the advice on TikTok is built on limited personal experience and anecdotal evidence, rather than serious clinical research.
“Short, eye-catching videos tend to oversimplify the complex reality of professional therapy,” warns Dr. David Okai.
Health experts and some British MPs are now calling on the government to intervene to protect users, especially young people, from a wave of misinformation that could directly affect their mental health and even their lives.

TikTok, for its part, has denied that it has been lax in its oversight, insisting that it has been working to limit the harm caused by misleading content. In a response to The Guardian, TikTok said it would remove any videos that discourage users from seeking professional medical help or promote dangerous treatments.
The platform also said users in the UK are redirected to trusted resources from the National Health Service (NHS) when searching for information about mental health.
“TikTok is a place where millions of people share authentic mental health journeys and find a supportive community. However, this study has clear methodological limitations and could undermine users’ freedom of expression,” a TikTok spokesperson said in a statement.
However, the UK government appears unwilling to continue to place full trust in tech platforms. With the Online Safety Act now in effect, tech companies will now be legally responsible for tackling harmful or misleading content, especially if it is likely to harm children.
The UK’s media regulator, Ofcom, is also tightening its standards. A new set of rules, which will come into effect from 25 July 2025, will require social platforms, search engines and online games to implement measures to protect young users from extremist content, including information about suicide, self-harm, eating disorders and pornography.
As social media increasingly becomes young people's first source of information about mental health, the question is not just who is talking, but how to ensure what they say doesn't hurt the listener.