This shows a person staring at a TV.
The study also determined more needed to be done to prevent suicide-related content being recommended to users based on algorithms for suggested viewing. Credit: Neuroscience News

YouTube Habits Linked to Increased Loneliness and Anxiety

Summary: Researchers have found a link between frequent YouTube usage and increased levels of loneliness, anxiety, and depression, especially among viewers under 29.

The study raises concerns about ‘parasocial relationships’ between creators and viewers, as well as the algorithmic recommendation of suicide-related content. The team suggests AI-based solutions to guide users towards positive mental health content.

Key Facts:

  1. Regular YouTube users, particularly those under 29, experience higher levels of loneliness, anxiety, and depression according to a study from the Australian Institute for Suicide Research and Prevention.
  2. The development of parasocial relationships, or one-sided emotional bonds between YouTube content creators and viewers, could potentially exacerbate mental health issues.
  3. The study highlights concerns about YouTube’s algorithm recommending suicide-related content and proposes an AI-based solution to direct users towards verified positive mental health content.

Source: Griffith University

Frequent users of YouTube have higher levels of loneliness, anxiety, and depression according to researchers from the Australian Institute for Suicide Research and Prevention (AISRAP).

Dr Luke Balcombe and Emeritus Professor Diego De Leo from Griffith University’s School of Applied Psychology and AISRAP sought to understand both the positive and negative impacts of the world’s most used streaming platform on mental health. 

They found the most negatively affected individuals were those under 29 years of age, or who regularly watched content about other people’s lives.

Lead author Dr Luke Balcombe said the development of parasocial relationships between content creators and followers could be cause for concern, however some neutral or positive instances of creators developing closer relationships with their followers also occurred.

“These online ‘relationships’ can fill a gap for people who, for example, have social anxiety, however it can exacerbate their issues when they don’t engage in face-to-face interactions, which are especially important in developmental years,” he said.

“We recommend individuals limit their time on YouTube and seek out other forms of social interaction to combat loneliness and promote positive mental health.”

Dr Balcombe said the amount of time spent on YouTube was often a concern for parents, who struggled to monitor their children’s use of the platform for educational or other purposes.

For the purpose of the study, over two hours per day of YouTube consumption was classed as high frequency use and over five hours a day as saturated use.

The study also determined more needed to be done to prevent suicide-related content being recommended to users based on algorithms for suggested viewing. 

While ideally, people shouldn’t be able to search for these topics and be exposed to methods, the YouTube algorithm does push recommendations or suggestions based on previous searches, which can send users further down a disturbing ‘rabbit hole’. 

Users can report this type of content, but sometimes it may not be reported, or it could be there for a few days or weeks and with the sheer volume of content passing through, it’s almost impossible for YouTube’s algorithms to stop all of it.

If a piece of content is flagged as possibly containing suicide or self-harm topics, YouTube then provides a warning and asks the user if they want to play the video.

“With vulnerable children and adolescents who engage in high frequency use, there could be value in monitoring and intervention through artificial intelligence,” Dr Balcombe said.

“We’ve explored human–computer interaction issues and proposed a concept for an independent-of-YouTube algorithmic recommendation system which will steer users toward verified positive mental health content or promotions.

“YouTube is increasingly used for mental health purposes, mainly for information seeking or sharing and many digital mental health approaches are being tried with varying levels of merit, but with over 10,000 mental health apps currently available, it can be really overwhelming knowing which ones to use, or even which ones to recommend from a practitioner point of view.

“There is a gap for verified mental health or suicide tools based on a mix of AI-based machine learning, risk modeling and suitably qualified human decisions, but by getting mental health and suicide experts together to verify information from AI, digital mental health interventions could be a very promising solution to support increasing unmet mental health needs.”  

About this psychology research news

Author: Christine Bowley
Source: Griffith University
Contact: Christine Bowley – Griffith University
Image: The image is credited to Neuroscience News

Original Research: Open access.
The Impact of YouTube on Loneliness and Mental Health” by Luke Balcombe et al. Informatics


Abstract

The Impact of YouTube on Loneliness and Mental Health

There are positives and negatives of using YouTube in terms of loneliness and mental health. YouTube’s streaming content is an amazing resource, however, there may be bias or errors in its recommendation algorithms.

Parasocial relationships can also complicate the impact of YouTube use. Intervention may be necessary when problematic and risky content is associated with unhealthy behaviors and negative impacts on mental health. Children and adolescents are particularly vulnerable.

Although YouTube might assist in connecting with peers, there are privacy, safety, and quality issues to consider.

This paper is an integrative review of the positive and negative impacts of YouTube with the aim to inform the design and development of a technology-based intervention to improve mental health. The impact of YouTube use on loneliness and mental health was explored by synthesizing a purposive selection (n = 32) of the empirical and theoretical literature.

Next, we explored human–computer interaction issues and proposed a concept whereby an independent-of-YouTube algorithmic recommendation system steers users toward verified positive mental health content or promotions.

Join our Newsletter
I agree to have my personal information transferred to AWeber for Neuroscience Newsletter ( more information )
Sign up to receive our recent neuroscience headlines and summaries sent to your email once a day, totally free.
We hate spam and only use your email to contact you about newsletters. You can cancel your subscription any time.
  1. Now do some research on the communist owned site tick tock…see the mental hazards from people that are full of drama, spreading lies and propaganda…not to mention the shortening of attention spans..sad people can’t tend to their own business….I have my own home and life
    ..save the drama for your mamma…

  2. This is a blah blah blah article that shows what are they watching last time suicide was main for my youtube was freaking Logan Paul seeing that guy hung in the forest that was the most I have ever heard if it other than passing mentions.

Comments are closed.