Instagram Is Failing to Prevent Teenagers from Seeing Suicide and Self-Harm Posts

There’s been widespread concern about the impact that social media is having on young people, and now the effects of suicide content are being looked at. A study has shown that Instagram is failing to protect young people from harmful content.

What Has Happened?

A test carried out by cyber researchers and child safety groups has found that 30 out of 47 tools designed to keep teens safe on Instagram had either expired or were highly ineffective. Meta has challenged the findings, saying that it has substantial protections in place to limit the amount of harmful content that teens see. It says that millions of teens and parents use the tools to stay safe online.

Why Does It Matter?

Mental health training courses Blackpool, such as tidaltraining.co.uk/mental-health-training-courses/blackpool/, cover material that relates to social media harm. Researchers know that children and teens are being exposed to harmful content of various kinds.

Children and teens are being exposed to content that either sexualises them or encourages them to post content for sexualised likes, according to the findings of the report. The study also found that teens were being exposed to content that glamorised suicide and self-harm.

The government says that its new Online Safety Act will protect children online by fundamentally changing how the internet works and by placing the onus on social media firms to apply sufficient protections. Whether that end goal is currently being achieved is debatable, and the topic continues to be hugely controversial as parents continue to do battle.

Leave a Reply

Your email address will not be published. Required fields are marked *