Questions and answers as social media companies face illegal content crackdown

A proposed crackdown on social media companies is expected to be announced, to force them to remove illegal content and protect vulnerable users.

With mounting anger against tech giants, we take a look at the debate.

– Why are social media firms being criticised?

Disturbing images of self-harm can be found on social media, whether broadcasted into users’ feeds by those they follow, or by searching for certain terms. Many are uploaded by youngsters.

The desperate expression from vulnerable individuals is feared to influence others into harming themselves.

Social media firms are currently left to their own devices as far as regulation is concerned, but they have been urged to act to protect their users and been threatened with legislation.

Suicide prevention minister Jackie Doyle-Price is expected to warn that the normalisation of self-harm and suicide content online poses a danger as grave as child grooming.

Outrage has intensified following the death of Molly Russell.

Molly Russell died in November 2017 (Family handout/PA)

– Who is Molly Russell?

The 14-year-old girl took her own life in November 2017. Her family later found she had viewed social media content linked to anxiety, depression, self-harm and suicide.

Her father, Ian Russell, said he had “no doubt Instagram helped kill my daughter”, blaming algorithms used by the platform for enabling her to view more harmful content.

– Have they acted already?

Instagram boss Adam Mosseri said he was “deeply moved” by Molly’s story and acknowledged his platform was “not yet where we need to be” on the issues of suicide and self-harm.

Images that encourage the acts are banned, but the boss admitted that Instagram relies on users to report the content before it is purged.

“The bottom line is we do not yet find enough of these images before they’re seen by other people,” Mr Mosseri added.

But he said the Facebook-owned firm would introduce “sensitivity screens” making it harder for users to see images showing cutting.

The issue is not simple though.

He argues a key piece of advice from external experts is that “safe spaces” for young people to discuss their mental health issues online are essential, providing therapeutic benefits.

The site fears stigmatising mental health by censoring images that reflect real issues.

A spokesman for Instagram and Facebook vowed to make it harder to find self-harm related content through hashtags and the search feature.

Currently, searching for one hashtag that takes users to images of self-harm first warns that some of the following posts “often encourage behaviour that can cause harm and even lead to death”.

“If you’re going through something difficult, we’d like to help,” the warning message adds.

– Does social media increase incidents of self-harm?

The evidence is currently patchy.

One UK study found self-harm in girls aged 13-16 rose 68% from 2011 to 2014, a period that mirrors a boom in social media usage.

But the authors from the University of Manchester warned more research was necessary to attribute a cause and they suggested it could partly reflect girls being more willing to talk.

Another study led by the University of Oxford said moderate use could be beneficial but extreme usage may have a minor negative influence, pointing towards limiting adolescents’ technology time.

– What’s next?

Digital minister Margot James was on Tuesday expected to announce proposals to crack down on social media companies by forcing them by law to remove illegal content and sign a code of conduct.

“We will introduce laws that force social media platforms to remove illegal content, and to prioritise the protection of users beyond their commercial interests,” she was to say.

But, as has been the case with prior social media scandals, whether the giants act themselves will perhaps limit any regulation.

[“source=breakingnews”]