Tech platforms called to support public interest research into mental health impacts

The tech industry has been called on to share data with public sector researchers so the mental health and psychosocial impacts of their service on vulnerable users can be better understood, and also to contribute to funding the necessary independent research over the next ten years.

The UK’s chief medical officers have made the call in a document setting out advice and guidance for the government about children’s and young people’s screen use. They have also called for the industry to agree a code of conduct around the issue.

Concerns have been growing in the UK about the mental health impacts of digital technologies on minors and vulnerable young people.

Last year the government committed to legislate on social media and safety. It’s due to publish a white paper setting out the detail of its plans before the end of the winter, and there have been calls for platforms to be regulated as publishers by placing a legal duty of care on them to protect non-adult users from harm. Though it’s not yet clear whether the government intends to go that far.

“The technology industry must share data they hold in an anonymised form with recognised and registered public sector researchers for ethically agreed research, in order to improve our scientific evidence base and understanding,” the chief medical officers write now.

After reviewing the existing evidence the CMOs say they were unable to establish a clear link between screen-based activities and mental health problems.

“Scientific research is currently insufficiently conclusive to support UK CMO evidence-based guidelines on optimal amounts of screen use or online activities (such as social media use),” they note, hence calling for platforms to support further academic research into public health issues.

Last week the UK parliament’s Science and Technology Committee made a similar call for high quality anonymized data to be provided to further public interest research into the impacts of social media technologies.

We asked Facebook-owned Instagram whether it will agree to provide data to public sector mental health and wellbeing researchers earlier this week. But at the time of writing we’re still waiting for a response. We’ve also reached out to Facebook for a reaction to the CMOs’ recommendations.

Update: A Facebook spokesperson said:

We want the time young people spend online to be meaningful and, above all, safe. We welcome this valuable piece of work and agree wholeheartedly with the Chief Medical Officers on the need for industry to work closely together with government and wider society to ensure young people are given the right guidance to help them make the most of the internet while staying safe.

Instagram’s boss, Adam Mosseri, is meeting with the UK health secretary today to discuss concerns about underage users being exposed to disturbing content on the social media platform.

The meeting follows public outrage over the suicide of a schoolgirl whose family said she had been exposed to Instagram accounts that shared self-harm imagery, including some accounts they said actively encouraged suicide. Ahead of the meeting Instagram announced some policy tweaks — saying it would no longer recommend self-harm content to users, and would start to screen sensitive imagery, requiring users click to view it.

In the guidance document the CMOs write that they support the government’s move to legislate “to set clear expectations of the technology industry”. They also urge the technology industry to establish a voluntary code of conduct to address how they safeguard children and young people using their platforms, in consultation with civil society and independent experts.

Areas that the CMOs flag for possible inclusion in such a code include “clear terms of use that children can understand”, as well as active enforcement of their own T&Cs — and “effective age verification” (they suggest working with the government on that).

They also suggest platforms include commitments to “remove addictive capabilities” from the UX design of their services, criticism so-called “persuasive” design.

They also suggest platforms commit to ensure “appropriate age specific adverts only”.

The code should ensure that “no normalisation of harmful behaviour (such as bullying and selfharming) occurs”, they suggest, as well as incorporate ongoing work on safety issues such as bullying and grooming, in their view.

In advice to parents and carers also included in the document, the CMOs encourage the setting of usage boundaries around devices — saying children should not be allowed to take devices into their bedrooms at bedtime to prevent disruption to sleep.

Parents also encourage screen-free meal time to allow families to “enjoy face-to-face conversation”.

The CMOs also suggest parents and guardians talk to children about device use to encourage sensible social sharing — also pointing out adults should never assume children are happy for their photo to be shared. “When in doubt, don’t upload,” they add.

Build your free WordPress website with Host2.us free hosting today!

About the Author

Leave a Reply