Prof: Social Media Platforms Should Reach Out to Youth With Some Sources of Help

CC0 / / Social media
Social media - Sputnik International
Subscribe
Instagram’s Boss, Adam Mosseri, has announced that all graphic images of self-harm will be removed from the social media platform following the death of 14-year-old Molly Russell, a British school who took her own life in 2017. Sputnik spoke about it to Associate Professor Jo Robinson from The National Centre of Excellence in Youth Mental Health.

Sputnik: Instagram's Boss, Adam Mosseri, has announced that all graphic images of self-harm will be removed from the social media platform following the death of 14-year-old Molly Russell, who took her own life in 2017. How significant is this announcement from Instagram?

Facebook logo - Sputnik International
Privacy Concerns Arise as Facebook Reveals Whatsapp, Instagram Integration Plans
Jo Robinson: I think it's a really important announcement. I think it's been a real challenge for social media companies actually to know how to respond properly and responsibly when young people share images and content about suicide and self-harm. I think it's been a real challenge for them to, to know how to respond to keep young people safe on the platform, but also not to shut down and silence conversations that young people might not be able to have in other forums.

Sputnik: Are big companies like Instagram and Facebook really to blame for the number of young suicides on their services?

Jo Robinson: It's not simple no, and as a parent myself, my heart goes out to any parent who's lost a child to suicide — it really does, but suicide and self-harm and are terribly complex behaviours. I think it's probably a bit unfair to blame one company or one type of platform, I think the pathways that lead young people to take their own lives are terribly complex and I certainly think that social media platforms can be part of the problem for many young people who may be already vulnerable and distressed but don't think that the only part of the problem.

Certainly some of the young people that we've spoken to, you know, use social media platforms find it part of the solution as well, because on there they can find a sense of community or they can find a way of expressing themselves in a way that they might not have felt able to do otherwise. So I would say that these behaviours are really, really complex. It's terribly tragic when a young person takes her own life. But I think we also need to look at the factors that need somebody to feel that way in the first place and to some of the kind of attitudes that we have as a community and the service sector towards suicide and self-harm that make young people feel unable to express themselves or seek help from professionals.

Young girls using a mobile phone - Sputnik International
Facebook Plans to Integrate With WhatsApp, Instagram, Messenger – Reports
So I think we need to look a little bit beyond I think it's easy to blame social media platforms for this and I certainly think as I say that they're part of a certainly can be part of the problem for young people, but I think they can also be part of the solution and I do think we need to look beyond social media at some of the reasons that need young people to feel this way.

READ MORE: 'No More Deaths': Instagram Needs to Review Its Harmful Content Policy — Journo

Sputnik: Instagram has very specific filters on its services, for example, it won't allow hardcore violence and nudity — in particular, female nudity which resulted in condemnation from certain feminist groups. Why do you think it's taken so long for Instagram to ban images of self-harm compared to nudist content?

Jo Robinson: First of all, I would say the Facebook and Instagram are owned by the same company and Facebook's made the same announcement that Instagram have made today so they've changed their policies in exactly the same way that Instagram have. I think it's a very complex behaviour and I think the challenge that these platforms have had over the last few years is that simply removing people's content can actually shut down conversations that young people might not be able to have in other way.

Just simply removing images that are potentially distressing for some people, but not that the issue with posting images around self-harm and, and content around suicide and self-harm, young people don't set out to distress others, they don't set out to cause harm or to embarrass others. It's the kind of inadvertent consequence, I think, or it can be an unintended consequence of that imagery, the impact that they might have another vulnerable young people. So I think one of the challenges that these platforms is by shutting those conversations are taking the images away, what they can further do is compound the sense of shame and stigma and isolation that those young people might have felt that have led them to communicate in this way in the first place.

Monique Agostino - Sputnik International
Asia
Instagram-Famous Stunner Faces Charges Over Burglaries, Stealing $1,300 in Cash
So they've had to be very careful about just simply shutting down conversations and removing content and making people feel worse than they did in the first place. So I think they've had to give a lot of thought and I think they have given a lot of thoughts are the ways in which I'll respond to young people when they try and post this content. So it's not simply a matter of removing it, but then what the platforms will do is reach out to the young people and provide them with some messages of hope, and some sources of help on those sorts of things and explain why they've removed the images rather than just taking them away.

So I think it's been a complicated process, and I very much welcome the decision that they've made. I think it's the right thing for them to do. We know that images, graphic images of self-harm can lead to instances of contagion or copycat events and those sorts of things. So I very much welcome the move, but I think it has been a complicated process and it's good to see that they put the thought into it, and that what they plan to do is when they do remove content is to respond to those young people directly and explain to them why the contents been removed and reach out to them with some sources of help.

The views expressed in this article are those of the speaker and do not necessarily reflect those of Sputnik.

Newsfeed
0
To participate in the discussion
log in or register
loader
Chats
Заголовок открываемого материала