Image: Flickr / Stacey MacNaught

Three in four girls have been recipients of unwanted sexually explicit images on social media apps

According to a new study, tech companies need to do more to clamp down on non-consensual sexual images sent to young girls. 

TW: sexual misconduct

Academics at University College London and the University of Kent found that the majority of the girls who were sent explicit images chose not to report it to their parents, any authorities, or the app developers. 

The report suggests that the practice of sending sexual images is becoming “dangerously normalised”, pressurising girls to “trade” images. The report warned that girls who sent a picture in response were being mocked by classmates after their photo was shared. 

The study showed that only 17% of the recipients of unwanted sexual images decided to report them to the app on which they received them. 

Of  88 girls aged between 12 and 18 who took part in focus groups, around three-quarters said they had received unwanted images of male genitalia. The focus groups reported instances of receiving images from adult men using false identities. The report said that “nearly half of incidents of image-based sexual harassment were from unknown adult men, based on profiles.”

Young people in the UK are facing a crisis of online sexual violence

–Professor Jessica Ringrose, UCL Institute of Education

 

Professor Jessica Ringrose from the UCL Institute of Education said: “Young people in the UK are facing a crisis of online sexual violence. Despite these young people, in particular girls, saying they felt ‘disgusted, embarrassed and confused’ about the sending and receiving of non-consensual images, they rarely want to talk about their online experiences for fear of victim-blaming, and worry that reporting will make matters worse.

“We hope this report allows all of us to better identify when and how image-sharing becomes digital sexual harassment and abuse, and spread the message that, although the non-consensual sending and sharing of sexual images may be common and feel ‘normal’, it is extremely harmful.” 

A Snapchat spokesperson said: “There will always be people who try to evade our systems, but we provide easy in-app reporting tools and have teams dedicated to building more features, including new parental tools, to keep our community safe.”

Comments

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.