Self-exposed female Twitter users ‘are four times more likely to see nudity’

Twitter has announced that it will begin removing posts from its service that contain unauthorized or unauthorized images posted by members of the public without their permission. The move comes after a five-month-long investigation…

Self-exposed female Twitter users 'are four times more likely to see nudity'

Twitter has announced that it will begin removing posts from its service that contain unauthorized or unauthorized images posted by members of the public without their permission.

The move comes after a five-month-long investigation by the company, which began in January, that found less than half of the images that were uploaded without users’ permission were being removed within 24 hours of being reported by concerned parties.

The company’s general counsel, Vijaya Gadde, announced the decision in a blog post on Saturday, saying: “Our goal is to make this safer, easier and better for you. And that requires us to do more.”

The new policy will be rolled out next week, and will prohibit images posted by a user in the same thread in which they were posted without permission. “It will also proactively seek out content with unauthorized photos before it is posted by the person who uploaded it,” said Gadde.

It’s not all bad news for users: “For the first time, this new update will require members of the public to explicitly consent to seeing an image on Twitter, whether that image is a public image or an image from a private profile,” wrote Gadde.

One person who welcomed the change was former Ghostbusters star Leslie Jones, who has been the victim of image-grabbing on many occasions. She tweeted about it earlier this month: “Hey @Twitter, thanks for removing the peephole images from my streams. Also here’s a link to a Giffoni Film Festival blog post, written by someone named ‘Johnathan Whitehead’. The man featured there ‘harassed’ me and on the day was arrested for harassing me to death. I’ve had to witness a lot of horrible things online, people posting my old unedited pictures, my old pictures ‘fact-checking’ & the visual abuse of fellow followers. Thanks!”

But others have not been so pleased. Student filmmaker Lauren Charlton told Buzzfeed News that one of her entries for the television program Blackish would not be seen on the show because “it had a naked woman dancing in a sexy pose next to her boyfriend”.

The network told Buzzfeed, in a statement, that “Every time a new series begins production, we require a showrunner, writer and supervising producer to submit an initial list of characters for consideration, story ideas and scene descriptions for consideration. Only a small percentage of those are ultimately chosen for production.”

‘Don’t keep putting nice words in her mouth’: Ghostbusters actress criticises social media abuse Read more

Charlton, 19, said her entry was removed because it “depicted an obvious, positive moment in a relationship between two people who went public with their relationship”.

However, Twitter may have a problem. There are over 5m self-exposed pictures of women on the social network every month. One of the figures used by Instagram to prove that it has been more concerned about protecting its female users’ images than other social media platforms was a study by the Geena Davis Institute, which surveyed 300 Twitter users for 50 hours and found that they were four times more likely to see nudity posted by men compared to women.

Leave a Comment