top of page

Life after image-based sexual abuse

Digital technologies were meant to be a great equaliser, but what happens to victims whose private sexual images are created and shared without their consent?

October 2021
to About

The present and future of sexual abuse

 

 

 

Nothing is ever deleted from the internet or so the saying goes. Whether it is meant as a warning or a joke, this statement is considered to be a fact. If an image or a video is uploaded online, it is nearly impossible to remove it. The same applies to an image captured by a smartphone, whether or not it is uploaded to cloud storage, it exists on that device. Modern computers may have a trash function but there is always a record that the image existed. Nothing is truly deleted. 

 

This reality haunts many victims of image-based sexual abuse (IBSA), which refers to the creation or distribution of private sexual images, including threats. Other terms, such as revenge porn or spy cam porn, are more commonly known but are misleading. These acts are a form of sexual abuse because they are done without consent and can have devastating effects on victims. One of those being the complete and total loss of control of their images. 

 

It is time we recognise these acts for what they truly are, sexual violence. It is time we understand the seriousness of these abuses and their effects on victims. We can no longer pretend what happens in the digital world is separate from the physical. Both are real life, worlds that we inhabit, and everyone deserves to do so in safety. 

 

Follow a victim's journey below. These stories are based on research but fictional to prevent harassment to victims.

 

Choose a story

 

Avatar 108
Avatar 102
Avatar 106
Avatar 103
to Services

Navigating the world wide web
of image-based sexual abuse

“Ours is a [cyber] world that is both everywhere and nowhere,
but is not where bodies live.”

John Perry Barlow, "A Declaration of the Independence of Cyberspace," 1996

Shadow

In 2010, Hunter Moore created a website that popularised the sharing of non-consensual sexual images and videos online and led to him being dubbed by Rolling Stone “the most hated man on the Internet.” Is Anyone Up? encouraged users to post explicit images of people without their consent, often accompanied by the victim's private personal information such as names, addresses, and links to social media accounts. Some victims said their images were initially shared with a partner who post-breakup shared them on Is Anyone Up? for revenge and humiliation, leading to the circulation of the term revenge porn. Others claimed their images were stolen by unlawful hacks into their email accounts prior to being posted on the website. 

 

Moore was brash when confronted or criticised. He refused take-down requests, responded “LOL” to cease and desist letters, and blamed victims, saying, “I don’t know how you can point your finger at me; you took the picture.” 

 

The infamous site was shut down after the FBI arrested Moore in early 2014 for paying a colleague to hack into women’s email accounts for the purpose of stealing nude photos to post to the site. He pleaded guilty in 2015 and was sentenced to two and a half years in prison, but before Is Anyone Up? was taken offline in 2012, it reportedly received over 300,000 visitors a day and made Moore as much as $30,000 dollars a month.

 

It seems Moore was right about one thing that he said in a 2012 TV interview, “Somebody is gonna monetize this, and I was the person to do it.” Clearly, despite moral outcries and denouncements, there is a demand for non-consensual sexual imagery. 
 

The Through line
 

While it is unlikely Is Anyone Up? was the first revenge porn site, it certainly was not the last. In the past decade, reports of revenge porn have increased as well as other scenarios where sexual images were created or shared without consent. 

 

In 2012, a 16-year-old girl was raped in Steubenville, Ohio by two high school football players and witnessed by peers who recorded and photographed the acts and shared them on social media with mocking and degrading comments about the victim. 

 

Two years later, over 400 sexual images of female celebrities were hacked and went viral online. The incident became known as Celebgate or The Fappening and led to an early outcry that so-called leaked images of famous women are nefarious rather than newsworthy. 

 

In 2017, British woman Gina Martin was at a festival when a man took a photo of her underwear from under her skirt, and after the police responded that they could not do anything because the act was not illegal, she campaigned until upskirting was made a criminal offense in the UK. 

 

That same year, deepfakes burst into the public's consciousness with threats that AI-generated fake videos of politicians could disrupt international relations, such as a faked video of then President of the United States, Donald Trump, goading the North Korean leader, Kim Jong-un, into a nuclear war. While this fear and others less dramatic has yet to be realised in a major way, the same technology is used daily to abuse female celebrities and non-famous women by face-swapping their image into a pornographic video to convincingly appear as if it were the victim performing in the video. 

In South Korea, a series of rallies were held to protest against spy cam crimes and voyeurism, referring to the non-consensual filming in public or private spaces. Online communities like Singapore’s Nasi Lemak Telegram group were eventually shut down and exposed for sharing sexual images of women captured without consent with over 44,000 members of the chat. 

 

There are many more examples, and each share a through line: more people are being sexually abused through images and videos. 
 

It's not about revenge or porn

To understand sexual abuse it is helpful to think of categories. There are types of physical sexual violence that occur in-person, such as rape. In the recent decade, we are seeing more of what is called technology-facilitated sexual violence, which can be perpetrated in-person or online. One example is cyberstalking, which refers to a broad list of ways a person uses the internet to harass, intimidate, or stalk a person, such as through threatening messages. Image-based sexual abuse (IBSA) is a form of tech-facilitated violence and is defined as “the non-consensual creation or distribution of private sexual images.” It also includes threats to create or share images, which is increasingly common in domestic violence situations. 

 

Image-based sexual abuse (IBSA) may be a lumbering name, but it is emerging as a preferred term because it immediately identifies these acts for what they are, sexual abuse. There is an absence of consent and intense harm caused to the victims. Perhaps unsurprisingly, it is largely women and members of minority groups who are victimised by IBSA. It should go without saying, but for the record, men are victims too and their experiences should not be discounted. This is a global problem affecting all ages, genders, identities, and races.

 

Terms matter and are the first step in understanding how and why this abuse is perpetrated. Calling it revenge porn is particularly problematic because it focuses on the motive of the abuser rather than the harm to the victim. Abusers report a variety of motives, including circulating images for financial gain, for ‘a laugh’, to win praise within a friendship group, or to control or harass. The idea of revenge as a main motivator inherently blames victims as if they did something to deserve the abuse. Calling this kind of abuse porn conflates all adult content as non-consensual, which is inaccurate, and incorrectly implies that victims intended for their images to be consumed by a wide audience. 

Other terms and types of abuse that fall under the IBSA umbrella are recorded rape tapes, upskirting, voyeurism, sextortion, and deepfakes. This list will likely grow as technologies evolve, just as we can trace the proliferation of this technological abuse to the mass adoption of the internet and smart phones which lowered the barriers for perpetration. A violating picture up a skirt on an escalator was not possible with a polaroid, just as a sexual video could not be shared among thousands of viewers by snail mail without great expense. 

The effects on victims

Even as more victims of image-based sexual abuse come forward to share their stories, a pernicious idea persists that so-called leaked images are not a big deal. Victims are often harassed online or even in person after their images go viral, and yet they are told to simply log off, ignore the comments, or delete their social media accounts, as if that could stop their peers or strangers who have viewed the victim’s non-consensual imagery from contributing to the torment. There is no separating the cyberworld from our real lives anymore. No one knows this better than IBSA victims. 

 

Image-based sexual abuse is more than a privacy violation. It is a complete loss of control of one’s identity. If a victim’s images are released online, it is nearly impossible to have them scrubbed from the internet. Even if their images were not posted, traces remain on the devices that captured the footage or were used to view or download it. Because these photos and videos cannot be deleted and lead to harassment of the victim if shared, non-consensual imagery makes easy fodder for abusers to make threats or extort. This further controls the lives of victims, who tumble through this never ending cycle, living as if afflicted with an incurable disease, watching their private sexual images being reposted, shared, commented on, and consumed.  

 

As one victim said in a report, "[Image-based sexual abuse] impacts your sense of self on every level.” Many victims report similar traumatic symptoms as sexual assault victims, including emotional distress, intense feelings of shame and guilt, self-blame, post-traumatic stress disorder, depression, anxiety, and suicide ideation or attempts. They may experience a loss of trust in relationships and for many, their losses extend to career opportunities. It is much harder to change jobs when employers rely on google background checks. If a victim’s non-consensual imagery has ended up on porn sites, it is easy to misjudge an abuse victim for a pornographic actor, both of whom suffer stigma in the job market. 

 

What can victims do? Of course, they can delete their social media accounts, go to the police, sue their abuser, report their images to the hosting websites, or call a helpline. These are options, but currently no single solution has the power to give back control of a victim's images. 

 

Victims can legally change their name, change jobs, move cities, dye their hair, get a nose job or some other appearance altering treatment in an attempt to distance themselves from their online profile. For a time, some isolate themselves from the online and physical worlds, essentially silenced by their harassers and fearful of speaking out. This is the paradox of IBSA victims: those who attempt to shine a light on their plight inadvertently draw attention to the same images they would like to disappear.

Hopes and dreams of the cyberworld

Early technology adopters and innovators held high hopes about the internet and smart devices, that these tools would foster connections, drive new commerce, and act as an equaliser among all people. After a few decades in practice, the latter seems obviously misguided. Where the internet was supposed to create a cyberworld free of privilege, instead people across the globe brought their prejudices to the platforms. 

 

Image-based sexual abuse is one example. While these new technologies may have made such abuse easy to perpetrate, the inclinations to do so and the demand to consume illicit content already existed. This is why when it comes to IBSA solutions, there is no single answer. Yes, it involves new laws, better technology to track non-consensual imagery, and conscious social media policies. It requires law enforcement to work beyond jurisdictions and internationally when websites that host such content are distant from the victims. We need a cultural change, a shift in perspective that requires users to think twice before clicking, sharing, or creating images without consent.  

 

It is time we accept this issue as a type of sexual abuse and recognise the harms it causes to victims who struggle to regain control over their lives. The internet and our devices are not so separate from our real lives. They exist as a real space, a place where bodies live and exist to be consumed, forever and ever circulating.

to Work

About the project

 

[Can't] Delete is a work of journalism in conjunction with four victim profiles that are works of fiction. This is an intentional approach to innovate the way journalists cover stories of abuse. Victims of image-based sexual abuse are particularly susceptible to harassment after their case is made public or reported by the news media because inevitably readers go looking for the very images deemed to be abusive. To avoid causing additional harm to victims, these stories are derived from research reports and victim statements and are intended to be interpreted as plausible scenarios.

Created by: Josie Gleave

If you are experiencing image-based sexual abuse, related online abuses, or privacy violations, know that you are not alone and that there are resources that can help. Please reach out to the organisations below for support.

Australia: Office of the eSafety Commissioner

United Kingdom: Revenge Porn Helpline; Victim Support

United States: Cyber Civil Rights Initiative 

bottom of page