Last July, Miss Universe 2018 Catriona Gray marched to NBI to seek the help of authorities in tracking down the creator of a fake nude photo of her which was posted in an issue of local tabloid, Bulgar.
Last December, celebrity and host Maine Mendoza, accompanied by her mother, did the same thing and asked the authorities to investigate and prosecute the people behind her fake sex video scandal.
This January, celebrities Sue Ramirez and Maris Racal took to social media to express their disgust towards the people who maliciously manipulated their photos and posted them online.
What makes these headlines twice as terrifying is the fact that these all happened in the span of just half a year.
The non-consensual circulation of nude photos, whether real or fake, is a 21st-century problem that all – yes, all – women have to deal with. It’s a highly nuanced issue whose elusive solution stems from our differing generational beliefs towards sexuality and femininity.
But this isn’t talking to all generations. This isn’t even talking to any single generation. This piece is not meant to sway people to side with the victims. If you have to be swayed into believing that non-consensually making and spreading nude photos, whether real or fake, is wrong, then this piece certainly won’t be enough.
This is talking to a very specific group of people: the ones whose devices these pictures end up in.
“This can happen to anyone”
That short statement formed part of Sue Ramirez’s longer tirade towards her culprit. She’s right – it can happen to anyone, and it has.
In other parts of the globe, both celebrities and anonymous females fall prey to photo manipulations and “deepfakes,” a type of synthetic media that uses really smart artificial-intelligence or AI to replace a person from a photo or a video with the likeness of someone else.
With high-tech software, either women’s photos get maliciously edited to remove their clothing, or women’s faces are edited over that of an adult actress in explicit sex scenes.
The applications are endless, and so are the malicious intents behind them. These media can be used against women to harass or humiliate them, blackmail them as a revenge porn ploy, or extort money from their victims.
To wit, the number of malicious deepfake photos and videos doubled in just nine months from 2018 to 2019. It also doesn’t help that the software needed for this forgery is essentially free and that any media are virtually untraceable.
This is why anyone – from the A-listers to the most private citizens – can be targeted by these crafty individuals.
Anyone can be involved
But don’t get us wrong: while it only takes one repulsive mind to manipulate the media, it still takes a community to turn it into a devastating scandal.
This is why anyone who knowingly spreads these fake nude photos and videos is also considered a culprit, literally and figuratively.
Literally, because Republic Act No. 9995, or the Anti-Photo and Video Voyeurism Act of 2009, not only punishes those who try to copy, reproduce, publish, or broadcast such photo or video, but also those who share or distribute the same.
And metaphorically, because it doesn’t matter whether you made the media or merely shared it – that you let the malicious content pass through you without the victim’s consent means you let it happen.
Asking friends for the photo “just to look at it” makes you the culprit.
Forwarding it to friends who “just want to look at it” makes you the culprit.
Saving the photo to your private folders “for your own enjoyment” makes you the culprit.
Sharing the photo in your group chats because “you know your friends aren’t like that” makes you the culprit.
Lastly, even if you haven’t had a copy of, let alone seen, the nude photo, but you have friends who do, and you choose to not do anything about it, then you might as well be the culprit.
These crimes against women are as much enabling as they are enacting. So if you see nude photos or videos of women, take Sue’s advice: if you have enough delicadeza and respect, then you do everything you can to not let it spread.