Vol. 2 When is a Woman Just a Woman? Digital Body Theft in the Age of 'Innovation'
Silicon Valley Calls it Deepfake Technology. Let's Call it What It Is: Algorithmic Sexual Violence.
A Facebook message at 2:17 AM.
The cursor blinks in sync with the neon sign outside—a defunct WeWork logo bleeding INNOVATE in sodium-vapor red. Lucas Kowalski. We'd dissected frogs together in 10th grade biology, shared exactly one awkward slow dance at junior prom, his hand too hot against my back. We hadn't spoken since he liked a photo of my dog in 2019. The kind of person who exists in the periphery of your life until suddenly, catastrophically, they don't.
Just a link. Pornhub.
The notification pulses in my message box like a tumor on my screen. The 'h' in the logo looks wrong somehow, twisted into a shape that makes my teeth ache.
I know the protocol. I wrote it. Five hundred words on digital security for WIRED, another thousand for TechCrunch. Don't click suspicious links. Report. Block. Move on. But something in this message feels...wrong.
Like fingernails scraping the back of my neck.
Like static before a storm.
Like the way mercury moves under skin.
"Hey, you got hacked," I type, the kind of casual message you send when you're trying not to let anxiety crawl up your spine. The kind of denial that comes before recognition. Before the fall. Before the unmaking.
Three dots appear immediately. It's 2:17 AM. Still 2:17 AM. The clock hasn't moved. Lucas is waiting.
Time unspools its spine; space forgets its clean design. The wallpaper behind me breathes softly, like a living thing remembering its own death. Somewhere, a girl is being turned into data, into desire, into something that glows in the dark of men's screens.
"Not hacked. Thought you should know. Look what you've become."
The room tilts. My reflection fragments in the darkened window, but the pieces don't quite fit together anymore. Hair still wet from a shower I don't remember taking. Face bare except for yesterday's mascara, which is slowly crawling up my cheeks like black spiders. The woman in the glass blinks when I don't.
The neon sign pulses outside: INNOVATE INNOVATE INNO_ATE IN_OV_TE I_NO_A_E _NOV__E. Each flash reveals a different version of me in the glass, each less mine than the last.
They wear my features like ill-fitting masks, stretched wrong across bone and sinew. Their smiles split their faces like fresh wounds, too wide, too hungry. One bares teeth arranged in an order teeth should never be. Another's eyes roll back to white, then forward again, but what returns isn't iris or pupil—it's code, endlessly scrolling. I look for myself among them, but every reflection knows something I don't, and their grins suggest I'm the last to get the joke.
My fingers hover over the link.
Click.
Buffer wheel. Loading. The bandwidth is always worse after midnight, as if the internet itself is tired of carrying our sins. My hands look almost translucent in the screen's glow, belonging to someone else already. Everything else of mine is about to.
Then I see it.
It's me, but it's not me. The same face I've arranged into professional smiles for LinkedIn headshots and casual grins for Instagram. But this face is doing things my face has never done, in a place my body has never been. Digital vivisection. Algorithmic assault. A violence factory's latest product.
Views: 147,392
Uploaded: 3 days ago
Username: yourealwayswrong_abouttech
The comments writhe like maggots: "Is this really her?" "IS THIS REALLY HER?" "That tech journalist chick?" "Someone send this to her editor lol" "Got more. Check DM."
I close the browser but it's too late. You can't unsee yourself being unmade. Can't unwrite the algorithm that learned to wear your skin.
Moths tap against the window in binary, their wings carrying whispers of all the girls who've disappeared into screens, transformed into light and longing and something that hurts to look at. We are all just data now, our bodies translated into ones and zeros, our consent evaporating like morning dew—
My phone buzzes again. Different number, same threat. A new link glows against the dark like a cursed sigil.
2:19 AM now. As my mascara-stained eyes stare at the screen which promises worse to come, I understand—this is how it starts. Not with a bang, not with a whimper, but with a notification in the dark and the slow realization that your body is no longer your own.
It was never about technology. It's always been about power. Always.
The Sanitization of Violence
What once required sophisticated video editing suites and teams of skilled professionals now runs on a MacBook Air. The democratization of abuse, wrapped in the familiar language of tech progress. Y Combinator's motto "Make Something People Want" takes on a darker meaning when what people want is the power to violate consent at scale.
They call them 'deepfakes' – a term laundered through focus groups until the violence became palatable, another Silicon Valley portmanteau designed to make violation sound like innovation. The same industry that gave us disruption instead of deregulation, move fast and break things instead of externalize risk onto society, now offers us "deepfake" instead of what it really is: algorithmic sexual assault.
The mechanics work through something called Generative Adversarial Networks (GANs), two AI systems locked in algorithmic combat. One creates increasingly convincing forgeries while the other attempts to detect them, each making the other sharper, more sophisticated, more dangerous. In 2014, they were an academic curiosity. One paper, one possibility. By 2021, 658 papers. Each publication making violation more efficient, more scalable, more profitable.
It's machine learning as an arms race, with women's autonomy as collateral damage.
What the venture capitalists don't put in their pitch decks is how efficiently they've automated misogyny. "We're democratizing content creation." As if democracy ever meant the right to violate. Each funding round builds better tools for turning women's images into raw material, their consent transformed into an outdated protocol, merely a point of friction to be reengineered.
Somewhere in South Park, a 22-year-old in a Patagonia vest is showing his demo reel demonstrating how easily their algorithm could swap faces in videos. "Think of the applications in Hollywood," he said, without irony, without awareness, without understanding that he'd built a weapon and called it a tool.
He stands there, explaining to his accelerator cohort how this technology will "change the world." He's not wrong.
The Economics of Exploitation
Every social media post becomes potential training data, all of it feeding algorithms that learn to duplicate and manipulate women's images with increasing precision. The barriers to entry aren't technical anymore—they're moral constraints. And those, it turns out, are much easier to bypass than any security protocol.
A 2023 report found that 98% of all deepfake videos are non-consensual pornography. Of those, 99% target women.
The scale isn't just industrial—it's weaponized:
95,820 deepfake videos online
98% pornographic
99% targeting women
550% increase since 2019
303,640,207 views and counting
But statistics can numb us to reality. Let's be clear: each number represents real women, real violation, real trauma. This isn't a technological inevitability—it's a choice we've made as a society, one view at a time.
The same tech industry that hosts "Women in Tech" panels and posts black squares on Instagram has built and funded the most efficient system of gender-based violence since the patriarchy itself.
This is what happens when you optimize for engagement without ethics. When you move fast and break women.
According to Security Hero:
~ 15 deepfake creation community websites and forums, boasting a collective membership of over 609,464 individuals
42 user-friendly deepfake tools with roughly 10 million monthly searches
One in every three deepfake tools allow users to create deepfake pornography
It now takes less than 25 minutes and costs $0 to create a 60-second deepfake pornographic video of anyone using just one clear face image
The Case Studies: Taylor vs. Taylor
"I keep thinking if I just explain it clearly enough, someone will help," Taylor tells the camera in Another Body, her voice steady even as her hands shake. The 2023 documentary follows the college student's methodical quest for justice after discovering her face digitally grafted onto pornographic videos. She documents everything—the initial discovery, the rising view counts, the university administrators and police shifting uncomfortably in their chairs as they explain there's nothing they can do.
She approaches her own violation like a research project, as if perfect documentation might somehow lead to justice. Spoiler alert: it doesn't.
Then came Taylor Swift.
In January 2024, artificially generated pornographic images of Taylor Swift flooded X (formerly Twitter), reaching 47 million views in just 19 hours. The platform's response was predictably inadequate – too little, too late, with exhausting regularity.
She had everything a target of an attack could want. Cultural capital, legal resources, technical teams, platform connections, and squadrons of Swifties willing to mass-report violations. She had a voice loud enough to reach Congress.
And still, the images spread faster than they could be contained. Millions saw them. They exist somewhere on hard drives and servers, ready to resurface.
This is what violation still looks like even when you have every possible advantage.
Place these stories side by side and a horrifying pattern emerges. One Taylor had global fame, unlimited resources, and maximum attention. The other had a laptop and hope that the system would work. Different circumstances, same outcome. Women discovering their bodies are no longer their own.
What makes these cases important isn't just what happened to two women named Taylor. It's what they reveal about the system: resources don't prevent violation; they only manage aftermath. Perfect evidence doesn't lead to justice. Platforms remain unaccountable. Public attention doesn't change systemic issues.
The Legal Labyrinth
The legal system treats algorithmic violence like a 20th century crime. Virginia became the first state to criminalize deepfake pornography in 2019, offering victims the illusion of protection. California followed, dangling $150,000 in civil penalties… meaningless against anonymous creators using VPNs and crypto wallets. By the time a lawyer files the paperwork, the damage has replicated across a thousand servers in a dozen countries.
The platforms that host this content hide behind Section 230 of the Communications Decency Act – a law written when the internet meant AOL chat rooms, now twisted into a shield for industrial-scale violation. The immunities that were meant to protect free speech now protect something else entirely. Virality drives profit, even when that virality comes from violation. Without liability risk, platforms have little motivation to rapidly remove deepfakes.
When the Taylor Swift deepfakes went viral, Congress introduced the DEFIANCE Act, promising victims the right to "finally defend their reputations and take civil action against individuals who produced, distributed, or received digital forgeries" (Ocasio-Cortez, 2024).
But legislation moves at the speed of bureaucracy while AI moves at the speed of light. By the time a law passes committee, the algorithms have learned new ways to violate. By the time it reaches the floor, they've learned to evade detection. By the time it's signed, they've evolved beyond its reach.
This isn't a failure of enforcement. It's a feature of the system:
Local laws trying to police global networks
Individual victims fighting industrial-scale violation
Analog remedies for digital violence
Until we fundamentally restructure platform liability—making them responsible not just for removal but for prevention—we're asking victims to use restraining orders against algorithms. Until we create international frameworks for digital violence, we're telling women their protection ends at state lines. Until we remove platform immunity for automated assault, we're saying their consent matters less than engagement metrics.
The algorithms don't care about jurisdiction. But the law should.
The Unmaking
The crisis of deepfakes reveals something fundamental about our relationship with technology: we've built tools that amplify our worst impulses while shielding their creators from consequences. We've created systems that turn violation into venture capital, assault into engagement metrics, trauma into quarterly earnings.
Until we address the underlying power structures that make digital violence not just possible but profitable, we'll continue playing an endless game of technological whack-a-mole, always one step behind the next innovation in violation. The platforms propose better filters while their algorithms learn new ways to violate. They promise faster takedowns while their engagement metrics reward rapid spread. They offer better detection while their tools perfect deception.
You can't moderate your way out of a system designed for violation. You can't filter away a technology built for assault. The question isn't how to make this technology safer. The question is why we built it at all.
It's about power, consent, and who bears the cost of "innovation."
In the time it took you to read this:
Someone created another deepfake
Someone discovered their digital twin living a life they never consented to
Someone's platform claimed it was doing its best while doing nothing at all
Someone's algorithm got better at violation
Someone's venture capital funded the next generation of assault
Someone decided profit matters more than consent
These aren't just statistics. They're not just case studies. They're warning signs.
The algorithms don't care about consent. But we built the algorithms. We funded them. We optimized them. We deployed them.
We can unmake them too.
But first, we have to name them for what they are: not content generation tools, not creative AI, not democratized technology.
They're weapons. And they're working exactly as designed.
Such a powerful piece. Very important perspective!!! It has me wondering: how can we create a safer, friendlier, more communal, less extractive, less toxic digital world?
Wow. Thank you for sharing this information in a way that makes this visceral. There is work to be done.