Everyday Misogyny at Scale
How casual harassment becomes digital abuse
This past weekend, my partner and I went to a crowded bar to watch the NFL Bears Playoff game. Within 20 minutes of entering the bar, I was harassed twice. One, when my bag slightly grazed a 65+ year old man, and he thought the best response to me was, “You can rub that bag anyway you want.” Next, when I asked a group of guys at a table with an empty chair if I could take that chair, one 30+ year-old “man” responded, “You can sit on my lap.”
Moments like this are not isolated. They are the everyday ways men show over and over again that women’s bodies are public, commentable, and available for consumption.
So no, it does not fucking surprise me that GrokAI, a company led by Elon Musk, the most misogynistic (and wealthiest) man in the world, was used by other men to undress women and children.
As a grotesque aside: Elon Musk tweeted to Taylor Swift that he will put a baby in her after she endorsed Kamala Harris in 2024. This is not a joke. This is sexual intimidation with rapist energy. And it tells you exactly who feels entitled to women’s bodies online and offline, and the values he brings into his companies.
This is not new. It is just the latest iteration.
Internet technology has been used by men to harass, exploit, and sexually violate women and children for years. Deepfake pornography did not appear overnight. As early as 2017 and 2018, non-consensual deepfake pornography of celebrities like Maisie Williams, Taylor Swift, Jennifer Lawrence, and Emma Watson was circulating on mainstream platforms such as Pornhub.
“Revenge porn” as a practice predates generative AI entirely, with women overwhelmingly targeted. In the UK, reports to the Revenge Porn Helpline increased by 106% in 2023 alone. OpenAI was released to the public in November 2022.
At the same time, the infrastructure enabling this digital abuse has scaled rapidly and unchecked. Cloud platforms like Dropbox were used for years to store child sexual abuse material before meaningful detection and reporting systems were implemented.
So I am not shocked that there is AI-generated abuse openly out there on unregulated internet platforms. It is the inevitable result of technologies and governments built by men, for men, where women’s bodies are treated as objects, and our safety is an afterthought.
There is no protection for any of us.
There is no comprehensive law in the United States that protects a person’s likeness online. When we post photos or videos on Instagram, TikTok, or Facebook, those images can be scraped, reused, and repurposed by individuals, corporations, or AI systems, often without our consent. That is how an AI-generated image of Luigi Mangione’s likeness ended up modeling a Shein shirt.
Only last year did Congress pass the Take It Down Act, making it a federal crime to publish or threaten to publish non-consensual intimate images, including AI-generated ones.
And protections for children effectively end at age 13. The main federal law governing children’s online privacy is the Children’s Online Privacy Protection Act (COPPA) and was passed in 1998. Think about what the internet looked like then. A proposed bill, the Kids Online Safety Act (KOSA), would extend protections to teens on social media platforms. It has not been passed.
Each of us chooses whether we enable abuse, through action or silence, or whether we intervene.
The man at the bar chose to harass me.
His friends chose to snicker like they were in middle school.
I chose to tell my partner instead of staying silent.
My partner chose to intervene.
That moment made something painfully clear: men listen to other men.
So men, step up.
The most telling part of that interaction was the harasser’s response when my partner called him out. He said, “I didn’t know it was your girl.” As if harassment is acceptable as long as a woman doesn’t “belong” to another man.
But the best part was my partner’s response: “Like that makes a difference for what you did?”
In that moment, I didn’t matter to the harasser. I was an object to comment on. But my partner’s refusal to excuse him did matter, because it came from another man.
This is the world women live in. And this is why I don’t care how many women you’re friends with, who raised you, or whether you call yourself a feminist. If you cannot stand up to other men when they are racist, sexist, or harassing women, you are not showing up. Silence breeds permission.
Women are tired. We are living under systems run by men who abuse power, while we are erased, exploited, and assaulted. We are tired of having to plead for our humanity.
For my girlies, I hope you find safety and support in whatever advocacy looks like for you. Sometimes that means calling it out directly. Sometimes it means telling a partner, a friend, a bartender, or security. Sometimes it means walking away from the situation. For me, protecting my body means refusing to stay silent, and refusing to stay around men who joke about women, degrade us, or won’t intervene.
We deserve online and offline communities of women and men who know how to step in, speak up, and protect one another. That should be our baseline.



Thanks for writing this, it clarifies a lot, as this pattern of misogyny isn't just a new bug in AI sistems but a deeply ingrained feature of patriarchal society that technology simply scales up.
The connection between everyday harrasment and digital abuse is so powerfully drawn here. I've noticed the same pattern in tech spaces where casual sexism gets dismissed as joking around, but it directly enables the bigger stuff like deepfake abuse. Your framing of how men need to step up and call out other men realy hit home for me.