IWD: Rights, justice, action in the digital world
- 2 days ago
- 3 min read

BY MAJOR STAR CONLIFFE*
Imagine waking up one morning to find that someone has posted all over the internet pictures of you without your clothes on.
This is exactly what happened a few months ago to thousands of women.
They went online to find that their photographs had been used by Elon Musk’s Grok chatbot to generate sexualised images of them which were then posted publicly on the X platform. The New York Times reports that between 31 December 2025 and 8 January 2026 (just nine days), Grok generated and “posted more than 4.4 million images, of which at least 41 percent were sexualised images of women and children”. These were all non-consensual images which Grok had been asked to create and post to the X platform, mostly by men.
The public was rightly outraged and governments around the world opened investigations. Some have announced new laws designed to protect people from artificial intelligence (AI) image abuse. Musk responded to the scandal by stating that Grok would block users from making illegal images; however, journalists report that paid subscribers can still use Grok to create them.
AI deepfakes disproportionately harm women, especially women with public profiles such as politicians, journalists and celebrities. One of the women who was targeted by AI image abuse is Paris Hilton. She is a vocal activist and advocate for better laws that protect everyone from AI image abuse. In a recent press conference, Hilton talked about how this abuse affected her and other victims: “Too many girls are afraid to exist online, or sometimes to exist at all. And I know how that feels because I’ve lived it … This isn’t about just technology, it’s about power. It’s about someone using someone’s likeness to humiliate, silence and strip them of dignity.”
The Grok scandal is just one example of how all forms of online violence towards women are escalating around the world. And it leaves me wondering, how is it possible that this kind of public abuse of women is still so common in 2026? Why are governments still struggling to regulate technology companies and make them accountable for how their platforms facilitate online violence?
The issue is twofold. Not only is there a lack of women and other minorities working in tech companies, but governments and companies simply do not listen to women. Their warnings that there are not enough safeguards in place to stop AI being used as a tool to abuse women were ignored until too late.
Ironically, the purpose of online abuse is to frighten women into silence. When the online abuse is not effective at shutting women up, perpetrators will then move to offline violence (primarily stalking), to frighten victims out of the public conversation.
The online to offline violence connection is so serious that some women journalists have increased their physical security measures. Adding to the distress that women experience from online violence, they are often blamed for their own abuse and expected to moderate their online presence to prevent the abuse from happening, an impossible and unfair task.
The theme of this year’s International Women’s Day is ‘Rights. Justice. Action.’ It highlights the reality that around the world, women hold only 64 per cent of the legal rights that men have, and that countries are failing to close these legal gaps. I hope for a world in which governments and tech companies listen to women’s concerns and experiences before even more harm is done. And I dream of an online world that is safe for the equal participation of all women and girls.
*Major Star Conliffe is an Australian Salvation Army officer (pastor) currently serving in South Korea






