Six months ago, pilot Hana Khan saw her photo on an app that appeared to be auctioning dozens of Muslim women in India. The app was quickly taken down, no one was charged, and the issue was dropped – until a similar app emerged on New Years Day.
Khan was not on the new app called Bulli Bai – an insult to Muslim women – which sold activists, journalists, an actor, politicians and Nobel laureate Malala Yousafzai as maids.
Amid growing outrage, the app was withdrawn and four suspects were arrested this week.
The bogus auctions that have been shared widely on social media are just the latest examples of how technology is being used – often easily, quickly and inexpensively – to put women at risk through human abuse. online, theft of privacy or sexual exploitation.
For Muslim women in India who are often abused online, it is a daily risk, even as they use social media to speak out against hatred and discrimination against their minority community.
âWhen I saw my photo on the app, my world shook. I was upset and angry that someone could do this to me, and I grew more and more angry when I realized that this person without name was doing well, âsaid Khan, who filed a police complaint against the first app, Sulli Deals, another derogatory term for Muslim women.
“This time I felt so much terror and despair that it was happening again to my friends, to Muslim women like me. I don’t know how to make it stop,” said Khan, a professional pilot. in their thirties at the Thomson. Reuters Foundation.
Mumbai Police said they were investigating whether the Bulli Bai app was “part of a larger plot.”
A spokesperson for GitHub, which hosted the two apps, said it had âlong-standing policies against content and behavior involving harassment, discrimination and incitement to violence.
“We have suspended a user account as a result of investigating reports of such activity, which violate all of our policies.”
Advances in technology have increased the risks for women around the world, whether it’s trolling or doxxing with their personal information revealed, surveillance cameras, tracking, or fake porn videos that contain forged images.
Deepfakes – or artificial synthetic media generated by intelligence – are used to create porn, with apps that allow users to undress women or swap their faces for explicit videos.
Digital abuse of women is pervasive because “everyone has a device and a digital presence,” said Adam Dodge, CEO of EndTAB, a US-based nonprofit that fights abuse technological.
“Violence has become easier to perpetrate because you can hit someone anywhere in the world. The order of magnitude of the evil is also greater because you can download something and show it to the world in seconds. “, did he declare.
âAnd there is permanence because that photo or video exists online forever,â he added.
The emotional and psychological impact of such abuse is “just as excruciating” as the physical abuse, with effects compounded by the virality, the public nature and the permanence of the content online, said Noelle Martin, an activist from Australia.
At 17, Martin discovered that his image had been edited into pornographic images and distributed. His campaign against image-based abuse helped change the law in Australia.
But victims are struggling to be heard, she said.
“There is a dangerous misconception that the harms of technology-facilitated abuse are not as real, serious or potentially fatal as abuse with a physical element,” she said.
“For victims, this misconception makes speaking out, seeking support and accessing justice much more difficult.”
Tracking lone creators and rogue coders is difficult, and tech platforms tend to protect anonymous users who can easily create a fake email or social media profile.
Even lawmakers aren’t spared: In November, the US House of Representatives censored Republican Paul Gosar for a photoshopped anime video that showed him killing Democrat Alexandra Ocasio-Cortez. He then retweeted the video.
âWith any new technology, we should immediately think about how and when it will be misused and armed to harm girls and women online,â Dodge said.
âTechnology platforms have created a very lopsided atmosphere for victims of online abuse, and the traditional ways to seek help when we are hurt in the physical world are not as available when the abuse is occurring. online, âhe said.
Some tech companies are taking action.
Following reports that its AirTags – tracking devices that can be attached to keys and wallets – were being used to track women, Apple launched an app to help users protect their privacy.
In India, women on auction apps are still shaken up.
Ismat Ara, a reporter featured on Bulli Bai, called it “nothing less than online harassment”.
It was “violent, threatening and intended to create a sense of fear and shame in my mind, as well as in the minds of women in general and the Muslim community,” Ara said in a police complaint that she posted on social media.
Arfa Khanum Sherwani, also featured for sale, wrote on Twitter: âThe auction may be bogus, but the persecution is real. “