March 8, 2025 - Comments Off on Why Women Fight the Algorithm?
Why Women Fight the Algorithm?
By Anmol Irfan
Hana*, a Karachi-based writer, was sitting with her brother watching reels with him as he scrolled through Instagram when she realised that more than a few videos were focused on misogynistic jokes and a number of Andrew Tate clips popped up. Each time, her brother scrolled away, and she shared that as far as her own interactions with him went, he’d never expressed any support or liking towards this kind of misogynistic red-pill content which is often associated with highly controversial or misogynistic viewpoints, particularly within the "men's rights" movement, where it can involve conspiracy theories and accusations against feminism. “If anything I’d say we have very similar interests, and yet when I looked deeper into it, our feeds were completely different,” she says.
It’s no secret that Pakistani society is extremely misogynistic - and that misogyny has spread onto the country’s digital spaces equally fast. But it’s not just the societal mindset that promotes misogyny online, it’s the way social media algorithms promote certain biases. “There was a time when YouTube shorts would only show me right-wing, red-pill content. Given that my views are more left-leaning, I was quite surprised and, often, baited into being angry by some of these audacious clips. I do feel like my Instagram and TikTok feeds are still more curated to my interests,” says journalist and content creator Sajeer Shaikh.
Pakistani women have dealt with extreme cases of misogynistic and sexist content over the last few years - with women journalists or those in public positions being specifically targeted. But it isn’t even about targeting a specific woman - rather that constant barrage of hate on the country’s feminist movements, and anything related to women in general. Just last year, users on X (formerly Twitter) blamed feminism for the fact that there were apparently 10 million Pakistani women over the age of 35 who weren’t married.
Due to the prevalence of misogynistic content online, and how fast it goes viral, Hana* points out that she’s often seen it recommended to her as well, but the prevalence of it on her brother’s accounts was far more. While the prevalence of misogynistic content is one thing, Hana* says she also struggles to find feminist content on her feed and wonders whether it gets less reach, or whether it’s simply not available in her spaces.
“Pakistani women journalists make very little content on this [serious content],” says Lubna Jerar Naqvi, a journalist and IFJ’s Pakistan Gender Coordinator, while adding, “Because it’s not considered to be serious journalism. We see all kinds of soft content catering to women like fashion & makeup which is a good thing, but hard-hitting issues get overwhelmed.”
With all of this to fight against, and the added cultural nuances that make it that much harder to navigate this in online spaces, Pakistani women, and South Asian women in general have had a difficult time finding safe spaces online, but they’ve also learned how to navigate the algorithm to create those spaces for themselves where possible and get their content out there.
Global Algorithmic Bias
Many of the issues Pakistani women have to fight against are a trickle-down from worldwide trends and algorithmic patterns put into place by social media companies and global data. Hana’s isn’t an isolated case. Research by Monash University last year showed that misogynistic content was recommended to accounts identified as male regardless of whether or not they actually searched for it. Similar studies carried out by UCL showed an alarming increase in misogynistic content being recommended to young boys in as little as 5 days after the accounts signed up to TikTok.
It’s not just about a specific kind of content either. Gender bias in social media algorithms can also manifest itself in more subtle ways. A Global Witness report from 2023 showed how certain ads were shown to certain genders through Facebook’s algorithm, despite the employers not specifying gender.
It’s not Facebook alone. Twitter’s list of suggested topics to a user in 2022 was clearly biased against women, showing how the algorithm curates an anti-women or more misogynistic online experience
And as Meta’s new policies come into play - that fight may be getting more difficult. Under its latest changes to hate speech and fact-checking policies, Meta now allows the objectification of women. Rules that prevented people from comparing minorities to objects or referring to women as household objects or property” have been removed. This comes at a time when Trump’s new government is already making it that much harder for women and gender minorities to feel safe and supported, and “trad wife” content is getting more and more popular online.
All of this adds up to have a very direct impact on the mental well-being and decisions of young women and girls as they continue to be shaped by the content they see online. In a UNESCO report last year, titled Technology On Her Terms the agency warned that algorithms on social media promoted content that led to young girls feeling worse about themselves.
“This exposure can have particularly detrimental effects on girls’ self-esteem and body image. In turn, this impacts girls’ mental health and well-being, which are essential for academic success,” the report said.
Facebook’s own research shows girls reporting that when they felt bad about their bodies, social media content made them feel worse. In many cases, this algorithmic bias against women also extends to activists trying to do good. In 2021, a well-known Somali women’s rights activist, Hanna Paranta’s account was restricted after anti-women’s rights activists launched a campaign against her. While Paranta’s account was eventually restored, her blue badge was not and her reach and impact severely decreased as a result of this ban. Yet when it comes to the opposite side, misogynistic hate speech and attacks often fly under the radar because companies like Meta and X simply do not have the resources to track hate speech sufficiently in non-English languages.
Pakistani Misogyny Gets Amplified Online
With so much of Pakistani content online being created in Urdu, Roman Urdu or regional languages, it’s no surprise then that hate speech and problematic statements don’t come under social media checkers’ radars. But surprisingly, it doesn’t go away even when users do try to flag it. “I personally don't go looking for red-pill content but it pops up itself. When I mark it as spam, there is no real action taken. I still see it,” Shaikh shares.
Pakistani misogynists in particular love to objectify women - comparing them to lollipops, cars, and countless other items in an effort to promote their own distorted views of gender roles. With Meta’s new policies, this already existing hate speech now gets a bigger space to thrive. A 2024 study published in Social Media + Society indicates that moderation decisions tend to favor mainstream perspectives by 35%, which often sidelines minority and dissenting viewpoints, and in Pakistan that includes any women or gender minorities who dare to make the effort to talk about gender online.
“My reach has actually been killed due to non-stop Palestine coverage, and possibly even feminist content. My account cannot be seen by people under 16 too. I've received strikes and also a warning for being ineligible for monetization on Instagram,” Shaikh adds, and yet the misogynistic content and hate speech targeted towards women is allowed to slide without any checks and balances.
In November last year, the media organisation Uks Research Resource Centre hosted a webinar on ‘SafeWords: Combating Sexist Abuse in Urdu and Punjabi.’ They shared research by the organisation that found that only 25% of women in Pakistan have access to the internet and many continue to avoid being online due to the abuse they face.
Mobile gender gap in South Asia
And as the creation of content like this increases in South Asia, it creates a cycle of misogynistic content because it feeds into the algorithm. The more users create and engage with this content, the more algorithms learn that this content is popular and promotes it to more users. It’s why more women shut themselves off from digital spaces, and feel their content has no value or doesn’t make a difference, the more misogyny online will increase.
“A lot of people in South Asia for instance tend to genuinely believe feminism is Western propaganda or that it’s about hating men, when in reality, it’s just about challenging systemic inequalities that harm everyone. Because outrage can drive engagement, reactionary narratives get pushed to the top of people’s feeds, making it harder to have real conversations about gender justice,” says PhD researcher Shirin Naseer who also co-authored the article Gender-Based Violence in Pakistan’s Digital Spaces.
Of course, even creating content that fights the algorithm isn’t easy, and requires many content creators to work around the algorithm.
“They use alternative spellings like “f3minism” or “GBV” instead of writing out “gender-based violence” to avoid shadowbanning. We see people using trending audio or memes to package serious topics in a way that algorithms favour,” Naseer says, adding, “Another thing that happens is intentional engagement strategies—women’s rights communities will like, comment, and share feminist content in a coordinated way to push it up in the algorithm. Some content creators also subtly embed feminist messages into various forms of lifestyle content so that they can fly under the radar and get more eyes. It’s all about being more intentional and understanding the system better to work with it. “
But even then it’s not an easy fight. “As someone who had a semi-camera-facing job and vocal, I have had to deal with a lot of abuses, threats, and derogatory remarks. It would upset or enrage me at one point in time. Eventually, I just began to see the humor in it. Don't get me wrong, there's nothing funny about the things being said. But, you have to find a defense mechanism, I suppose,” Shaikh shares.
It’s also why Naqvi doesn’t fault women who create content that works for the algorithm, and don’t go after the so-called “serious stuff”. After all, for many people who depend on social media content for their income, making sure they’re working with the algorithm is crucial. “We have so many content creators but they’re not focusing on these things, they’re looking at trends and I don't blame them,” she says, adding that instead, organisations need to focus on training journalists who do work on these topics to gain the skills needed to create content that can navigate the algorithm.
“Yes reach for feminist is an issue, but people living outside Pakistan are making a lot of gender-based content and that reaches us, so I think the issue is that we are also unable to make hard-hitting content,” says Naqvi, and she’s not wrong. Many content creators in other parts of South Asia and even South Asian women abroad are able to find reach and engagement on content that revolves around gender justice, sexual and reproductive health and other seemingly taboo topics in the region. But they do so by balancing the fine line between content and accessibility. Getting the message across while still capitalising on trending audios, hooks and other tips that get their content across to viewers.
“If there’s so many women in South Asia, then why is it [gender-based conversation] not trending, and the basic thing is we don’t have tools and resources,” Naqvi shares, adding that she’s seen young singers and content creators find a big presence on TikTok. She believes that if the work of these content creators could be harnessed into creating the right kind of digital tools for gender awareness, Pakistani creators could create more successful content.
But at the end of the day, it’s important to note that no matter how resourceful women can get online, it should not be their responsibility to find safe spaces for themselves. “I think the idea of a safe space in the digital sphere is quite a whimsical dream. In an era where technology and humans have evolved to the extent where a YouTuber can tell the exact coordinates of one's location by looking at a picture for mere seconds, it's naive to believe that safety is a perk offered in any online space,” Shaikh says, talking about the harsh digital reality we exist in today.
Naseer also puts the onus on tech companies and social media platforms to ensure safe spaces and support for gender justice movements. “Perhaps while individual resistance is important, women shouldn’t have to 'outsmart’ algorithms just to have their voices heard. The responsibility is on platforms, intermediaries and policymakers to build safer, fairer systems. That means transparency in how content is ranked, stronger moderation against digital GBV, and intentional AI models that don’t reinforce sexist biases.”