September 25, 2020 - Comments Off on What Is Emotional Regulation And Why Is It So Important?

What Is Emotional Regulation And Why Is It So Important?

By Kashfa Zafar

Have you ever felt hangry? If you’re human, chances are that you’ve been so hungry at some point that you were extremely irritated by everything and everyone around, but you were probably too agitated to realize that your bad mood was the result of a fairly common human experience – hunger. Emotionally heightened experiences can be really overwhelming resulting in cognitive overload; Your mind might respond by ‘shutting down,’ suspending your abilities of rational judgment. That is why for someone observing your behavior, you might seem like a less-than-stable individual. Of course, you know that you’re not some irrational person but in the case of experiencing ‘hanger,’ even you might be surprised by the things you say or do without realizing the reasons behind your seemingly ecstatic behavior. If only you knew that you were simply hungry, and the solution to your troubles was just a refrigerator door away. Wouldn’t that make your life easier?

Well, the good news is that there’s definitely a way. It’s called emotional regulation. Emotional regulation is the ability to exercise control over your emotional state so you’re in a better position to respond appropriately to the demands of a given situation. Emotional regulation skills obviously extend far beyond the scope of simply experiencing hanger. These skills are positively correlated to your social and emotional intelligence and can provide effective management skills for those experiencing depression and anxiety. 

The key in developing emotional regulation skills is to cultivate and practice mindful awareness. When you find yourself in an emotionally provocative situation, remove yourself physically from that negative space and redirect your attention towards what you’re feeling physically. Notice how your body feels. Does your chest feel tighter? Is your heart racing? Are you experiencing a headache? Whatever the case maybe, you can applaud yourself for practicing what is known as cognitive reappraisal. Instead of focusing too much on your negative thoughts and feelings, you have now managed to divert your mind towards how these negative effects present themselves physically in your body. When you do this, you are regaining control over your judgment and actions and not letting your emotions drive your thoughts and behavior. Cognitive reappraisal is a simple yet highly effective tool used in many different types of psychotherapies. It the ability to reframe your cognitions or alter your way of thinking. So, in the case above, you have reframed your experience of the situation because instead of focusing on your negative feelings and thoughts that might have negatively affected your perception of the given circumstances, you’ve concentrated your attention to somewhat neutral bodily sensations. 

Now that you’ve rerouted your thoughts from the situation onto yourself, the next step is to explore your feelings. Simply acknowledging that you’re feeling ‘bad’ or ‘mad’ is only a start. Dig a little deeper and notice what kind of negative emotions you’re feeling. If possible, write them down. Ask yourself what emotion might be masking itself in the form of anger. Sadness? Guilt? Shame? Hopelessness? For this, you have to be honest with yourself. Execute the same mindfulness that you practiced when noticing how your body felt. Without judging what comes up for you, identify both the surface-level as well as hidden emotions. By practicing this exercise over time, you’ll not only be able to develop and refine your emotional awareness, but you’ll also be able to tell what kind of emotional experience you’re having by simply noticing how your body feels. Each emotion has a physiological reaction in the body, and because you would have monitored the physical manifestation of the identified emotion, you’ll know how to regulate your behavior without being emotionally flooded. 

Once you’ve become regular in the practice of emotional regulation, you’ll feel you have greater self-control even in the face of the most pressing and pressurizing situations. Instead of letting your emotions control you, you’ll be able to take charge of your life, be it personal, social or professional.

September 17, 2020 - Comments Off on August 2020: DRF launches the Digital Detox Campaign

August 2020: DRF launches the Digital Detox Campaign

Online Campaigns and Initiatives

#HamaraInternet #MehfoozInternet

Under the Cyber Harassment Helpline advocacy campaigns, a comic strip with impactful graphics was released online which focused on victim blaming of cyber harassment. The comic strip highlighted how victim blaming decreases the likelihood that survivors will share their traumatic experience and seek support. It called on the audiences to take action against this attitude and shift the blame from the victim to the perpetrator.

In order to combat cyber bullying, which ruins people’s online experiences, we need to understand why people do this so that we are able to design our strategies accordingly. DRF released an infographic to shed light on some of these reasons.

DRF also launched another campaign called 'A to Z of Cyber Harassment', in which we unpack everything that makes up cyber harassment and what we need to know to protect ourselves online.

DRF’s Digital Detox Campaign

As part of DRF’s ongoing advocacy around raising awareness on cyber harassment, it shared and encouraged its audience online to participate in Bingos on various digital rights issues. One of the bingos, ‘how many of these steps do you take for your digital detox?’ had the highest number of interactions and retweets. It was also well-received on Instagram with many followers self-reflecting and pledging to do a better digital detox for their well-being. Another bingo focused on what entails cyber bullying behavior as a first step to understand what people need to do to make online spaces safe for all.




DRF has launched the #ActivismInPandemic campaign highlighting the important work human rights defenders and journalists have been doing during COVID19. The campaign aims to share experiences of journalists and HRDs during the pandemic and also highlight the importance of managing work and stress during these testing times.

Policy initiatives

Cyber Harassment Helpline July Statistics

Cyber Harassment Helpline received 697 complaints in the month of July. In comparison to the previous months prior lockdown , this is a huge number. It shows a spike in the cases of online violence especially blackmailing through non consensual use of information and images. Another observed trend is of social engineering through which people are coerced into sharing their personal details like National Identity Card number, WhatsApp code, bank account details and, e-wallet details making them susceptible to hacking and financial fraud.


Media Coverage

Pakistani journalists stand up against online harassment

Ramsha Jahangir, our member of the Network of Women Journalists for Digital Rights highlighted how online spaces have been weaponized recently. In DRF’s study ‘Fostering Open Spaces in Pakistan’ it was found that women in journalism and activism were subjected to online harassment and abuse.

Read the full article here around harassment that women journalists have been facing:

Pakistan tells Youtube to block ‘objectionable’ content’

DRF’s Nighat Dad shared her thoughts around Pakistan’s statement to Youtube regarding objectionable content on the platforms.

Read the full article here:

DRF on Coffee Table with Mina Malik Hussain on domestic violence

29 August 2020: A discussion on domestic violence

DRF’s Dania Mukhtar participated on Indus News on the show Coffee Table with Mina Malik Hussain with Kanwal Ahmed (Founder of Soul Sisters) around domestic violence. Dania highlighted the cases the cyber harassment helpline has received around domestic violence and also mentioned Ab Aur Nahin as a resource to victims of domestic violence.

Link to show:

Events and Sessions

DRF on International Youth Day on webinar on Breaking Barriers for Meaningful Youth Engagement

DRF’s Nighat Dad shared her thoughts on International Youth Day on ‘Breaking Barriers for Meaningful Youth Engagement’. Nighat highlighted on the digital dividend to engage youth across the country and globally. She also highlighted the struggle it takes for women to reclaim spaces.

DRF on NHRF’s discussion on protection of HRDs and their work

DRF participated in a discussion hosted by the Norwegian Human Rights Fund (NHRF) around protection of HRDs and their work. Nighat Dad participated in the discussion and shed light on the online safety trainings the organization has conducted due to NHRF’s help and support.

DRF at Right To Information: Need, Use and Status on 14th August

DRF participated in a panel discussion hosted by the Senior Journalists Forum on ‘Right to Information: Need, Use and Status’. The webinar took place on 14th August and highlighted the importance of right to information and its current use and status within the country.

DRF at “Digital Pakistan – The Future of Politics: Lessons and Opportunities” on 21st August

DRF participated in the webinar ‘Digital Pakistan- The Future of Politics: Lessons and Opportunities’ on 21st August. The session was organized by the Senior Journalists Forum and focused on the use of digital technologies for politics and mobilization.

DRF hosted a tweetchat on ‘How Activism has changed post COVID19’

DRF organized a tweetchat on 26th August 2020, around how activism has changed post COVID19. The participants in the tweetchat were Nighat Dad (lawyer, human rights activist and Facebook oversight board member), Imaan Mazari (lawyer and human rights activist), Moneeza Ahmed (feminist, humanistic therapist and an activist focusing on community development) and Sabin Muzaffar (founder and executive director of Ananke- a digital platform empowering women through awareness, advocacy and education). The tweetchat was an extension of DRF’s online campaign on #ActivismInPandemic and highlighted the key issues activists have been facing during COVID19 and what social media companies can do in order to protect them.

DRF at ‘Implementing SDG 16.10.1 (Protection of Women Journalist and Media Workers) for ensuring gender response media environment in Pakistan on 28th August

DRF participated in the Sustainable Development Goal 16.10.1 which focused on  protection of women journalists and media workers on the 28th of August. DRF’s Nighat Dad spoke about the threats female journalists are facing in the cyber world and also highlighted the importance of a gender responsive media environment in the session.

DRF in Provincial Consultation on Gender Bases Violence- Strengthening the Response

DRF participated in an ‘Provincial Consultation on Gender Based Violence – Strengthening the Response’ under the Awaz II program organized by Peace & Justice Network (PJN) held on 26 August 2020. The purpose of the consultation was to identify specific advocacy points to prevent and address gender-based violence in Punjab. In addition to assessing the legislative and institutional frameworks, the consultation also examined how prevalent social norms can be changed by both improving women’s access to legal processes regarding the registration and prosecution of crimes as well as addressing public shame. The deliberations of the consultation would be used to inform the advocacy agenda for the Aawaz Provincial Forum.

Digital Safety Conference by LACAS and DRF

August 7  - August 8

DRF and LACAS, in collaboration, hosted the "Digital Safety Conference"  to create a better understanding of how the digital world works and its impact on students. Moreover, bring students, teachers and parents on the same page of how to deal with the pressing issues related to the digital world. The event was held virtually and included representatives from a social media company, legal specialists, mental health counselors and student speakers. The two panels in the conference included “Cyberbullying, its Impact and being a Responsible Netizens” and “Parenting in the age of the internet”.

Facebook Live Session: Will banning the internet fix the mental health pandemic?

Our mental health expert, Saba Sabir, conducted an informative live session on the link between online spaces and mental health. We explored the possibility that while online spaces can result in tangible mental health harms, it is also a space for connection and expression particularly at a time where we are socially distancing. The session can be accessed here.

DRF conducted a session with Dastak on Online Safety

DRF conducted a semi-advanced online safety session with the team members of Dastak Charitable Organization on the 7th of August, 2020.

The session covered apps and tools that can be used to increase safety in the digital space.

DRF organized a session with AWAM on Online Safety

DRF organized a second session with the Faislabad-based organization AWAM as a part of its interaction with civil society organizations on the 25th of August, 2020.

The session covered apps and tools that can be used to increase safety in the digital space.

DRF’s session with WISE on Cyber Harassment and Online Safety

DRF collaborated with WISE on the 7th of August and hosted a session around the internet and online safety. The session was attended by female students, teachers and community activists. The session focused on cyber harassment and introduced the cyber harassment helpline as a resource to the participants along with other intitiaves like Ab aur nahin (providing legal pro bono service to domestic abuse and harassment victims)  and IWF portal (to report child sexual abuse online).

Privacy Laws in Pakistan

Zainab Durrani from DRF participated in a session of PCL Talks hosted by Pakistan College of Law and LEAP Pakistan that covered privacy laws in Pakistan. The discussion centered around privacy as a concept in general, as a right and now a digital necessity. Zainab discussed the cultural connotations of privacy and DRF’s position on the current Personal Data Protection Bill.

Gender sensitization session with journalists

DRF conducted a two-day online workshop on 9th-10th August 2020 on gender-sensitive reporting for women journalists and content creators. The workshop was attended by 14 participants. It also had a dedicated session on covering covid-19 from a gender-sensitive perspective. The training included a practical component after completion that required all participants to submit a short story covered from a gender-sensitive perspective.

Data Privacy in Asia

Our Project Manager Zainab Durrani participated in a tweet chat organized by Undatify Me that looked into the current privacy laws and regulations in Asia. The chat covered any issues with these laws, campaigns and movements that converged on the overall theme of privacy and the existing resources for people who want to learn more about these themes and the need for more such resources and reliable literature to be developed around it.

DRF with the support of FNF hosted two sessions of the Hamara Internet Online Safety Program

DRF hosted two sessions with the support of FNF with young adults in the Hamara Internet Online Safety Program. The sessions were conducted by DRF’s five youth ambassadors who focused on privacy and data protection of young adults in the country. The youth ambassadors were divided into teams and Team A conducted the session on 21st August with their group whereas Team B conducted their session on 22nd August with their respective group.

Sessions with Anew and FNF

Our partner FNF provided our team members with the wonderful opportunity to attend a 4-part training course with the team at Anew where they learnt about and discussed meaningful and impactful engagement, especially given the myriad of restrictions with conducting sessions and interacting with stakeholders and audience. These sessions took place in the last two weeks of August and were just the need of the hour for our trainers.

COVID19 Updates

Cyber Harassment Helpline

Because of the observed increase in cyber harassment complaints during COVID 19 lockdown,  DRF made its cyber harassment helpline operational 24/7 for three months. The cyber harassment helpline is offering the services free of cost for anyone who calls in or reaches out to us via social media or email. These include legal aid, the digital help desk, and mental health counseling. During these three months, our toll-free number is  accessible every day of the week, from 9 AM to 5 PM, our mental health counselors are working from 10 AM to 9 PM each day as well. Our mental health counselors are trained professionals providing free of cost counseling to women and marginalized communities. During all other hours of the day, our team is attending to complaints and queries through online platforms.

Contact the helpline on 080039393 or email us on You can also reach out to us on our social media channels.

Ab Aur Nahin

In times of COVID19 domestic abuse is at an all-time high where victims do not have anywhere to go. Ab Aur Nahin is a confidential legal and counselor support service specifically designed for victims of harassment and abuse.

IWF Portal

DRF in collaboration with Internet Watch Foundation (IWF) and the Global Fund to End Violence Against Children launched a portal to combat children’s online safety in Pakistan. The new portal allows internet users in Pakistan to anonymously report child sexual abuse amterial in three different languaged- English, Urdu and Pashto. The reports will then be assessed by trained IWF analysts in the UK.

September 14, 2020 - Comments Off on Women Journalists and Allies Express Outrage at the Murder of Shaheena Shaheen and Demand Concrete Measures of Ensure Safety of Journalists

Women Journalists and Allies Express Outrage at the Murder of Shaheena Shaheen and Demand Concrete Measures of Ensure Safety of Journalists

The news of Shaheena Shaheen’s brutal murder has greatly disturbed the community of media practitioners across the country and lays bare the structural insecurity women face in this country. Shaheena was an accomplished journalist based in Balochistan and was shot dead inside her home on September 5, 2020 in Turbat.[1] Shaheena was a host on PTV and the editor of a local magazine. She was outspoken for issues facing women, in her profession and community. Her murder is a grim reminder that women journalists face innumerable barriers and threats on the basis of their gender.

Pakistan is one of the most dangerous countries to be a journalist in, ranking 145 out of 180 countries in the 2020 World Press Freedom Index by Reporters Without Borders.[2] The challenges that women journalists face cannot be neatly captured by the discourse of journalist security and media freedoms. Women journalists are subjected to a ‘double threat’ that is both professional and personal in nature. The overall lack of media freedoms and violence against journalists impacts women journalists, however because of their gender, women journalists face a personal threat to their bodies and well-being as well. Shaheena’s murder, reportedly by her husband, is being characterised as a ‘domestic matter’. We strongly believe that the personal is political, and for women journalists the challenges they face in their personal lives--the double shift due to inequitable distribution of care and domestic work, violence within the home, harassment in work and public places, online vitriol directed at them--impacts their work and can often put their lives in danger. Women journalists do not shed their gender when entering professional engagements, rather their gender often predominantly defines their professional life.

We also remember the brutal murder of Urooj Iqbal in November 2019 who was also shot by her husband outside her workplace for allegedly not agreeing to leave her job.[3] Despite the fact that the murder was condemned by journalists across the world,[4] her family eventually settled the matter outside of court and did not pursue a case against her husband.[5] This case shows that when the perpetrator of violence is a family member, the likelihood of settling the matter outside of court, often due to the pressure exerted on the family, is high. Since the passage of the Criminal Law (Amendment) (Offences in the name or pretext of Honour) Act, 2016, cases of honour killings can be pursued by the state under section 299 of the Pakistan Penal Code regardless of whether the family forgives the perpetrator or not, but the implementation of the law is inconsistent. The

cold-blooded murders of Urooj and Shaheena are crimes against society as a whole, they should be pursued by the state, particularly in a country where crimes against women are vastly underreported. Pakistan is ranked 151 out of 153 countries on the Global Gender Gap Index Report 2020.[1]

On September 8th, the Spokesperson for the UN High Commissioner for Human Rights, Rupert Colville, has noted that the Pakistani Government take “immediate, concrete steps to ensure the protection of journalists and human rights defenders who have been subjected to threats, [...] the need for prompt, effective, thorough and impartial investigations with a view to ensuring accountability in cases of violence and killings.”[2]

These crimes take place in the backdrop of daily challenges that women journalists face. Recently, 150 women journalists issued a letter calling out the online harassment that they are subjected to and the ways in which political parties weaponise digital spaces and gender attacks to silence critical women journalists.[3] The concerns that women journalists face should be taken seriously and acted upon, by the media outlets as well as by the government.  State inaction sends a message to women in the journalist community that they are on their own and in the long term discourages young women from joining the profession.

We, the undersigned, demand that:

  1. While we are encouraged that the Ministry of Human Rights has taken notice of Shaheena’s case, we demand that there should be adequate follow-up by the state to ensure that the accused is prosecuted and a possible settlement does not impact the prosecution;
  2. The state prosecution challenges the pardon by the family in Urooj Iqbal’s case in the respective court and pursues the case with the state as a party; and
  3. The government takes immediate and urgent steps to pass the Journalist Protection Bill, with added provisions which recognise the gendered threats that women journalists face and institute accountability mechanisms to mitigate and address them.


  1. Xari Jalil, Dawn
  2. Umaima Ahmed - The News on Sunday, Network of Women Journalists for Digital Rights,
  3. Ghareeda Farooqi - News One
  4. Afia Salam - Freelancer
  5. Reema Omer - Lawyer
  6. Maryam Saeed - e Feminist Magazine 50-50
  7. Reem Khurshid - Dawn
  8. Amina Usman - Urdupoint
  9. Fahmidah Yousfi  -
  10. Rabia Noor - ARY News
  11. Najia Ashar - GNMI
  12. Nighat Dad - Network of Women Journalists for Digital Rights
  13. Shmyla Khan - Network of Women Journalists for Digital Rights
  14. Sahar Habib Ghazi, Freelance Investigative Reporters and Editors
  15. Ailia Zehra - Naya Daur
  16. Alia Chughtai -
  17. Rabbia Arshad , freelance documentary and filmmaker
  18. Lubna Jerar Naqvi Journalist
  19. Sabah Malik, Arab News
  20. Nida Mujahid Hussain, Network of Women Journalists for Digital Rights,
  21. Sabahat Khan - Freelancer - DW
  22. Maleeha Mengal - Social Media Strategist (Shirkat Gah Women’s Resource Centre)
  23. Moniba iftikhar  - Associated Press of Pakistan
  24. Naheed Akhtar - APP
  25. Tooba Masood - Freelance journalist
  26. Laiba Zainab - Sujag
  27. Sadaf Khan, Media Matters for Democracy
  28. Kiran Nazish, journalist and founder CFWIJ
  29. Katarzyna Mierzejewska, The Coalition for Women in Journalism (CFWIJ)
  30. Rabia Bugti - Dialogue Pakistan
  31. Jalila haider -  Independent Urdu
  32. Tanzila Mazhar - GTV
  33. Tehreem Azeem - Freelance journalist
  34. Marian Sharaf Joseph - Freelance Journalist
  35. Luavut Zahid - Freelance journalist
  36. Mahim Maher - SAMAA TV
  37. Maham Javaid
  38. Neelum Nawab - DIN News
  39. Zeenat Bibi - Freelance Journalist from KP
  40. Ambreen Khan - content editor Khabarwalay news
  41. Annam Lodhi, Freelancer
  42. Maryam Nawaz- Geo news
  43. Ayesha Saghir - Producer Express News
  44. Asma Sherazi - TV show Aaj News
  45. Afifa Nasar Ullah - Reporter, City News
  46. Haya Fatima Iqbal - Documentary Filmmaker
  47. Wajiha Naz Soharwardi - CPNE
  48. Sahar Saeed - Neo TV Network
  49. Kiran Rubab khan -  Reporter, 7 news
  50. Imrana Komal - Senior Multimedia Journalist, Free lines
  51. Manal Khan - Independent Writer
  52. Zoya Anwer - Independent Multimedia Journalist
  53. Shaista Hakim - Reporter khyber News Swat
  54. Hina durrani, APP
  55. Sabrina Toppa, Freelance
  56. Shafaq Saba - Freelance Journalist from KP
  57. Mehak Mudasir -  Freelance Journalist from KP
  58. Zivile Diminskyte - Engagement coordinator at CFWIJ

Supporting Bodies:

  1. Network of Women Journalists for Digital Rights (NWJDR)
  2. Women In Media Alliance Pakistan (WIMA)
  3. The Coalition for Women in Journalism (CFWIJ)
[1] Mohammad Zafar, ‘Journalist Shaheena Shaheen shot dead in Turbat’, The Express Tribune, September 5, 2020,


[1] ‘Woman journalist shot dead’, Dawn, November 26, 2019,

[1] ‘Pakistan: Woman journalist killed for not quitting job’, November 26, 2019, The Coalition for Women in Journalism,

[1] ‘عروج اقبال کا قتل: ’اپنے کام کی وجہ سے پاکستان میں قتل ہونے والی پہلی خاتون صحافی' کا مقدمہ اختتام پذیر’, BBC Urdu, August 12, 2020,

[1] “Mind the 100 Year Gap’, 2019, World Economic Forum,

[1] “Press priefing notes on Pakistan”, Spokesperson for the UN High Commissioner for Human Rights, 8 September 2020,


September 2, 2020 - Comments Off on Cyber Bullying And Its Effects On Teenagers/Adolescents

Cyber Bullying And Its Effects On Teenagers/Adolescents

By Sara Israa 

Cyberbullying or cyber harassment are no new terms. They are now commonly experienced by people who are active on social media and who use online spaces. Cyberbullying could be defined in many ways but basically it is when someone intentionally sends hurtful messages and pictures, spreads false information, threatens or blackmails you, hacks your social media, or impersonates you. It is something which is persistent, which is probably meant to intimidate the victim. The perpetrator might be known or unknown.

With the influx of technology, social media, and unlimited access to internet services, cyber bullying is on the rise. It won’t be wrong to say that online spaces are now becoming unsafe day by day, since we are not aware of the predators behind the screens. The peak of cyber bullying is now actually taking a toll on mental health. Teenagers are the most common victim of this since they belong to a vulnerable part of society and they also excessively use online spaces. It is disturbing because of its public and uncontrollable nature.

The teenagers who are cyber bullied experience a range of emotions such as increased anxiety, low and sad mood, school absenteeism, decreased self-esteem, difficulty focusing, and in extreme situations even suicide. Cyber bullying and adolescent mental health hold a strong relation together. There has been vast research that validates that harassment on the internet introduces feelings of guilt, worry, and depression. This at times aggravated since many teenagers have a hard time communicating. This results in self-blame which might be a reason for them choosing suicide. 

Cyber victimization at times also leads to teenagers isolating themselves and spending their time worrying over the consequences of being shamed online. Similarly, children who experience cyber harassment may experience anger outbursts and may have relationship problems later in life. Cyber victims are more likely to experience somatic problems, including difficulty sleeping, headaches, and stomachaches, as compared to their unaffected peers. Many children in order to overcome or get away with post shame of cyber harassment may also indulge in substance abuse. 

 Unfortunately, most teenagers are unaware of digital safety hence they fall prey to cyber bullies. Also, a vast majority of research shows that in the past decade cyber harassment has become so prevalent that it is not considered a public health concern. 

With cyber harassment showing a strong correlation with adverse effects of mental health it is high time now that we make the youth more aware of cyber safety. We at the individual and collective levels should try to make online safety more accessible. There is a dire need for mental health counselors to address the concerns of cyber victims and provide them with platforms where they can vent out and word out their perspectives and thoughts without being judged. 

There are some ways which adults or parents can use to save their children from being cyber bullied. Firstly, be empathetic and listen to your child so that he/she can confide in you without fear. As a parent or adult, you can make sure that your child’s profile is private and not public, limit the number of friends your child adds on social media and allow only those to be added which he/she knows in real life, ensure about passwords safekeeping and ensure that your child knows how to report, block or delete someone who is harassing them. Get them engaged in offline activities. Remember, the less time they spend on their devices, the less likely it is that they will be cyberbullied.


Nixon, C. L. (2014). Current perspectives: the impact of cyberbullying on adolescent  health. Adolescent health, medicine and therapeutics5, 143.

Vaillancourt, T., Faris, R., & Mishna, F. (2017). Cyberbullying in children and  youth: Implications for health and clinical practice. The Canadian journal of  psychiatry62(6), 368-373. and-teenagers

Munawar, R., Inam-Ul-Haq, M. A., Ali, S., & Maqsood, H. (2014). Incidence, nature  and impacts of cyber bullying on the social life of university students. World  Applied Sciences Journal30(7), 827-830.

August 12, 2020 - Comments Off on July 2020: Digital 50.50, second edition released

July 2020: Digital 50.50, second edition released

Online Campaigns and Initiatives

Digital 50.50 Second Edition

Digital Rights Foundation released the second edition of its e-magazine, Digital 50.50. This edition focused on the mental health impact and coping strategies during the coronavirus pandemic. Some of the topics covered included, a feature on how children are making sense of the coronavirus lockdown, a narrative on digital healing and connectedness during the lockdown and listicle on strategies to do an effective digital detox for mental health and well-being. The reception received by the magazine has so far been amazing and we hope to continue supporting writers and content creators in creating visually strong and intersectional-feminist stories. Do give it a read here or skim through, we have filled it with love for our readers.

Hamara Internet and #MahfoozInternet Campaign

DRF has been taking forward the Hamara Internet campaign on #MahfoozInternet on Twitter highlighting the different aspects of digital rights in Pakistan. The campaign has been focusing on repelling section 20 of PECA 2016, regularly changing passwords to keep devices safe and secure and how social media companies track browser history of individuals.

DRF’s 5 Youth Ambassadors for the Hamara Internet Online Safety Program

DRF launched the Hamara Internet Youth Ambassador Program online and picked 5 youth ambassadors from across Pakistan on a vigorous application process online. The five youth ambassadors will be implementing the Hamara Internet Online Safety curriculum with young adults aged 14 till 18 years old.

Policy initiatives

DRF comments and Objections on the Citizens Protection (Against Online Harm) Rules 2020

DRF released its comments and objections on the Citizens Protection (Against Online Harm) Rules 2020. DRF reiterated their statement again that the rules are an attack on the fundamental rights of citizens in the country and shed light on why advocacy groups are boycotting consultation on the matter.

Read the full comments below:

Statement against the ban of Bigo live and warning issued to TikTok:

DRF released a statement regarding the ban on the app Bigo live and the warning issued to TikTok recently by the Pakistan Telecommunication Authority (PTA). In the statement it was highlighted how banning both these apps is an attack on the constitutional right of freedom of expression of citizens in the country.

Read the full statement here:

Joint Statement on The Digital Gap during COVID 19 focusing on online classes

Students across Pakistan have been protesting against the shift to online classrooms, rightly pointing out that as students from less urban centers move back home, they either lack access to high-speed internet or no internet at all. DRF released a statement on the digital gap during COVID19 highlighting the problems people are facing with this transition to online spaces. #Internet4GilgitBaltistan #OnlineClasses

Read the full statement here:

Cyber Harassment Helpline June Statistics

Cyber Harassment Helpline received 413 complaints in the month of June only. In comparison to the previous months, this is a huge number. It shows a spike in the cases of online violence especially social engineering through which people are coerced into sharing their personal details like National Identity Card number, WhatsApp code, bank account details and, e-wallet details making them susceptible to hacking and financial fraud.


Read about our findings in the following articles:

Media Coverage

DRF on GEO News focusing on how to prevent trolling on social media

DRF’s ED Nighat Dad spoke on Geo News highlighting trolling and how to prevent it in online spaces. She highlighted what measures people can take to avoid trolling and how cyberbullying is a problem in online spaces which should be focused on more.

Nighat Dad spoke on VOA about #unbanPUBG

DRF’s ED Nighat Dad spoke to Voice of America (VOA) about the PUBG ban and how this will have repercussions on our digital economy. She focused on why it is important to #unbanPUBG.

Watch the full interview here:

Nighat Dad highlights in the News on ‘Any unwarranted restriction by animal market on digital media poses a threat to the nascent industry’

DRF’s Executive Director Nighat Dad spoke to the News about restrictions being posed on vloggers and bloggers online. She also highlighted how the restriction of speech of influencers and people online is a direct threat to their freedom of expression and a violation of the constitution.

Read the full article here:

Digital Exclusion Cripples life in the pandemic

DRF’s ED Nighat Dad shared how digital exclusion silences democratic voices in the country. She talked about how internet connectivity is a serious issue in Pakistan and how it has suffered over the years.

Read the full article here:

پاکستانی تعلیمی اداروں میں جنسی ہراس کتنا سنگین مسئلہ ہے؟

DRF’s ED gave comments about how harassment in educational institutions is a growing problem and how institutions need to do more to address these problems.

Read the full article here:

Dawn’s Editorial highlights in its editorial about banning online apps

Dawn’s editorial highlighted about banning apps online and retreated DRF’s point on how the justification of such bans to ‘protect’ children is akin to banning highways to prevent road accidents.

Read the full editorial here:

Events and Sessions

DRF at the School of Tomorrow discussing ‘Freedom & privacy in information societies’ #SOT2020

DRF’s Nighat Dad spoke at Beaconhouse's School of Tomorrow conference on ‘Freedom and Privacy in Information Societies’. She shed light on how every individual is vulnerable online and how one must deal with issues of hacking and privacy breach. She also highlighted why it is important to be aware of your digital footprint.

Watch the full conversation here:

Nighat Dad on Digital Rights Digital Divides and the new normal? (A talk by Nighat Dad)

Nighat Dad gave a talk on Digital Rights Digital Divide and the new normal on 21st July. She discussed online spaces post pandemic and particularly highlighted how the online landscape in Pakistan has changed and evolved post pandemic.

DRF at the webinar ‘The critical challenges of tackling hate speech, xenophobia, rhetoric, and incitement to hatred against minorities’

DRF participated in a webinar series on debating challenges for minority protection and spoke on the ‘Critical challenges of tackling hate speech, xenophobia rhetoric and incitement to hatred against minorities’. Nighat Dad spoke at the event on July 2nd and 9th.

DRF conducted a virtual session with Lincoln Corner ‘Digital Rights & Reporting Cyber Bullying’

DRF conducted a virtual session with Lincoln’s corner on ‘Digital Rights & Reporting Cyber Bullying’. Nighat Dad spoke in detail on how protecting one’s individual identity online is important and what one must do to respond to cyber harassment. She also highlighted the cyber harassment helpline and other intitriaves of DRF to tackle cyber harassment and cyberbullying online.

Consultation on VPN registration

DRF held a virtual consultation on the issue of VPN registration by the Pakistan Telecommunication Authority (PTA) with civil society actors, lawyers and industry stakeholders. The consultation was an opportunity to understand the impact of regulating VPNs on not only digital rights indicators, but also the viability of the digital economy in Pakistan.

Meeting with FIA

DRF’s cyber harassment helpline held a meeting with the Cyber Crime Wing of the FIA on July 6, 2020 to discuss the ways in which the work of our law enforcement agencies can be improved to make the internet safer for women and gender minorities.

RightsCon Panels

Digital rights solidarities in South Asia
July 31, 2020

DRF hosted its community lab session on the state of digital rights in South Asia with participants from India, Bangladesh and Nepal to understand the common trends in digital rights and state policies in the region. The aim of this session was also to forge solidarities and partnerships in the region based on our shared histories and challenges.

Between regulation and rights: empowering the citizen through media and information literacy in the context of misinformation and offline violence
July 30, 2020

DRF took part in Digital Empowerment Foundation (DEF)’s session at RightsCon 2020. The session (available here) brought a diversity of perspectives of media and information literacy practitioners from the Global North and South to tease out the nuances of the operation of the phenomenon in different socio-political contexts, the government responses, and the role of civil society in mainstreaming media and information literacy within the communities they work with.

Technology-facilitated hate speech and digital activism
July 31, 2020

DRF also took part in BoloBhi’s panel on online hate speech with Arzu Geybullayeva from Azerbaijan and Alex Warofka from Facebook. The session delved into an in-depth discussion about technology facilitated hate speech on social media platforms with special focus on hate received by digital activism. The session analyzed the impact this hate speech has on the movements and their actors, and seeks to explore ways to deal with it.

DRF Attends Uproar Training on 3rd July

Members of  DRF attended an online training on using the uproar tools, which are quite useful for improving research and advocacy skills with the Universal Periodic Review (UPR).

Curbing Cyber Harassment in rapidly digitalizing world

On 23rd July, DRF took part in a facebook live session to address cyber harassment in a digitalized world. Muhammad Usman represented DRF at this session which was hosted by Peace and Justice Network Pakistan. Usman spoke on the offenses under PECA and on the reporting mechanisms and procedures of cybercrimes in Pakistan.

DRF at UNESCO webinar on ‘Advanced ICTs & Artificial Intelligence: Human Rights & Development Intersection

Nighat Dad spoke on UNESCO’s webinar on ‘Advanced ICTs & Artificial Intelligence: Human rights & Development Intersection’. She highlighted our growing reliance on ICT’s and technology and also highlighted how digital rights are important and need to be implemented in online spaces.

DRF at webinar on ‘Safer and Equal Online Space for Women’

DRF spoke at the webinar ‘Safer and Equal Online Space for Women’. She highlighted how the internet and technology need to be accessible to women for it to move ahead. She also emphasized how these spaces aren’t safe for women and gendered minorities and the responsibility of social media channels to make these spaces safe for all.

A talk on domestic violence in Pakistan

DRF’s Dania Mukhtar spoke on a talk on domestic abuse in Pakistan. She highlighted how domestic abuse has been on the rise in times of COVID19 and what the law says in terms of domestic abuse cases in Pakistan

Watch the full video here:

Meeting with Women Action Forum (Lahore) dated 14 July 2020

DRF presented its comments to the amendments proposed by UN Women to the ‘The Punjab Protection against Harassment of Women at the WorkplaceAct, 2010’. After the meeting, the teams of DRF and WAF decided to form a committee to propose amendments to this law as well as the Punjab Protection of Women against Violence Act, 2016.

Session with AWAM

DRF conducted an online safety session with team members of

Association of Women for Awareness and Motivation (AWAM) Organization of Faislabad on the 29th of July, 2020.

The session covered a basic understanding of how to keep oneself secure and protected from risks while working and operating digitally and was well-received by our participants.

COVID19 Updates

Cyber Harassment Helpline

In the wake of COVID 19 lockdown when cyber harassment cases have increased  DRF has made its cyber harassment helpline operational 24/7 for three months. The cyber harassment helpline is offering the services free of cost for anyone who calls in or reaches out to us via social media or email. These include legal aid, the digital help desk, and mental health counseling. During these three months, our toll-free number is  accessible every day of the week, from 9 AM to 5 PM, our mental health counselors are working from 10 AM to 9 PM each day as well. Our mental health counselors are trained professionals providing free of cost counseling to women and marginalized communities. During all other hours of the day, our team is attending to complaints and queries through online platforms.

Contact the helpline on 080039393 or email us on You can also reach out to us on our social media channels.

Ab Aur Nahin

In times of COVID19 domestic abuse is at an all-time high where victims do not have anywhere to go. Ab Aur Nahin is a confidential legal and counselor support service specifically designed for victims of harassment.

IWF portal

DRF in collaboration with Internet Watch Foundation (IWF) and the Global Fund to End Violence Against Children launched a portal to combat children’s online safety in Pakistan. The new portal allows internet users in Pakistan to anonymously report child sexual abuse material in three different languages – English, Urdu, and Pashto. The reports will then be assessed by trained IWF analysts in the UK.

July 21, 2020 - Comments Off on Digital Rights Foundation expresses concern regarding banning of popular social media applications TikTok and Bigo Live

Digital Rights Foundation expresses concern regarding banning of popular social media applications TikTok and Bigo Live

The Pakistan Telecommunications Authority (PTA) banned the popular social media application Bigo Live and issued a final warning to TikTok via press release on July 20, 2020, purportedly acting on a ‘number of complaints’ against the alleged ‘immoral, obscene and vulgar content’ on these applications. Additional reasons for the ban and warning included ‘their extremely negative effects on society in general and youth in particular.’ As an organisation working in the field of digital rights and freedoms for nearly a decade, Digital Rights Foundation (DRF) sees this as a blatant violation of the Constitutional right to freedom of expression online and urges the PTA to reconsider its approach for the safety of minors. These measures and warnings call for a fundamental reflection on the laws of censorship in place in Pakistan.

Pakistan has a long history of bans on social media platforms. In 2010, the Lahore High Court directed the government to block access to Facebook; a ban that lasted for a few days. Similarly, YouTube was blocked in Pakistan for three years. The Islamabad High Court issued orders in 2018 directing the PTA to swiftly deal with illegal content online, threatening otherwise that the courts would be compelled to block social media websites such as Twitter and Facebook. Last year the PTA reported to the National Assembly Standing Committee on Information Technology and Telecom that a total of 900,000 URLs have been blocked in the country. More recently, the PTA, acting upon directions of the Lahore High Court, ‘temporarily banned’ the popular mobile-based game PUBG. Earlier this month, a petition was filed at the Lahore High Court calling for a ban on TikTok.

The internet is a medium for expression, ranging from political to artistic content, and access to information that includes vital health information, news and entertainment. Applications such as TikTok and Bigo Live are video-based platforms used by a diverse set of users for expressing their right to speech and accessing content, both rights guaranteed under Articles 19 and 19A of the Constitution of Pakistan. Any curtailment of these rights needs to be proportionate in nature, necessary to the specific harm being caused and established by law. Blanket bans on social media platforms are neither proportionate, necessary to the harm stated in by the PTA nor justified by law.

Firstly, the criteria of ‘obscenity’, ‘vulgarity’ and ‘immorality’ used by the PTA is extremely vague and no objective legal standard has been employed by the Authority to take its decision. This was also reflected in the petition against TikTok submitted before the Lahore High Court that contended that users on the app are “spreading nudity and pornography for the sake of fame and rating”. It is not lost on us that the criteria of obscenity is often articulated in gendered terms and was the same justification used by petitions calling for a ban on the Aurat March earlier this year. Platforms such as TikTok and Bigo provide young people space to express themselves freely, in ways that they cannot in other spaces of society. It is also no coincidence that the user base of TikTok and Bigo is extremely diverse, consisting of users from different classes and genders. Unlike Facebook and Twitter, apps such as TikTok are not text-heavy and thus lend themselves to a more diverse userbase as lack of literacy is no longer a barrier. Justifications based on ‘vulgarity’ and ‘obscenity’ are often stand-ins for society’s discomfort with expression that deviates from gendered norms and carries classist assumptions of what constitutes ‘respectable’, ‘acceptable’ content.

The popularity of TikTok and Bigo among Pakistanis should be celebrated, it provides an avenue for artistic expression, a platform for expression and space for connection with one another. The content on these applications, no matter how frivolous or ‘silly’ is protected speech, and is important for any society that values culture. The courts, government or the PTA are no one to judge whether it is valuable or beneficial for society or not. Additionally, these platforms provided content creators with an opportunity to earn revenue from their live streams and creative videos. Many Pakistanis are among the top professional online gamers, using platforms such as PUBG, and there is a burgeoning local e-sports culture. In a post-Covid19 economy, as prospects for employment for the youth are rapidly shrinking, taking away these economic opportunities could devastate the livelihoods of many.

While the PTA does not allude to it, many have cited the extremely horrific incident of gang rape in Lahore as further justification for the ban. The victim/survivor was a user of TikTok who allegedly met her rapist over the app and agreed to meet in person to record a video, which is when the incident took place. Violence against women and rape is a systemic issue that predates any social media application and will, unfortunately, continue even if the internet was banned in Pakistan. This rhetoric of ‘protecting women’ is part of an old playbook, women’s safety issues are hijacked by voices to enact paternalistic, heavy-handed measures which do very little to tackle systemic violence against gendered bodies or dismantle patriarchal structures, rather seek to restrict freedoms. As a digital and women’s rights organisation, we are witness to the justifications used by the government to pass draconian legislation such as the Prevention of Electronic Crimes Act in 2016 which has done little to protect women and gendered minorities in online spaces. The desire to police women’s bodies and expression is again apparent when the criteria of obscenity and vulgarity are invoked to limit internet freedoms.

Secondly, it is disingenuous to argue that these platforms are being used to negatively influence the youth as even a cursory look at the content on TikTok and Bigo reveals that this is patently not the case. The community guidelines of these platforms prohibit pornography and harmful content for minors. Any content violating these policies is either auto-deleted by the algorithm or can be reported in-app. When a proportionate remedy exists for the alleged harm caused, there is no point in banning entire applications which contain diverse content. If any content on the application is deemed to contain hate speech or cause harm to minors, then it should be reported as a piece of individual content to either the social media company for removal or taken up with the relevant law enforcement agencies. Individual pieces of content cannot be used to justify the banning of an entire platform; a move that would be grossly disproportionate.

Additionally, social media companies cannot be held liable for individual pieces of content on their platforms. While each company should be required to have mechanisms for removal and monitoring of harmful content in place, the principle of intermediary liability states that these platforms cannot be expected to be held liable for every content that is posted. Users are well within their rights to demand that social media companies have adequate mechanisms and systems in place that protect the most vulnerable groups using their platform, however, platforms cannot be expected to guarantee that no harmful content will ever be posted. As long as there are systems in place to detect, report and remove such content when it does appear the companies are acting within the scope of their limited liability.

Thirdly, the justification of these bans to ‘protect’ children and the youth is akin to banning highways to prevent road accidents. The mental health, wellbeing and safety of children and young adults should be a concern for us all, however, banning applications is a paternalistic solution to a problem whose root causes are beyond individual applications or even the internet. Mental health is an epidemic worldwide and Pakistan, in particular, lacks the infrastructure to provide quality mental healthcare to its citizens, including the youth. In a country with a massive youth bulge, it is concerning that avenues for expression and entertainment, which are vital parts of intellectual and emotional growth, are extremely limited. As per a study by UNDP in 2018, it was reported that a majority of Pakistani youth do not have access to recreational facilities or events: 78.6% do not have access to parks; 97.2% lack access to live music; 93.9% lack access to sporting events; 97% can’t access cinemas and 93% are denied access to sports facilities. By banning applications primarily used by the youth, the state will be denying them a platform for self-expression in a society that already lacks alternatives.

Fourthly, individual pieces of content can be reported to the PTA under section 37 of the Prevention of Electronic Crimes Act 2016 (PECA). If individual accounts or content violates the criteria for removal, the PTA has the power to “remove or block or issue directions for removal or blocking of access to an information”. This needs to be done through an order passed by the PTA and such an order needs to be communicated to the aggrieved party who has the right under section 37(4) to ask for a review of the order for blocking or removal. Furthermore, an appeal to the relevant high court against the decision of the PTA also lies under section 37(5). As a digital rights organisation, we believe that these powers are already too broad and need to be reviewed; but it is concerning that in banning entire platforms the PTA is exceeding even these excessive powers in an arbitrary manner.

This month TikTok, along with 58 other Chinese-owned apps, was banned by the Modi-led government in India as a result of its strained relations with China. Statements by the US government have expressed similar possible plans. While there are legitimate concerns to be raised regarding the content regulation and privacy policies of the application, these decisions do not seem to be driven out of genuine concern for users’ rights rather are part of a larger geostrategic move to isolate China. As a country that has repeatedly raised alarm regarding the fascist tendencies of the Modi government, it is surprising that the government in Pakistan is taking a similar heavy-handed approach to internet censorship.

We demand an overhaul of the internet regulation regime in place as it is extremely arbitrary and violates the principles of freedom of expression and access to information, enshrined in not only international conventions that Pakistan has ratified but also guaranteed under its own Constitution. These individual cases point towards a wider trend of shrinking online freedoms. As concerned citizens, we demand:

  1. Immediately lift bans on PUBG and Bigo Live, and reconsider warnings issued to TikTok;
  2. Section 37 of the Prevention of Electronic Crimes Act 2016 be repealed;
  3. The government move towards a model of self-regulation that is compliant with international human rights standards;
  4. Transparency from the PTA regarding the content that is reported to the Authority and publicly available orders which delineate the reasons for removing/blocking specific content;
  5. A comprehensive and welfare-based plan for the protection of children and adolescents which includes investment in digital literacy, access to mental health counselling and programs for the performing arts; and
  6. Immediate de-notification of the Citizen's Protection (Against Online Harm) Rules and abandonment of the approach taken in the Rules as a viable mechanism for regulating the internet.
 Read about the impact of the TikTok ban in India here:

July 13, 2020 - Comments Off on June 2020: DRF launches Digital 50.50 e-magazine

June 2020: DRF launches Digital 50.50 e-magazine

1: Online campaigns and initiatives

Digital 50.50

Digital Rights Foundation launched its newest initiative, a feminist e-magazine called the Digital 50.50, a monthly e-magazine that aims to create an open space for ideas, opinions, art, and discourse on a wide array of topics in the digital rights arena from an intersectional feminist lens. The idea behind launching Digital 50.50 is rooted in our organization’s goal to make online spaces safer and equal for womxn journalists and support them in providing reliable information to the public. To ensure gender-sensitive reporting and amplify women’s voices, Digital 50.50 believes that the inclusion of women is a must in all roles - from content creators and editors to experts, sources, and subjects of stories and art. The theme for the first issue was ‘Impact of Covid-19 on women and girls in online and offline spaces’ and it covered a diverse range of features on how the coronavirus pandemic has transformed the work that journalists do, the rise in incels online, increasing cases of domestic abuse and the economic issues faced by women workers and laborers.

Statement by DRF to save OTF

The US government has recently announced to push money and funding to closed source tools which considerably has adverse effects on the Open Tech Foundation (OTF). OTF’s work is under threat and OTF has been supporting countless journalists, human rights defenders, and organizations working on human rights. DRF condemns this move against open source technology and OTF.

Read our full statement here:

Campaign on Digital Laws Asia

DRF, in partnership with APC, did a day-long campaign highlighting the legal landscape of Pakistan with regards to its Digital Laws. This campaign was held across South Asia on the same day and was meant to show a cross-comparison of countries in this region. DRF particularly highlighted the Citizen’s Protection (Against Online Harm) Rules 2020 which restrict freedom of expression and privacy online of citizens. DRF and other organizations have also given out a statement on no consultation on the rules without withdrawal.

Click to read our statement on the rules:

No Consultation Without Withdrawl of Rules:

COVID19 contact tracing

DRF analyzed the COVID19 contact tracing app and raised its concerns regarding possible surveillance through the app and the safety of the user’s data.

Read our full analysis here:

Campaign on together for reliable information

In collaboration with the Free Press Unlimited, Digital Rights Foundation participated in a campaign to highlight the work done by DRF and its Network of Women Journalists for Digital Rights to bring accurate and reliable information to the public amid the coronavirus pandemic. The campaign included a series of videos from DRF’s team on our role in bringing reliable information on digital rights to our beneficiaries and the general public and the launch of a toolkit for journalists and content creators on how to keep themselves safe considering the new set of risks and threats being posed in the online spaces. It also brought to the public, a series of visual stories from the journalistic frontline of Covid-19.

DRF internship program

DRF is proud to launch our internship program for this year and also introduce our brilliant batch of interns in the past who worked tirelessly to help us in what we do. To know more about our work and the different projects our interns worked on click on the link below:

2: Policy Initiatives

Cyber Harassment Helpline Report 2019

DRF released Cyber Harassment Helpline’s annual report of 2019 on 24th June. The report highlights complaint trends observed in the previous year as well as the policy recommendations for all the stakeholders. Concerning trends observed were the increase of attacks on mobile wallets/ e-wallets like EasyPaisa and phishing attacks where people were targeted through WhatsApp or text messaging. The Helpline saw a total of 2023 cases being reported, with a daily average of 146 calls per month in 2019. When compared to the overall complaints the Helpline had received over three years, the calls from 2019 account for 45% of all complaints. This showed an alarming increase in the number of cases over time and a disturbing upward trend in cyber-harassment cases.

Link to report:

Forum on Information and Democracy working group to combat infodemic and information chaos

The forum on information and democracy launched their first working group to combat infodemic and information chaos and making a policy framework to tackle this crisis. Our ED Nighat Dad is a part of this forum where the mission is to assist with the regulation and self-regulation of the online information and communication domain and to implement the goals of the International Partnership for Information and Democracy that was launched during the UN General Assembly in September 2019 and has now been signed by 37 countries.

Read about the working group here:

Blog on Internet shutdown in Quetta

DRF observed with great concern the internet shutdown in the provincial capital of Quetta earlier last month that affected the lives of the Baloch residents. The effects of cutting off an essential resource extend not only to the difficulties in the work of journalists who are covering the news in the midst of a pandemic but also to the safety of all those who have to venture out to work and have limited access to their families as well as the access of information to all citizens in the affected area.

Link to the blog post:

Blog on countries wanting to regulate and control VPNs and their access

DRF’s intern Areeba Jibril pens down her thoughts around countries regulating and controlling VPNs and how that might be a direct threat to an individual’s privacy, free speech, and elections. She tweets at @AreebaJibril

Link to blog:

3: Media Engagement

189% increase in cyber-harassment cases during COVID19 (Policy brief)

Cyber Harassment Helpline released a policy brief on 3rd June delineating the number of cases and complaint trends observed during the lockdown. An increase in the number of complaints during lockdown was observed. The helpline received a combined 136 complaints in March and April during the lockdown compared to 47 complaints, an increase of 189 percent, before the lockdown in January and February. The majority of cases received at the cyber harassment helpline during the lockdown months pertained to blackmailing through non-consensual sharing of information, intimate pictures, and videos. Complaints of hate speech, phishing, fake profiles, and defamation were also reported. These complaints were only received through online mediums (Emails, social media platforms) as Cyber harassment helpline’s toll-free number was inactive. The policy brief also suggested measures for the government to tackle this issue of increased cyber harassment.

Link to policy brief:

Press coverage of brief:

DRF on morning show ‘Aaj Pakistan with Sidra Iqbal’

Nighat Dad spoke on the morning show ‘Aaj Pakistan with Sidra Iqbal’ highlighting how social media and our personal lives are interlinked. She also highlighted how everyone is vulnerable on social media and it is important to aware of what you’re sharing online.

Link to show:

Inside Pakistan’s COVID19 Contact Tracing app

DRF spoke to The Diplomat around Pakistan’s COVID19 Contact Tracing App and how boundaries need to be established by the state when they’re introducing such an app. It was highlighted how the privacy of users should also be considered when the app is introduced.

Read the full article here:

4: Events and Sessions

Human Rights in the Digital era: Business respect for civil and political rights in times of emergency

Nighat Dad participated in the United Nations Virtual Forum on Responsible Business and Human Rights 2020. She spoke on a panel on ‘Human Rights in the Digital Era: Business Respect for Civil and Political Rights in times of emergence’. She particularly highlighted the human rights implications of business tackling during COVID 19.

Link to details of the session:

APAC Insight 2- True Lies: Misinformation, Censorship and the Open Internet

DRF’s ED Nighat Dad spoke in the webcast ‘True Lies: Misinformation, Censorship and the Open Internet’ hosted by Internet Society Asia Pacific (ISOC) on 30th June. Nighat emphasized on the disinformation campaigns around Pakistan recently and how that is a direct threat to activists to the country.

Link to webcast:

DRF in webinar on How to reduce the digital gender divide in post-pandemic Pakistan

On 16th June DRF in collaboration with the Swedish embassy did a webinar on ‘How to reduce the digital gender divide in post-pandemic Pakistan’. The webinar started with a keynote address from the Swedish ambassador, Ingrid Johansson and later had a presentation by Camila Wagner and Nighat Dad. Camila Wagner, an awarded journalist specializing in societal challenges. She presently runs Klara K, consultants in public affairs, crisis management and leadership. Camila explained the digital gender divide across the globe and in Sweden while DRF’s Nighat Dad drew stark differences between Pakistan and Sweden, and how still much needed to be done in the country.

Online Violence against women during COVID19 at #WOWGlobal24- British council

Nighat Dad spoke about online violence against women during COVID19 ad #WOWGlobal24 by British Council. She shared statistics from the helpline in which we saw an increase in online violence complaints on the helpline by 180% in the month of March when the lockdown was imposed as opposed to January and February of 2020.


Click here to read about what she said:

Roundtable on cybercrime and harassment

Nighat Dad spoke on the roundtable on ‘Cyber Harassment Under the Shadow of Corona: Incidence, Control and Punishment’ hosted by the National Initiative against Organized Crime Pakistan. She highlighted how cyber harassment has increased during these times and how cyber harassment helpline is a resource for people facing harassment online.




Emerging stories: Journalism in times of isolation FPU

DRF’s ED participated in Freedom Press Unlimited’s Emerging stories focusing on journalism in times of isolation. She highlighted the work DRF has done with journalists and the challenges journalists are facing on the ground here due to the pandemic.


Link to the panel:


Courting the Law’s discussion on ‘Rights in the Digital Age’

DRF participated in Candid with Pasha on a live discussion on the ‘Personal Data Protection Bill 2020- How will it impact individuals and corporate?’ on 8th June. She highlighted the discrepancies in the bill and how more work needs to be done around it.

Link to discussion:


Increased cyber harassment during COVID19 on FM101

DRF spoke on FM 101 around how harassment during COVID19 has increased and also shared startling figures from the helpline around this.

DRF at the public hearing at Budenstag highlighting ‘Human Rights and Freedom of Association in the Digital Age’

DRF’s ED spoke at Budenstag public hearing focusing on human rights and freedom of association in the digital age. She highlighted how lawmakers need to be more thoughtful and aware of the interventions they propose and how it can have international repercussions for all. She also highlighted the responsibilities of the Facebook oversight board and how women and other marginalized groups are vulnerable on the internet.

Aaahung panel: Impact of Online Harassment on Pakistani Youth

DRF participated in the webinar on the ways in which the youth is using online spaces and the efforts we can take to make it a safer place for them.

A recording of the panel can be found here:

UN Women panel: Cyber Security & data privacy during COVID19

This webinar focused on the challenges women were facing regarding online violence and remedies available to them in Pakistan. It was a vibrant discussion about the shortcoming of current reporting mechanisms and the steps that can be taken to improve them.

Session on online safety with Dastak Organization

DRF conducted an online session with the team at Dastak Charitable Organization, who work as a refuge for women who have faced physical violence and also operate a crisis helpline for them. The session, which was held on the 29th of June, 2020, proved to be informative and interactive and generated positive feedback.

5: Covid19 Updates

24/7 Cyber Harassment Helpline

On 13th June, given the rise of cyber harassment cases following the COVID 19 lockdown DRF made its cyber harassment helpline operational 24/7 for three months. The cyber harassment helpline is offering our services free of cost for anyone who calls in or reaches out to us via social media or email. These include legal aid, the digital help desk, and mental health counseling. During these three months, our toll-free number will be accessible every day of the week, from 9 AM to 5 PM, our mental health counselors will work from 10 AM to 9 PM each day as well. Our mental health counselors are trained professionals and are providing free of cost counseling to women and those from marginalized communities. During all other hours of the day, our team is attending to complaints and queries through online platforms.

Contact the helpline on 080039393 or email us on You can also reach out to us on our social media channels.

Ab Aur Nahin

In times of COVID19 domestic abuse is at an all-time high where victims do not have anywhere to go. Ab Aur Nahin is a confidential legal and counselor support service specifically designed for victims of harassment.

IWF portal

DRF in collaboration with Internet Watch Foundation (IWF) and the Global Fund to End Violence Against Children launched a portal to combat children’s online safety in Pakistan. The new portal allows internet users in Pakistan to anonymously report child sexual abuse material in three different languages – English, Urdu, and Pashto. The reports will then be assessed by trained IWF analysts in the UK.


July 1, 2020 - Comments Off on Comments on the Consultation & Objections to the Rules

Comments on the Consultation & Objections to the Rules

Soon after the Citizens Protection (Against Online Harm) Rules, 2020 (the ‘Rules’) were notified in January 2020, Digital Rights Foundation (DRF) issued a statement in which concerns regarding the Rules were expressed. It was submitted, and it is reiterated, that the Rules restrict free speech and threaten the privacy of Pakistani citizens. Subsequently, we formulated our Legal Analysis highlighting therein the jurisdictional and substantive issues with the Rules in light of constitutional principles and precedent as well as larger policy questions.

When the Ministry of Information Technology & Telecommunication (MoITT) announced the formation of a committee to consult on the Rules, DRF, along with other advocacy groups, decided to boycott the consultation until and unless the Rules were de-notified. In a statement signed by over 100 organisations and individuals in Pakistan and released on Feb 29, 2020, the consultation process was termed a ‘smokescreen’. Maintaining our position on the consultation, through these Comments, we wish to elaborate on why we endorse(d) this stance, what we believe is wrong with the consultation and how these wrongs can be fixed. This will be followed by our objections to the Rules themselves.

Comments on the Consultations:

At the outset, we at DRF, would like to affirm our conviction to strive for a free and safe internet for all, especially women and minorities. We understand that in light of a fast-growing digital economy, rapidly expanding social media and continuous increase in the number of internet users in Pakistan, ensuring a safe online environment seems to be of much interest. While online spaces should be made safe, internet regulations should not come at the expense of fundamental rights guaranteed by the Constitution of Pakistan, 1973 and Pakistan’s international human rights commitments. A balance, therefore, has to be struck between fundamental rights and limitations imposed in the exercise of those rights. The only way, we believe, to achieve this balance is through meaningful consultations done with the stakeholders and the civil society in good faith.

The drafting process of the Rules, however, has been exclusionary and secretive from the start. It began with a complete lack of public engagement when the Rules were notified in February 2020- so much so that the Rules only became public knowledge after their notification. Given the serious and wide-ranging implications of the Rules, caution on the Government’s part and sustained collaboration with civil society was needed. Instead, the unexpected and sudden notification of the Rules caused alarm to nearly all stakeholders, including industry actors who issued a sharp rebuke of the Rules. It is respectfully submitted, that such practices do not resonate with the principles of ‘good faith.’

Almost immediately after the notification, the Rules drew sharp criticism locally and internationally. As a result, the Ministry of Information Technology & Telecommunication (MoITT) announced the formation of a committee to begin consultation on the Citizens Protection (Against Online Harm) Rules 2020. However, a consultation at the tail-end of the drafting process will do little good. Public participation before and during the official law-making process is far more significant than any end-phase activity and will be an eye-wash rather than a meaningful process.

We are concerned with not only the content of the regulations but also with how that content is to be agreed upon. The consultations, which are a reaction to the public outcry, fail to address either of the two. Experience shows that when people perceive a consultation to be insincere, they lose trust not only in the process but in the resulting regulations as well. Therefore, we urge the Federal Government to withdraw the existing Rules (a mere suspension of implementation of Rules is insufficient) through the same process by which they were notified. Any future Rules need to be co-created with meaningful participation of civil society from the start. Without a withdrawal of the Rules, the ‘reactionary’ consultations would be seen as a manipulation of the process to deflect criticism and not a genuine exercise to seek input. Without a withdrawal, it is unlikely the Rules would gain sufficient popularity or legitimacy. Once the necessary steps for withdrawal of notification have been taken, we request the government to issue an official statement mentioning therein that the legal status of the Rules is nothing more than a ‘proposed draft.’ This would mean that anything and everything in the Rules is open for discussion. This would not only demonstrate ‘good faith’ on the government’s part but also show its respect for freedom and democracy.

Even otherwise, it should be noted that the present consultation falls short of its desired purpose in as much as it seeks input with respect to the Rules only. The Preamble of the ‘Consultation Framework,’ posted on PTA’s website lays down the purpose of the consultation as follows: “in order to protect the citizens of Pakistan from the adverse effects of online unlawful content…” It is submitted that to make the internet ‘safe’ and to ‘protect’ the citizens would require more than regulations alone. The government should initiate a series of studies to ascertain other methods as well to effectively tackle online harms. Self-regulatory mechanisms for social media companies, educating users on safety and protective tools with respect to the internet and capacity-building of law-enforcement agencies to deal with cyber-crimes are some of the options that must be explored if the objective is to protect citizens from online unlawful content. These steps become all the more significant because online threats and harmful content continue to evolve. Additionally, such measures will reduce the burden on the regulators and provide a low-cost remedy to the users. It is reiterated that to effectively address this daunting task, a joint effort between the Government, civil society, law enforcement agencies as well as all social media companies is required.

To that end, a participatory, transparent and inclusive consultative process is needed. While this will help secure greater legitimacy and support for the Rules, at the same time, it can transform the relationship between citizens and their government. First and foremost, the presence of all stakeholders should be ensured. The Government should, in particular, identify those groups and communities that are most at risk and secure their representation in the consultation(s). If transparency and inclusiveness require the presence of all key stakeholders at the table, consensus demands that all voices be heard and considered. Therefore, we request that there should be mechanisms to ensure that the views and concerns of stakeholders are being seriously considered and that compromises are not necessarily based on majority positions.

Given the wide-ranging and serious implications of the Rules, it is necessary that the citizens’ groups and the society at large be kept informed at all stages of drafting these regulations. The drafters should also explain to the people why they have produced the text they have: what was the need for it and what factors they considered, what compromises they struck with the negotiators and why, and how they reached agreements on the final text.

Finally, input from the civil society and stakeholders should be sought to define words and certain phrases of the Rules that create penal offences. Many of the definitions and terms in the Rules, (which will be discussed shortly), are either loosely defined or lack clarity. It is suggested that discussions be held to determine clear meanings of such terms.

Objections to the Rules:
1: The Rules exceed the scope of its Parent Acts:

The Rules have been notified under provisions of the Pakistan Telecommunication (Re-organisation) Act, 1996 (PTRA) and the Prevention of Electronic Crimes Act (PECA) 2016 (hereinafter collectively referred to as the ‘Parent Acts’). The feedback form on the PTA website notes that the Rules are formulated “exercising statutory powers under PECA section 37 sub-section 2.” Under the Rules, the Pakistan Telecommunication Authority (PTA) is the designated Authority.

It is submitted that the scope and scale of action defined in the Rules go beyond the mandate given under the Parent Acts. The government is reminded that rules cannot impose or create new restrictions that are beyond the scope of the Parent Act(s).

It is observed that Rule 3 establishes the office of a National Coordinator and consolidates (through delegation) the powers granted to the Authority under PECA and PTRA to control online regulation. While we understand that Section 9 of PTRA allows delegation of powers, no concept of delegation of powers exists under PECA. Therefore, to pass on the powers contained in PECA to any third body is a violation of the said Act. Even under PTRA, powers can only be delegated to the chairman/chairperson, member or any other officer of PTA (re: Section 9), in which case, the autonomy and independence of the National Coordinator would remain questionable.

Without conceding that powers under PECA can be delegated to the National Coordinator, it is still submitted that Rule 7 goes beyond the scope of PECA (and also violates Article 19 of the Constitution which will be discussed later). Section 37(1) of PECA grants power to the Authority to “remove or block or issue directions for removal or blocking of access to an information through any information system....” While Section 37(1) confers limited powers to only remove or block access to an information, the powers to remove or block the whole information system or social media platform (conferred upon the National Coordinator by virtue of Rule 7) is a clear case of excessive delegation.

Further, the Rules require social media companies to deploy proactive mechanisms to ensure prevention of live streaming of fake news (Rule 4) and to remove, suspend or disable accounts or online content that spread fake news (Rule 5). Rule 5(f) also obligates a social media company that “if communicated by the Authority that online content is false, put a note to that effect along with the online content”. It is submitted that the powers to legislate on topics like fake news and misinformation are not granted under the Parent Acts. It is also unknown where the wide powers granted to the National Coordinator under Rule 3(2) to advise the Provincial and Federal Governments, issue directions to departments, authorities and agencies and to summon official representatives of social media companies stem from. These aforementioned Rules are, therefore, ultra vires the provisions of the Parent Acts.

  • Remove the body of the National Coordinator and the powers it wrongly assumed through PECA (re: the power to block online systems (rule 7)).
  • If any such power is to be conferred, then limit that power to removal or suspension of an information on any information system as opposed to the power to block the entire online system.
  • Establish a criteria for the selection and accountability of National Coordinator.
  • Ensure autonomy and independence of the National Coordinator.
  • Introduce mechanisms, through introduction of a public database or directory, to ensure transparency from any authority tasked with regulation and removal of content on the content removed and the reasons for such removal.
  • Omit provisions regulating ‘fake news’ as they go beyond the scope of Parent Acts.
  • Omit Rule 5(f) i.e. obligation to issue fake news correction, as it goes beyond the scope of the Parent Acts. Alternatively, social media companies should be urged to present ‘fact checks’ to any false information.
  • Exclude from the powers of the National Coordinator the ability to advise the Provincial and Federal Governments, issue directions to departments, authorities and agencies and to summon official representatives of social media companies, contained in Rule 3(2), as they go beyond the scope of the Parent Acts.
2: Arbitrary Powers:

It is observed that the Rules have granted arbitrary and discretionary powers to the National Coordinator and, at the same time, have failed to provide any mechanisms against the misuse of these powers.

Rule 4 obligates a social media company to remove, suspend or disable access to any online content within twenty-four hours, and in emergency situations within six hours, after being intimated by the Authority that any particular online content is in contravention of any provision of the Act, or any other law, rule, regulation or instruction of the National Coordinator. An ‘emergency situation’ is also to be exclusively determined by this office. On interpretation or permissibility of any online content, as per Rule 4 (2), the opinion of the National Coordinator is to take precedence over any community standards and rules or guidelines devised by the social media company.

It is submitted that this grants unprecedented censorship powers to a newly appointed National Coordinator which has the sole discretion to determine what constitutes ‘objectionable’ content. These are extremely vague and arbitrary powers and the Rules fail to provide any checks and balances to ensure that such requests will be used in a just manner. It is trite law that a restriction on freedom of speech will be unreasonable if the law imposing the restriction has not provided any safeguards against arbitrary exercise of power. However, Rule 4 encourages arbitrary and random acts and bestows upon the National Coordinator unfettered discretion to regulate online content instead of even remotely attempting to provide any safeguards against abuse of power. Moreover, the power granted under Rule 5 to the National Coordinator to monitor falsehood of any online content adds to the unfettered powers of the National Coordinator. It is concerning that while the National Coordinator has been granted extensive powers, including quasi-judicial and legislative powers to determine what constitutes online harm, the qualifications, accountability, and selection procedure of the National Coordinator remains unclear. This will have a chilling effect on the content removal process as social media companies will rush content regulation decisions to comply with the restrictive time limit, rushing on particularly complicated cases of free speech that require deliberation and legal opinions. Furthermore, smaller social media companies, which do not have the resources and automated regulation capacities that big tech companies such as Facebook or Google possess, will be disproportionately burdened with urgent content removal instructions.

Further, Rule 6 requires a social media company to provide to the Investigation Agency any information, data, content or sub-content contained in any information system owned, managed or run by the respective social media company. It is unclear if the Investigating Agency is required to go through any legal or judicial procedure to make such a request or not and whether it is required to notify or report to a court on seizure of any such information. Given the current PECA regulations, there is still a legal process through which information or data of private users can be requested. This Rule, however, totally negates the current process and gives the National Coordinator sweeping powers to monitor online traffic. The power under Rule 6 exceeds the ambit of section 37 and runs parallel to data request procedures established with social media companies.

  • Re-consider the 24 hrs time limit for content removal. It would be unreasonable to impose such a strict timeline especially for content that relates to private wrongs/disputes such as defamation and complicated cases of free speech.
  • Insert a “Stop the Clock” provision by listing out a set of criteria (such as seeking clarifications, technical infeasibility, etc.) under which the time limit would cease to apply to allow for due process and fair play in enforcing such requests.
  • Formulate clear and predetermined rules and procedures for investigations, seizures, collection and sharing of data.
  • Rule 4 should be amended and the Authority tasked with removal requests be required to give ‘cogent reasons for removal’ along with every content removal request. If those reasons are not satisfactory, the social media company should have the right to seek further clarifications.
  • National Coordinator should not be the sole authority to determine what constitutes ‘objectionable’ online content; neither can this be left open for the National Coordinator to decide from time to time through its ‘instructions’.
  • Remove the powers to request, obtain and provide data to Investigating Agencies.
3: vague Definitions:

It is an established law that “the language of the statute, and, in particular, a statute creating an offence, must be precise, definite and sufficiently objective so as to guard against an arbitrary and capricious action on part of the state functionaries.” Precise definitions are also important so that social media companies may regulate their conduct accordingly.

A fundamental flaw within these Rules is its vague, overly broad and extremely subjective definitions. For example, extremism (Rule 2(d)) is defined as ‘violent, vocal or active opposition to fundamental values of the state of Pakistan including...” It does not, however, define what constitutes or can be referred to as fundamental values of the state of Pakistan. Given the massive volume of content shared online, platforms may feel obliged to take a ‘better safe than sorry’ approach –which in this case would mean ‘take downfirst, ask questions later (or never).’ This threatens not only to impede legitimate operation of (and innovation in) services, but also to incentivize the removal of legitimate content. Moreover, an honest criticism or a fair comment made regarding the Federal Government, or any other state institution, runs the risk of being seen as ‘opposition,’ as this word also lacks clarity.

Similarly, while social media companies are required to ‘take due cognizance of the religious, cultural, ethnic and national security sensitivities of Pakistan’ (Rule 4(3)), the Rules fail to elaborate on these terms. Further, ‘fake news’ (Rule 4(4) & Rule 5(e)) has not been defined which adds to the ambiguity of the Rules. It is submitted that vague laws weaken the rule of law because they enable selective prosecution and interpretation, and arbitrary decision-making.

Rule 4(4) obligates a social media company to deploy proactive mechanisms to ensure prevention of live streaming of any content with regards to, amongst other things, ‘hate speech’ and ‘defamation.’ It should be noted that ‘hate speech’ and ‘defamation’ are both defined and considered as offences under PECA as well as the Pakistan Penal Code, 1860 (‘PPC’). It is also posited that determination of both these offences require a thorough investigation and a trial under both of these laws. It is submitted that if a trial and investigation is necessary to determine these offences then it would be nearly impossible for social media companies to ‘proactively’ prevent their live streaming. Additionally, social media companies already take down such material based on their community guidelines which cover areas such as public safety, hate speech and terrorist content. For instance, during the Christchurch terrorist attack, while Facebook was unable to take down the livestream as it was happening, AI technology and community guidelines were used to remove all instances of the video from the platform within hours of the incident. However, the Rules propose an unnecessary burden on social media companies and if any content is hastily removed as being hateful or defamatory, without a proper determination or investigation, then not only would such removal implicate the person who produced or transmitted such content (given these are penal offences under PECA & PPC) but also condemn them unheard. Even otherwise, hate speech and defamation are entirely contextual determinations, where the illegality of material is dependent on its impact. Impact on viewers is impossible for an automated system to assess, particularly before or during the material is being shared.

It is also noted that Rule 4(4) is in conflict with Section 38(5) of PECA, which expressly rejects imposition of any obligation on intermediaries or service providers to proactively monitor or filter material or content hosted, transmitted or made available on their platforms.

  • It is suggested that discussions be held amongst all stakeholders to determine clear and precise meanings of the following terms:
    • Extremism
    • Fundamental Values of the State of Pakistan
    • Religious, cultural and ethical sensitivities of Pakistan
    • National Security
    • Fake News
    • National Security
  • Use alternate methods of countering hateful, extremist and speech through investment in independent fact-checking bodies, funds for organisations developing counter-speech against organisations tackling online speech against women, gender, religious and ethnic minorities.
  • Formulate clear components of ‘active or vocal opposition’ to ensure it cannot be used to silence dissenting opinions.
  • Omit Rule 4(4) as it violates Section 38 (5) of PECA.
  • Content constituting ‘hate speech’ and ‘defamation’ should not be removed without a proper investigation.
4: Violates and Unreasonably restricts Fundamental Rights:

The Rules, as they stand, pose a serious danger to fundamental rights in the country. In particular, the breadth of the Rules’s restrictions, and the intrusive requirements that they place on social media platforms, would severely threaten online freedom of expression, right to privacy and information.

It is submitted that Rule 4 is a blatant violation of Article 19 (freedom of speech, etc.) of the Constitution. It exceeds the boundaries of permissible restrictions within the meaning of Article 19, lacks the necessary attributes of reasonableness and is extremely vague in nature. Article 19 states that restrictions on freedom of expression must be “reasonable” under the circumstances, and must be in aid of one of the legitimate state interests stated therein (“in the interests of the glory of Islam, integrity, security, or defence of Pakistan…”). The Rules, however, require all social media companies to remove or block online content if it is, among other things, in “contravention of instructions of the National Coordinator.” It is to be noted that deletion of data on the instructions of the National Coordinator does not fall under the permissible restrictions of Article 19 as it is an arbitrary criteria for the restriction of fundamental rights. Furthermore, a restriction on freedom of speech may only be placed in accordance with ‘law’ and an instruction passed by the National Coordinator does not qualify as law within the meaning of Article 19.

It must also be noted that Rule 7 (Blocking of Online System) is a violation of Article 19 of the Constitution which only provides the power to impose reasonable ‘restrictions’ on free speech in accordance with law. It is submitted that in today’s digital world, online systems allow individuals to obtain information, form, express and exchange ideas and are mediums through which people express their speech. Hence, entirely blocking an online system would be tantamount to blocking speech itself. The power to ‘block’ cannot be read under, inferred from, or assumed to be a part of the power to ‘restrict’ free speech. It was held, in Civil Aviation Authority Case, that “the predominant meanings of the said words (restrict and restriction) do not admit total prohibition. They connote the imposition of limitations of the bounds within which one can act...” Therefore, while Article 19 allows imposition of ‘restrictions’ on free speech, the power to ‘block’ an information system entirely exceeds the boundaries of permissible limitations under it and is a disproportionate method of achieving the goal of removing harmful content on the internet – rendering Rule 7 inconsistent with the Constitution as well (previously it was discussed Rule 7 goes beyond the scope of the Section 37 (1) of PECA).

As has already been discussed above, a restriction on freedom of speech will be unreasonable if the law imposing the restriction has not provided any safeguards against arbitrary exercise of power. Rule 4 violates this principle by encouraging arbitrary and random acts and bestows upon the National Coordinator unfettered discretion to regulate online content without providing any safeguards against abuse of power. The Rules do not formulate sufficient safeguards to ensure that the power extended to the National Coordinator would be exercised in a fair, just, and transparent manner. The power to declare any online content as ‘harmful’ and to search and seize data without the measures for questioning the authority concerns the state of privacy and free speech of the companies and that of the people.

The fact that the government has asked social media companies to provide all and any kind of user information or data in a ‘decrypted, readable and comprehensible format’, including private data shared through messaging applications like WhatsApp (Rule 6), and that too without defining any mechanisms for gaining access to data of anyone being investigated, just shows that it is neither concerned with the due procedure of the law nor is it concerned with the potential violations of citizens right to privacy.

Finally, Rule 5 obligates social media companies to put a note along-with any online content that is considered or interpreted to be ‘false’ by the National Coordinator. Not only does this provision add to the unfettered powers of the National Coordinator to be exercised arbitrarily but also makes the Coordinator in-charge of policing truth. This violates the principles of freely forming an ‘opinion’ (a right read under Article 19) as the National Coordinator now decides, or dictates, what is true and what is false.

  • Amend Rule 4 and exclude from it the words “the instructions of the National Coordinator” as the same violates Article 19 of the Constitution.
  • Omit Rule 7 as it violates Article 19 and does not fall under the ‘reasonable restrictions’ allowed under the Constitution.
  • Formulate rules and procedures for investigations, seizures and collection of data which are in line with due process safeguards.
  • Rule 4 should be amended to require the regulatory body to give ‘cogent reasons for removal’ along with every content removal request. If those reasons are not satisfactory, the social media company should have the right to seek further clarifications.
  • The authority tasked with content removal should not be the sole authority to determine what constitutes ‘objectionable’ online content; neither should it be left open for the authority to decide from time to time through its ‘instructions’.
5: Data Localisation:

Rule 5 obligates social media companies to register with the PTA within three months of coming into force of these Rules. It requires a social media company to establish a permanent registered office in Pakistan with a physical address located in Islamabad and to appoint a focal person based in Pakistan for coordination with the National Coordinator.

It is submitted that the requirement for registering with PTA and establishing a permanent registered office in Pakistan, before these companies can be granted permission to be viewed and/or provide services in Pakistan, is a move towards “data localisation”and challenges the borderless nature of the internet - a feature that is intrinsic to the internet itself. Forcing businesses to create a local presence is outside the norms of global business practice and can potentially force international social media companies to exit the country rather than invest further in Pakistan. It is unreasonable to expect social media companies to set up infrastructure in the country when the nature of the internet allows for it to be easily administered remotely. With an increase in compliance costs that come with incorporation of a company in Pakistan, companies across the globe including start-ups may have to reconsider serving users in Pakistan. Consequently, users in Pakistan including the local private sector may not be able to avail a variety of services required for carrying out day-to-day communication, online transactions, and trade/business related tasks. Many businesses and organisations across Pakistan rely on the services provided by social media companies, particularly during the Covid-19 pandemic when reliance on the internet has increased substantially, and will thus have an indirect impact on the economy as well. The proposed Rules requiring local incorporation and physical offices will also have a huge repercussion on taxation, foreign direct investment and other legal perspectives along with negatively impacting economic growth.

To effectively defend against cybercrimes and threats, companies protect user data and other critical information via a very small network of highly secure regional and global data centers staffed with uniquely skilled experts who are in scarce supply globally. These centers are equipped with advanced IT infrastructure that provides reliable and secure round-the-clock service. The clustering of highly-qualified staff and advanced equipment is a critical factor in the ability of institutions to safeguard data from increasingly sophisticated cyber-attacks.

Mandating the creation of a local data center will harm cybersecurity in Pakistan by:

  • Creating additional entry points into IT systems for cyber criminals.
  • Reducing the quality of cybersecurity in all facilities around the world by spreading cybersecurity resources (both people and systems) too thin.
  • Forcing companies to disconnect systems and/or reduce services.
  • Fragmenting the internet and impeding global coordination of cyber defense activities, which can only be achieved efficiently and at scale when and where the free flow of data is guaranteed.

Preventing the free flow of data:

  • Creates artificial barriers to information-sharing and hinders global communication;
  • Makes connectivity less affordable for people and businesses at a time when reducing connectivity costs is essential to expanding economic opportunity in Pakistan, boosting the digital economy and creating additional wealth;
  • Undermines the viability and dependability of cloud-based services in a range of business sectors that are essential for a modern digital economy; and
  • Slows GDP growth, stifles innovation, and lowers the quality of services available to domestic consumers and businesses.

The global nature of the Internet has democratized information which is available to anyone, anywhere around the world in an infinite variety of forms. The economies of scale achieved through globally located infrastructure have contributed to the affordability of services on the Internet, where several prominent services are available for free. Companies are able to provide these services to users even in markets that may not be financially sustainable as they don't have to incur additional cost of setting-up and running local offices and legal entities in each country where they offer services. Therefore, these Rules will harm consumer experience on the open internet, increasing costs to an extent that offering services/technologies to consumers in Pakistan becomes financially unviable.

  1. Scrap Rule 5 and abandon the model of data localisation as it discourages business and weakens data security of servers;
  2. Develop transparent and legally-compliant data request and content removal mechanisms with social media companies as an alternative to the model proposed in Rule 5.
Concluding Remarks:

We have discussed that the current consultations lack the essentials of ‘good faith’ which demands reexamination of the entire framework. We have also discussed that the Rules exceed the scope of Parent Acts, accord arbitrary powers to the National Coordinator, uses vague definitions and unreasonably restricts fundamental rights which makes them liable to be struck down. In light of the above, we call upon the government to immediately withdraw the Rules and initiate the consultation process from scratch. The renewed consultation should premise around tackling ‘online harm’ instead of a discussion on the Rules alone. Consensus should be reached on the best ways to tackle online harms. This would require comprehensive planning, transparent and meaningful consultations with stakeholders and participation of the civil society. Until this is done, Digital Rights Foundation will disassociate itself from any government initiatives that are seen as ingenuine efforts to deflect criticism.


June 22, 2020 - Comments Off on DRF Condemns Move Against Open Source Technology and OTF

DRF Condemns Move Against Open Source Technology and OTF

In a world where online freedoms are increasingly under threat from all sides, organisations who work on supporting a free and safe internet are more important than ever. This is why Digital Rights Foundation (DRF) is extremely worried by developments by the US government that might undermine the work Open Tech Fund (OTF) does.

Serious concerns over the future of OTF were raised last week when the new head of the United States Agency for Global Media (USAGM), was planning to push money and funding towards closed-source tools. OTF is an independent non-profit grantee of the USAGM and has been supporting organisations, journalists, human rights defenders and users by funding innovative and open-source projects which uphold internet freedoms across the word. This move prompted Libby Liu, the inaugural OTF CEO, to step down from her position citing concerns of interference from the new head of the USAGM in “the current FY2020 OTF funding stream and redirect some of our resources to a few closed-source circumvention tools."

OTF was one of the first supporters of DRF’s cyber harassment helpline, which has provided assistance to over 4000 individuals across Pakistan and continues to support journalists, activists, HRDs, women, children and vulnerable groups during the Covid-19 pandemic. The planned move by the USAGM threatens this support and similar work that OTF does with organisations globally. In the last eight years, OTF’s projects have “enabled more than 2 billion people in over 60 countries to safely access the internet.”

As beneficiaries, both direct and indirectly from the tools that OTF supports, we urge the US Congress to take concrete and immediate steps to ensure that OTF continues to support open-source and digital rights projects all over the world. We echo the demands made by the ‘Save Internet Freedom Tech’ coalition including that “all US-Government internet freedom funds to be awarded via an open, fair, competitive, and evidence-based decision process.”

The internet has enabled us to innovate, connect and thrive, particularly during the Covid-19 pandemic. We believe that internet freedom is the bedrock of what makes all these things possible on the internet, and organisations such as OTF which support the work of internet freedom are central to this foundation.

June 19, 2020 - Comments Off on Virtual ‘Private’ Networks no Longer Private as PTA Requires Registration

Virtual ‘Private’ Networks no Longer Private as PTA Requires Registration

Areeba Jibril is a DRF intern focusing on issues related to privacy, free speech, and elections. She tweets at @AreebaJibril

Finding a Virtual Private Network (VPN) provider in Pakistan is easy. A quick google search will pull up multiple free services. Casual internet users may register for these services to circumvent paywalls and access online content that has been blocked in Pakistan. They can do this without even really knowing what they’re signing up for. More sophisticated users may use VPNs to ensure that their IP addresses, and therefore their geographical location and identity, remain hidden from the websites they visit.

What casual users likely don’t know is that the Pakistan Telecommunication Authority (PTA) has announced a registration requirement for all Virtual Private Networks (VPNs) by 30th June 2020. This is twenty-two days after they first posted a public service announcement on their website. The PTA regulations do not ban the use of VPNs entirely, but they do require users to register their VPN use with their Internet Service Providers (ISPs). To do this they must share their CNIC number, the purpose for which they would like to use a VPN, and which IP address they will be using their VPN with. The privacy intrusion is not limited to this information. –The notification is vague, therefore it is difficult to say with authority the extent of the privacy intrusions that may come about. There is online speculation about the extent of information that the government can feely request from non-VPN users and whether the same practices will apply to VPN-users as well.

The Pakistani government claims they’ve added this requirement to support the Information and Communications Technology (ICT) industry and promote the “safety of telecom users.” But requiring registration of VPNs defeats the purpose for which VPNs were created. VPNs cannot be private if they must be registered with ISPs, who are then required to share the information with the government. The information flow doesn’t stop there – the government has contracted with Sandvine Corporation, a US-based company, to monitor ‘grey’ internet traffic.

The 10th June announcement isn’t forthcoming regarding the significance of this announcement, by claiming that this is “not new”. It’s true that users have been reporting that their VPNs had suddenly stopped working since 2011. However, this new announcement includes the threat of legal consequences, without much clarity on what these consequences will be. The drastic consequences to privacy do not need to be new to be concerning. The PTA claims to be using its authority under clause 4(6) of Monitoring and Reconciliation of Telephony Traffic Regulations (MRITT), 2010. 

VPNs can be helpful for the average internet user when they want to access content such as television shows that aren’t otherwise available in Pakistan. But they serve a much more important purpose in promoting freedoms of opinion and expression by protecting the privacy of users. By using a VPN, users can ensure that the websites they visit and the content they post cannot be traced back to them. For many, anonymity is an important part of what makes the internet a safe place.

David Kaye, the Special Rapporteur on the promotion and protection of the right to freedom of opinion and expression, noted, “Encryption and anonymity provide individuals and groups with a zone of privacy online to hold opinions and exercise freedom of expression without arbitrary and unlawful interference or attacks… A VPN connection, or use of Tor or a proxy server, combined with encryption, may be the only way in which an individual is able to access or share information in [environments with prevalent censorship].”

As the list of registered VPN users will be shared with ISPs, the risk of private information being accessed by those with malicious intent will increase dramatically. Without the ability to hide their physical location, users will be in greater danger if they use the internet to communicate discontent with the government and seek help anonymously. 

Some users may decide they cannot risk this intrusion to their privacy and refuse to register their VPNs. It is unclear how these users will be treated. The government can request that non-registered users have their VPNs blocked. However, they have also said that users who fail to register their VPNs can face legal consequences if they cause “loss to the national exchequer.” They maintain that they are adding this requirement to terminate “illegal traffic.” These vague terms should be a great cause of concern. What is illegal traffic? What will be considered a “loss to the national exchequer”? When will users be held legally accountable for failing to register their VPNs? The lack of guidance increases the risk that these laws will be used to target political dissidents and unpopular speech.

The notification concerning VPNs, coupled with the news from a few months back regarding ‘Deep Packet Inspection’ (DPI) poses a serious threat to online privacy and security for the common Pakistani citizens. DPI allows unprecedented access to a private individual’s activity online. The added issue with the DPI technology is the fact that the government has been incredibly silent on how they plan on using the technology and what the purpose of it is. This silence and general vagueness is somewhat similar to what we’re witnessing nowadays when it comes to this notification regarding VPNs in the country.   

Pakistan is not alone in regulating the use of VPNs. Belarus, China, Iran, Turkey, Iraq, Syria, Oman, Russia, Uganda, the UAE, and Venezuela have either introduced some measures to restrict the use of VPNs or banned the use outright. Iran allows the use of VPNs, but only if providers are Iranian while Russia bans VPN usage for sites that have previously been blocked by Russia’s governing body for telecommunications and mass media communications. Consequences for using VPNs are also wide-ranging. In China, the government has gone so far as to arrest a VPN provider. In Oman, private users face a 500 rial fine ($1300USD). 

Given the human and digital rights track record of these countries, this is not a list of countries that Pakistan should want to be on.  


The coming Pakistan VPN ban: PTA sets deadline for VPN users to register by June 30th
Where are VPNs legal and where are they banned?