Blog Archives

Archives for July 2020

July 21, 2020 - Comments Off on Digital Rights Foundation expresses concern regarding banning of popular social media applications TikTok and Bigo Live

Digital Rights Foundation expresses concern regarding banning of popular social media applications TikTok and Bigo Live

The Pakistan Telecommunications Authority (PTA) banned the popular social media application Bigo Live and issued a final warning to TikTok via press release on July 20, 2020, purportedly acting on a ‘number of complaints’ against the alleged ‘immoral, obscene and vulgar content’ on these applications. Additional reasons for the ban and warning included ‘their extremely negative effects on society in general and youth in particular.’ As an organisation working in the field of digital rights and freedoms for nearly a decade, Digital Rights Foundation (DRF) sees this as a blatant violation of the Constitutional right to freedom of expression online and urges the PTA to reconsider its approach for the safety of minors. These measures and warnings call for a fundamental reflection on the laws of censorship in place in Pakistan.

Pakistan has a long history of bans on social media platforms. In 2010, the Lahore High Court directed the government to block access to Facebook; a ban that lasted for a few days. Similarly, YouTube was blocked in Pakistan for three years. The Islamabad High Court issued orders in 2018 directing the PTA to swiftly deal with illegal content online, threatening otherwise that the courts would be compelled to block social media websites such as Twitter and Facebook. Last year the PTA reported to the National Assembly Standing Committee on Information Technology and Telecom that a total of 900,000 URLs have been blocked in the country. More recently, the PTA, acting upon directions of the Lahore High Court, ‘temporarily banned’ the popular mobile-based game PUBG. Earlier this month, a petition was filed at the Lahore High Court calling for a ban on TikTok.

The internet is a medium for expression, ranging from political to artistic content, and access to information that includes vital health information, news and entertainment. Applications such as TikTok and Bigo Live are video-based platforms used by a diverse set of users for expressing their right to speech and accessing content, both rights guaranteed under Articles 19 and 19A of the Constitution of Pakistan. Any curtailment of these rights needs to be proportionate in nature, necessary to the specific harm being caused and established by law. Blanket bans on social media platforms are neither proportionate, necessary to the harm stated in by the PTA nor justified by law.

Firstly, the criteria of ‘obscenity’, ‘vulgarity’ and ‘immorality’ used by the PTA is extremely vague and no objective legal standard has been employed by the Authority to take its decision. This was also reflected in the petition against TikTok submitted before the Lahore High Court that contended that users on the app are “spreading nudity and pornography for the sake of fame and rating”. It is not lost on us that the criteria of obscenity is often articulated in gendered terms and was the same justification used by petitions calling for a ban on the Aurat March earlier this year. Platforms such as TikTok and Bigo provide young people space to express themselves freely, in ways that they cannot in other spaces of society. It is also no coincidence that the user base of TikTok and Bigo is extremely diverse, consisting of users from different classes and genders. Unlike Facebook and Twitter, apps such as TikTok are not text-heavy and thus lend themselves to a more diverse userbase as lack of literacy is no longer a barrier. Justifications based on ‘vulgarity’ and ‘obscenity’ are often stand-ins for society’s discomfort with expression that deviates from gendered norms and carries classist assumptions of what constitutes ‘respectable’, ‘acceptable’ content.

The popularity of TikTok and Bigo among Pakistanis should be celebrated, it provides an avenue for artistic expression, a platform for expression and space for connection with one another. The content on these applications, no matter how frivolous or ‘silly’ is protected speech, and is important for any society that values culture. The courts, government or the PTA are no one to judge whether it is valuable or beneficial for society or not. Additionally, these platforms provided content creators with an opportunity to earn revenue from their live streams and creative videos. Many Pakistanis are among the top professional online gamers, using platforms such as PUBG, and there is a burgeoning local e-sports culture. In a post-Covid19 economy, as prospects for employment for the youth are rapidly shrinking, taking away these economic opportunities could devastate the livelihoods of many.

While the PTA does not allude to it, many have cited the extremely horrific incident of gang rape in Lahore as further justification for the ban. The victim/survivor was a user of TikTok who allegedly met her rapist over the app and agreed to meet in person to record a video, which is when the incident took place. Violence against women and rape is a systemic issue that predates any social media application and will, unfortunately, continue even if the internet was banned in Pakistan. This rhetoric of ‘protecting women’ is part of an old playbook, women’s safety issues are hijacked by voices to enact paternalistic, heavy-handed measures which do very little to tackle systemic violence against gendered bodies or dismantle patriarchal structures, rather seek to restrict freedoms. As a digital and women’s rights organisation, we are witness to the justifications used by the government to pass draconian legislation such as the Prevention of Electronic Crimes Act in 2016 which has done little to protect women and gendered minorities in online spaces. The desire to police women’s bodies and expression is again apparent when the criteria of obscenity and vulgarity are invoked to limit internet freedoms.

Secondly, it is disingenuous to argue that these platforms are being used to negatively influence the youth as even a cursory look at the content on TikTok and Bigo reveals that this is patently not the case. The community guidelines of these platforms prohibit pornography and harmful content for minors. Any content violating these policies is either auto-deleted by the algorithm or can be reported in-app. When a proportionate remedy exists for the alleged harm caused, there is no point in banning entire applications which contain diverse content. If any content on the application is deemed to contain hate speech or cause harm to minors, then it should be reported as a piece of individual content to either the social media company for removal or taken up with the relevant law enforcement agencies. Individual pieces of content cannot be used to justify the banning of an entire platform; a move that would be grossly disproportionate.

Additionally, social media companies cannot be held liable for individual pieces of content on their platforms. While each company should be required to have mechanisms for removal and monitoring of harmful content in place, the principle of intermediary liability states that these platforms cannot be expected to be held liable for every content that is posted. Users are well within their rights to demand that social media companies have adequate mechanisms and systems in place that protect the most vulnerable groups using their platform, however, platforms cannot be expected to guarantee that no harmful content will ever be posted. As long as there are systems in place to detect, report and remove such content when it does appear the companies are acting within the scope of their limited liability.

Thirdly, the justification of these bans to ‘protect’ children and the youth is akin to banning highways to prevent road accidents. The mental health, wellbeing and safety of children and young adults should be a concern for us all, however, banning applications is a paternalistic solution to a problem whose root causes are beyond individual applications or even the internet. Mental health is an epidemic worldwide and Pakistan, in particular, lacks the infrastructure to provide quality mental healthcare to its citizens, including the youth. In a country with a massive youth bulge, it is concerning that avenues for expression and entertainment, which are vital parts of intellectual and emotional growth, are extremely limited. As per a study by UNDP in 2018, it was reported that a majority of Pakistani youth do not have access to recreational facilities or events: 78.6% do not have access to parks; 97.2% lack access to live music; 93.9% lack access to sporting events; 97% can’t access cinemas and 93% are denied access to sports facilities. By banning applications primarily used by the youth, the state will be denying them a platform for self-expression in a society that already lacks alternatives.

Fourthly, individual pieces of content can be reported to the PTA under section 37 of the Prevention of Electronic Crimes Act 2016 (PECA). If individual accounts or content violates the criteria for removal, the PTA has the power to “remove or block or issue directions for removal or blocking of access to an information”. This needs to be done through an order passed by the PTA and such an order needs to be communicated to the aggrieved party who has the right under section 37(4) to ask for a review of the order for blocking or removal. Furthermore, an appeal to the relevant high court against the decision of the PTA also lies under section 37(5). As a digital rights organisation, we believe that these powers are already too broad and need to be reviewed; but it is concerning that in banning entire platforms the PTA is exceeding even these excessive powers in an arbitrary manner.

This month TikTok, along with 58 other Chinese-owned apps, was banned by the Modi-led government in India as a result of its strained relations with China. Statements by the US government have expressed similar possible plans. While there are legitimate concerns to be raised regarding the content regulation and privacy policies of the application, these decisions do not seem to be driven out of genuine concern for users’ rights rather are part of a larger geostrategic move to isolate China. As a country that has repeatedly raised alarm regarding the fascist tendencies of the Modi government, it is surprising that the government in Pakistan is taking a similar heavy-handed approach to internet censorship.

We demand an overhaul of the internet regulation regime in place as it is extremely arbitrary and violates the principles of freedom of expression and access to information, enshrined in not only international conventions that Pakistan has ratified but also guaranteed under its own Constitution. These individual cases point towards a wider trend of shrinking online freedoms. As concerned citizens, we demand:

  1. Immediately lift bans on PUBG and Bigo Live, and reconsider warnings issued to TikTok;
  2. Section 37 of the Prevention of Electronic Crimes Act 2016 be repealed;
  3. The government move towards a model of self-regulation that is compliant with international human rights standards;
  4. Transparency from the PTA regarding the content that is reported to the Authority and publicly available orders which delineate the reasons for removing/blocking specific content;
  5. A comprehensive and welfare-based plan for the protection of children and adolescents which includes investment in digital literacy, access to mental health counselling and programs for the performing arts; and
  6. Immediate de-notification of the Citizen's Protection (Against Online Harm) Rules and abandonment of the approach taken in the Rules as a viable mechanism for regulating the internet.
https://globalfreedomofexpression.columbia.edu/cases/ali-v-pakistan-the-case-of-the-facebook-ban/
https://www.reuters.com/article/us-pakistan-youtube/pakistan-lifts-ban-on-youtube-after-launch-of-local-version-idUSKCN0UW1ER
https://www.dawn.com/news/1507590
https://www.dawn.com/news/1566352
https://www.dawn.com/news/1568996/court-moved-for-ban-on-tiktok
https://www.dawn.com/news/1536977/freedom-of-expression-cant-be-banned-lhc-seeks-police-response-on-petition-against-aurat-march
https://www.dawn.com/news/1404423
 Read about the impact of the TikTok ban in India here: https://www.npr.org/2020/07/16/890382893/tiktok-changed-my-life-india-s-ban-on-chinese-app-leaves-video-makers-stunned
https://edition.cnn.com/2020/07/07/tech/us-tiktok-ban/index.html

July 13, 2020 - Comments Off on June 2020: DRF launches Digital 50.50 e-magazine

June 2020: DRF launches Digital 50.50 e-magazine

1: Online campaigns and initiatives

Digital 50.50

Digital Rights Foundation launched its newest initiative, a feminist e-magazine called the Digital 50.50, a monthly e-magazine that aims to create an open space for ideas, opinions, art, and discourse on a wide array of topics in the digital rights arena from an intersectional feminist lens. The idea behind launching Digital 50.50 is rooted in our organization’s goal to make online spaces safer and equal for womxn journalists and support them in providing reliable information to the public. To ensure gender-sensitive reporting and amplify women’s voices, Digital 50.50 believes that the inclusion of women is a must in all roles - from content creators and editors to experts, sources, and subjects of stories and art. The theme for the first issue was ‘Impact of Covid-19 on women and girls in online and offline spaces’ and it covered a diverse range of features on how the coronavirus pandemic has transformed the work that journalists do, the rise in incels online, increasing cases of domestic abuse and the economic issues faced by women workers and laborers.

Statement by DRF to save OTF

The US government has recently announced to push money and funding to closed source tools which considerably has adverse effects on the Open Tech Foundation (OTF). OTF’s work is under threat and OTF has been supporting countless journalists, human rights defenders, and organizations working on human rights. DRF condemns this move against open source technology and OTF.

Read our full statement here: https://digitalrightsfoundation.pk/drf-condemns-move-against-open-source-technology-and-otf/

Campaign on Digital Laws Asia

DRF, in partnership with APC, did a day-long campaign highlighting the legal landscape of Pakistan with regards to its Digital Laws. This campaign was held across South Asia on the same day and was meant to show a cross-comparison of countries in this region. DRF particularly highlighted the Citizen’s Protection (Against Online Harm) Rules 2020 which restrict freedom of expression and privacy online of citizens. DRF and other organizations have also given out a statement on no consultation on the rules without withdrawal.

Click to read our statement on the rules: https://digitalrightsfoundation.pk/drf-condemns-citizens-protection-against-online-harm-rules-2020-as-an-affront-on-online-freedoms/

No Consultation Without Withdrawl of Rules:

https://drive.google.com/file/d/1pvTqMRF3_fWaH5gQg6DNa9G2ldpAdY_3/view

COVID19 contact tracing

DRF analyzed the COVID19 contact tracing app and raised its concerns regarding possible surveillance through the app and the safety of the user’s data.

Read our full analysis here: https://digitalrightsfoundation.pk/covid-19-gov-pk-the-tech-to-battle-coronavirus/

Campaign on together for reliable information

In collaboration with the Free Press Unlimited, Digital Rights Foundation participated in a campaign to highlight the work done by DRF and its Network of Women Journalists for Digital Rights to bring accurate and reliable information to the public amid the coronavirus pandemic. The campaign included a series of videos from DRF’s team on our role in bringing reliable information on digital rights to our beneficiaries and the general public and the launch of a toolkit for journalists and content creators on how to keep themselves safe considering the new set of risks and threats being posed in the online spaces. It also brought to the public, a series of visual stories from the journalistic frontline of Covid-19.

DRF internship program

DRF is proud to launch our internship program for this year and also introduce our brilliant batch of interns in the past who worked tirelessly to help us in what we do. To know more about our work and the different projects our interns worked on click on the link below:

https://digitalrightsfoundation.pk/internship-program/

2: Policy Initiatives

Cyber Harassment Helpline Report 2019

DRF released Cyber Harassment Helpline’s annual report of 2019 on 24th June. The report highlights complaint trends observed in the previous year as well as the policy recommendations for all the stakeholders. Concerning trends observed were the increase of attacks on mobile wallets/ e-wallets like EasyPaisa and phishing attacks where people were targeted through WhatsApp or text messaging. The Helpline saw a total of 2023 cases being reported, with a daily average of 146 calls per month in 2019. When compared to the overall complaints the Helpline had received over three years, the calls from 2019 account for 45% of all complaints. This showed an alarming increase in the number of cases over time and a disturbing upward trend in cyber-harassment cases.

Link to report:

https://digitalrightsfoundation.pk/wp-content/uploads/2020/06/DRF-Helpline-Report-2019-3.pdf

Forum on Information and Democracy working group to combat infodemic and information chaos

The forum on information and democracy launched their first working group to combat infodemic and information chaos and making a policy framework to tackle this crisis. Our ED Nighat Dad is a part of this forum where the mission is to assist with the regulation and self-regulation of the online information and communication domain and to implement the goals of the International Partnership for Information and Democracy that was launched during the UN General Assembly in September 2019 and has now been signed by 37 countries.

Read about the working group here:

https://informationdemocracy.org/2020/06/24/forum-launches-working-group-to-combat-infodemic-information-chaos/

Blog on Internet shutdown in Quetta

DRF observed with great concern the internet shutdown in the provincial capital of Quetta earlier last month that affected the lives of the Baloch residents. The effects of cutting off an essential resource extend not only to the difficulties in the work of journalists who are covering the news in the midst of a pandemic but also to the safety of all those who have to venture out to work and have limited access to their families as well as the access of information to all citizens in the affected area.

Link to the blog post:

https://digitalrightsfoundation.pk/quetta-internet-shutdown/

Blog on countries wanting to regulate and control VPNs and their access

DRF’s intern Areeba Jibril pens down her thoughts around countries regulating and controlling VPNs and how that might be a direct threat to an individual’s privacy, free speech, and elections. She tweets at @AreebaJibril

Link to blog:

https://digitalrightsfoundation.pk/virtual-private-networks-no-longer-private-as-pta-requires-registration/

3: Media Engagement

189% increase in cyber-harassment cases during COVID19 (Policy brief)

Cyber Harassment Helpline released a policy brief on 3rd June delineating the number of cases and complaint trends observed during the lockdown. An increase in the number of complaints during lockdown was observed. The helpline received a combined 136 complaints in March and April during the lockdown compared to 47 complaints, an increase of 189 percent, before the lockdown in January and February. The majority of cases received at the cyber harassment helpline during the lockdown months pertained to blackmailing through non-consensual sharing of information, intimate pictures, and videos. Complaints of hate speech, phishing, fake profiles, and defamation were also reported. These complaints were only received through online mediums (Emails, social media platforms) as Cyber harassment helpline’s toll-free number was inactive. The policy brief also suggested measures for the government to tackle this issue of increased cyber harassment.

Link to policy brief:

https://digitalrightsfoundation.pk/wp-content/uploads/2020/06/Covid-19.pdf

Press coverage of brief:

https://www.dawn.com/news/1561065

https://ibcurdu.com/news/111316/

DRF on morning show ‘Aaj Pakistan with Sidra Iqbal’

Nighat Dad spoke on the morning show ‘Aaj Pakistan with Sidra Iqbal’ highlighting how social media and our personal lives are interlinked. She also highlighted how everyone is vulnerable on social media and it is important to aware of what you’re sharing online.

Link to show:

https://www.youtube.com/watch?v=NtKHNL7HkVU

Inside Pakistan’s COVID19 Contact Tracing app

DRF spoke to The Diplomat around Pakistan’s COVID19 Contact Tracing App and how boundaries need to be established by the state when they’re introducing such an app. It was highlighted how the privacy of users should also be considered when the app is introduced.

Read the full article here:

https://thediplomat.com/2020/07/inside-pakistans-covid-19-contact-tracing/

4: Events and Sessions

Human Rights in the Digital era: Business respect for civil and political rights in times of emergency

Nighat Dad participated in the United Nations Virtual Forum on Responsible Business and Human Rights 2020. She spoke on a panel on ‘Human Rights in the Digital Era: Business Respect for Civil and Political Rights in times of emergence’. She particularly highlighted the human rights implications of business tackling during COVID 19.

Link to details of the session:

https://rbhr2020.heysummit.com/talks/human-rights-in-the-digital-era-business-respect-for-civil-and-political-rights-in-times-of-emergency/#

APAC Insight 2- True Lies: Misinformation, Censorship and the Open Internet

DRF’s ED Nighat Dad spoke in the webcast ‘True Lies: Misinformation, Censorship and the Open Internet’ hosted by Internet Society Asia Pacific (ISOC) on 30th June. Nighat emphasized on the disinformation campaigns around Pakistan recently and how that is a direct threat to activists to the country.

Link to webcast:

https://isoc.live/12355/

DRF in webinar on How to reduce the digital gender divide in post-pandemic Pakistan

On 16th June DRF in collaboration with the Swedish embassy did a webinar on ‘How to reduce the digital gender divide in post-pandemic Pakistan’. The webinar started with a keynote address from the Swedish ambassador, Ingrid Johansson and later had a presentation by Camila Wagner and Nighat Dad. Camila Wagner, an awarded journalist specializing in societal challenges. She presently runs Klara K, consultants in public affairs, crisis management and leadership. Camila explained the digital gender divide across the globe and in Sweden while DRF’s Nighat Dad drew stark differences between Pakistan and Sweden, and how still much needed to be done in the country.

Online Violence against women during COVID19 at #WOWGlobal24- British council

Nighat Dad spoke about online violence against women during COVID19 ad #WOWGlobal24 by British Council. She shared statistics from the helpline in which we saw an increase in online violence complaints on the helpline by 180% in the month of March when the lockdown was imposed as opposed to January and February of 2020.

 

Click here to read about what she said:

https://www.dawn.com/news/1565792/wow-festival-begins-online

Roundtable on cybercrime and harassment

Nighat Dad spoke on the roundtable on ‘Cyber Harassment Under the Shadow of Corona: Incidence, Control and Punishment’ hosted by the National Initiative against Organized Crime Pakistan. She highlighted how cyber harassment has increased during these times and how cyber harassment helpline is a resource for people facing harassment online.

 

 

 

Emerging stories: Journalism in times of isolation FPU

DRF’s ED participated in Freedom Press Unlimited’s Emerging stories focusing on journalism in times of isolation. She highlighted the work DRF has done with journalists and the challenges journalists are facing on the ground here due to the pandemic.

 

Link to the panel:

https://dezwijger.nl/programma/journalism-in-times-of-isolation-10

 

Courting the Law’s discussion on ‘Rights in the Digital Age’

DRF participated in Candid with Pasha on a live discussion on the ‘Personal Data Protection Bill 2020- How will it impact individuals and corporate?’ on 8th June. She highlighted the discrepancies in the bill and how more work needs to be done around it.

Link to discussion:

Candid with P@SHA - Ep-4 - The Personal Data Protection Bill

In the Next Episode of Candid with P@SHA, we have a Live Panel Discussion on "The Personal Data Protection Bill 2020 - How will it impact Individuals and Corporate?"Our panelists are:- Syed Junaid Imam - Member IT, Ministry of IT&T- Saifullah Khan - Managing Partner, S.U.Khan Associates Corporate & Legal Consultants and Member of P@SHA Legal Committee- Nighat Dad - Founder at Digital Rights Foundation- Badar Khushnood - Co-Founder Bramerz (Pvt.) Ltd.This session will be moderated by Mujeeb Zahur - MD, S&P Global & VC P@SHACatch us LIVE this Monday (June 8th, 2020) at 04:00 PM on P@SHA's Facebook Page for an in-depth analysis!From the event page linked below, you can add a reminder to your calendar:https://www.pasha.org.pk/events/candid-with-pasha-ep-4/Facebook Event: https://www.facebook.com/events/899226477246360/#CandidWithPASHA #COVID19 #LIVEDiscussion #DataProtectionBill

Posted by Pakistan Software Houses Association on Monday, June 8, 2020

 

Increased cyber harassment during COVID19 on FM101

DRF spoke on FM 101 around how harassment during COVID19 has increased and also shared startling figures from the helpline around this.

DRF at the public hearing at Budenstag highlighting ‘Human Rights and Freedom of Association in the Digital Age’

DRF’s ED spoke at Budenstag public hearing focusing on human rights and freedom of association in the digital age. She highlighted how lawmakers need to be more thoughtful and aware of the interventions they propose and how it can have international repercussions for all. She also highlighted the responsibilities of the Facebook oversight board and how women and other marginalized groups are vulnerable on the internet.

Aaahung panel: Impact of Online Harassment on Pakistani Youth

DRF participated in the webinar on the ways in which the youth is using online spaces and the efforts we can take to make it a safer place for them.

A recording of the panel can be found here: https://www.facebook.com/watch/live/?v=321965452134965&ref=watch_permalink.

UN Women panel: Cyber Security & data privacy during COVID19

This webinar focused on the challenges women were facing regarding online violence and remedies available to them in Pakistan. It was a vibrant discussion about the shortcoming of current reporting mechanisms and the steps that can be taken to improve them.

Session on online safety with Dastak Organization

DRF conducted an online session with the team at Dastak Charitable Organization, who work as a refuge for women who have faced physical violence and also operate a crisis helpline for them. The session, which was held on the 29th of June, 2020, proved to be informative and interactive and generated positive feedback.

5: Covid19 Updates

24/7 Cyber Harassment Helpline

On 13th June, given the rise of cyber harassment cases following the COVID 19 lockdown DRF made its cyber harassment helpline operational 24/7 for three months. The cyber harassment helpline is offering our services free of cost for anyone who calls in or reaches out to us via social media or email. These include legal aid, the digital help desk, and mental health counseling. During these three months, our toll-free number will be accessible every day of the week, from 9 AM to 5 PM, our mental health counselors will work from 10 AM to 9 PM each day as well. Our mental health counselors are trained professionals and are providing free of cost counseling to women and those from marginalized communities. During all other hours of the day, our team is attending to complaints and queries through online platforms.

Contact the helpline on 080039393 or email us on [email protected]. You can also reach out to us on our social media channels.

Ab Aur Nahin

In times of COVID19 domestic abuse is at an all-time high where victims do not have anywhere to go. Ab Aur Nahin is a confidential legal and counselor support service specifically designed for victims of harassment.

www.abaurnahin.pk

IWF portal

DRF in collaboration with Internet Watch Foundation (IWF) and the Global Fund to End Violence Against Children launched a portal to combat children’s online safety in Pakistan. The new portal allows internet users in Pakistan to anonymously report child sexual abuse material in three different languages – English, Urdu, and Pashto. The reports will then be assessed by trained IWF analysts in the UK.

https://report.iwf.org.uk/pk

 

July 1, 2020 - Comments Off on Comments on the Consultation & Objections to the Rules

Comments on the Consultation & Objections to the Rules

Soon after the Citizens Protection (Against Online Harm) Rules, 2020 (the ‘Rules’) were notified in January 2020, Digital Rights Foundation (DRF) issued a statement in which concerns regarding the Rules were expressed. It was submitted, and it is reiterated, that the Rules restrict free speech and threaten the privacy of Pakistani citizens. Subsequently, we formulated our Legal Analysis highlighting therein the jurisdictional and substantive issues with the Rules in light of constitutional principles and precedent as well as larger policy questions.

When the Ministry of Information Technology & Telecommunication (MoITT) announced the formation of a committee to consult on the Rules, DRF, along with other advocacy groups, decided to boycott the consultation until and unless the Rules were de-notified. In a statement signed by over 100 organisations and individuals in Pakistan and released on Feb 29, 2020, the consultation process was termed a ‘smokescreen’. Maintaining our position on the consultation, through these Comments, we wish to elaborate on why we endorse(d) this stance, what we believe is wrong with the consultation and how these wrongs can be fixed. This will be followed by our objections to the Rules themselves.

Comments on the Consultations:

At the outset, we at DRF, would like to affirm our conviction to strive for a free and safe internet for all, especially women and minorities. We understand that in light of a fast-growing digital economy, rapidly expanding social media and continuous increase in the number of internet users in Pakistan, ensuring a safe online environment seems to be of much interest. While online spaces should be made safe, internet regulations should not come at the expense of fundamental rights guaranteed by the Constitution of Pakistan, 1973 and Pakistan’s international human rights commitments. A balance, therefore, has to be struck between fundamental rights and limitations imposed in the exercise of those rights. The only way, we believe, to achieve this balance is through meaningful consultations done with the stakeholders and the civil society in good faith.

The drafting process of the Rules, however, has been exclusionary and secretive from the start. It began with a complete lack of public engagement when the Rules were notified in February 2020- so much so that the Rules only became public knowledge after their notification. Given the serious and wide-ranging implications of the Rules, caution on the Government’s part and sustained collaboration with civil society was needed. Instead, the unexpected and sudden notification of the Rules caused alarm to nearly all stakeholders, including industry actors who issued a sharp rebuke of the Rules. It is respectfully submitted, that such practices do not resonate with the principles of ‘good faith.’

Almost immediately after the notification, the Rules drew sharp criticism locally and internationally. As a result, the Ministry of Information Technology & Telecommunication (MoITT) announced the formation of a committee to begin consultation on the Citizens Protection (Against Online Harm) Rules 2020. However, a consultation at the tail-end of the drafting process will do little good. Public participation before and during the official law-making process is far more significant than any end-phase activity and will be an eye-wash rather than a meaningful process.

We are concerned with not only the content of the regulations but also with how that content is to be agreed upon. The consultations, which are a reaction to the public outcry, fail to address either of the two. Experience shows that when people perceive a consultation to be insincere, they lose trust not only in the process but in the resulting regulations as well. Therefore, we urge the Federal Government to withdraw the existing Rules (a mere suspension of implementation of Rules is insufficient) through the same process by which they were notified. Any future Rules need to be co-created with meaningful participation of civil society from the start. Without a withdrawal of the Rules, the ‘reactionary’ consultations would be seen as a manipulation of the process to deflect criticism and not a genuine exercise to seek input. Without a withdrawal, it is unlikely the Rules would gain sufficient popularity or legitimacy. Once the necessary steps for withdrawal of notification have been taken, we request the government to issue an official statement mentioning therein that the legal status of the Rules is nothing more than a ‘proposed draft.’ This would mean that anything and everything in the Rules is open for discussion. This would not only demonstrate ‘good faith’ on the government’s part but also show its respect for freedom and democracy.

Even otherwise, it should be noted that the present consultation falls short of its desired purpose in as much as it seeks input with respect to the Rules only. The Preamble of the ‘Consultation Framework,’ posted on PTA’s website lays down the purpose of the consultation as follows: “in order to protect the citizens of Pakistan from the adverse effects of online unlawful content…” It is submitted that to make the internet ‘safe’ and to ‘protect’ the citizens would require more than regulations alone. The government should initiate a series of studies to ascertain other methods as well to effectively tackle online harms. Self-regulatory mechanisms for social media companies, educating users on safety and protective tools with respect to the internet and capacity-building of law-enforcement agencies to deal with cyber-crimes are some of the options that must be explored if the objective is to protect citizens from online unlawful content. These steps become all the more significant because online threats and harmful content continue to evolve. Additionally, such measures will reduce the burden on the regulators and provide a low-cost remedy to the users. It is reiterated that to effectively address this daunting task, a joint effort between the Government, civil society, law enforcement agencies as well as all social media companies is required.

To that end, a participatory, transparent and inclusive consultative process is needed. While this will help secure greater legitimacy and support for the Rules, at the same time, it can transform the relationship between citizens and their government. First and foremost, the presence of all stakeholders should be ensured. The Government should, in particular, identify those groups and communities that are most at risk and secure their representation in the consultation(s). If transparency and inclusiveness require the presence of all key stakeholders at the table, consensus demands that all voices be heard and considered. Therefore, we request that there should be mechanisms to ensure that the views and concerns of stakeholders are being seriously considered and that compromises are not necessarily based on majority positions.

Given the wide-ranging and serious implications of the Rules, it is necessary that the citizens’ groups and the society at large be kept informed at all stages of drafting these regulations. The drafters should also explain to the people why they have produced the text they have: what was the need for it and what factors they considered, what compromises they struck with the negotiators and why, and how they reached agreements on the final text.

Finally, input from the civil society and stakeholders should be sought to define words and certain phrases of the Rules that create penal offences. Many of the definitions and terms in the Rules, (which will be discussed shortly), are either loosely defined or lack clarity. It is suggested that discussions be held to determine clear meanings of such terms.

Objections to the Rules:
1: The Rules exceed the scope of its Parent Acts:

The Rules have been notified under provisions of the Pakistan Telecommunication (Re-organisation) Act, 1996 (PTRA) and the Prevention of Electronic Crimes Act (PECA) 2016 (hereinafter collectively referred to as the ‘Parent Acts’). The feedback form on the PTA website notes that the Rules are formulated “exercising statutory powers under PECA section 37 sub-section 2.” Under the Rules, the Pakistan Telecommunication Authority (PTA) is the designated Authority.

It is submitted that the scope and scale of action defined in the Rules go beyond the mandate given under the Parent Acts. The government is reminded that rules cannot impose or create new restrictions that are beyond the scope of the Parent Act(s).

It is observed that Rule 3 establishes the office of a National Coordinator and consolidates (through delegation) the powers granted to the Authority under PECA and PTRA to control online regulation. While we understand that Section 9 of PTRA allows delegation of powers, no concept of delegation of powers exists under PECA. Therefore, to pass on the powers contained in PECA to any third body is a violation of the said Act. Even under PTRA, powers can only be delegated to the chairman/chairperson, member or any other officer of PTA (re: Section 9), in which case, the autonomy and independence of the National Coordinator would remain questionable.

Without conceding that powers under PECA can be delegated to the National Coordinator, it is still submitted that Rule 7 goes beyond the scope of PECA (and also violates Article 19 of the Constitution which will be discussed later). Section 37(1) of PECA grants power to the Authority to “remove or block or issue directions for removal or blocking of access to an information through any information system....” While Section 37(1) confers limited powers to only remove or block access to an information, the powers to remove or block the whole information system or social media platform (conferred upon the National Coordinator by virtue of Rule 7) is a clear case of excessive delegation.

Further, the Rules require social media companies to deploy proactive mechanisms to ensure prevention of live streaming of fake news (Rule 4) and to remove, suspend or disable accounts or online content that spread fake news (Rule 5). Rule 5(f) also obligates a social media company that “if communicated by the Authority that online content is false, put a note to that effect along with the online content”. It is submitted that the powers to legislate on topics like fake news and misinformation are not granted under the Parent Acts. It is also unknown where the wide powers granted to the National Coordinator under Rule 3(2) to advise the Provincial and Federal Governments, issue directions to departments, authorities and agencies and to summon official representatives of social media companies stem from. These aforementioned Rules are, therefore, ultra vires the provisions of the Parent Acts.

Recommendations:
  • Remove the body of the National Coordinator and the powers it wrongly assumed through PECA (re: the power to block online systems (rule 7)).
  • If any such power is to be conferred, then limit that power to removal or suspension of an information on any information system as opposed to the power to block the entire online system.
  • Establish a criteria for the selection and accountability of National Coordinator.
  • Ensure autonomy and independence of the National Coordinator.
  • Introduce mechanisms, through introduction of a public database or directory, to ensure transparency from any authority tasked with regulation and removal of content on the content removed and the reasons for such removal.
  • Omit provisions regulating ‘fake news’ as they go beyond the scope of Parent Acts.
  • Omit Rule 5(f) i.e. obligation to issue fake news correction, as it goes beyond the scope of the Parent Acts. Alternatively, social media companies should be urged to present ‘fact checks’ to any false information.
  • Exclude from the powers of the National Coordinator the ability to advise the Provincial and Federal Governments, issue directions to departments, authorities and agencies and to summon official representatives of social media companies, contained in Rule 3(2), as they go beyond the scope of the Parent Acts.
2: Arbitrary Powers:

It is observed that the Rules have granted arbitrary and discretionary powers to the National Coordinator and, at the same time, have failed to provide any mechanisms against the misuse of these powers.

Rule 4 obligates a social media company to remove, suspend or disable access to any online content within twenty-four hours, and in emergency situations within six hours, after being intimated by the Authority that any particular online content is in contravention of any provision of the Act, or any other law, rule, regulation or instruction of the National Coordinator. An ‘emergency situation’ is also to be exclusively determined by this office. On interpretation or permissibility of any online content, as per Rule 4 (2), the opinion of the National Coordinator is to take precedence over any community standards and rules or guidelines devised by the social media company.

It is submitted that this grants unprecedented censorship powers to a newly appointed National Coordinator which has the sole discretion to determine what constitutes ‘objectionable’ content. These are extremely vague and arbitrary powers and the Rules fail to provide any checks and balances to ensure that such requests will be used in a just manner. It is trite law that a restriction on freedom of speech will be unreasonable if the law imposing the restriction has not provided any safeguards against arbitrary exercise of power. However, Rule 4 encourages arbitrary and random acts and bestows upon the National Coordinator unfettered discretion to regulate online content instead of even remotely attempting to provide any safeguards against abuse of power. Moreover, the power granted under Rule 5 to the National Coordinator to monitor falsehood of any online content adds to the unfettered powers of the National Coordinator. It is concerning that while the National Coordinator has been granted extensive powers, including quasi-judicial and legislative powers to determine what constitutes online harm, the qualifications, accountability, and selection procedure of the National Coordinator remains unclear. This will have a chilling effect on the content removal process as social media companies will rush content regulation decisions to comply with the restrictive time limit, rushing on particularly complicated cases of free speech that require deliberation and legal opinions. Furthermore, smaller social media companies, which do not have the resources and automated regulation capacities that big tech companies such as Facebook or Google possess, will be disproportionately burdened with urgent content removal instructions.

Further, Rule 6 requires a social media company to provide to the Investigation Agency any information, data, content or sub-content contained in any information system owned, managed or run by the respective social media company. It is unclear if the Investigating Agency is required to go through any legal or judicial procedure to make such a request or not and whether it is required to notify or report to a court on seizure of any such information. Given the current PECA regulations, there is still a legal process through which information or data of private users can be requested. This Rule, however, totally negates the current process and gives the National Coordinator sweeping powers to monitor online traffic. The power under Rule 6 exceeds the ambit of section 37 and runs parallel to data request procedures established with social media companies.

Recommendations:
  • Re-consider the 24 hrs time limit for content removal. It would be unreasonable to impose such a strict timeline especially for content that relates to private wrongs/disputes such as defamation and complicated cases of free speech.
  • Insert a “Stop the Clock” provision by listing out a set of criteria (such as seeking clarifications, technical infeasibility, etc.) under which the time limit would cease to apply to allow for due process and fair play in enforcing such requests.
  • Formulate clear and predetermined rules and procedures for investigations, seizures, collection and sharing of data.
  • Rule 4 should be amended and the Authority tasked with removal requests be required to give ‘cogent reasons for removal’ along with every content removal request. If those reasons are not satisfactory, the social media company should have the right to seek further clarifications.
  • National Coordinator should not be the sole authority to determine what constitutes ‘objectionable’ online content; neither can this be left open for the National Coordinator to decide from time to time through its ‘instructions’.
  • Remove the powers to request, obtain and provide data to Investigating Agencies.
3: vague Definitions:

It is an established law that “the language of the statute, and, in particular, a statute creating an offence, must be precise, definite and sufficiently objective so as to guard against an arbitrary and capricious action on part of the state functionaries.” Precise definitions are also important so that social media companies may regulate their conduct accordingly.

A fundamental flaw within these Rules is its vague, overly broad and extremely subjective definitions. For example, extremism (Rule 2(d)) is defined as ‘violent, vocal or active opposition to fundamental values of the state of Pakistan including...” It does not, however, define what constitutes or can be referred to as fundamental values of the state of Pakistan. Given the massive volume of content shared online, platforms may feel obliged to take a ‘better safe than sorry’ approach –which in this case would mean ‘take downfirst, ask questions later (or never).’ This threatens not only to impede legitimate operation of (and innovation in) services, but also to incentivize the removal of legitimate content. Moreover, an honest criticism or a fair comment made regarding the Federal Government, or any other state institution, runs the risk of being seen as ‘opposition,’ as this word also lacks clarity.

Similarly, while social media companies are required to ‘take due cognizance of the religious, cultural, ethnic and national security sensitivities of Pakistan’ (Rule 4(3)), the Rules fail to elaborate on these terms. Further, ‘fake news’ (Rule 4(4) & Rule 5(e)) has not been defined which adds to the ambiguity of the Rules. It is submitted that vague laws weaken the rule of law because they enable selective prosecution and interpretation, and arbitrary decision-making.

Rule 4(4) obligates a social media company to deploy proactive mechanisms to ensure prevention of live streaming of any content with regards to, amongst other things, ‘hate speech’ and ‘defamation.’ It should be noted that ‘hate speech’ and ‘defamation’ are both defined and considered as offences under PECA as well as the Pakistan Penal Code, 1860 (‘PPC’). It is also posited that determination of both these offences require a thorough investigation and a trial under both of these laws. It is submitted that if a trial and investigation is necessary to determine these offences then it would be nearly impossible for social media companies to ‘proactively’ prevent their live streaming. Additionally, social media companies already take down such material based on their community guidelines which cover areas such as public safety, hate speech and terrorist content. For instance, during the Christchurch terrorist attack, while Facebook was unable to take down the livestream as it was happening, AI technology and community guidelines were used to remove all instances of the video from the platform within hours of the incident. However, the Rules propose an unnecessary burden on social media companies and if any content is hastily removed as being hateful or defamatory, without a proper determination or investigation, then not only would such removal implicate the person who produced or transmitted such content (given these are penal offences under PECA & PPC) but also condemn them unheard. Even otherwise, hate speech and defamation are entirely contextual determinations, where the illegality of material is dependent on its impact. Impact on viewers is impossible for an automated system to assess, particularly before or during the material is being shared.

It is also noted that Rule 4(4) is in conflict with Section 38(5) of PECA, which expressly rejects imposition of any obligation on intermediaries or service providers to proactively monitor or filter material or content hosted, transmitted or made available on their platforms.

Recommendations:
  • It is suggested that discussions be held amongst all stakeholders to determine clear and precise meanings of the following terms:
    • Extremism
    • Fundamental Values of the State of Pakistan
    • Religious, cultural and ethical sensitivities of Pakistan
    • National Security
    • Fake News
    • National Security
  • Use alternate methods of countering hateful, extremist and speech through investment in independent fact-checking bodies, funds for organisations developing counter-speech against organisations tackling online speech against women, gender, religious and ethnic minorities.
  • Formulate clear components of ‘active or vocal opposition’ to ensure it cannot be used to silence dissenting opinions.
  • Omit Rule 4(4) as it violates Section 38 (5) of PECA.
  • Content constituting ‘hate speech’ and ‘defamation’ should not be removed without a proper investigation.
4: Violates and Unreasonably restricts Fundamental Rights:

The Rules, as they stand, pose a serious danger to fundamental rights in the country. In particular, the breadth of the Rules’s restrictions, and the intrusive requirements that they place on social media platforms, would severely threaten online freedom of expression, right to privacy and information.

It is submitted that Rule 4 is a blatant violation of Article 19 (freedom of speech, etc.) of the Constitution. It exceeds the boundaries of permissible restrictions within the meaning of Article 19, lacks the necessary attributes of reasonableness and is extremely vague in nature. Article 19 states that restrictions on freedom of expression must be “reasonable” under the circumstances, and must be in aid of one of the legitimate state interests stated therein (“in the interests of the glory of Islam, integrity, security, or defence of Pakistan…”). The Rules, however, require all social media companies to remove or block online content if it is, among other things, in “contravention of instructions of the National Coordinator.” It is to be noted that deletion of data on the instructions of the National Coordinator does not fall under the permissible restrictions of Article 19 as it is an arbitrary criteria for the restriction of fundamental rights. Furthermore, a restriction on freedom of speech may only be placed in accordance with ‘law’ and an instruction passed by the National Coordinator does not qualify as law within the meaning of Article 19.

It must also be noted that Rule 7 (Blocking of Online System) is a violation of Article 19 of the Constitution which only provides the power to impose reasonable ‘restrictions’ on free speech in accordance with law. It is submitted that in today’s digital world, online systems allow individuals to obtain information, form, express and exchange ideas and are mediums through which people express their speech. Hence, entirely blocking an online system would be tantamount to blocking speech itself. The power to ‘block’ cannot be read under, inferred from, or assumed to be a part of the power to ‘restrict’ free speech. It was held, in Civil Aviation Authority Case, that “the predominant meanings of the said words (restrict and restriction) do not admit total prohibition. They connote the imposition of limitations of the bounds within which one can act...” Therefore, while Article 19 allows imposition of ‘restrictions’ on free speech, the power to ‘block’ an information system entirely exceeds the boundaries of permissible limitations under it and is a disproportionate method of achieving the goal of removing harmful content on the internet – rendering Rule 7 inconsistent with the Constitution as well (previously it was discussed Rule 7 goes beyond the scope of the Section 37 (1) of PECA).

As has already been discussed above, a restriction on freedom of speech will be unreasonable if the law imposing the restriction has not provided any safeguards against arbitrary exercise of power. Rule 4 violates this principle by encouraging arbitrary and random acts and bestows upon the National Coordinator unfettered discretion to regulate online content without providing any safeguards against abuse of power. The Rules do not formulate sufficient safeguards to ensure that the power extended to the National Coordinator would be exercised in a fair, just, and transparent manner. The power to declare any online content as ‘harmful’ and to search and seize data without the measures for questioning the authority concerns the state of privacy and free speech of the companies and that of the people.

The fact that the government has asked social media companies to provide all and any kind of user information or data in a ‘decrypted, readable and comprehensible format’, including private data shared through messaging applications like WhatsApp (Rule 6), and that too without defining any mechanisms for gaining access to data of anyone being investigated, just shows that it is neither concerned with the due procedure of the law nor is it concerned with the potential violations of citizens right to privacy.

Finally, Rule 5 obligates social media companies to put a note along-with any online content that is considered or interpreted to be ‘false’ by the National Coordinator. Not only does this provision add to the unfettered powers of the National Coordinator to be exercised arbitrarily but also makes the Coordinator in-charge of policing truth. This violates the principles of freely forming an ‘opinion’ (a right read under Article 19) as the National Coordinator now decides, or dictates, what is true and what is false.

Recommendations:
  • Amend Rule 4 and exclude from it the words “the instructions of the National Coordinator” as the same violates Article 19 of the Constitution.
  • Omit Rule 7 as it violates Article 19 and does not fall under the ‘reasonable restrictions’ allowed under the Constitution.
  • Formulate rules and procedures for investigations, seizures and collection of data which are in line with due process safeguards.
  • Rule 4 should be amended to require the regulatory body to give ‘cogent reasons for removal’ along with every content removal request. If those reasons are not satisfactory, the social media company should have the right to seek further clarifications.
  • The authority tasked with content removal should not be the sole authority to determine what constitutes ‘objectionable’ online content; neither should it be left open for the authority to decide from time to time through its ‘instructions’.
5: Data Localisation:

Rule 5 obligates social media companies to register with the PTA within three months of coming into force of these Rules. It requires a social media company to establish a permanent registered office in Pakistan with a physical address located in Islamabad and to appoint a focal person based in Pakistan for coordination with the National Coordinator.

It is submitted that the requirement for registering with PTA and establishing a permanent registered office in Pakistan, before these companies can be granted permission to be viewed and/or provide services in Pakistan, is a move towards “data localisation”and challenges the borderless nature of the internet - a feature that is intrinsic to the internet itself. Forcing businesses to create a local presence is outside the norms of global business practice and can potentially force international social media companies to exit the country rather than invest further in Pakistan. It is unreasonable to expect social media companies to set up infrastructure in the country when the nature of the internet allows for it to be easily administered remotely. With an increase in compliance costs that come with incorporation of a company in Pakistan, companies across the globe including start-ups may have to reconsider serving users in Pakistan. Consequently, users in Pakistan including the local private sector may not be able to avail a variety of services required for carrying out day-to-day communication, online transactions, and trade/business related tasks. Many businesses and organisations across Pakistan rely on the services provided by social media companies, particularly during the Covid-19 pandemic when reliance on the internet has increased substantially, and will thus have an indirect impact on the economy as well. The proposed Rules requiring local incorporation and physical offices will also have a huge repercussion on taxation, foreign direct investment and other legal perspectives along with negatively impacting economic growth.

To effectively defend against cybercrimes and threats, companies protect user data and other critical information via a very small network of highly secure regional and global data centers staffed with uniquely skilled experts who are in scarce supply globally. These centers are equipped with advanced IT infrastructure that provides reliable and secure round-the-clock service. The clustering of highly-qualified staff and advanced equipment is a critical factor in the ability of institutions to safeguard data from increasingly sophisticated cyber-attacks.

Mandating the creation of a local data center will harm cybersecurity in Pakistan by:

  • Creating additional entry points into IT systems for cyber criminals.
  • Reducing the quality of cybersecurity in all facilities around the world by spreading cybersecurity resources (both people and systems) too thin.
  • Forcing companies to disconnect systems and/or reduce services.
  • Fragmenting the internet and impeding global coordination of cyber defense activities, which can only be achieved efficiently and at scale when and where the free flow of data is guaranteed.

Preventing the free flow of data:

  • Creates artificial barriers to information-sharing and hinders global communication;
  • Makes connectivity less affordable for people and businesses at a time when reducing connectivity costs is essential to expanding economic opportunity in Pakistan, boosting the digital economy and creating additional wealth;
  • Undermines the viability and dependability of cloud-based services in a range of business sectors that are essential for a modern digital economy; and
  • Slows GDP growth, stifles innovation, and lowers the quality of services available to domestic consumers and businesses.

The global nature of the Internet has democratized information which is available to anyone, anywhere around the world in an infinite variety of forms. The economies of scale achieved through globally located infrastructure have contributed to the affordability of services on the Internet, where several prominent services are available for free. Companies are able to provide these services to users even in markets that may not be financially sustainable as they don't have to incur additional cost of setting-up and running local offices and legal entities in each country where they offer services. Therefore, these Rules will harm consumer experience on the open internet, increasing costs to an extent that offering services/technologies to consumers in Pakistan becomes financially unviable.

Recommendations:
  1. Scrap Rule 5 and abandon the model of data localisation as it discourages business and weakens data security of servers;
  2. Develop transparent and legally-compliant data request and content removal mechanisms with social media companies as an alternative to the model proposed in Rule 5.
Concluding Remarks:

We have discussed that the current consultations lack the essentials of ‘good faith’ which demands reexamination of the entire framework. We have also discussed that the Rules exceed the scope of Parent Acts, accord arbitrary powers to the National Coordinator, uses vague definitions and unreasonably restricts fundamental rights which makes them liable to be struck down. In light of the above, we call upon the government to immediately withdraw the Rules and initiate the consultation process from scratch. The renewed consultation should premise around tackling ‘online harm’ instead of a discussion on the Rules alone. Consensus should be reached on the best ways to tackle online harms. This would require comprehensive planning, transparent and meaningful consultations with stakeholders and participation of the civil society. Until this is done, Digital Rights Foundation will disassociate itself from any government initiatives that are seen as ingenuine efforts to deflect criticism.