July 1, 2020 - Comments Off on Comments on the Consultation & Objections to the Rules

Comments on the Consultation & Objections to the Rules

Soon after the Citizens Protection (Against Online Harm) Rules, 2020 (the ‘Rules’) were notified in January 2020, Digital Rights Foundation (DRF) issued a statement in which concerns regarding the Rules were expressed. It was submitted, and it is reiterated, that the Rules restrict free speech and threaten the privacy of Pakistani citizens. Subsequently, we formulated our Legal Analysis highlighting therein the jurisdictional and substantive issues with the Rules in light of constitutional principles and precedent as well as larger policy questions.

When the Ministry of Information Technology & Telecommunication (MoITT) announced the formation of a committee to consult on the Rules, DRF, along with other advocacy groups, decided to boycott the consultation until and unless the Rules were de-notified. In a statement signed by over 100 organisations and individuals in Pakistan and released on Feb 29, 2020, the consultation process was termed a ‘smokescreen’. Maintaining our position on the consultation, through these Comments, we wish to elaborate on why we endorse(d) this stance, what we believe is wrong with the consultation and how these wrongs can be fixed. This will be followed by our objections to the Rules themselves.

Comments on the Consultations:

At the outset, we at DRF, would like to affirm our conviction to strive for a free and safe internet for all, especially women and minorities. We understand that in light of a fast-growing digital economy, rapidly expanding social media and continuous increase in the number of internet users in Pakistan, ensuring a safe online environment seems to be of much interest. While online spaces should be made safe, internet regulations should not come at the expense of fundamental rights guaranteed by the Constitution of Pakistan, 1973 and Pakistan’s international human rights commitments. A balance, therefore, has to be struck between fundamental rights and limitations imposed in the exercise of those rights. The only way, we believe, to achieve this balance is through meaningful consultations done with the stakeholders and the civil society in good faith.

The drafting process of the Rules, however, has been exclusionary and secretive from the start. It began with a complete lack of public engagement when the Rules were notified in February 2020- so much so that the Rules only became public knowledge after their notification. Given the serious and wide-ranging implications of the Rules, caution on the Government’s part and sustained collaboration with civil society was needed. Instead, the unexpected and sudden notification of the Rules caused alarm to nearly all stakeholders, including industry actors who issued a sharp rebuke of the Rules. It is respectfully submitted, that such practices do not resonate with the principles of ‘good faith.’

Almost immediately after the notification, the Rules drew sharp criticism locally and internationally. As a result, the Ministry of Information Technology & Telecommunication (MoITT) announced the formation of a committee to begin consultation on the Citizens Protection (Against Online Harm) Rules 2020. However, a consultation at the tail-end of the drafting process will do little good. Public participation before and during the official law-making process is far more significant than any end-phase activity and will be an eye-wash rather than a meaningful process.

We are concerned with not only the content of the regulations but also with how that content is to be agreed upon. The consultations, which are a reaction to the public outcry, fail to address either of the two. Experience shows that when people perceive a consultation to be insincere, they lose trust not only in the process but in the resulting regulations as well. Therefore, we urge the Federal Government to withdraw the existing Rules (a mere suspension of implementation of Rules is insufficient) through the same process by which they were notified. Any future Rules need to be co-created with meaningful participation of civil society from the start. Without a withdrawal of the Rules, the ‘reactionary’ consultations would be seen as a manipulation of the process to deflect criticism and not a genuine exercise to seek input. Without a withdrawal, it is unlikely the Rules would gain sufficient popularity or legitimacy. Once the necessary steps for withdrawal of notification have been taken, we request the government to issue an official statement mentioning therein that the legal status of the Rules is nothing more than a ‘proposed draft.’ This would mean that anything and everything in the Rules is open for discussion. This would not only demonstrate ‘good faith’ on the government’s part but also show its respect for freedom and democracy.

Even otherwise, it should be noted that the present consultation falls short of its desired purpose in as much as it seeks input with respect to the Rules only. The Preamble of the ‘Consultation Framework,’ posted on PTA’s website lays down the purpose of the consultation as follows: “in order to protect the citizens of Pakistan from the adverse effects of online unlawful content…” It is submitted that to make the internet ‘safe’ and to ‘protect’ the citizens would require more than regulations alone. The government should initiate a series of studies to ascertain other methods as well to effectively tackle online harms. Self-regulatory mechanisms for social media companies, educating users on safety and protective tools with respect to the internet and capacity-building of law-enforcement agencies to deal with cyber-crimes are some of the options that must be explored if the objective is to protect citizens from online unlawful content. These steps become all the more significant because online threats and harmful content continue to evolve. Additionally, such measures will reduce the burden on the regulators and provide a low-cost remedy to the users. It is reiterated that to effectively address this daunting task, a joint effort between the Government, civil society, law enforcement agencies as well as all social media companies is required.

To that end, a participatory, transparent and inclusive consultative process is needed. While this will help secure greater legitimacy and support for the Rules, at the same time, it can transform the relationship between citizens and their government. First and foremost, the presence of all stakeholders should be ensured. The Government should, in particular, identify those groups and communities that are most at risk and secure their representation in the consultation(s). If transparency and inclusiveness require the presence of all key stakeholders at the table, consensus demands that all voices be heard and considered. Therefore, we request that there should be mechanisms to ensure that the views and concerns of stakeholders are being seriously considered and that compromises are not necessarily based on majority positions.

Given the wide-ranging and serious implications of the Rules, it is necessary that the citizens’ groups and the society at large be kept informed at all stages of drafting these regulations. The drafters should also explain to the people why they have produced the text they have: what was the need for it and what factors they considered, what compromises they struck with the negotiators and why, and how they reached agreements on the final text.

Finally, input from the civil society and stakeholders should be sought to define words and certain phrases of the Rules that create penal offences. Many of the definitions and terms in the Rules, (which will be discussed shortly), are either loosely defined or lack clarity. It is suggested that discussions be held to determine clear meanings of such terms.

Objections to the Rules:
1: The Rules exceed the scope of its Parent Acts:

The Rules have been notified under provisions of the Pakistan Telecommunication (Re-organisation) Act, 1996 (PTRA) and the Prevention of Electronic Crimes Act (PECA) 2016 (hereinafter collectively referred to as the ‘Parent Acts’). The feedback form on the PTA website notes that the Rules are formulated “exercising statutory powers under PECA section 37 sub-section 2.” Under the Rules, the Pakistan Telecommunication Authority (PTA) is the designated Authority.

It is submitted that the scope and scale of action defined in the Rules go beyond the mandate given under the Parent Acts. The government is reminded that rules cannot impose or create new restrictions that are beyond the scope of the Parent Act(s).

It is observed that Rule 3 establishes the office of a National Coordinator and consolidates (through delegation) the powers granted to the Authority under PECA and PTRA to control online regulation. While we understand that Section 9 of PTRA allows delegation of powers, no concept of delegation of powers exists under PECA. Therefore, to pass on the powers contained in PECA to any third body is a violation of the said Act. Even under PTRA, powers can only be delegated to the chairman/chairperson, member or any other officer of PTA (re: Section 9), in which case, the autonomy and independence of the National Coordinator would remain questionable.

Without conceding that powers under PECA can be delegated to the National Coordinator, it is still submitted that Rule 7 goes beyond the scope of PECA (and also violates Article 19 of the Constitution which will be discussed later). Section 37(1) of PECA grants power to the Authority to “remove or block or issue directions for removal or blocking of access to an information through any information system....” While Section 37(1) confers limited powers to only remove or block access to an information, the powers to remove or block the whole information system or social media platform (conferred upon the National Coordinator by virtue of Rule 7) is a clear case of excessive delegation.

Further, the Rules require social media companies to deploy proactive mechanisms to ensure prevention of live streaming of fake news (Rule 4) and to remove, suspend or disable accounts or online content that spread fake news (Rule 5). Rule 5(f) also obligates a social media company that “if communicated by the Authority that online content is false, put a note to that effect along with the online content”. It is submitted that the powers to legislate on topics like fake news and misinformation are not granted under the Parent Acts. It is also unknown where the wide powers granted to the National Coordinator under Rule 3(2) to advise the Provincial and Federal Governments, issue directions to departments, authorities and agencies and to summon official representatives of social media companies stem from. These aforementioned Rules are, therefore, ultra vires the provisions of the Parent Acts.

Recommendations:
  • Remove the body of the National Coordinator and the powers it wrongly assumed through PECA (re: the power to block online systems (rule 7)).
  • If any such power is to be conferred, then limit that power to removal or suspension of an information on any information system as opposed to the power to block the entire online system.
  • Establish a criteria for the selection and accountability of National Coordinator.
  • Ensure autonomy and independence of the National Coordinator.
  • Introduce mechanisms, through introduction of a public database or directory, to ensure transparency from any authority tasked with regulation and removal of content on the content removed and the reasons for such removal.
  • Omit provisions regulating ‘fake news’ as they go beyond the scope of Parent Acts.
  • Omit Rule 5(f) i.e. obligation to issue fake news correction, as it goes beyond the scope of the Parent Acts. Alternatively, social media companies should be urged to present ‘fact checks’ to any false information.
  • Exclude from the powers of the National Coordinator the ability to advise the Provincial and Federal Governments, issue directions to departments, authorities and agencies and to summon official representatives of social media companies, contained in Rule 3(2), as they go beyond the scope of the Parent Acts.
2: Arbitrary Powers:

It is observed that the Rules have granted arbitrary and discretionary powers to the National Coordinator and, at the same time, have failed to provide any mechanisms against the misuse of these powers.

Rule 4 obligates a social media company to remove, suspend or disable access to any online content within twenty-four hours, and in emergency situations within six hours, after being intimated by the Authority that any particular online content is in contravention of any provision of the Act, or any other law, rule, regulation or instruction of the National Coordinator. An ‘emergency situation’ is also to be exclusively determined by this office. On interpretation or permissibility of any online content, as per Rule 4 (2), the opinion of the National Coordinator is to take precedence over any community standards and rules or guidelines devised by the social media company.

It is submitted that this grants unprecedented censorship powers to a newly appointed National Coordinator which has the sole discretion to determine what constitutes ‘objectionable’ content. These are extremely vague and arbitrary powers and the Rules fail to provide any checks and balances to ensure that such requests will be used in a just manner. It is trite law that a restriction on freedom of speech will be unreasonable if the law imposing the restriction has not provided any safeguards against arbitrary exercise of power. However, Rule 4 encourages arbitrary and random acts and bestows upon the National Coordinator unfettered discretion to regulate online content instead of even remotely attempting to provide any safeguards against abuse of power. Moreover, the power granted under Rule 5 to the National Coordinator to monitor falsehood of any online content adds to the unfettered powers of the National Coordinator. It is concerning that while the National Coordinator has been granted extensive powers, including quasi-judicial and legislative powers to determine what constitutes online harm, the qualifications, accountability, and selection procedure of the National Coordinator remains unclear. This will have a chilling effect on the content removal process as social media companies will rush content regulation decisions to comply with the restrictive time limit, rushing on particularly complicated cases of free speech that require deliberation and legal opinions. Furthermore, smaller social media companies, which do not have the resources and automated regulation capacities that big tech companies such as Facebook or Google possess, will be disproportionately burdened with urgent content removal instructions.

Further, Rule 6 requires a social media company to provide to the Investigation Agency any information, data, content or sub-content contained in any information system owned, managed or run by the respective social media company. It is unclear if the Investigating Agency is required to go through any legal or judicial procedure to make such a request or not and whether it is required to notify or report to a court on seizure of any such information. Given the current PECA regulations, there is still a legal process through which information or data of private users can be requested. This Rule, however, totally negates the current process and gives the National Coordinator sweeping powers to monitor online traffic. The power under Rule 6 exceeds the ambit of section 37 and runs parallel to data request procedures established with social media companies.

Recommendations:
  • Re-consider the 24 hrs time limit for content removal. It would be unreasonable to impose such a strict timeline especially for content that relates to private wrongs/disputes such as defamation and complicated cases of free speech.
  • Insert a “Stop the Clock” provision by listing out a set of criteria (such as seeking clarifications, technical infeasibility, etc.) under which the time limit would cease to apply to allow for due process and fair play in enforcing such requests.
  • Formulate clear and predetermined rules and procedures for investigations, seizures, collection and sharing of data.
  • Rule 4 should be amended and the Authority tasked with removal requests be required to give ‘cogent reasons for removal’ along with every content removal request. If those reasons are not satisfactory, the social media company should have the right to seek further clarifications.
  • National Coordinator should not be the sole authority to determine what constitutes ‘objectionable’ online content; neither can this be left open for the National Coordinator to decide from time to time through its ‘instructions’.
  • Remove the powers to request, obtain and provide data to Investigating Agencies.
3: vague Definitions:

It is an established law that “the language of the statute, and, in particular, a statute creating an offence, must be precise, definite and sufficiently objective so as to guard against an arbitrary and capricious action on part of the state functionaries.” Precise definitions are also important so that social media companies may regulate their conduct accordingly.

A fundamental flaw within these Rules is its vague, overly broad and extremely subjective definitions. For example, extremism (Rule 2(d)) is defined as ‘violent, vocal or active opposition to fundamental values of the state of Pakistan including...” It does not, however, define what constitutes or can be referred to as fundamental values of the state of Pakistan. Given the massive volume of content shared online, platforms may feel obliged to take a ‘better safe than sorry’ approach –which in this case would mean ‘take downfirst, ask questions later (or never).’ This threatens not only to impede legitimate operation of (and innovation in) services, but also to incentivize the removal of legitimate content. Moreover, an honest criticism or a fair comment made regarding the Federal Government, or any other state institution, runs the risk of being seen as ‘opposition,’ as this word also lacks clarity.

Similarly, while social media companies are required to ‘take due cognizance of the religious, cultural, ethnic and national security sensitivities of Pakistan’ (Rule 4(3)), the Rules fail to elaborate on these terms. Further, ‘fake news’ (Rule 4(4) & Rule 5(e)) has not been defined which adds to the ambiguity of the Rules. It is submitted that vague laws weaken the rule of law because they enable selective prosecution and interpretation, and arbitrary decision-making.

Rule 4(4) obligates a social media company to deploy proactive mechanisms to ensure prevention of live streaming of any content with regards to, amongst other things, ‘hate speech’ and ‘defamation.’ It should be noted that ‘hate speech’ and ‘defamation’ are both defined and considered as offences under PECA as well as the Pakistan Penal Code, 1860 (‘PPC’). It is also posited that determination of both these offences require a thorough investigation and a trial under both of these laws. It is submitted that if a trial and investigation is necessary to determine these offences then it would be nearly impossible for social media companies to ‘proactively’ prevent their live streaming. Additionally, social media companies already take down such material based on their community guidelines which cover areas such as public safety, hate speech and terrorist content. For instance, during the Christchurch terrorist attack, while Facebook was unable to take down the livestream as it was happening, AI technology and community guidelines were used to remove all instances of the video from the platform within hours of the incident. However, the Rules propose an unnecessary burden on social media companies and if any content is hastily removed as being hateful or defamatory, without a proper determination or investigation, then not only would such removal implicate the person who produced or transmitted such content (given these are penal offences under PECA & PPC) but also condemn them unheard. Even otherwise, hate speech and defamation are entirely contextual determinations, where the illegality of material is dependent on its impact. Impact on viewers is impossible for an automated system to assess, particularly before or during the material is being shared.

It is also noted that Rule 4(4) is in conflict with Section 38(5) of PECA, which expressly rejects imposition of any obligation on intermediaries or service providers to proactively monitor or filter material or content hosted, transmitted or made available on their platforms.

Recommendations:
  • It is suggested that discussions be held amongst all stakeholders to determine clear and precise meanings of the following terms:
    • Extremism
    • Fundamental Values of the State of Pakistan
    • Religious, cultural and ethical sensitivities of Pakistan
    • National Security
    • Fake News
    • National Security
  • Use alternate methods of countering hateful, extremist and speech through investment in independent fact-checking bodies, funds for organisations developing counter-speech against organisations tackling online speech against women, gender, religious and ethnic minorities.
  • Formulate clear components of ‘active or vocal opposition’ to ensure it cannot be used to silence dissenting opinions.
  • Omit Rule 4(4) as it violates Section 38 (5) of PECA.
  • Content constituting ‘hate speech’ and ‘defamation’ should not be removed without a proper investigation.
4: Violates and Unreasonably restricts Fundamental Rights:

The Rules, as they stand, pose a serious danger to fundamental rights in the country. In particular, the breadth of the Rules’s restrictions, and the intrusive requirements that they place on social media platforms, would severely threaten online freedom of expression, right to privacy and information.

It is submitted that Rule 4 is a blatant violation of Article 19 (freedom of speech, etc.) of the Constitution. It exceeds the boundaries of permissible restrictions within the meaning of Article 19, lacks the necessary attributes of reasonableness and is extremely vague in nature. Article 19 states that restrictions on freedom of expression must be “reasonable” under the circumstances, and must be in aid of one of the legitimate state interests stated therein (“in the interests of the glory of Islam, integrity, security, or defence of Pakistan…”). The Rules, however, require all social media companies to remove or block online content if it is, among other things, in “contravention of instructions of the National Coordinator.” It is to be noted that deletion of data on the instructions of the National Coordinator does not fall under the permissible restrictions of Article 19 as it is an arbitrary criteria for the restriction of fundamental rights. Furthermore, a restriction on freedom of speech may only be placed in accordance with ‘law’ and an instruction passed by the National Coordinator does not qualify as law within the meaning of Article 19.

It must also be noted that Rule 7 (Blocking of Online System) is a violation of Article 19 of the Constitution which only provides the power to impose reasonable ‘restrictions’ on free speech in accordance with law. It is submitted that in today’s digital world, online systems allow individuals to obtain information, form, express and exchange ideas and are mediums through which people express their speech. Hence, entirely blocking an online system would be tantamount to blocking speech itself. The power to ‘block’ cannot be read under, inferred from, or assumed to be a part of the power to ‘restrict’ free speech. It was held, in Civil Aviation Authority Case, that “the predominant meanings of the said words (restrict and restriction) do not admit total prohibition. They connote the imposition of limitations of the bounds within which one can act...” Therefore, while Article 19 allows imposition of ‘restrictions’ on free speech, the power to ‘block’ an information system entirely exceeds the boundaries of permissible limitations under it and is a disproportionate method of achieving the goal of removing harmful content on the internet – rendering Rule 7 inconsistent with the Constitution as well (previously it was discussed Rule 7 goes beyond the scope of the Section 37 (1) of PECA).

As has already been discussed above, a restriction on freedom of speech will be unreasonable if the law imposing the restriction has not provided any safeguards against arbitrary exercise of power. Rule 4 violates this principle by encouraging arbitrary and random acts and bestows upon the National Coordinator unfettered discretion to regulate online content without providing any safeguards against abuse of power. The Rules do not formulate sufficient safeguards to ensure that the power extended to the National Coordinator would be exercised in a fair, just, and transparent manner. The power to declare any online content as ‘harmful’ and to search and seize data without the measures for questioning the authority concerns the state of privacy and free speech of the companies and that of the people.

The fact that the government has asked social media companies to provide all and any kind of user information or data in a ‘decrypted, readable and comprehensible format’, including private data shared through messaging applications like WhatsApp (Rule 6), and that too without defining any mechanisms for gaining access to data of anyone being investigated, just shows that it is neither concerned with the due procedure of the law nor is it concerned with the potential violations of citizens right to privacy.

Finally, Rule 5 obligates social media companies to put a note along-with any online content that is considered or interpreted to be ‘false’ by the National Coordinator. Not only does this provision add to the unfettered powers of the National Coordinator to be exercised arbitrarily but also makes the Coordinator in-charge of policing truth. This violates the principles of freely forming an ‘opinion’ (a right read under Article 19) as the National Coordinator now decides, or dictates, what is true and what is false.

Recommendations:
  • Amend Rule 4 and exclude from it the words “the instructions of the National Coordinator” as the same violates Article 19 of the Constitution.
  • Omit Rule 7 as it violates Article 19 and does not fall under the ‘reasonable restrictions’ allowed under the Constitution.
  • Formulate rules and procedures for investigations, seizures and collection of data which are in line with due process safeguards.
  • Rule 4 should be amended to require the regulatory body to give ‘cogent reasons for removal’ along with every content removal request. If those reasons are not satisfactory, the social media company should have the right to seek further clarifications.
  • The authority tasked with content removal should not be the sole authority to determine what constitutes ‘objectionable’ online content; neither should it be left open for the authority to decide from time to time through its ‘instructions’.
5: Data Localisation:

Rule 5 obligates social media companies to register with the PTA within three months of coming into force of these Rules. It requires a social media company to establish a permanent registered office in Pakistan with a physical address located in Islamabad and to appoint a focal person based in Pakistan for coordination with the National Coordinator.

It is submitted that the requirement for registering with PTA and establishing a permanent registered office in Pakistan, before these companies can be granted permission to be viewed and/or provide services in Pakistan, is a move towards “data localisation”and challenges the borderless nature of the internet - a feature that is intrinsic to the internet itself. Forcing businesses to create a local presence is outside the norms of global business practice and can potentially force international social media companies to exit the country rather than invest further in Pakistan. It is unreasonable to expect social media companies to set up infrastructure in the country when the nature of the internet allows for it to be easily administered remotely. With an increase in compliance costs that come with incorporation of a company in Pakistan, companies across the globe including start-ups may have to reconsider serving users in Pakistan. Consequently, users in Pakistan including the local private sector may not be able to avail a variety of services required for carrying out day-to-day communication, online transactions, and trade/business related tasks. Many businesses and organisations across Pakistan rely on the services provided by social media companies, particularly during the Covid-19 pandemic when reliance on the internet has increased substantially, and will thus have an indirect impact on the economy as well. The proposed Rules requiring local incorporation and physical offices will also have a huge repercussion on taxation, foreign direct investment and other legal perspectives along with negatively impacting economic growth.

To effectively defend against cybercrimes and threats, companies protect user data and other critical information via a very small network of highly secure regional and global data centers staffed with uniquely skilled experts who are in scarce supply globally. These centers are equipped with advanced IT infrastructure that provides reliable and secure round-the-clock service. The clustering of highly-qualified staff and advanced equipment is a critical factor in the ability of institutions to safeguard data from increasingly sophisticated cyber-attacks.

Mandating the creation of a local data center will harm cybersecurity in Pakistan by:

  • Creating additional entry points into IT systems for cyber criminals.
  • Reducing the quality of cybersecurity in all facilities around the world by spreading cybersecurity resources (both people and systems) too thin.
  • Forcing companies to disconnect systems and/or reduce services.
  • Fragmenting the internet and impeding global coordination of cyber defense activities, which can only be achieved efficiently and at scale when and where the free flow of data is guaranteed.

Preventing the free flow of data:

  • Creates artificial barriers to information-sharing and hinders global communication;
  • Makes connectivity less affordable for people and businesses at a time when reducing connectivity costs is essential to expanding economic opportunity in Pakistan, boosting the digital economy and creating additional wealth;
  • Undermines the viability and dependability of cloud-based services in a range of business sectors that are essential for a modern digital economy; and
  • Slows GDP growth, stifles innovation, and lowers the quality of services available to domestic consumers and businesses.

The global nature of the Internet has democratized information which is available to anyone, anywhere around the world in an infinite variety of forms. The economies of scale achieved through globally located infrastructure have contributed to the affordability of services on the Internet, where several prominent services are available for free. Companies are able to provide these services to users even in markets that may not be financially sustainable as they don't have to incur additional cost of setting-up and running local offices and legal entities in each country where they offer services. Therefore, these Rules will harm consumer experience on the open internet, increasing costs to an extent that offering services/technologies to consumers in Pakistan becomes financially unviable.

Recommendations:
  1. Scrap Rule 5 and abandon the model of data localisation as it discourages business and weakens data security of servers;
  2. Develop transparent and legally-compliant data request and content removal mechanisms with social media companies as an alternative to the model proposed in Rule 5.
Concluding Remarks:

We have discussed that the current consultations lack the essentials of ‘good faith’ which demands reexamination of the entire framework. We have also discussed that the Rules exceed the scope of Parent Acts, accord arbitrary powers to the National Coordinator, uses vague definitions and unreasonably restricts fundamental rights which makes them liable to be struck down. In light of the above, we call upon the government to immediately withdraw the Rules and initiate the consultation process from scratch. The renewed consultation should premise around tackling ‘online harm’ instead of a discussion on the Rules alone. Consensus should be reached on the best ways to tackle online harms. This would require comprehensive planning, transparent and meaningful consultations with stakeholders and participation of the civil society. Until this is done, Digital Rights Foundation will disassociate itself from any government initiatives that are seen as ingenuine efforts to deflect criticism.

 

June 22, 2020 - Comments Off on DRF Condemns Move Against Open Source Technology and OTF

DRF Condemns Move Against Open Source Technology and OTF

In a world where online freedoms are increasingly under threat from all sides, organisations who work on supporting a free and safe internet are more important than ever. This is why Digital Rights Foundation (DRF) is extremely worried by developments by the US government that might undermine the work Open Tech Fund (OTF) does.

Serious concerns over the future of OTF were raised last week when the new head of the United States Agency for Global Media (USAGM), was planning to push money and funding towards closed-source tools. OTF is an independent non-profit grantee of the USAGM and has been supporting organisations, journalists, human rights defenders and users by funding innovative and open-source projects which uphold internet freedoms across the word. This move prompted Libby Liu, the inaugural OTF CEO, to step down from her position citing concerns of interference from the new head of the USAGM in “the current FY2020 OTF funding stream and redirect some of our resources to a few closed-source circumvention tools."

OTF was one of the first supporters of DRF’s cyber harassment helpline, which has provided assistance to over 4000 individuals across Pakistan and continues to support journalists, activists, HRDs, women, children and vulnerable groups during the Covid-19 pandemic. The planned move by the USAGM threatens this support and similar work that OTF does with organisations globally. In the last eight years, OTF’s projects have “enabled more than 2 billion people in over 60 countries to safely access the internet.”

As beneficiaries, both direct and indirectly from the tools that OTF supports, we urge the US Congress to take concrete and immediate steps to ensure that OTF continues to support open-source and digital rights projects all over the world. We echo the demands made by the ‘Save Internet Freedom Tech’ coalition including that “all US-Government internet freedom funds to be awarded via an open, fair, competitive, and evidence-based decision process.”

The internet has enabled us to innovate, connect and thrive, particularly during the Covid-19 pandemic. We believe that internet freedom is the bedrock of what makes all these things possible on the internet, and organisations such as OTF which support the work of internet freedom are central to this foundation.

https://www.vice.com/en_us/article/935k5p/open-technology-fund-ceo-resigns

https://digitalrightsfoundation.pk/wp-content/uploads/2020/06/Covid-19.pdf.

https://saveinternetfreedom.tech/

June 19, 2020 - Comments Off on Virtual ‘Private’ Networks no Longer Private as PTA Requires Registration

Virtual ‘Private’ Networks no Longer Private as PTA Requires Registration

Areeba Jibril is a DRF intern focusing on issues related to privacy, free speech, and elections. She tweets at @AreebaJibril

Finding a Virtual Private Network (VPN) provider in Pakistan is easy. A quick google search will pull up multiple free services. Casual internet users may register for these services to circumvent paywalls and access online content that has been blocked in Pakistan. They can do this without even really knowing what they’re signing up for. More sophisticated users may use VPNs to ensure that their IP addresses, and therefore their geographical location and identity, remain hidden from the websites they visit.

What casual users likely don’t know is that the Pakistan Telecommunication Authority (PTA) has announced a registration requirement for all Virtual Private Networks (VPNs) by 30th June 2020. This is twenty-two days after they first posted a public service announcement on their website. The PTA regulations do not ban the use of VPNs entirely, but they do require users to register their VPN use with their Internet Service Providers (ISPs). To do this they must share their CNIC number, the purpose for which they would like to use a VPN, and which IP address they will be using their VPN with. The privacy intrusion is not limited to this information. –The notification is vague, therefore it is difficult to say with authority the extent of the privacy intrusions that may come about. There is online speculation about the extent of information that the government can feely request from non-VPN users and whether the same practices will apply to VPN-users as well.

The Pakistani government claims they’ve added this requirement to support the Information and Communications Technology (ICT) industry and promote the “safety of telecom users.” But requiring registration of VPNs defeats the purpose for which VPNs were created. VPNs cannot be private if they must be registered with ISPs, who are then required to share the information with the government. The information flow doesn’t stop there – the government has contracted with Sandvine Corporation, a US-based company, to monitor ‘grey’ internet traffic.

The 10th June announcement isn’t forthcoming regarding the significance of this announcement, by claiming that this is “not new”. It’s true that users have been reporting that their VPNs had suddenly stopped working since 2011. However, this new announcement includes the threat of legal consequences, without much clarity on what these consequences will be. The drastic consequences to privacy do not need to be new to be concerning. The PTA claims to be using its authority under clause 4(6) of Monitoring and Reconciliation of Telephony Traffic Regulations (MRITT), 2010. 

VPNs can be helpful for the average internet user when they want to access content such as television shows that aren’t otherwise available in Pakistan. But they serve a much more important purpose in promoting freedoms of opinion and expression by protecting the privacy of users. By using a VPN, users can ensure that the websites they visit and the content they post cannot be traced back to them. For many, anonymity is an important part of what makes the internet a safe place.

David Kaye, the Special Rapporteur on the promotion and protection of the right to freedom of opinion and expression, noted, “Encryption and anonymity provide individuals and groups with a zone of privacy online to hold opinions and exercise freedom of expression without arbitrary and unlawful interference or attacks… A VPN connection, or use of Tor or a proxy server, combined with encryption, may be the only way in which an individual is able to access or share information in [environments with prevalent censorship].”

As the list of registered VPN users will be shared with ISPs, the risk of private information being accessed by those with malicious intent will increase dramatically. Without the ability to hide their physical location, users will be in greater danger if they use the internet to communicate discontent with the government and seek help anonymously. 

Some users may decide they cannot risk this intrusion to their privacy and refuse to register their VPNs. It is unclear how these users will be treated. The government can request that non-registered users have their VPNs blocked. However, they have also said that users who fail to register their VPNs can face legal consequences if they cause “loss to the national exchequer.” They maintain that they are adding this requirement to terminate “illegal traffic.” These vague terms should be a great cause of concern. What is illegal traffic? What will be considered a “loss to the national exchequer”? When will users be held legally accountable for failing to register their VPNs? The lack of guidance increases the risk that these laws will be used to target political dissidents and unpopular speech.

The notification concerning VPNs, coupled with the news from a few months back regarding ‘Deep Packet Inspection’ (DPI) poses a serious threat to online privacy and security for the common Pakistani citizens. DPI allows unprecedented access to a private individual’s activity online. The added issue with the DPI technology is the fact that the government has been incredibly silent on how they plan on using the technology and what the purpose of it is. This silence and general vagueness is somewhat similar to what we’re witnessing nowadays when it comes to this notification regarding VPNs in the country.   

Pakistan is not alone in regulating the use of VPNs. Belarus, China, Iran, Turkey, Iraq, Syria, Oman, Russia, Uganda, the UAE, and Venezuela have either introduced some measures to restrict the use of VPNs or banned the use outright. Iran allows the use of VPNs, but only if providers are Iranian while Russia bans VPN usage for sites that have previously been blocked by Russia’s governing body for telecommunications and mass media communications. Consequences for using VPNs are also wide-ranging. In China, the government has gone so far as to arrest a VPN provider. In Oman, private users face a 500 rial fine ($1300USD). 

Given the human and digital rights track record of these countries, this is not a list of countries that Pakistan should want to be on.  

Sources:
https://www.pta.gov.pk/en/media-center/single-media/public-notice---get-your-vpn-registered-080620

http://tickets.nexlinx.net.pk/index.php?/News/NewsItem/View/45

https://www.dawn.com/news/1512784

https://www.pta.gov.pk/en/media-center/single-media/public-notice---get-your-vpn-registered-080620

The coming Pakistan VPN ban: PTA sets deadline for VPN users to register by June 30th
https://www.pta.gov.pk/media/monitoring_telephony_traffic_reg_070510.pdf. https://www.amnesty.org/en/latest/news/2017/08/vpns-are-a-vital-defence-against-censorship-but-theyre-under-attack/
Where are VPNs legal and where are they banned?
https://www.bbc.com/news/technology-41160383

June 11, 2020 - Comments Off on Quetta Internet Shutdown

Quetta Internet Shutdown

This article has been authored by Zainab Durrani who is a Project Manager at DRF

The recent incident of internet shutdown in Balochistan’s provincial capital, Quetta, has been noted by the Digital Rights Foundation with a grave degree of concern.

Two days without the Internet

For over 48 hours, between 30 May and 2 June, 2020 Quetta remained without access to mobile internet services. This occurrence, especially in the middle of the global COVID-19 pandemic that is at its peak in Pakistan at the moment, is an egregious infringement on the residents’ right and access to information, effectively cutting them off from the rest of the world and depriving them of potentially essential and lifesaving information.

Internet shutdowns are a deliberate effort to cut-off particular communities from access to the internet, which includes information, social media platforms and services accessible online. Internet shutdowns come in a myriad of forms: the relevant authority can choose to throttle access to a specific section of the population by cutting off bandwidth; instituting broadband/mobile internet shutdowns, “Internet blackouts”; blanket internet shutdowns, mobile phone call and text message network shutdowns; service-specific (platform) shutdown e.g cutting off access to platforms like Twitter or Facebook.

This particular shutdown purportedly came after escalating tensions between members of the Hazara and Pashtun communities in Quetta. The deaths of three young men from the Pashtun community led to unrest in the city and the eventual blockage of internet services was to purportedly quell this unrest. However, as per sources, the reasons for the shutdown were unknown to the provincial government at the time, who were unsure as to why the Pakistan Telecommunication Authority (PTA) had disbanded services in the city. Additionally, no official public notice was given by the PTA to communicate the shutdown, and its expected duration.

Concerns for residents

Internet shutdowns are an ineffective way of dealing with unrest in a particular locality, in that they are a disproportionate measure and human rights groups over the world have pointed out that they have the potential to engender more panic in the absence of access to information.

The shutdown impacted the work of many throughout the city. As a journalist, Mr. Hafizullah Sherani from Voice of America expressed difficulty in filing reports, having to go through the extraordinary lengths which involved attempting to get connectivity on the roadside, in front of a friend’s office, at 2 AM ironically in order to file his story on the shutdown itself. Saadullah Akhter from Balochistan Express echoed this experience noting that ‘it was an abrupt suspension when the city was in grip of tension following Hazara Town lynching hence we faced immense difficulty in getting accurate information over the incident and sharing it with other colleagues and newsrooms.’

This shutdown impacted the lives of dwellers from all walks of life, who, in the process of getting through an unprecedented pandemic, are relying heavily on connectivity not only to remain in touch with friends and family, but to coordinate efforts to arrange resources such as plasma of recovered patients to help those suffering from COVID-19.

Not only was it a problem as a field reporter, notes journalist Rani Wahidi, but as a citizen who could not communicate with their family through secure channels like Whatsapp, to keep in touch throughout the day or share her location with them for safety purposes. The shutdown increased the difficulty of those stepping outside their homes to work during a global pandemic.

Are shutdowns effective?

Shutdowns are a common tactic used by the state to ensure elusive aims such as “security” and “safety”. This is particularly so in Balochistan which faces frequent internet shutdowns and connectivity issues. For instance, in 2018 parts of Balochistan witnessed three shutdowns over the course of a week, one of which occurred during the Pashtun Long March.

As per Access Now there were at least 213 recorded instances of internet shutdowns the world over during the year 2019 alone in 33 countries. Not only are these shutdowns generating a social cost that impedes human rights, there is an economic cost--and a hefty one at that. 

Researchers Samuel Woodhams and Simon Migliano report that:

"In economic terms, disruptions not only affect the formal economy but also the informal, especially in less well-developed nations. There can also be lasting damage with the loss of investor confidence and faltering development, all of which makes our estimates conservative.”.

"On the human rights side, these shutdowns clearly impact citizens' freedom of expression and the right to information and may even result in an increase in violence."

Internet shutdowns often have a severe impact on freedom of assembly and association as well as mobility. Sadia Baloch, activist and member of the Baloch Students Organization (BSO) said that the shutdown impacted the protest they were organizing for 4 year old Bramsh Baloch who lost her mother to violence and received injuries herself in Turbat, Balochistan.

‘... it specifically affected our protest which was on the next day,our mobilization was affected as very few people got the news and the rest of Balochistan has no internet facility which is a problem itself.’

While time-bound and location-specific internet shutdowns are very common, however there have been long-term shutdowns in the country as well. The former Federally Administered Tribal Areas (FATA) territories of Pakistan have also been facing an internet shutdown for 4 years now. 1460 days, give or take. ‘In early June 2016, at Torkham, the border forces of Pakistan and Afghanistan clashed over the construction of a gate by the Pakistani authorities on the border. This clash led to the suspension of 3G/4G services in bordering towns and tribal areas.’

The suspension of services is legally condoned under s.54 of the PTA Act which covers national security. S.54 (3) in particular reads: Upon proclamation of emergency by the President, the Federal Government may suspend or modify all or any order or licences made or issued under this Act or cause suspension of operation, functions or services of any licensee for such time as it may deem necessary. 

This is despite Islamabad High Court (IHC) ruling that mobile network shutdowns, including mobile based internet suspension were illegal. The judgment, from February of 2018,  indicated that access to telecommunication services is a fundamental right of the citizens of Pakistan, and any attempt to suspend said services is a violation of their constitutional rights. The case is currently pending on appeal.

Digital rights activist Usama Khilji of Bolo Bhi expressed his concerns by noting: 

‘The long standing internet shutdown in ex-FATA is a gross violation of the fundamental rights to information and freedom of expression and increasingly the right to education as guaranteed by the Constitution. Millions of Pakistani citizens cannot be left out of internet access as it impacts their ability to communicate, access information, and access education especially since the pandemic started. The Universal Service Fund set up by the government & contributed to by telecom companies must immediately be utilised to enable internet access in ex-FATA.’

Over the last few years, the situation has taken a turn for the worse in terms of a greater cost paid by those cut off from the internet. Currently, as students hailing from outside metropolitans have had to return home due to the implications of the coronavirus spread and there are more people working from home, blanket and arbitrary shutdowns will have a disproportionate effect, depriving them of access to information, work and an education. 

Being a member of the #KeepitOn campaign, which consists of 158 organizations from 65 countries that are devoted to fighting internet shutdowns, DRF is committed to reporting on and continuing its advocacy for constant and safe access to the internet for all. 

Sources:
https://slate.com/technology/2020/04/pandemic-internet-shutdown-danger.html

https://voicepk.net/quetta-tensions-lead-to-internet-shutdown/

https://www.poynter.org/fact-checking/2019/sri-lanka-blocked-social-media-to-stop-misinformation-about-the-easter-terror-attacks-but-it-didnt-work/

https://www.accessnow.org/pakistan-shuts-down-the-internet-three-times-in-one-week/

https://www.accessnow.org/cms/assets/uploads/2020/02/KeepItOn-2019-report-1.pdf

https://edition.cnn.com/2020/01/09/tech/internet-shutdowns-cost-intl-hnk/index.html

https://thediplomat.com/2020/06/balochistan-erupts-in-protests-over-a-murdered-mother-and-her-injured-4-year-old/

https://thediplomat.com/2019/03/balochistans-great-internet-shutdown/

https://digitalrightsfoundation.pk/islamabad-high-court-ruled-mobile-network-shutdowns-illegal/

 

June 3, 2020 - Comments Off on May 2020: DRF celebrates World Press Freedom Day

May 2020: DRF celebrates World Press Freedom Day

Online campaigns and initiatives

World Press Freedom Day

Digital Rights Foundation conducted a three-day campaign to raise awareness on World Press Freedom Day, 3rd May 2020, on how the COVID-19 pandemic and the subsequent surge in misinformation impacting press freedom. The campaign included three facebook live sessions hosted by our Executive Director Nighat Dad and other team members with amazing journalists including, Amber Rahim Shamsi, Maham Javaid and Najia Asher. The themes of discussion revolved around coronavirus and journalism: disinfodemic and its impact on independent journalism; journalistic frontline, transforming newsrooms and increased stress; and surveillance structure and press freedom. DRF also shared resources and tools by other organisations that can help journalists in fact-checking information around COVID-19. Five infographics on the theme were also developed and shared on DRF’s socials for awareness raising.

Together for Reliable Information Campaign

DRF participated in Free Press Unlimited’s Campaign to highlight the work that our organisation and its Network of Women Journalists’ members are doing to provide people with reliable information. The campaign included five short video interviews from our team and women journalists working at the frontline to report covid-19 related news from the field. DRF developed a digital hygiene toolkit for journalists and the wider community of content creators and bloggers, which will be launched in June 2020. Ten picture stories covering the journalistic frontline of covid-19 were also included in the campaign.

DRF Annual Report

DRF launched it’s Annual report for 2019 highlighting all the important work that we’ve done so far. Through this report we would like to celebrate our team and its effort over the last year. We aim to keep striving towards having a safe and equal digital experience for all.

See our annual report here:

https://digitalrightsfoundation.pk/wp-content/uploads/2020/05/Annual-Reort-2019..pdf

Policy Initiatives

Personal Data Protection Bill Recommendations

DRF submitted a thorough legal analysis of the ‘Personal Data Protection Bill 2020’. Within the bill recommendations were given to the government on how to further strengthen the bill. DRF also developed a video series around the bill to explain key terms and concerns within the bill.

Read the entire analysis here:

https://digitalrightsfoundation.pk/wp-content/uploads/2020/05/PDPB-2020_-Final-Analysis_05.05.2020-1.pdf

Statement on accountability in Waziristan honour killings

Two women were killed in the name of honor in Waziristan when a man leaked a short video of the two on social media. The video was leaked without the girls consent and contained private imagery. DRF released a statement expressing its outrage over this and demands that justice be served in this case.

Read the full statement here:

https://digitalrightsfoundation.pk/digital-rights-foundation-urges-for-accountability-in-waziristan-honour-killings/

Statement on Violations of Privacy & Condemns Moral Policing in Uzma Khan case:

DRF released a statement expressing its concerns around the privacy violations and moral policing of actress Uzma Khan. Uzma Khan’s video of being bullied in her own home was leaked without her consent and is a clear violation of her privacy. The video let to her character assassination and slut-shaming that is common in cases where women assert their bodily autonomy outside the bounds of marriage.

Read the full statement here:

https://digitalrightsfoundation.pk/digital-rights-foundation-is-gravely-concerned-with-the-violations-of-privacy-condemns-moral-policing-in-uzma-khan-case/

Media Engagement

Digital rights activist Nighat Dad part of Facebook's 'supreme court' for content

DRF is proud to announce that our ED Nighat Dad is now a part of Facebook’s newly announced oversight board to oversee decisions regarding content published on the social media network and Instagram. Nighat is one of 20 board members across the globe promoting women rights, human rights and freedom of expression on the board.

Read the full article here:

https://www.dawn.com/news/1555236/digital-rights-activist-nighat-dad-part-of-facebooks-supreme-court-for-content

Pakistan's 'honor killings' show women need digital skills, says Facebook oversight board member

Nighat Dad spoke to Thomson Reuters Foundation news about the honour killings of two girls in Waziristan over a leaked video online. She highlighted how the internet can be used against women and this is especially true in a patriarchal society like Pakistan.

Read the full article here:

https://news.trust.org/item/20200521151630-xewbx

Article on privacy concerns with tech to tackle Covid-19

DRF’s Shmyla Khan and Zainab Khan Durrani wrote about the framework of the data protection bill and why this may be a cause of concern for citizens. The two also talked about the emergence of invasive technology to tackle the virus and why it is important for citizens to be vigilant.

Read the full article here:

https://www.thenews.com.pk/tns/detail/659111-pandemic-tech-and-privacy-concerns

Events and Sessions

Coronavirus and Children Series

DRF’s Executive Director Nighat Dad spoke on UNICEF’s webinar on ‘Coronavirus & Children Series’ highlighting how access to ICTs is a fundamental right for everyone. She also spoke about the prevalent inequalities in online spaces in Pakistan and how girls in Pakistan are under more scrutiny online as compared to boys.

Drawing lessons from COVID 19

DRF spoke at a webinar of Reporters Without Borders (RSF) on drawing lessons from COVID19 on account of world press freedom day. DRF’s ED Nighat highlighted the unequal access of information between the urban and rural and how this in itself has created a hierarchy amongst citizens. She also highlighted the rise of misinformation in times of COVID in Pakistan and how this can be a problem during the pandemic.

Fireside Chat with Nighat Dad

The NEST I/O startup pulse hosted a fireside chat with DRF’s Nighat Dad with Jehan Ara. The two discussed the workings of the data protection bill and cyber crime laws in Pakistan. There was also a discussion around online hate speech and how to curb it in times of COVID.

Inequalities: Bridging the Divide

Nighat Dad spoke at the #UN75 dialogue on ‘Inequalities: Bridging the Divide’ with 200 students, faculty and participants of NUST university from across the globe. Nighat spoke about the inequalities across the internet along with the legal and gender based inequalities in Pakistan. She also shed light on the inequalities on freedom of expression in the country.

Read more about the session here:

https://nation.com.pk/29-May-2020/nust-hosts-un75-dialogue

Pandemic Contact Tracing, privacy and data protection

DRF participated in the virtual #ImpactTalk that focused on issues with contact tracing applications from the perspective of data protection and civil rights. It was pointed out by the speakers that while it is important to address the public health emergency caused by the COVID-19 pandemic, a balance needs to be struck with civil liberties both in the short and long term. This is particularly important in the South Asian context where there is a dearth of data protection regulations and privacy rights.

TWEETCHAT: Feminist Responses to the “Bois Locker Room”

DRF participated in a Tweetchat on May 27, 2020 organised by “End Cyber Abuse” which focused on the issue of masculinities and their manifestation in online spaces, particularly in light of the “Bois Locker Room” incident in India.

Giving Tuesday ChaynHQ

Chayn raised funds for its friends and partners on a great initiative called ‘Giving Tuesday’. Chayn raised money for the cyber harassment helpline which provides three basic services to its callers which is digital security, legal aid and psychological counselling.

DRF Collaborated With Accountability Labs 

We worked with Accountability Labs on one of their issues of their ‘Pakistan Coronavirus CivAct  Campaign’ Newsletter. This particular issue was about digital safety and security during this pandemic. This newsletter went through the types of attacks that people were facing through the lockdown and what they can do both in a preventative manner and in a defensive manner.

COVID Updates

Cyber Harassment Helpline

The Cyber harassment helpline is working virtually and has started taking calls from Monday till Friday 9 am till 5 pm. The helpline number is 0800-39393 and the helpline provides three basic services which are legal help, psychological assistance and digital security assistance.

Ab Aur Nahin

In times of COVID19 domestic abuse is at an all time high where victims do not have anywhere to go to. Ab Aur Nahin is a confidential legal and counselor support service specifically designed for survivors of harassment.

Contact us now:

https://abaurnahin.pk/

IWF Portal:

Digital Rights Foundation (DRF) in collaboration with the Internet Watch Foundation (IWF) and the Global Fund to End Violence Against Children launched a portal to combat children’s online safety in Pakistan. The new portal allows internet users in Pakistan to anonymously report child sexual abuse material in three different languages – English, Urdu, and Pashto. The reports will then be assessed by trained IWF analysts in the UK.

The portal can be access on:

https://report.iwf.org.uk/pk

 

 

June 3, 2020 - Comments Off on COVID 19 and Cyber Harassment: DRF Releases Lockdown Numbers

COVID 19 and Cyber Harassment: DRF Releases Lockdown Numbers

DRF established the Cyber Harassment Helpline in December 2016. The services we’ve offered since then include, legal support to online harassment victims as well as digital security assistance and also psychological counseling of victims. 

As Pakistan entered its lockdown in response to the COVID-19 outbreak, we feared there would be an increase in cyber-harassment cases as well as cyber attacks in general. To explore this possibility we analyzed the data from our Cyber Harassment Helpline from the months of March and April 2020 and compared it to the data from January and February 2020, to compare how cases have grown in the lockdown. Given that the pandemic became a public health emergency in Pakistan in March 2020, we feel that the comparison can reflect the changing patterns of online harassment and violation in relation to the social ramifications of COVID-19 This analysis is being released in the form of a policy brief and includes a list of recommendations for concerned stakeholders. 

As compared to January and February, March and April saw an increase of 189% in complaints registered with our Cyber Harassment Helpline. 74% of the cases in March and April were reported by women, 19% by men, and 5% by gender non-binary persons. When the lockdown was enforced in March, for the safety of our employees, we had to close our office as well as shut down our Helpline’s toll-free number. This massive bump in recorded complaints came through email and our social media. 

We have found that “the forms of gendered violence that are largely directed at women in the digital sphere usually include sexual harassment, surveillance, unauthorized use and dissemination of personal data, and manipulation of personal information including images and videos. This form of violence acts as a significant barrier to women’s expression of themselves as well as meaningful engagement with the internet. A majority of the cases that the Digital Rights Foundation’s cyber harassment helpline received digitally during lockdown (April and May) pertained to blackmailing through non-consensual sharing of information, intimate pictures and videos.” 

Alongside this data, we are also releasing a list of 14 recommendations for relevant stakeholders. These cover issues of the FIA’s accessibility especially during the pandemic, and also how technology needs to be used hand in hand while dealing with harassment cases, like allowing for video testimonies etc. 

During the pandemic, the cyber harassment helpline has been working hard to provide uninterrupted services to complainants of online harassment, while ensuring the safety and well-being of our staff. Early in the lockdown period, we switched exclusively to online platforms, however, we have restored the toll-free number through cooperation from the Pakistan Telecommunications Authority and PTCL.

Our full policy brief is attached to this email. For more information on this policy brief and on the work of our Cyber Harassment team, you can get in touch with them using this email: 

helpdesk@digitalrightsfoundation.pk 

May 30, 2020 - Comments Off on Digital Rights Foundation is Gravely Concerned with the Violations of Privacy & Condemns Moral Policing in Uzma Khan case

Digital Rights Foundation is Gravely Concerned with the Violations of Privacy & Condemns Moral Policing in Uzma Khan case

image soon

It is no secret that the internet is not a safe place for women, much like most spaces in society. Tools and technologies are repeatedly weaponised to harass, shame and silence women, recreating oppressions and patriarchal power structures that have enacted violence on women’s body and freedoms for centuries.

Earlier this week, Uzma Khan’s video of her terrified and being bullied in her own home was leaked without her consent and in clear violation of her privacy, it set off character assassinations and slut-shaming that is common in cases where women assert their bodily autonomy outside the bounds of marriage. Women’s sexuality is heavily controlled through penal laws and moral policing that seeks to negate their consent and autonomy. Women stepping outside traditional gender roles or the respectability of the family unit are shamed for their choices, and the video was an example of technology-enabled moral policing. Subsequently, as videos of the attack emerged on social media, promoting outrage from some on the blatant use of power to punish a woman for moral transgressions, but also voyeuristic viewings from those baying for entertainment. The manner in which women’s presence and bodies are objectified and consumed online often obscures the larger structural issues and power dynamics at play in cases, an exercise that even well-wishers often wilfully participate in.

Privacy has traditionally been used as a concept to confine to their homes and insulate violence within the family from accountability—the concept of “chaar devari”, the privacy of the women of the family, has been weaponised to keep women within the domestic sphere and invisibilise violence within the home. Feminist interventions on the right to privacy however centre it as a means of safety and preserving individual human dignity, as a shield to protect the vulnerable against powerful institutions and individuals. Uzma’s right to privacy within her home, over her videos and personal information is crucial, particularly in a case where the power dynamics are stacked up against her. The fact that after the filing of the FIR, Uzma’s personal details, such as her home address, were put on the internet and widely disseminated reminds us of the dangers of doxxing that played a part in the horrific murder of Qaneel Balochi. The disregard for Uzma’s privacy—opening up her persona life for public consumption—is extremely troubling and dangerous.

We call on the law enforcement bodies to demonstrate their independence and fairness by following through on the registered FIR and taking steps to ensure that the inquiry and subsequent case is fair and transparent. Furthermore, we believe that protection should be provided to Uzma and her family with due regard to their privacy. At the same time, we also recognise the limitations of the law and the justice system in providing restorative justice for the loss suffered. Additionally, the law is often instrumentalized to serve the interests of capitalist-patriarchal order, reproducing the status quo through coerced compromises and police malpractice.

May 30, 2020 - Comments Off on COVID-19 GOV PK: The Tech to Battle Coronavirus

COVID-19 GOV PK: The Tech to Battle Coronavirus

As COVID-19 has spread across Pakistan, questions have been raised about how the Government will tackle the spread of the virus. Across the globe we have seen different approaches to this, varying from comparatively relaxed to extremely stringent.

A popular global approach to health surveillance has been contact tracing[1], followed by surveillance and testing. Contact tracing is an old public health technique which tracks an infected person by tracing the places they visited and the people they met. In order to stem the spread of the virus, all those who came into contact with the infected person are then tracked down, informed of their contact and told to self isolate, or are immediately tested for the virus. This process goes on with each new case and is supposed to help ‘map’ the virus as it spreads. In some countries, mobile applications have been launched to track the virus and help people see ‘where’ the virus is.

These apps act as a way for governments to warn the public about cases nearby, and also allow people to report themselves as patients, so as to keep the cycle of contact tracing going. While such extensive mapping may be helpful for tracking the disease on the macro level, these apps present on the flip-side, major privacy concerns.

Take for example this detailed account of South Korea’s Patient #10422:

Before being diagnosed, patient #10422 visited the Hanaro supermarket in Yangjae township on March 23 from 11:32 p.m. to 12:30 a.m. The patient was accompanied by their spouse, both wearing masks and using their own car for transportation. On March 27, the pair visited the Yangjae flower market from 4:52 p.m. to 5:18 p.m., again wearing masks. They then had dinner at the Brooklyn The Burger Joint at Shinsegae Centum Mall from 6:42 p.m. to 7:10 p.m. This detailed record can be found, publicly available, on many government websites, and is a testament to the extensive contact tracing carried out by Korean authorities.[2]

The minutiae of this account goes to show the extent to which data is being collected and observed.

In many instances, the state response has been immediate and comprehensive which hints at the presence of such tech and mechanisms being in place before the pandemic swept the globe, as is apparent from Pakistani PM Imran Khan’s statement: "It (system for tracking and tracing) was originally used against terrorism, but now it is has come in useful against

[1]https://www.brookings.edu/techstream/how-surveillance-technology-powered-south-koreas-covid-19-response/

[2]https://www.brookings.edu/techstream/how-surveillance-technology-powered-south-koreas-covid-19-response/

coronavirus."[1]  This necessitates the inclusion of a detailed data protection and destruction policy to accompany the launch of such apps which mandate the destruction of the data once the health-related utility is over.

At home, our concerns begin from the knowledge that the government of Pakistan is implementing a policy of mapping that involves tracking citizens and their movements. Internationally, there has been debate about the efficacy of contact tracing, however, at the same time, some countries have seen success with this policy. In the context of Pakistan, unfortunately, these measures are accompanied by a lack of trust between the State and citizens. Multiple instances[2] of citizens' data being leaked from one of the biggest national biometric databases in the world, i.e. the Nadra database, has created a faith deficit. Instances of CNIC and family registration certificates (FRC) information being sold online for as low as $1-2 a piece due to a data leak at a provincial level and possibly national level cement this belief.

The “COVID-19 Gov PK” app, released by the National Information Technology Board (NITB) and the Ministry of National Health Services, has been available for use since early April and has been downloaded with an unsurprising frequency given the alarm among the masses, with a rough estimate of more than 500,000 installations at the time of writing.

The very limited privacy policy (found below) states that it is ‘adhering to social, moral, ethical values, and privacy’ while providing no details of the same and referring to no framework under whose jurisdiction these values are defined and the same goes for the element of privacy.

Given that the app seeks permission for geolocation data of the device it is being used on, and personal medical and geographical data of the user, the policy included within the app is not sufficient or clear on exactly how this data is being processed and who has access to it.

[1]https://www.aljazeera.com/news/2020/04/pakistan-intelligence-services-track-coronavirus-cases-200424073528205.html

[2]https://digitalrightsfoundation.pk/drf-condemns-yet-another-breach-of-nadra-database-and-demands-strong-data-protection-legislation/

A rapid evidence review published by the Ada Lovelace Institute in the UK sets out, amongst other measures, the proposal for the formation ‘of a new Group of Advisors on Technology in Emergencies (GATE) to oversee the development and testing of any proposed digital tracing application.[1]

We at DRF submit the same and ask that a GATE advisory be created to oversee the development, rollout and implementation of fair and citizen rights-protective technologies to combat the pandemic in Pakistan and that a proviso be extended from the outset as to the limitations, especially in terms of time-frame, be allotted and notified with every new tech measure the governments, both Federal and provincial, take to combat the pandemic.

As more and more of offline life has moved online, the increased activity has subsequently led to more complaints of online harassment and crimes. In light of this, there is no reference to heightened concerns regarding the ‘security’ of the app and the personal data being saved. In a White Paper, titled ‘Decentralized Privacy-Preserving Proximity Tracking’ (D3PT), experts in the field highlighted that centralised databases made about patients are at a higher risk of being attacked and leaked than decentralised ones. The white paper makes the case for a decentralized database since it offers a more stringent security policy and quicker response to any attempted data breaches. A centralized system requires a phone to upload all its contact information onto a central database, similar to what the UK is doing currently. In contrast, decentralized systems cross reference a device’s contact information without uploading it to a central database. This is similar to how the European Union has implemented contact tracing. If intelligent decisions are not made about how this data is saved, attackers can access personal information, malicious actors can target patients and in some cases lead to discriminatory practices being adopted. Already we have seen this happening in Balochistan where COVID-19 positive patients’ medical data was leaked[2] to reveal their identities which is not only a massive privacy breach on its own but is only made more complicated by the social stigma attached to corona patients.

The White Paper talks about how the transmission of data works in such apps. Most COVID 19 tracking apps have a feature called the ‘Radius Map’ that tells the user if their immediate surroundings have had a reported case of the novel coronavirus. It does this by using bluetooth signals that bounce off of other users of similar apps. Because of this, specific locations of patients can be pinpointed to the average user. The White Paper highlights this as a privacy concern. Additionally, they also highlight the fact that these signals can be manipulated by hackers to create false alerts of nearby COVID 19 patients, spreading panic in an already volatile situation.

More worryingly, the government app does not rely solely on Bluetooth technology but also makes use of location data which makes it more invasive by a significant degree. These concerns are not helped by the fact that the app does not even meet the standards set by tech giants like Apple and Google, who have collaborated together to develop the APIs for coronavirus app development and have released a detailed set of documentation on exposure notification, its framework and cryptography to promote ‘privacy-promoting contact tracing’.

We submit that the Government of Pakistan share detailed SOPs regarding the COVID 19 app launched by them. These should detail their privacy policy in full, addressing data retention and destruction through a clear and unambiguous sunset clause. Also, we maintain that the Government should share with the public as to who exactly has access to this database and strict guidelines regarding data sharing. While we appreciate that this is an unprecedented situation, the Government still must act in a manner that best protects its citizens' data and their right to privacy, a right enshrined in the country’s Constitution of Pakistan. This, to us, includes the maintenance of the right to opt-in in terms of app usage for everyone, even government employees or essential and frontline workers.

The requirement of immunity certificates must also not be made a condition on which citizens’ mobility and access to benefits rests. These immunity certificates are a focus of debate at the moment with several European nations considering issuing ‘passports’ which allow the holder (a recovered COVID-19 patient) access to a social life but also to civil liberties like the freedom of association and movement. These measures have the potential for unprecedented surveillance and control over public life and cannot be made a prerequisite for exercising fundamental and inalienable constitutional rights.

While we understand the imperatives of the public health emergency, it is important that the State establish some boundaries and limitations to their policy, to ensure their citizens have tangible reasons to place their trust and data with them. The current privacy policy contained within the app itself is inadequate to address these queries and cannot be supplemented given the absence of any data protection legislation in Pakistan. We demand also that the apps that are developed to aid the healthcare emergency be open source[3]. This would not only promote transparency but give a tangible boost to the faith placed in the government’s initiatives for its citizens.

The principle of proportionality is required here, in terms of the strength and effect of the measures being employed. Technology is an asset in these times, however we demand that the increasing centrality of technology be done in a safe, transparent and just manner.

[1]https://www.adalovelaceinstitute.org/exit-through-the-app-store-how-the-uk-government-should-use-technology-to-transition-from-the-covid-19-global-public-health-crisis/

[2]https://balochistanvoices.com/2020/03/private-data-of-coronavirus-patients-leaked-in-balochistan/

[3] Open Source refers to software whose source code is readily available online can also be audited by digital security experts for security standards etc.

May 20, 2020 - Comments Off on Evidence of Twitter, Periscope and Zoom restrictions in Pakistan

Evidence of Twitter, Periscope and Zoom restrictions in Pakistan

Network data from the NetBlocks internet observatory confirm that Twitter, Periscope and Zoom were restricted on multiple internet providers in Pakistan on the evening of Sunday 17 May 2020, commencing approximately 18:30 UTC and lasting over an hour. This report produced in partnership with the Digital Rights Foundation presents findings on the schedule events.

It is shown that the Zoom restrictions appear technically unrelated to international issues that affected call quality earlier in the day. Further, it is shown that Twitter, Twitter’s image and video servers, Twitter’s streaming platform Periscope and the Zoom videoconferencing website share the same timeline of disruption, consistent with previous documented social media platform disruptions in Pakistan.

Sunday’s incident matches the characteristics of previous documented restrictions applied on grounds of national security or to prevent unrest such as the Pakistan’s November 2017 social media blackout.

What happened on Sunday?

Late on Sunday 17 May 2020, users across Pakistan started reporting inability accessing the Twitter social media platform and Zoom videoconferencing service.

Users were able to regain access using VPN tools which circumvent national censorship or filtering mechanisms. During this period the #TwitterDown hashtag trended in Pakistan.

A real-time incident alert was issued by NetBlocks presenting initial findings which are developed and examined further in the present report:

The bulk of reports from Pakistan describe a loss of access to affected services. Other reports from Pakistan describe the “throttling” or slowing of Twitter. NetBlocks data indicate that backend image and video servers were specifically unavailable during the disruption period, corroborating these reports.

How does this relate to international outages?

Zoom experienced technical issues earlier on Sunday affecting certain types of meetings on the service for a limited subset of users. The company issued an update at 15:43 UTC confirming that the problem was resolved, hours prior to the onset of social media disruptions in Pakistan.

No widespread user reports of outages are evident in other countries at the time of Pakistan’s social media blackout. NetBlocks performance metrics from around the world show that Sunday’s disruption was localized to Pakistan:

International reachability metrics show impact by country over two days, with nation-scale disruption evident solely in Pakistan during the reported period

A closer examination of the specific time interval for Sunday’s disruption in Pakistan also shows no restrictions or disruptions in effect outside of Pakistan:


Additionally, timings show that the services were disrupted in the same time window in Pakistan, and restored at the same moment:

Findings are drawn from a core sample of 300 network performance measurements observed from 30 network/location pairings across Pakistan supplemented by a wider dataset of international metrics for comparative use.

Why were Twitter, Periscope and Zoom disrupted in Pakistan?

No explanation or legal order has been presented by authorities or network operators at the time of writing.

Pakistan has previously implemented similar restrictions during mass-protests and limits internet access each year during Ashura. However, no protests were held on Sunday and public manifestations are unlikely as Pakistan remains under partial lockdown in response to the COVID-19 pandemic.

Researchers note that the timing of restrictions as well as the set of platforms affected coincide with a “virtual conference” critical of Pakistani policy held via Zoom, shared on Twitter and reportedly streamed via Periscope on Sunday evening.

News report suggest the virtual event generated controversy in Pakistan, stoking tensions between Indian and Pakistani political activists. Nevertheless, a nation-scale social media blackout in response to a virtual event would be a notable development for Pakistan.

NetBlocks encourages network operators and governments to report disruptions and their legal basis, where available, in a transparent manner in keeping with international standards.

This investigation is conducted by NetBlocks and the Digital Rights Foundation.

Methodology

Internet performance and service reachability are determined via NetBlocks web probe privacy-preserving analytics. Each measurement consists of latency round trip time, outage type and autonomous system number aggregated in real-time to assess service availability and latency in a given country. Network providers and locations enumerated as vantage point pairs. The root cause of a service outage may be additionally corroborated by means of traffic analysis and manual testing as detailed in the report.

originally published on @NETBLOCKS

May 18, 2020 - Comments Off on Digital Rights Foundation urges for accountability in Waziristan honour killings

Digital Rights Foundation urges for accountability in Waziristan honour killings

May 18, 2020

Digital Rights Foundation expresses its outrage regarding the cold-blooded murder of two teenage girls at the hands of their family member, killed in the name of misplaced and patriarchal notions of “honour”. The honour killing was prompted by a short mobile video of the young man that surfaced on social media. The video was leaked without the girls’ consent and contained private imagery.

Regrettably, killings in the name of so-called honour are not a new phenomenon in Pakistan and several parts of the world, technology-enabled violence is emerging as a tool for shaming women and controlling their autonomy. Videos and images of women are often weaponised to blackmail, exercise control and inflict violence on women, employing technology as another tool in service of the patriarchy. In Pakistan, the digital gender divide is among the largest in the world, as women are 37 per cent less likely than men to own a mobile phone device of their own. Furthermore, women’s access is often surveilled and controlled by patriarchal figures in their lives. This gap is particularly stark in areas such as Waziristan where mobile internet access has been denied due to a prolonged internet shutdown, resulting in women being deprived of access to resources and crucial information that can potentially save lives.

This is not the first time honour killings resulted from the leaking of women’s private information and images. In a society where women’s consent and their bodily autonomy is regularly violated and dismissed, technology often serves as a handmaiden of these patriarchal structures. Women accessing online spaces or using technology to express themselves or exercise pleasure have heartbreakingly been met with violence and censure. Qandeel Baloch subverted online spaces to express herself and her sexuality, only to be met with online violence and privacy violations which culminated in her murder at the hands of her brother. The 2011 Kohistan case, which saw the murder of three men and five young women due to a video in which they were dancing in their private home, took multiple investigations, intervention by the Supreme Court of Pakistan and nearly eight years to see justice. 

While a First Information Report (FIR) of the incident has been registered at Razmak police station in North Waziristan, we would urge the authorities to closely monitor the investigation and prosecution of the case given the heinous nature of the crime. Honour killings should not only be condemned across the board, but the action taken by the police and courts should reflect this. Too often, societal pressure, familial collusion and uneven application of the law have marred cases in the past. Since the Criminal Law (Amendment) (Offences in the name or pretext of Honour) Act, 2016, the law is clear regarding the limited ability of the family to pardon the perpetrator in cases of honour killings and the state must ensure that section 311 of the Pakistan Penal Code is implemented in its true spirit. In addition to ensuring justice against the honour killing, an investigation should also be launched into the leaking of the private and intimate video. These videos put women’s lives at risk and contribute to a culture where women’s bodies are consumed as objects for male pleasure. Women, through exploitative imagery, are dehumanised, blackmailed and often re-traumatised.

We also urge the state to take immediate and pre-emptive measures to ensure the safety of the other two individuals in the leaked clips. Particularly the security and privacy of the young woman must be ensured and should serve as a precedent for all future investigations dealing with leaked images and videos of women.

Unfortunately, honour killings are not a relic of outdated or fringe ideas, they are grounded in current notions of viewing women as familial and societal property, bearing the impossible burden of carrying the honour of the family, community and nation. In just the last month alone, there were six reported cases of honour killings only in Swat. Furthermore, it is important to state that digital rights such as privacy and protection from online hate speech should be universally enjoyed, however they are particularly important to ensure the safety of women and gender minorities in online spaces--for women and gender minorities, effective mechanisms ensuring the enforcement of these rights can be the difference between life and death.