April 11, 2018 - Comments Off on Statement: DRF condemns Google’s alliance with Pentagon
Digital Rights Foundation (DRF) strictly condemns the involvement of technology giant Google with the US Department of Defense’s (DoD) Project Maven, an initiative that intends to deploy machine learning for military purposes, particularly in terms of using artificial intelligence to interpret video imagery which will potentially be used to improve the targeting of drone strikes.
This recent development, in the highest echelons of technology, has been unsettling for us as a digital rights organization situated in a region that has been at the epicenter of military operations by the United States, particularly drone strikes. DRF would like to register its concerns and alarm regarding the far-reaching ramifications of the proposal.
Here is what we know so far:
- Employees of Google, numbering in thousands (3000+) have drafted and signed a letter in protest of their employer’s collaboration with the State Department in Project Maven to help increase the existing technology’s efficacy in terms of video imagery and drone strike targeting. “We believe that Google should not be involved in the business of war”, the employees’ letter stated.
- The outcry is motivated by the employees’ resistance to the idea of Google allocating resources to the DoD for military surveillance and the potential ethical implication of such involvement. The news, broken by Gizmodo’s article on the 3rd of March, 2018 notes that this pilot project which was not previously reported, was the subject of much debate after being shared on an internal mailing list.
- The letter, addressed to the CEO, Sundar Pichai, demands a reassurance from the company by asking it to extricate itself from this allegiance with the Pentagon - the Headquarters of DoD - and for the implementation of a policy which promises that it will not “ever build warfare technology”.
This state of affairs is alarming for a multitude of reasons, the most crucial of which is the possible trend that this could give rise to in terms of overlapping roles being played by organisations that deal in mass data collection to operate and streamline their products in collaboration with state apparatus. The prime concern here is that a behemoth such as Google is used and trusted by billions every single day for business and leisure. Given its influence and role in the daily lives of people all around the world, and the fact that the fate of the data we all hand over to it is hitherto unknown, there will be serious doubts about how it is used. In the aftermath of the Cambridge Analytica scandal, this is a worrisome development especially since no official word has come from Google denouncing data leaks and providing reassurance as to the privacy of users.
Secondly, it should be noted that such projects carry the potential to cause physical harm to humans and/or give rise to geopolitical instability, so Google and the individuals working at the company should be extremely cautious about working with any military agency, especially given the notorious history of conquest that the US armed force enjoys. The consequences of such projects are not only difficult to mitigate but even predict. Moreover, they cannot assume that the DoD has fully assessed the risks involved in the Project before going ahead with it further. It is important to highlight that in the past, drone strikes have been inaccurate and have resulted in the loss of innocent lives, therefore creating a sense of fear within the general population of the targeted area. Indeed the sharpening of the military’s ‘lethality’ has been termed as a goal by the US defense secretary, Jim Mattis, a worrying indicator of the mindset in place. Thus, the onus is on Google as well to fully analyze the consequences and if this new technology is used by the US armed forces, then Google bears the ethical responsibility for the casualties.
Thirdly, since many of the details of Project Maven have not been made public, it is uncertain if Google has asked an independently constituted ethics board to veto or raise concerns regarding any aspects of the program. Any project review process should not only be independent and transparent but should also be made public, and without independent oversight, such a project runs a real risk of harm.
Lastly, as a country on the receiving end of drone surveillance and attacks, this does not bode well for Pakistan. These strikes have targeted the most vulnerable areas of Pakistan, particularly the politically marginalized FATA. As per a report published by the Bureau of Investigative Journalism, a UK-based not-for-profit organization, the strikes have killed between 424 to 966 civilians between 2004 and 2016. For a country not actively at war and for its citizens who did not have the ability or even get the chance to defend themselves before being killed by orders issued from thousand of miles away, this is a cruel mockery of the sovereignty of our boundaries. The alliance of Google with what is essentially a perpetration of ‘war crimes’ within the bounds of our nation, comes across as a breach of DRF’s beliefs in democratic participation. Drone strikes have in the past, however, repeatedly undermined democratic processes and denied decision-making powers to Pakistani citizens. The very concept of foreign surveillance within the territory of Pakistan and its airspace is unsettling.
The US government officials claim that the drone strikes are accurate and rarely harm innocent lives in the area but the reported number of civilian lives lost due to these attacks suggests otherwise. It has also been reported that in Pakistan where drone strikes take place, parents have taken their children out of school to protect them from possible strikes. Such are the lives of civilians living in these affected areas where they cannot even enjoy something as basic as roaming around in the streets without fearing for their lives.
Despite the high number of civilian casualties and criticism that the program lacks transparency, the US Government has repeatedly defended the strikes. While they claim that drone strikes are accurate and rarely harm civilians, strikes can kill or injure anyone in the area, even if they are only meant to kill a targeted individual. Many victims have come forward and shared their harrowing stories of when a drone strike changed their lives. One of the victims of a drone attack reported that 11 of his family members were killed, despite having no links with the Taliban. A member of a local pro-government peace committee was also killed, along with his three sons and a nephew, due to wrongly targeting their house, instead of where the militants resided. These are just two out of the many examples where civilians were killed in the name of collateral damage. Unfortunately, there is no accountability, at least in Pakistan, the death tolls are never confirmed and the strikes, whether successful or not, are never publicly acknowledged by the US government. The psychological impact of drone surveillance, when combined with the civilian casualties during strikes, leads to significant negative strategic costs that need to be incorporated into the assessment of the project by not only the US government but all the relevant stakeholders involved in aiding this project, including Google.
Although it is commendable that Google employees are debating the project internally and voicing their dissent, however there are other stakeholders involved as well--the citizens of countries who are on the receiving end of US surveillance and drone strikes. We strongly urge Google to reconsider the decision to collaborate with the DoD, considering the cost, hefty ethical stakes and safety risks involved.