Back to news

Privacy Impact Assessment in the age of New Normal

Jay-r
Ipac
Joan
Barata
May 21, 2020

One of the main reasons attributed to the relative success of most East Asian democracies’ strategies in handling the COVID-19 pandemic is the widespread adoption of the government-backed COVID-19 apps. In the Philippines, several COVID-19 tracker apps have likewise surfaced. Recently, the Philippines' health department partnered with a local tech company and launched staysafe.ph as part of its efforts to contain the spread of the virus. These covid-19 apps may have different features, technical or otherwise, but one thing that is common among them is that the downloading of the app is not legally mandatory. This is due largely to privacy concerns in these jurisdictions which have pretty robust legal privacy frameworks. As noted in 2013 by the defunct Article 29 Working Party created under the erstwhile EU privacy directive, “[a]pp developers unaware of the data protection requirements may create significant risks to the private life and reputation of users of smart devices.”

This is why India’s covid-19 app tracker app, Aarogya Setu, recently made headlines when the government effectively made it mandatory to download the app; otherwise one might lose one’s job, get fined, or go to jail. Although a democracy, the absence of a data privacy law in India made the situation more complicated.[1] India's move to make the use of the app mandatory finds similarity in the Philippines when a municipality recently launched its own COVID-19 Tracker app. The municipal Mayor announced that all the municipality’s residents, workers, and business establishments are required to download and register on the app; otherwise, residents may not go out of their homes, go to work, or resume business operations.[2] Unsurprsingly, the move did not gain much media attention, and the local government quietly began implementation.  

The app’s brief “data privacy and data sharing” (hereinafter, privacy policy) simply provides that the municipality warrants that (i) “the [processing] of personal data shall be made in accordance with [the Data Privacy Act and pertinent rules and issuances]; (ii) “they shall comply with the provision of [Data Privacy Act and pertinent rules and issuances]; (iii) “the personal data of the data subjects shall be safeguarded at all times”; [and] (iv) it “shall enforce the rights of the data subjects.” Despite its mandatory nature, the privacy policy provides that the “[r]egistrant once registering in this IT Project System agrees and gives consent to the collection and processing of personal data.” At the outset, the municipality apparently relies on “consent” in processing personal information. If consent is the basis of the processing of personal information, is there a valid consent if failure to download and register on the app would mean the attachment of certain disabilities? The striking contradiction between an official pronouncement of the municipal mayor and what the privacy policy states seems ominous.

As the community quarantine is relaxed in most parts of the country, apprehensions of a “second wave” of covid-19 mount. The municipality’s mandatory requirement of downloading of and registration on its app appears to be a response to these apprehensions. The laudable purpose however should not mean that privacy rights casually take a back seat. With more than two months under lockdown, it is important to ask and emphasize whether a Privacy Impact Assessment (PIA) has been previously conducted before the app was launched.

According to the National Privacy Commission (NPC), a PIA should generally be undertaken for every processing system of a controller or processor that involves personal data. The municipality as controller can choose to forego the conduct of a PIA only if it determines that the processing involves minimal risks to the rights and freedoms of individuals, taking into account recommendations from its data protection officer. In making this determination, the controller should consider the size and sensitivity of the personal data being processed, the duration and extent of processing, the likely impact of the processing to the life of data subject and possible harm in case of a personal data breach. This is similar to the EU’s General Data Protection Regulation, which requires a controller to carry out a data protection impact assessment “[w]here a type of processing in particular using new technologies, and taking into account the nature, scope, context and purposes of the processing, is likely to result in a high risk to the rights and freedoms of natural persons…”[3]

In this case, there is no indication whether a PIA has been previously conducted. Gauging from the very broad statements in the privacy policy however, the absence of a prior PIA will not surprise. In the case of the municipality’s app, the personal information to be processed may even involve sensitive personal information, and yet the privacy policy clearly lacks the transparency required under the Data Privacy Act. Transparency means that the “data subject must be aware of the nature, purpose, and extent of the processing of his or her personal data, including the risks and safeguards involved, the identity of personal information controller, his or her rights as a data subject, and how these can be exercised.” In this case, the data subjects could be downloading the app and divulging personal information, without even knowing what their specific rights are and the specific obligations of the controller in protecting said personal information under the principles laid under the law.

How the app complies with the principle of proportionality is likewise not clear. Under this principle, personal data shall be processed only if the purpose of the processing could not reasonably be fulfilled by other means. Granting that the processing is for a public health purpose, since the national government itself did not even make it mandatory to download and register on a government-backed covid-19 tracker app, it is doubtful whether the purpose of the processing under the app cannot be by fulfilled by other means.

Given the range of other possible uses, both legal (outside of the data privacy act) and illegal, of the personal information that may be obtained under the app, the lack of details on  (i) the duration and extent of processing, and (ii) the limitations on the use of the personal and sensitive personal information that may be collected, among others, the broadness of the privacy policy is indeed worrisome. The number of possible data subjects and the nature of personal information that may be collected (health information) from them only magnifies the potential privacy risks that a PIA seeks to avoid.

Because of its importance, the NPC has emphasized that the results of the PIA must be properly documented in a report that includes information on stakeholder involvement, proposed measures for privacy risk management, and the process through which the results of the PIA will be communicated to internal and external stakeholders. The prior conduct of the PIA would actually make it easier for the controller to craft its privacy policy once it decides to proceed with the processing subject of the PIA.

Traditionally, an individual can easily detect actual or potential invasions or violations of his privacy rights. It is not so in the world of mobile internet, smartphones and apps, where the different processing activities happen behind the computer screen and whose unique environment can lure individuals into a false sense of privacy and security. Those activities could be as opaque to the data subjects as the mining and subsequent use of personal information by Cambridge Analytica were to the Facebook users, who unwarily disclosed their personal information, As Paul Chadwick observes, privacy is most appreciated by its absence, not its presence. But, in a digital environment, one gets to know the absence of privacy only when the obscurity of processing is unmasked. And this is where the importance of a legally-compliant data privacy policy and an appropriate PIA comes in.

As the NPC puts, "[our] war [against COVID-19] is testing our humanity and values. it should be emphasized that protecting privacy rights is tantamount to protecting lives. The Data Privacy Act of 2012 (DPA) is not a hindrance to the COVID-19 response." This should starkly remind us to remain watchful and vigilant of our rights, most especially during this time of great challenge where the temptation to shortcut legal processes and procedures could be stronger. As COVID-19 accelerates the adoption of various technological systems or products that are related in one way or another to the “new normal” spawned by the COVID-19 pandemic, PIA as an instrument for assessing the potential impacts on privacy of an initiative that processes personal information cannot be overemphasized,  if we are to avoid the small, gradual and incremental erosions of our privacy rights in a rapidly evolving data-driven world.  

 

[1] https://www.technologyreview.com/2020/05/07/1001360/india-aarogya-setu-covid-app-mandatory/

[2]https://www.facebook.com/AttyRoyMLoyola/videos/722048171952432/UzpfSTExNDMyMDY1NTI1MzA0OTozNTE2NzYzOTk1MDA4Njgx/

[3] Article 35 par. 1, General Data Protection Regulation.

Country
Philippines
Topic, claim, or defense
Privacy or Data Protection