The NHS COVID 19 Contact Tracing App Part 2: The Data Protection Impact Assessment

Person Ticks Checkbox Marks with a Pen, filling up To Do List. Checking Marks and FIlling in a Task List / Questiannaire / Medical Cart. Moving Macro Close-up Camera

Yesterday the Prime Minister said England will have a “world-beating” Covid 19 contact tracing system from June. Part of this system is the introduction of the NHS contact tracing app (“the Covid App”) which is currently being trialled on the Isle of Wight.
The app was initially meant to be launched across England in mid-May. Yesterday No.10 suggested this will be done “at a later date.” Why the delay? Well if you look at the recently published Data Protection Impact Assessment (DPIA) for the trial it’s obvious that much more work needs to be done. Here is our analysis of some of the issues raised. (If you are new to this subject we suggest you read the first blog in our series which discussed how such apps work and the different models which can be used.)

Background to the DPIA

The start of the App project has not been auspicious; nor does it instil confidence in the people running it. How can the public, let alone privacy professionals, trust the government when they say that the app will respect their privacy?

The trial of the app started on the Isle of Wight before the Information Commissioner’s Office (ICO) had been given sight of the DPIA. Although they have now seen a copy, the ICO is yet to give a formal opinion. Should the trial have gone ahead in this situation?

As demands grew to see the DPIA, NHSX published it as a .pdf document! However embedded documents including the all-important risk register could not be accessed.
So much for transparency! A few days later the word version of the DPIA was published revealing all the documents but there were typos and some names were not redacted. More importantly, those scrutinising it raised concerns that “high risks” in the original documentation had been listed as only “medium risks” in the risk register. NHSX quickly removed the word document and only the .pdf version is now available (here). For the trial to go ahead before all of the promised and finalised accurate documentation had been released again does not engender faith in the app’s ability to protect users’ privacy.

Oversight

An Ethics Advisory Board has been set up to oversee the Covid App project. In a letter to the Secretary of Health and Social Care, the Board spelt out the 6 principles it expected to be followed; value, impact, security and privacy, accountability, transparency and control.

Some members of the Board have since raised their concerns to the press over how the Board has been responded to. They were also unhappy not to have seen the final DPIA before being asked to comment.

Parliament’s Joint Committee on Human Rights has also been scrutinising the Covid App. It has said that it is not reassured that the app protects privacy and believes that it could be unlawful if the large amount of data gathered proved ineffectual. The Committee has even taken the unusual step of drafting a bill which would require all of the collected data to be deleted after the pandemic is over. (We will look at what data the NHS wants to keep for research purposes and why in our fourth and final blog in this series.)

These serious concerns being raised being by experts and parliamentarians will have a big impact on the public uptake of the app.

Privacy by Design

In line with Article 25 of the GDPR, the app’s DPIA states that it was designed and will continue to evolve with the Privacy by Design principles embedded. They include collecting the minimal amount of data necessary; data not leaving the device without the permission of the user; users’ identities obscured to protect their identity; no third-party trackers; proximity data deleted on users’ phones when no longer required; user can delete the app and its data at any time; personal data will not be kept for longer than is necessary in the central database; data in the central database will not be available to those developing in the app apart from in exceptional circumstances; and provision of any data from the central database will be subject to a data protection impact assessment and establishment of legal basis for the disclosure.

Interestingly, the ICO and European Data Protection Board have both favoured the de-centralised model as it is more with Article 25 and the principle of data minimisation under Article 5 of the GDPR.

Identified Risks

The key part of any DPIA are the risks identified and what mitigation can be put in place to reduce that risk if possible. The documented risks in the Covid App include:

  • Transferring of data outside of the EEA
  • Misuse of information by those with access
  • Adequate data processing agreements with relevant data processors
  • Lack of technical or organisational measures implemented to ensure appropriate security of the personal data
  • Personal data not being encrypted both/either in transit or at rest
  • Lack of testing which would assess and improve the effectiveness of such technical and organisational measures
  • Inadequate or misleading transparency information
  • Misuse of reference code issued by app for test requests and results management
  • Malicious access to sonar backend by cyber-attack. Extraction and re-identification of sonar backend data by combination with other data
  • Identification of infected individual due to minimal contact – e.g. isolated person with carer who is only contact
  • Malicious or hypochondriac incorrect self-diagnosis on app
  • Absence of controls over access to app by children
  • Lower than expected public trust at launch
  • Uncertainty about whether users will be able to exercise SRRs in relation to data held in the sonar backend
  • Uncertainty over retention of individual data items

It is surprising that the Covid App DPIA only identifies 15 risks in such a major project involving sharing Special Category Data. To assess all those risks as low to medium also casts doubts on the robustness of the risk assessments. Recently we heard that wide-ranging security flaws have been flagged by security researchers involved in the Isle of Wight pilot.

There also seems to be a lack of clarity about the data being processed by the app.
In response to the concerns raised, NHXS itself tweeted that the Covid App “does not track location or store any personal information.”

This was quickly objected to by many from the data protection community who disagreed with both assertions and argued the app used pseudonymised data and trackers.
The ICO itself states on its website:,

“Recital 26 (of the GDPR) makes it clear that pseudonymised personal data remains personal data and within the scope of the GDPR”.

The DPIA itself, however, does state that pseudonymised will be used and is personal data. The mixed messages coming from NHSX will only continue to cause confusion and once again erode trust.

Unidentified Risks

What is even more worrying is that there are some risks that have not been identified in the original DPIA:

  • There is a risk that there could be function creep and identification over time as more personal data is added or different research projects add in other identifiable data sets, along with the risk of interpreting smartphone users’ movements and interactions.
  • Users’ rights for their data to be erased under Article 17 of the GDPR have been completely removed once the data is used for research purposes. We’ll explore this more in a later blog around research.
  • There is no mention of any risk associated with the Privacy and Electronic Communications (EC Directive) Regulations 2003/2426.
  • No decision has yet been made on retention periods for research. The data could be kept for too long and breach the GDPR Principle 5.
  • The collection of personal data could be unlawful as it may breach the Human Rights Act 1998. If the app does not prove effective, it is arguable that it is not necessary and proportionate for the purpose it was created. More on this in the third blog in this series.
  • It also is unclear as to how the NHS risk scoring algorithm works as details have not been published. The Privacy Notice makes no mention of automated processing and is therefore not compliant with Article 13 (2)(f) of the GDPR. 

Conclusion

At this moment in time, there are still far too many questions and unaddressed concerns relating to the Covid App to reassure the public that they can download it in good faith, and know exactly what will happen to their data.

Feedback from the privacy community should result in a revised DPIA and further scrutiny. Only after all the questions and concerns have been addressed should the app be launched. Yesterday outsourcing firm Serco apologised after accidentally sharing the email addresses of almost 300 contact tracers. The company is training staff to trace cases of Covid-19 for the UK government!

This and other GDPR developments will be covered in our new online GDPR update workshop. Our  next online   GDPR Practitioner Certificate  course is  fully booked. We have  1 place left   on the course starting on 11th  June. 

online-gdpr-banner

Author: actnowtraining

Act Now Training is Europe's leading provider of information governance training, serving government agencies, multinational corporations, financial institutions, and corporate law firms. Our associates have decades of information governance experience. We pride ourselves on delivering high quality training that is practical and makes the complex simple. Our extensive programme ranges from short webinars and one day workshops through to higher level practitioner certificate courses delivered online or in the classroom.

3 thoughts on “The NHS COVID 19 Contact Tracing App Part 2: The Data Protection Impact Assessment”

  1. Hi, “Users’ rights for their data to be erased under Article 17 of the GDPR have been completely removed once the data is used for research purposes.” is true, but it is broader than that. The right to erasure is removed for all data that is stored centrally by the NHS. In particular, when people uninstall the app, only data on their phone will be deleted, not data on the server – despite the voluntary installation of the app being interpreted as “consent”.

  2. Hi Erke,

    Thanks for your comment. IMO The DPIA doesn’t technically completely rule out the loss of Art 17 rights before anonymisation because it states in the DPIA section on ‘Subject access and data subjects’ rights for both subject access requests and right to erasure: ‘This will require users to have access to their Sonar ID. With this they may be able to make a request which will be processed via the DHSC SRR process. The technical practicality of this needs to be assessed. If users do not have access to the Sonar ID, it may be exempt under Article 11’. I guess the question is, can someone access their Sonar ID in order to identify what is their data in the big pot?

    Matthew Gould (NHSX Chief Executive) stated in his evidence to the JHRC that “The data can be deleted as long it is on your own device. Once it is uploaded, it becomes enmeshed in wider data, and the technical technicalities of deleting it at that point become more tricky”.

    This again implies to me that it seems that NHSX seems to think that in theory Art 17 can be exercised for data before it is anonymised and actually the question in the DPIA that needs answering for me as a risk is whether the data CAN be erased if a request comes in and the data identified. (Remember the objections from people who didn’t want data taking from their GPs/ Hospital data to the Spine for Care.Data? Those objections were not initially applied because they didn’t have the tech to do it!)

    If the legal basis is 6 (e) – public task, Article 17 can be put aside using the Public Health argument Under Art 17 (3)(c). But the DPIA doesn’t actually say this, which may have been a mistake in retrospect. My reading of it implies you can try and exercise your right if you can identify the data NHSX has. NHSX would, of course, still have to justify why the processing is necessary in the first place for Art 17 (3)(c) to apply and that takes us to blog 3!

Leave a Reply

Discover more from Your Front Page For Information Governance News

Subscribe now to keep reading and get access to the full archive.

Continue reading