To Share or Not to Share; That is the Question! 

elaine-casap-qgHGDbbSNm8-unsplash

On 5th October 2021 the Data Sharing Code of Practice from the Information Commissioner’s Office came into effect for UK based Data Controllers.  

The code is not law nor does it ‘enforce’ data sharing, but it does provide some useful steps to consider when sharing personal data either as a one off or as part of an ongoing arrangement. Data Protection professionals, and the staff in the organisations they serve, will still need to navigate a way through various pressures, frameworks, and expectations on the sharing of personal data; case by case, framework by framework. A more detailed post on the contents of the code can be read here.  

Act Now Training is pleased to announce a new full day ‘hands on’ workshop for Data Protection professionals on Data Sharing. Our expert trainer, Scott Sammons, will look at the practical steps to take, sharing frameworks and protocols, risks to consider etc. Scott will also explore how, as part of your wider IG framework, you can establish a proactive support framework; making it easier for staff to understand their data sharing obligations/expectations and driving down the temptation to use a ‘Data Protection Duck out’ for why something was shared/not shared inappropriately.  

Delegates will also be encouraged to bring a data sharing scenario to discuss with fellow delegates and the tutor. This workshop can also be customised and delivered to your organisation at your premises or virtually. Get in touch to learn more.

advanced_cert

GDPR News Roundup

So much has happened in the world of data protection recently. Where to start?

International Transfers

In April, the European Data Protection Board’s (EDPB) opinions (GDPR and Law Enforcement Directive (LED)) on UK adequacy were adopted. The EDPB has looked at the draft EU adequacy decisions. It acknowledge that there is alignment between the EU and UK laws but also expressed some concerns. It has though issued a non-binding opinion recommending their acceptance. If accepted the two adequacy decisions will run for an initial period of four years. More here.

Last month saw the ICO’s annual data protection conference go online due to the pandemic. Whilst not the same as a face to face conference, it was still a good event with lots of nuggets for data protection professionals including the news that the ICO is working on bespoke UK standard contractual clauses (SCCs) for international data transfers. Deputy Commissioner Steve Wood said: 

“I think we recognise that standard contractual clauses are one of the most heavily used transfer tools in the UK GDPR. We’ve always sought to help organisations use them effectively with our guidance. The ICO is working on bespoke UK standard clauses for international transfers, and we intend to go out for consultation on those in the summer. We’re also considering the value to the UK for us to recognise transfer tools from other countries, so standard data transfer agreements, so that would include the EU’s standard contractual clauses as well.”

Lloyd v Google 

The much-anticipated Supreme Court hearing in the case of Lloyd v Google LLC took place at the end of April. The case concerns the legality of Google’s collection and use of browser generated data from more than 4 million+ iPhone users during 2011-12 without their consent.  Following the two-day hearing, the Supreme Court will now decide, amongst other things, whether, under the DPA 1998, damages are recoverable for ‘loss of control’ of data without needing to identify any specific financial loss and whether a claimant can bring a representative action on behalf of a group on the basis that the group have the ‘same interest’ in the claim and are identifiable. The decision is likely to have wide ranging implications for representative actions, what damages can be awarded for and the level of damages in data protection cases. Watch this space!

Ticketmaster Appeal

In November 2020, the ICO fined Ticketmaster £1.25m for a breach of Articles 5(1)(f) and 32 GPDR (security). Ticketmaster appealed the penalty notice on the basis that there had been no breach of the GDPR; alternatively that it was inappropriate to impose a penalty, and that in any event the sum was excessive. The appeal has now been stayed by the First-Tier Tribunal until 28 days after the pending judgment in a damages claim brought against Ticketmaster by 795 customers: Collins & Others v Ticketmaster UK Ltd (BL-2019-LIV-000007). 

Age Appropriate Design Code

This code came into force on 2 September 2020, with a 12 month transition period. The Code sets out 15 standards organisations must meet to ensure that children’s data is protected online. It applies to all the major online services used by children in the UK and includes measures such as providing default settings which ensure that children have the best possible access to online services whilst minimising data collection and use.

With less than four months to go (2 September 2021) the ICO is urging organisations and businesses to make the necessary changes to their online services and products. We are planning a webinar on the code. Get in touch if interested.

AI and Automated Decision Making

Article 22 of GDPR provides protection for individuals against purely automated decisions with a legal or significant impact. In February, the Court of Amsterdam ordered Uber, the ride-hailing app, to reinstate six drivers who it was claimed were unfairly dismissed “by algorithmic means.” The court also ordered Uber to pay the compensation to the sacked drivers.

In April EU Commission published a proposal for a harmonised framework on AI. The framework seeks to impose obligations on both providers and users of AI. Like the GDPR the proposal includes fine levels and an extra-territorial effect. (Readers may be interested in our new webinar on AI and Machine Learning.)

Publicly Available Information

Just because information is publicly available it does not provide a free pass for companies to use it without consequences. Data protection laws have to be complied with. In November 2020, the ICO ordered the credit reference agency Experian Limited to make fundamental changes to how it handles personal data within its direct marketing services. The ICO found that significant ‘invisible’ processing took place, likely affecting millions of adults in the UK. It is ‘invisible’ because the individual is not aware that the organisation is collecting and using their personal data. Experian has lodged an appeal against the Enforcement Notice.

Interesting that recently the Spanish regulator has fined another credit reference agency, Equifax, €1m for several failures under the GDPR. Individuals complained about Equifax’s use of their personal data which was publicly available. Equifax had also failed to provide the individuals with a privacy notice. 

Data Protection by Design

The Irish data protection regulator issued its largest domestic fine recently. Irish Credit Bureau (ICB) was fined €90,000 following a change in the ICB’s computer code in 2018 resulted in 15,000 accounts having incorrect details recorded about their loans before the mistake was noticed. Amongst other things, the decision found that the ICB infringed Article 25(1) of the GDPR by failing to implement appropriate technical and organisational measures designed to implement the principle of accuracy in an effective manner and to integrate the necessary safeguards into the processing in order to meet the requirements of the GDPR and protect the rights of data subjects (aka DP by design and by default). 

Data Sharing 

The ICO’s Data Sharing Code of Practice provides organisations with a practical guide on how to share personal data in line with data protection law. Building on the code, the ICO recently outlined its plans to update its guidance on anonymisation and pseudonymisation, and to explore the role that privacy enhancing technologies might play in enabling safe and lawful data sharing.

UK GDPR Handbook

The UK GDPR Handbook is proving very popular among data protection professionals.

It sets out the full text of the UK GDPR laid out in a clear and easy to read format. It cross references the EU GDPR recitals, which also now form part of the UK GDPR, allowing for a more logical reading. The handbook uses a unique colour coding system that allows users to easily identify amendments, insertions and deletions from the EU GDPR. Relevant provisions of the amended DPA 2018 have been included where they supplement the UK GDPR. To assist users in interpreting the legislation, guidance from the Information Commissioner’s Office, Article 29 Working Party and the European Data Protection Board is also signposted. Read what others have said:

“A very useful, timely, and professional handbook. Highly recommended.”

“What I’m liking so far is that this is “just” the text (beautifully collated together and cross-referenced Articles / Recital etc.), rather than a pundits interpretation of it (useful as those interpretations are on many occasions in other books).”

“Great resource, love the tabs. Logical and easy to follow.”

Order your copy here.

These and other GDPR developments will also be discussed in detail in our online GDPR update workshop next week.

Viva Las Vegas

Welcome to fabulous Las Vegas sign

Act Now is pleased to announce that Ibrahim Hasan has accepted an invitation to address the 21st Annual NAPCP Commercial Card and Payment Conference in Las Vegas, April 6-9 2020.

high_rez_NAPCP all black with url

The NAPCP is a membership-based professional association committed to advancing Commercial Card and Payment professionals and industry practices globally, with timely research and resources, peer networking and events serving a community of almost 20,000 individuals worldwide. The NAPCP is a respected voice in the industry and an impartial resource for members at all experience levels in the public and private sectors.

In a session entitled “Complying with the GDPR and United States Privacy Legislation” Ibrahim will examine the impact of GDPR and the California Consumer Privacy Act (CCPA) on the Payment Card industry. He will also be presenting webinars pre and post conference on these subjects to the NAPCP community.

The NAPCP Annual Conference is the can’t-miss event for the industry, bringing together 600 professionals from around the world to share perspectives on all Commercial Card and Payment vehicles, including Purchasing Card, Travel Card, Fleet Card, Ghost Card, Declining Balance Card, ePayables and other electronic payment options. Experts and practitioners share case studies, successes and thought-provoking ideas in almost 80 breakout sessions, all with an eye for trends and innovation across sectors.

Diane McGuire, CPCP, MBA, Managing Director of the NACP, said:

“I am really pleased that Ibrahim has accepted our invitation to join us in Las Vegas. As legislators and governments globally are starting to wake up to the implications of the digital revolution on individuals’ rights, our conference delegates will benefit from his GDPR and privacy expertise in what is sure to be a thought-provoking session.”

This is one of a number of international projects that Act Now has worked on in recent years. In June 2018 we delivered a GDPR workshop in Dubai for Middle East businesses and their advisers. In 2015 Ibrahim went to Brunei to conduct data protection audit training for government staff.

Ibrahim Hasan said:

“I am really pleased to address the NACP conference in Las Vegas. Our GDPR expertise is now being recognised abroad. The United States is the latest addition to our increasing international portfolio. We hope to use the conference as a platform to showcase our expertise to the US Data Controllers.”

Regular registration is now open for the event. Head over to this link to confirm registration.

NAPCPConferenceLogo_2020-high rez

Act Now’s forthcoming live and interactive CCPA webinar will cover the main obligations and rights in CCPA and practical steps to compliance. This webinar is ideal for data protection officers and advisers in UK and US businesses.

A New (GDPR) Data Sharing Code

Copy files, data exchange. Files transfer. Fast file transfer management

The law on data sharing is a minefield clouded with myths and misunderstandings.
The Information Commissioner’s Office (ICO) recently launched a consultation on an updated draft code of practice on this subject. Before drafting the new code, the ICO launched a call for views in August 2018, seeking input from various organisations such as trade associations and those representing the interests of individuals. (Read a summary of the responses here). The revised code will eventually replace the version made under the Data Protection Act 1998, first published in 2011.

The new code does not impose any additional barriers to data sharing, but aims to help organisations comply with their legal obligations under the General Data Protection Regulation (GDPR) and the Data Protection Act 2018 (DPA 2018).
Launching the consultation, which will close on 9th September 2019, the ICO said the code will:

“… address many aspects of the new legislation including transparency, lawful bases for processing, the new accountability principle and the requirement to record processing activities”.

Once finalised, the code will be a statutory code of practice under section 121 of the DPA 2018. Under section 127, the ICO must take account of it when considering whether a Data Controller has complied with its data protection obligations in relation to data sharing. The code can also be used in evidence in court proceedings and the courts must take its provisions into account wherever relevant.

Following the code, along with other ICO guidance, will help Data Controllers to manage risks; meet high standards; clarify any misconceptions about data sharing; and give confidence to share data appropriately and correctly. In addition to the statutory guidance, the code contains some optional good practice recommendations, which aim to help Data Controllers adopt an effective approach to data protection compliance.
It also covers some special cases, such as databases and lists, sharing information about children, data sharing in an emergency, and the ethics of data sharing.Reference is also made to the provisions of the Digital Economy Act 2017 which seeks to promote data sharing across the public sector

There is also section on sharing data for the purposes of law enforcement processing under Part 3 of the DPA 2018. This is an important area which organisations have not really understood as demonstrated by the recent High Court ruling that Sussex Police unlawfully shared personal data about a vulnerable teenager putting her “at greater risk.”

Steve Wood, the Deputy Information Commissioner for Policy, said:

“Data sharing brings many benefits to organisations and individuals, but it needs to be done in compliance with data protection law.”

“Our draft data sharing code gives practical advice and guidance on how to share data safely and fairly, and we are encouraging organisations to send us their comments before we launch the final code in the Autumn.”

You can respond to the consultation via the ICO’s online survey, or email datasharingcode@ico.org.uk until Monday 9 September 2019.

More on these and other developments in our GDPR update workshop presented by Ibrahim Hasan. Looking for a GDPR qualification? Our practitioner certificate is the best option.

The Facebook Data Breach Fine Explained

2000px-F_icon.svg-2

 

On 24th October the Information Commissioner imposed a fine (monetary penalty) of £500,000 on Facebook Ireland and Facebook Inc (which is based in California, USA) for breaches of the Data Protection Act 1998.  In doing so the Commissioner levied the maximum fine that she could under the now repealed DPA 1998. Her verdict was that the fine was ‘appropriate’ given the circumstances of the case.  For anyone following the so-called Facebook data scandal the fine might seem small beer for an organisation that is estimated to be worth over 5 billion US Dollars. Without doubt, had the same facts played out after 25th May 2018 then the fine would arguably have been much higher, reflecting the gravity and seriousness of the breach and the number of people affected.

The Facts

In summary, the Facebook (FB) companies permitted Dr Aleksandr Kogan to operate a third-party application (“App”) that he had created, known as “thisisyourdigitallife” on the FB platform. The FB companies allowed him and his company (Global Science Research (GSR) to operate the app in conjunction with FB from November 2013 to May 2015. The app was designed to and was able to obtain a significant amount of personal information from any FB user who used the app, including:

  • Their public FB profile, date of birth and current city
  • Photographs they were tagged in
  • Pages they liked
  • Posts on their time lime and their news feed posts
  • Friends list
  • Facebook messages (there was evidence to suggest the app also accessed the content of the messages)

The app was also designed to and was able to obtain extensive personal data from the FB friends of the App’s users and anyone who had messaged the App user. Neither the FB friends or people who had sent messages were informed that the APP was able to access their data, and nor did they give their consent.

The APP was able to use the information that it collected about users, their friends and people who had messaged them, in order to generate personality profiles. The information and also the data derived from the information was shared by Dr Kogan and his company with three other companies, including SCL Elections Ltd (which controls the now infamous Cambridge Analytica).

Facebook Fine Graphic

In May 2014 Dr Kogan sought permission to migrate the App to a new version of the FB platform. This new version reduced the ability of apps to access information about the FB friends of users. FB refused permission straight away. However, Dr Kogan and GSR continued to have access to, and therefore retained, the detailed information about users and the friends of its users that it had previously collected via their App. FB did nothing to make Dr Kogan or his company delete the information.  The App remained in operation until May 2015.

Breach of the DPA

The Commissioner’s findings about the breach make sorry reading for FB and FB users. Not only did the FB companies breach the Data Protection Act, they also failed to comply or ensure compliance with their own FB Platform Policy, and were not aware of this fact until exposed by the Guardian newspaper in December 2015.

The FB companies had breached s 4 (4) DPA 1998  by failing to comply with the 1stand 7th data protection principles. They had:

  1. Unfairly processed personal data in breach of 1st data protection principle (DPP1). FB unfairly processed personal data of the App users, their friends and those who exchanged messages with users of the APP. FB failed to provide adequate information to FB users that their data could be collected by virtue of the fact that their friends used the App or that they exchanged messages with APP users. FB tried, unsucesfully and unfairly, to deflect responsibility onto the FB users who could have set their privacy settings to prevent their data from being collected. The Commissioner rightly rejected this. The responsibility was on Facebooks to inform users about the App and what information it would collect and why. FB users should have been given the opportunity to withhold or give their consent. If any consent was purportedly  given by users of the APP or their friends, it was invalid because it was not freely given , specific or informed. Conseqauntly, consent did not provide a lawful basis for processing
  2. Failed to take appropriate technical and organisational measures against unauthorised or unlawful processing of personal data, in breach of the 7th data protection principle (DPP7). The processing by Dr Kogan and GSR was unauthorised (it was inconsistent with basis on which FB allowed Dr Kogan to obtain access of personal data for which they were the data controller; it breached the Platform Policy and the Undertaking. The processing by DR Kogan and his company was also unlawful, because it was unfair processing.  The FB companies failed to take steps (or adequate steps) to guard against and unlawful processing.  (See below). The Commissioner considered that the FB companies knew or ought to have known that there was a serious risk of contravention of the data protection principle sand they failed to take reasonable steps to prevent such a contravention.

Breach of FB Platform Policy

Although the FB companies operated a FB Platform Policy in relation to Apps, they failed to ensure that the App operated in compliance with the policy, and this constituted their breach of the 7th data protection principle. For example, they didn’t check Dr Kogan’s terms and conditions of use of the APP to see whether they were consistent with their policy (or presumably whether they were lawful). In fact they failed to implement a system to carry out such a review. It was also found that the use of the App breached the policy in a number of respects, specifically:

  • Personal data obtained about friends of users should only have been used to improve the experience of App users. Instead Dr Kogan and GSR was able to use it for their own purposes.
  • Personal data collected by the APP should not be sold or third parties. Dr Kogan and GSR had transferred the data to three companies.
  • The App required permission from users to obtain personal data that the App did not need in breach of the policy.

The FB companies also failed to check that Dr Kogan was complying with an undertaking he had given in May 2014 that he was only using the data for research, and not commercial, purposes. However perhaps one of the worst indictments is that FB only became aware that the App was breaching its own policy when the Guardian newspaper broke the story on December 11 2015. It was only at this point, when the story went viral, that FB terminate the App’s access right to the Facebook Login. And the rest, as they say, is history.

Joint Data Controllers

The Commissioner decided that Facebook Ireland and Facebook Inc were, at all material times joint data controllers and therefore jointly and severally liable. They were joint data controllers of the personal data of data subjects who are resident outside Canada and the USA and whose personal data is processed by or in relation to the operation of the Facebook platform. This was on the basis that the two companies made decisions about how to operate the platform in respect of the personal data of FB users.

The Commissioner also concluded that they processed personal data in the context of a UK establishment, namely FB UK (based in London) in respect of any individuals who used the FB site from the UK during the relevant period. This finding was necessary in order to bring the processing within scope of the DPA and for the Commissioner to exercise jurisdiction of the two Facebook companies.

The Use of Data Analytics for Political Purposes

The Commissioner considered that some of the data that was shared by Dr Kogan and his company, with the three companies is likely to have been used in connection with, or for the purposes of, political campaigning. FB denied this as far as UK residents were concerned and the Commissioner was unable, on the basis of information before her, whether FN was correct. However, she nevertheless concluded that the personal data of UK users who were UK residents was put at serious risk of being shared and used in connection with political campaigning. In short Dr Kogan and/or his company were in apposition where they were at liberty to decide how to use the personal data of UK residents, or who to share it with.

As readers will know, this aspect of the story continues to attract much media attention about the possible impact of the data sharing scandal on the US Presidential elections and the Brexit referendum. The Commissioner’s conclusions are quite guarded, given the lack of evidence or information available to her.

Susan Wolf will be delivering these upcoming workshops and the forthcoming FOI: Contracts and Commercial Confidentiality workshop which is taking place on the 10th December in London. 

Our 2019 calendar is now live. We are running GDPR and DPA 2018 workshops throughout the UK. Head over to our website to book your place now. 

Need to prepare for a DPO/DP Lead role? Train with Act Now on our hugely popular GDPR Practitioner Certificate.

LGL Advert

 

Google v Lloyd- Representative action for damages fails under the DPA 1998

 

GoogleAs more individuals become aware of the way in which organisations such as Facebook, and Google have used their personal data unlawfully, then the prospect of litigation, and class actions, seems increasingly likely.  However, the recent case of Lloyd v Google [2018] EWHC 2599 (QB) demonstrates that it doesn’t necessarily follow that a clear breach of data protection legislation will result in a successful claim for damages.  The case shows that even if claimants can prove that there has been a breach of data protection legislation (now the GDPR and DPA 2018) they need to identify what harm the breach has caused and how the damage has been caused by the breach. This will inevitably be a fact specific exerciser.

The background-the Safari Workaround and DoubleClick Ad Cookie

The case concerned the use, by Google, of a cookie known as the “DoubleClick Ad cookie” between 2011 -2012. Google allegedly used the cookie to secretly track the internet activity of iPhone users in the US and the UK. Ordinarily the Safari browser (developed by Apple) had a default setting that blocked the use of third-party cookies, such as the DoubleClick Ad cookie. However, Google was able to exploit certain exceptions to this default blockage and implement the so called “Safari Workaround” which enabled Google to set the cookie on an iPhone, when the user used the Safari browser. This gave Google access to a huge amount of browser generated personal information, including the address or URL of the website which the browser is displaying to the user. It was claimed that this information enabled Google to obtain or deduce other sensitive information about individuals, such as their interests and habits, race ethnicity, class, political or religious view, health, age, sexuality and financial position. Google was also alleged to have aggregated this information to create lists of different types of people, such as “football lovers’, and offered these lists to subscribing advertisers.

Regulatory action was against Google in the USA with Google agreeing to pay US$25.5 million civil penalty to settle charges brought by the US Federal Trade Commission, and a further US$17 million to settle state consumer-based actions.  No such regulatory action as taken by the Information Commissioner even though the breach clearly affected UK iPhone users.

 The representative claim

The action against google was brought by Mr Lloyd who was the only named claimant. However, he brought this action as a representative of a much larger class of people. This is a novel type of litigation that allows a representative to sue in a representative capacity on behalf of a class of people who have “the same interest” in the claim. It was not entirely clear how big the class was, but estimates ranged between 5.4-4.4 million people. Google not surprising was keen that permission was denied, bearing in mind it estimated its potential liability (if the case was successful) of between £1-3 billion.

Mr Lloyd argued that he and each member of the group/class he represented had a right to be compensated “for the infringement of their data protection rights”. Specifically, it was alleged that Google had carried out the secret tracking and collation of personal data without the data subject’s consent or knowledge; that this was a breach of Google’s duty under s 4(4) of the DPA 1998 and that the data subjects were entitled to compensation under s 13 DPA 1998.

In other words, the fact of the contravention gave them a right to be compensated.  Neither Mr Lloyd or any member of the group alleged or gave evidence about any financial loss, distress or anxiety. There were no individual allegations of harm. In fact, Mr Lloyd asserted that the claim was generic and claimed an equal, standard “tariff” award for each member of the class (the claim was for £750 per person). This turned out to be fatal to the claim.

Litigation against a US based company

Any litigant, or group of litigants, considering an action against Apple or Google or any other such company that is based outside the UK first need the permission of the High Court in order to serve a claim against a defendant outside of the jurisdiction of the domestic courts. Before the court will grant permission, the claimant must prove three things. First that the case falls within one of the listed “jurisdictional gateways”; second, that the case has a reasonable prospect of success and finally that England is the appropriate place to deal with the case.  The High Court had no difficulty deciding that England would be the natural jurisdiction for the case since the claimants were all in the UK and the alleged damage had been incurred in the UK.  However, the High Court Judge found that Mr Lloyd’s case failed on the remaining two issues and denied permission for the case to proceed.

The Court identified that the relevant gateway in this case was that the claimant had to prove they had a good arguable claim in tort and the damage was sustained in England & Wales.  The Judge was clear that a claim for damages under the DPA 1998 is a claim in tort. He was also satisfied that each member of the class was (for at least some of the relevant period) within the jurisdiction when they connected to the internet using the Safari browser.

However, the real and substantial issue in this case was whether the Safari Workaround had caused “damage” within the meaning of the DPA 1998.  The Court engaged in a lengthy analysis of the case law on DPA damages and concluded that the claimants had not sustained damages in this case. On this basis the court decided that Mr Lloyd did not have a good arguable case or a reasonable prospect of success.

Damages under the DPA 1998

Section 13 of the DPA 1998 provided that an individual who suffers damage by reason of any contravention by a data controller of any of the requirements of the DPA 1998 is entitled to compensation from the data controller for that damage.

The High Court decided that giving the words their natural meaning, this statutory right to compensation arises where

(a) there has been a breach of the DPA; and

(b) as a result, the claimant suffers damage.

These are two separate events connected by a causal link.  In short, the breach must cause the damage. Based on this logic, it necessarily follows that some breaches will not give rise to damages.  The High Court judge suggested some examples where a data controller processes personal data in breach of the DPA, but where the breach may not warrant an award of compensation, such as:

  • recording inaccurate data, but not using or disclosing it
  • Holding, but not disclosing, using or consulting personal data that are irrelevant
  • Holding data for too long
  • Failing, without consequences, to take adequate security measures.

Of course, this is not to say that these types of breaches could never give rise to a successful claim for damages, as much will depend on the context and facts of the case. However, the Court did suggest that data subjects had alternative remedies such as rectification, erasure and objection.

One of the key arguments presented by Lloyd was that the claimants had incurred damage because they lost control of their data. According to the Court, there will be circumstances where the loss of control may have significantly harmful consequences, such as in Vidal Hall. (Google in v Vidal-Hall and others & The Information Commissioner [2015] EWCA Civ 311) The focus in that case was on the significant distress caused to the claimants by the delivery to their screens of unwanted advertising material. However, decision was very fact specific; it seemed that the type of information that had secretly been tracked and used to send targeting advertising was of a particularly private and sensitive nature, such that it would have caused harm to the claimants had any one else seen their computer screens.

The High Court in Lloyd v Google also accepted that delivery of unwanted commercial advertising can be upsetting in other ways, for example where repeated or bulk unwanted communications:

  • Is so distressing it constitutes harassment even if the content is inherently innocuous
  • Infringes a person’s right to respect for their autonomy
  • Represents a material interference with their freedom of choice over how they lead their life.

However, on the facts of the case the Court concluded that the claimants had not provided any particulars of any damage suffered.   Rather the claimants seemed to be relying on the fact that the claimants were entitled to be compensated because of the breach alone. The judge rejected this as a possibility.

A Court cannot award compensation just because the data protection rules have been breached. The Court also rejected the idea that the claimants should be compensated in order to “censure” the defendant’s behaviour. The Court also rejected any argument that damages under the DPA should be awarded in order on a sort of “restitutionary” basis, that is ‘calculated by reference to the market value of the data which has been refused’.

Representative action cases- what lessons can be learnt?

This was a novel litigation, it involved one named claimant bringing an action on behalf of a large group. The cation faced difficulties right from the start, not least in trying to identify the group. The Judge identified three real difficulties with this type of action:

  1. The representative (Lloyd) and the members of the class do not all have the “same interest” which is an essential requirement for any representative action. Some people may have suffered no damage and others different types of damage. Consequently, they did not all have the same interest in the action.
  2. Even if it was possible to define the class of people represented, it would be practically impossible to identify all members of the class.
  3. The court would not exercise its discretion to allow this case to go forward, particularly given the costs of the litigation, the fact that the damages payable to each individual (were the case to succeed) would be modest, and that none of the class had shown any interest in, appeared to care about in the claim.

Anyone contemplating pursuing this type of claim in future would be well advised to carefully consider and take on board the judge’s criticisms and seek to address them before pursing an action.

LGL Advert

Susan Wolf will be delivering the forthcoming GDPR workshop in Birmingham on the 19th November. Book your place now! 

 

The Subject Access Right Under GDPR

canstockphoto710053

When the General Data Protection Regulation (GDPR) comes into force on 25th May 2018, it will introduce a number of new obligations on Data Controllers which will require them, amongst other things, to review their approach to personal data breaches, privacy notices and overall GDPR compliance responsibility. Some new Data Subject rights, including the right to erasure and the right to data portability, will also be introduced.

So there is a lot to learn and do within a short space of time. However the good news is, whilst GDPR will replace the UK’s Data Protection Act 1998 (DPA), it still includes familiar concepts such the right of the Data Subject to request a copy of his/her data, known as a Subject Access Request (SAR) in DPA parlance.

In brief, Article 15 of GDPR gives an individual the right to obtain:

  • confirmation that their data is being processed;
  • access to their personal data; and
  • other supplementary information

The supplementary information mentioned above is the same as under section 7 of the DPA (e.g. information about the source and recipients of the data) but now also includes, amongst other things,  details of international transfers, other Data Subject rights, the right to lodge a complaint with the ICO and the envisaged retention period for the data.

Fees

Under the DPA, Data Controllers can charge £10 for a SAR (£50 for a health record). GDPR allows most requests to be made free of charge. This is a significant change and will hit the budgets of those who receive voluminous or complex requests e.g. local authority social services departments.  However, a “reasonable fee” can be charged for further copies of the same information and when a request is manifestly unfounded or excessive, particularly if it is repetitive. The fee must be based on the administrative cost of providing the information.

Time Limit

The DPA allows Data Controllers 40 calendar days to respond to a SAR.  Under GDPR the requested information must be provided without delay and at the latest within one month of receipt. This can be extended by a further two months where the request is complex or where there are numerous requests. If this is the case, the Data Subject must be contacted within one month of the receipt of the request and explain why the extension is necessary.

All refusals must be in writing setting out the reasons and the right of the Data Subject to complain to the ICO and to seek a judicial remedy.

Format of Responses

Where the Data Subject makes a SAR by electronic means, and unless otherwise requested by the Data Subject, the information should be provided in a commonly used electronic format. Before providing the information, the Data Controller must verify the identity of the person making the request using “reasonable means”.

The GDPR (Recital 63) introduces a new best practice recommendation that, where possible, organisations should be able to provide remote access to a secure self-service system which would provide the individual with direct access to his or her information This will not be appropriate for all organisations, but there are some sectors where this may work well e.g. local authorities may look to providing secure online access to social work records.

Article 15 makes it clear that the right to obtain a copy of information or to access personal data through a remotely accessed secure system should not adversely affect the rights and freedoms of others. Therefore, as is the case under section 7(4) of the DPA, careful thought will need to be given to whether third party personal data needs to be redacted before disclosing information.

Exemptions

Data Protection Officers will be familiar with the exemptions in the DPA, set out in Part 4 and Schedule 7, some of which allow a Data Controller to refuse a SAR. There is currently no such list in the GDPR. However Article 23 allows national governments to introduce exemptions to various provisions in GDPR, including SARs, by way of national legislation based on a list set out in that article. This contains the same categories as in the DPA e.g. national security, crime prevention, regulatory functions etc. My guess is that the UK Government will enact the same exemptions as currently exist in the DPA.

Recital 63 states the purpose of the SAR is to make Data Subjects aware of and allow them to verify the lawfulness of the processing of their personal data. This seems to suggest that requests for other purposes e.g. to assist in litigation may be rejected. Compare this to the recent case of Dawson-Damer v Taylor Wessing LLP [2017] EWCA Civ 74 in which the Court of Appeal said that there was nothing in the EU Data Protection Directive (which the DPA implements into UK law) which “limits the purpose for which a data subject may request his data, or provides data controllers with the option of not providing data based solely on the requestor’s purpose.” (More on this case here.)

The GDPR does not introduce an exemption for requests that relate to large amounts of data, but a Data Controller may be able to consider whether the request is manifestly unfounded or excessive. Recital 63 also permits asking the individual to specify the information the request relates to.

Subject Access and Data Portability

How different is the Subject Access Right to the Right to Data Portability set out in Article 20? The latter also allows for Data Subjects to receive their personal data in a structured, commonly used and machine-readable format. In addition it allows them to request it to be transmitted to another Data Controller.

Unlike the subject access right, the Data Portability right does not apply to all personal data held by the Data Controller concerning the Data Subject.  Firstly it has to be automated data. Paper files are not included. Secondly the personal data has to be knowingly and actively provided by the Data Subject. By contrast personal data that are derived or inferred from the data provided by the Data Subject, such as a user profile created by analysis of raw smart metering data or a website search history, are excluded from the scope of the right to Data Portability, since they are not provided by the Data Subject, but created by the Data Controller. Thirdly the personal data has to be processed by the Data Controller with the Data Subject’s consent or pursuant to a contract with him/her.

In contrast, the subject access right applies to all personal data about a Data Subject processed by the Data Controller, regardless of the format it is held in, the justification for processing or its origin.

It is important to note that both rights do not require Data Controllers to keep personal data for longer than specified in their retention schedules or privacy polices. Nor is there a requirement to start storing data just to comply with a request if received.

To discuss this and other GDPR issues, come and say hello to us on stand 15 at the ICO Conference on Monday 6th March in Manchester. 

Make 2017 the year you achieve a GDPR qualification? See our full day workshops and new GDPR Practitioner Certificate.

New Data Sharing Powers in the Digital Economy Bill

illust_01_e

Much has been written about the complexities of the current legal regime relating to public sector data sharing. Over the years this blog has covered many stops and starts by the government when attempting to make the law clearer.

The Digital Economy Bill is currently making its way through Parliament. It contains provisions, which will give public authorities (including councils) more power to share personal data with each other as well as in some cases the private sector.

The Bill has been a long time coming and is an attempt by the Government to restore some confidence in data sharing after the Care.Data fiasco. It follows a consultation which ended in April with the publication of the responses.

The Bill will give public authorities a legal power to share personal data for four purposes:

  1. To support the well being of individuals and households. The specific objectives for which information can be disclosed under this power will be set out in Regulations (which can be added to from time to time). The objectives in draft regulations so far include identifying and supporting troubled families, identifying vulnerable people who may need help re tuning their televisions after changes to broadcasting bands and providing direct discounts on energy bills for people living in fuel poverty.
  2. For the purpose of debt collection and fraud prevention. Public authorities will be able to set up regular data sharing arrangements for public sector debt collection and fraud prevention but only after such arrangements have been through a business case and government approval process.
  3. Enabling public authorities to access civil registration data (births, deaths and marriages) (e.g. to prevent the sending of letters to people who have died).
  4. Giving the Office for National Statistics access to detailed administrative government data to improve their statistics.

The new measures are supported by statutory Codes of Practice (currently in draft) which provide detail on auditing and enforcement processes and the limitations on how data may be used, as well as best practice in handling data received or used under the provisions relating to public service delivery, civil registration, debt, fraud, sharing for research purposes and statistics. Security and transparency are key themes in all the codes. Adherence to the 7th Data Protection Principle (under Data Protection Act 1998 (DPA)) and the ICO’s Privacy Notices Code (recently revised) will be essential.

A new criminal offence for unlawful disclosure of personal data is introduced by the Bill. Those found guilty of an offence will face imprisonment for a term up to two years, a fine or both. The prison element will be welcomed by the ICO which has for a while been calling for tougher sentences for people convicted of stealing personal data under the DPA.

The Information Commissioner was consulted over the codes so (hopefully!) there should be no conflict with the ICO Data Sharing Code. The Bill is not without its critics (including Big Brother Watch) , many of whom argue that it is too vague and does not properly safeguard individuals’ privacy.

It is also an oversight on the part of the drafters that it does not mention the new General Data Protection Regulation (GDPR) which will come into force on 25th May 2018. This is much more prescriptive in terms of Data Controllers’ obligations especially on transparency and privacy notices.

These and other Information Sharing developments will be examined in our data protection workshops and forthcoming webinar.

Illustration provided by the Office of the Privacy Commissioner of Canada (www.priv.gc.ca)

The revised ICO Privacy Notices Code and GDPR

ICO Privacy notice code (4)

Earlier this month the Information Commissioner’s Office (ICO) published its revised Privacy Notices Code of Practice.

Under the Data Protection Act 1998 (DPA), a Data Controller should issue a privacy notice to Data Subjects whenever personal data is gathered from them. This should be done at the point of collection or as soon as reasonably practicable after that. The notice should (at the very least) include:

  • The identity of the Data Controller
  • The purpose, or purposes, for which the information will be processed
  • Any further information necessary, in the specific circumstances, to enable the processing in respect of the individual to be ‘fair’ (in accordance with the 1st DP Principle).

The ICO says that organisations need to do more to explain to service users what they are doing with personal personal data and why. The code includes examples of compliant notices as well as suggested formats for online notices, in apps and even a sample video privacy notice.

As we know the General Data Protection Regulation (GDPR) will be in force in May 2018 (and still relevant despite the Brexit vote). The GDPR specifies further detail to be included in privacy notices. It also requires notices to be issued even where personal data is received from a third party. The code briefly explains these new requirements including a useful table. The ICO says that by following the good practice recommendations in the code, organisations will be well placed to comply with the GDPR regime. Read Scott’s blog post on the new requirements here.

This code has been issued under section 51 of the DPA. The basic legal requirement is to comply with the DPA itself. Organisations may use alternative methods to meet the DPA’s requirements, but if they do nothing then they risk breaking the law. When considering whether or not the DPA has been breached the Information Commissioner can have due regard to the code.

The code includes a helpful checklist, covering key points and tips on how to write a notice.

Privacy Notices need to be regularly reviewed and updated to reflect any changes. The ICO is considering other practical ways of supporting organisations in achieving greater transparency such as the feasibility of a privacy notice generator!

Want to know more about privacy notices under GDPR?  Attend our full day GDPR workshop

GDPR Practitioner Certificate (GDPR.Cert) – A 4 day certificated course aimed at those undertaking the role of Data Protection Officer under GDPR whether in the public or the private sector.

New Data Sharing Consultation

illust_33_e

In February the Government launched a consultation on introducing laws to allow more citizens’ data to be used for ancillary purposes by the public sector. It says:

“Proportionate, secure and well-governed information sharing between public authorities can improve the lives of citizens. It can also support decisions on the economy which allow businesses to flourish, and improve the efficiency and effectiveness of the public sector. The government aims to do more to unlock the power of data.

The consultation runs till 22nd April 2016. It looks at enabling information sharing between public authorities to improve the lives of citizens and support decisions on the economy and society.” 

The proposals fall into 3 categories:

Improving public services

  • allowing public authorities to share personal data in specific contexts to improve the welfare of a specific person (e.g. automatically providing direct discounts on energy bills of people living in fuel poverty)
  • enabling public authorities to access civil registration data (births, deaths and marriages) (e.g. to prevent the sending of letters to people who have died)

Addressing fraud and debt

  • helping citizens manage their debt more effectively and reduce the overdue debt that they owe to government (i.e. allowing sharing of information for public sector debt collection)
  • helping detect and prevent the losses government currently experiences due to fraudulent activity

Allowing use of data for research and official statistics

  • giving the Office for National Statistics access to detailed administrative government data to improve their statistics
  • using de-identified data in secure facilities to carry out research for public benefit

Cynics may say that the proposals are really about allaying public sector fears that Government initiatives such as the Troubled Families Programme, require them to share personal data which may well breach the Data Protection Act 1998 (DPA).

A new criminal offence for unlawful disclosure of personal data is proposed to be introduced. Those found guilty of an offence will face imprisonment for a term up to two years, a fine or both. Certainly the prison element will be welcomed by the Information Commissioner who has recently reiterated his call for stronger sentencing powers for people convicted of stealing personal data under the DPA.

It is proposed that the new measures will be supported by a statutory Code of Practice, which will set out if, how and when data can be disclosed under each power. Primary legislation will set out the requirement to consult the Information Commissioner, and where appropriate Ministers in the Devolved Administrations and other relevant experts before issuing or revising  these Codes. Compliance with these Codes would be a requirement for any public authority seeking to participate under the proposals; failure to abide by the Codes may result in a public authority being removed from the relevant schedule and losing the ability to disclose or receive data under the power.

The whole law on information sharing needs examining. To echo the words of the Government:

We need to go further and update the legal regime to provide simple and flexible legal gateways to improve public sector access to information in key areas which impact the whole public sector in a systematic and consistent way so that citizens can have confidence that their data is being used for the right purposes and remains securely held.”

In 2014 the Law Commission reported on the outcome of a consultation on the law around sharing of personal information between public sector organisations.  It set out its recommendations, which included a full law reform project to be carried out in order to create a principled and clear legal structure for data sharing, which will meet the needs of society. I have not come across the Government’s response to the recommendations. May be this latest consultation is it!

Of course any new laws will have to be consistent with the new EU General Data Protection Regulation (GDPR), expected to come into force in 2018 and, which will replace the DPA.

These and other Information Sharing developments will be examined in our forthcoming full day workshops and webinars. 

Illustration provided by the Office of the Privacy Commissioner of Canada (www.priv.gc.ca)

%d bloggers like this: