Lessons from the Google GDPR Fine

person holding white ipad

Photo by Pixabay on Pexels.com

On 21st January 2019, theFrench National Data Protection Commission (CNIL) fined Google 50 million euros for breaches of the General Data Protection Regulation (GDPR). This is the biggest financial penalty issued so far by any European regulator under the new law. But the decision goes far beyond Google or even the tech sector.

In May 2018 CNIL received complaints from two privacy groups;  None Of Your Business and La Quadrature du Net. They argued, amongst other things, that Google did not have a valid legal basis to process the personal data of the users of its services, particularly for ads personalisation purposes, as they were in effect forcing users to consent.

CNIL agreed citing a “lack of transparency, inadequate information and lack of valid consent” regarding ad personalisation for users. It said users were “not sufficiently informed” about what they were agreeing to. Google made it too difficult for users to find essential information, “such as the data-processing purposes, the data storage periods or the categories of personal data used for the ads personalisation”, by splitting them across multiple documents, help pages and settings screens. That lack of clarity meant that users were effectively unable to exercise their right to opt out of data-processing for personalisation of ads.

GDPR (Article 4) standard consent must be, amongst other things, “specific” and “unambiguous”. Google consent failed as users were not asked specifically to opt in to ad targeting but were asked simply to agree to Google’s terms and privacy policy bundled together.

Google is appealing the decision. Meanwhile the Swedish data protection the Swedish Data Protection Authority (Datainspektionen) has also announced an investigation Google’s slurping of location and web histories.

This decision requires all Data Controllers to think carefully how they go about obtaining consent for personal data processing. Article 7 and 8 of GDPR must be considered as well as the Article 29 Working Party guidance.

Article 13 and 14 set out what information should be given to data subjects when processing their personal data. This is a stand-alone right but it also helps to ensure that the processing is fair and transparent as per Article 5(1)(a). Our blog on what to include in a privacy notice (including examples) will help those revising their notices in the light of this decision.

BREXIT UPDATE: Draft regulations have been laid before Parliament to amend GDPR and the Data Protection Act 2018 will change as a result of Brexit. If you want to know more, Ibrahim Hasan is presenting a webinar on 12th and 21st February 2019 at 10am.

Make 2019 the year you achieve a GDPR qualification. Our GDPR Practitioner Certificate courses are filling up fast.

Posted in GDPR, ICO | Tagged , , | 1 Comment

Making GDPR British: New Regulations set out the UK’s post Brexit DP landscape

robert-tudor-704838-unsplash

On 19th December 2018, just when you thought that you have finally made sense of the UK’s data protection regime, the government published new regulations with the catchy title, “The Data Protection, Privacy and Electronic Communications (Amendments etc) (EU Exit) Regulations 2019.” There are sixty one pages of regulations to navigate, before 29th March 2019, with only one page of explanatory notes. And you thought Theresa May had problems!

Before you start reaching for the highlighters, marker pens and sticky notes (and maybe even smelling salts) it is important to bear in mind that the primary aim of the new regulations is “to make GDPR British” (my phrase). Yes dear readers, we will soon have our own (red, white and blue) version of GDPR. All the pain and cost of Brexit will have been worth it!

To understand the new regulations, we have to go “back to basics” (not my phrase). The General Data Protection Regulation (GDPR) came into force on 25th May 2018. Despite the UK leaving the EU on 29th March (or later – you never know! – or never, in which case ignore everything and wait for more blog posts!!!!), all EU laws, including GDPR, will automatically become part of UK domestic law due to the provisions of the European Union (Withdrawal) Act 2018.

The EU version of GDPR, which the UK is bound by until exit day, contains many references to EU laws, institutions, currency and powers, amongst other things, which will cease to be relevant in the UK after Brexit. The Data Protection, Privacy and Electronic Communications (Amendments etc) (EU Exit) Regulations 2019 amend GDPR to remove these references and replace them with British equivalents where applicable. From exit day this new amended version of GDPR will be imaginatively titled, the “UK GDPR”.

The new regulations also amend the Data Protection Act 2018 (DPA 2018) which must be read alongside GDPR. (Read our summary and blog post busting some of the myths).

Chapter 3 of Part 2 of the DPA 2018 currently applies a broadly equivalent data protection regime to certain types of data processing to which the GDPR does not apply (“the applied GDPR”). For example, where personal data processing is related to immigration and to manual unstructured data held by a public authority covered by the Freedom of Information Act 2000 (FOI). The DPA 2018 applies GDPR standards to such data whilst adjusting those that would not work in the national context. Amongst other things, the new regulations merge this part into the UK GDPR.

Other provisions to note include:

  • Regulation 5 makes provision concerning interpretation in relation to processing that prior to exit day was subject to the applied GDPR.
  • Regulation 6 introduces Schedule 3, which makes consequential amendments to other legislation.
  • Regulation 8 makes amendments to the Privacy and Electronic Communications (EC Directive) Regulations 2003 (PECR) in light of provision made by the GDPR relating to the meaning of “consent”.

Part 3 of the DPA 2018 regulates the processing of personal data for law enforcement purposes implementing the Law Enforcement Directive (EU) 2016/680. This part will continue to apply, even after exit day, to competent authorities i.e. those that process personal data for the purposes of criminal offences or threats to public security e.g. the police, trading standards departments etc. Some minor amendments will be made to reflect the UK GDPR. Similarly Part 4 of the Act (processing of personal data by the Intelligence Services) and Parts 5 and 6 (Information Commissioner Powers and Enforcement) will remain in force.

The new regulations also deal with post Brexit international data transfers from the UK by amending the GDPR and adding additional provisions to the DPA 2018. However for the lawful transfer of personal data from the EU into the UK without additional safeguards being required, the UK will need to apply to the EU for adequacy status and join a list of 12 countries. These regulations attempt to make the UK version of GDPR as robust as the EU version. We will have to wait and see if the EU agrees.

The new regulations are currently in draft (you can follow their progress here). If approved they come into force on exit day, which is currently scheduled to be 29th March 2019, although it could be later. With all the uncertainties over the Brexit deal, I would not get the markers out just yet nor tear up your Act Now GDPR handbook!

STOP PRESS – The Regulations were made on 28th February 2018 and will come into force as set out in Regulation 1.

If you want to know more about the new regulations, Ibrahim Hasan is presenting a webinar soon.

Make 2019 the year you achieve a GDPR qualification. Our next few GDPR Practitioner Certificate courses are almost fully booked!

Posted in Brexit, DP ACT 2018, GDPR | Tagged , , | 4 Comments

Seasons Greetings

Seasons Greetings Banner - No Text

Act Now Training would like to wish everyone all of the seasons greetings and we wish you a happy and prosperous 2019.

The office will be shut from Friday 21st December until the 2nd January 2019.

We look forward to seeing you all in the new year!

Posted in Marketing, Uncategorized | Leave a comment

Act Now Launches New FOI Practitioner Certificate

 

FOI Certificate Banner

Act Now is pleased to announce the launch of its brand new FOI Practitioner Certificate.

This course is one of the first of its kind, in a way that only Act Now delivers – practical, on the ground skills to help you fulfil your role as an FOI Officer.

This new certificate course is ideal for those wishing to acquire detailed knowledge of FOI and related information access legislation (including EIR) in a practical context. It has been designed by leading FOI experts including Ibrahim Hasan and Susan Wolf – formerly a senior lecturer on the University of Northumbria’s LLM in Information
Rights Law.

The course uses the same format as our very successful GDPR Practitioner Certificate. It takes place over four days (one day per week) and involves lectures, discussion and practical drafting exercises. This format has been extremely well received by over 1000  delegates who have completed the course. Time will also be spent at the end of each day discussing what issues delegates may face when implementing/advising on the FOI topics of the day.

The four teaching days are followed by an online assessment and a practical project to be completed within 30 days.

Why is this course different?

  • An emphasis on practical application of FOI rather than rote learning
  • Lots of real life case studies and exercises
  • An emphasis on drafting Refusal Notices
  • An online Resource Lab with links, guidance and over 5 hours of videos
  • Modern assessment methods rather than a closed book exam

 Who should attend?

This course is suitable for anyone working within the public sector who needs to learn about FOI and related legislation in a practical context, as well as those with the requisite knowledge wishing to have it recognised through a formal qualification. It is most suitable for:

  • FOI Officers
  • Data Protection Officers
  • Compliance Officers
  • Auditors
  • Legal Advisers

Susan, says:

“FOI and EIR are almost 14 years old. Since the Act and Regulations came into force there have been many legal developments and court decisions that have given practitioners a much greater understanding of the legal provisions and how they should be applied in practice. With this in mind, we have written this course to ensure that it equips public sector officers with all the necessary knowledge and skills they need to respond to freedom of information requests accurately and efficiently. This course, with its emphasis on the law in practice, will enable trainees to become more accomplished and confident FOI practitioners”

Susan will share her vast experience gained through years of helping organisations comply with their information rights legislation obligations. This, together with a comprehensive set of course materials and guidance notes, will mean that delegates will not only be in a position to pass the course assessment but to learn valuable skills which they will be able to apply in their workplaces for years to come.

This new course builds on Act Now’s reputation for delivering practical training at an affordable price:

This new course widens the choice of qualifications for IG practitioners and advisers. Ibrahim Hasan (Director of Act Now Training) commented:

“We are pleased be able to launch this new qualification. Because of its emphasis on practical skills, we are confident that it will become the qualification of choice for current and future FOI Officers and advisers.”

To learn more please visit our website.

All our courses can be delivered at your premises at a substantially reduced cost.
Contact us for more information.

Posted in BCS, Certificated course, EIR, FOI caselaw, FOI caselaw, Section 5, FOI Fees, salaries, personal data, FOI Fees, FOI Veto, FOISA, Freedom of Information | Tagged , , , , , , , , , , | 1 Comment

The Facebook Data Breach Fine Explained

2000px-F_icon.svg-2

 

On 24th October the Information Commissioner imposed a fine (monetary penalty) of £500,000 on Facebook Ireland and Facebook Inc (which is based in California, USA) for breaches of the Data Protection Act 1998.  In doing so the Commissioner levied the maximum fine that she could under the now repealed DPA 1998. Her verdict was that the fine was ‘appropriate’ given the circumstances of the case.  For anyone following the so-called Facebook data scandal the fine might seem small beer for an organisation that is estimated to be worth over 5 billion US Dollars. Without doubt, had the same facts played out after 25th May 2018 then the fine would arguably have been much higher, reflecting the gravity and seriousness of the breach and the number of people affected.

The Facts

In summary, the Facebook (FB) companies permitted Dr Aleksandr Kogan to operate a third-party application (“App”) that he had created, known as “thisisyourdigitallife” on the FB platform. The FB companies allowed him and his company (Global Science Research (GSR) to operate the app in conjunction with FB from November 2013 to May 2015. The app was designed to and was able to obtain a significant amount of personal information from any FB user who used the app, including:

  • Their public FB profile, date of birth and current city
  • Photographs they were tagged in
  • Pages they liked
  • Posts on their time lime and their news feed posts
  • Friends list
  • Facebook messages (there was evidence to suggest the app also accessed the content of the messages)

The app was also designed to and was able to obtain extensive personal data from the FB friends of the App’s users and anyone who had messaged the App user. Neither the FB friends or people who had sent messages were informed that the APP was able to access their data, and nor did they give their consent.

The APP was able to use the information that it collected about users, their friends and people who had messaged them, in order to generate personality profiles. The information and also the data derived from the information was shared by Dr Kogan and his company with three other companies, including SCL Elections Ltd (which controls the now infamous Cambridge Analytica).

Facebook Fine Graphic

In May 2014 Dr Kogan sought permission to migrate the App to a new version of the FB platform. This new version reduced the ability of apps to access information about the FB friends of users. FB refused permission straight away. However, Dr Kogan and GSR continued to have access to, and therefore retained, the detailed information about users and the friends of its users that it had previously collected via their App. FB did nothing to make Dr Kogan or his company delete the information.  The App remained in operation until May 2015.

Breach of the DPA

The Commissioner’s findings about the breach make sorry reading for FB and FB users. Not only did the FB companies breach the Data Protection Act, they also failed to comply or ensure compliance with their own FB Platform Policy, and were not aware of this fact until exposed by the Guardian newspaper in December 2015.

The FB companies had breached s 4 (4) DPA 1998  by failing to comply with the 1stand 7th data protection principles. They had:

  1. Unfairly processed personal data in breach of 1st data protection principle (DPP1). FB unfairly processed personal data of the App users, their friends and those who exchanged messages with users of the APP. FB failed to provide adequate information to FB users that their data could be collected by virtue of the fact that their friends used the App or that they exchanged messages with APP users. FB tried, unsucesfully and unfairly, to deflect responsibility onto the FB users who could have set their privacy settings to prevent their data from being collected. The Commissioner rightly rejected this. The responsibility was on Facebooks to inform users about the App and what information it would collect and why. FB users should have been given the opportunity to withhold or give their consent. If any consent was purportedly  given by users of the APP or their friends, it was invalid because it was not freely given , specific or informed. Conseqauntly, consent did not provide a lawful basis for processing
  2. Failed to take appropriate technical and organisational measures against unauthorised or unlawful processing of personal data, in breach of the 7th data protection principle (DPP7). The processing by Dr Kogan and GSR was unauthorised (it was inconsistent with basis on which FB allowed Dr Kogan to obtain access of personal data for which they were the data controller; it breached the Platform Policy and the Undertaking. The processing by DR Kogan and his company was also unlawful, because it was unfair processing.  The FB companies failed to take steps (or adequate steps) to guard against and unlawful processing.  (See below). The Commissioner considered that the FB companies knew or ought to have known that there was a serious risk of contravention of the data protection principle sand they failed to take reasonable steps to prevent such a contravention.

Breach of FB Platform Policy

Although the FB companies operated a FB Platform Policy in relation to Apps, they failed to ensure that the App operated in compliance with the policy, and this constituted their breach of the 7th data protection principle. For example, they didn’t check Dr Kogan’s terms and conditions of use of the APP to see whether they were consistent with their policy (or presumably whether they were lawful). In fact they failed to implement a system to carry out such a review. It was also found that the use of the App breached the policy in a number of respects, specifically:

  • Personal data obtained about friends of users should only have been used to improve the experience of App users. Instead Dr Kogan and GSR was able to use it for their own purposes.
  • Personal data collected by the APP should not be sold or third parties. Dr Kogan and GSR had transferred the data to three companies.
  • The App required permission from users to obtain personal data that the App did not need in breach of the policy.

The FB companies also failed to check that Dr Kogan was complying with an undertaking he had given in May 2014 that he was only using the data for research, and not commercial, purposes. However perhaps one of the worst indictments is that FB only became aware that the App was breaching its own policy when the Guardian newspaper broke the story on December 11 2015. It was only at this point, when the story went viral, that FB terminate the App’s access right to the Facebook Login. And the rest, as they say, is history.

Joint Data Controllers

The Commissioner decided that Facebook Ireland and Facebook Inc were, at all material times joint data controllers and therefore jointly and severally liable. They were joint data controllers of the personal data of data subjects who are resident outside Canada and the USA and whose personal data is processed by or in relation to the operation of the Facebook platform. This was on the basis that the two companies made decisions about how to operate the platform in respect of the personal data of FB users.

The Commissioner also concluded that they processed personal data in the context of a UK establishment, namely FB UK (based in London) in respect of any individuals who used the FB site from the UK during the relevant period. This finding was necessary in order to bring the processing within scope of the DPA and for the Commissioner to exercise jurisdiction of the two Facebook companies.

The Use of Data Analytics for Political Purposes

The Commissioner considered that some of the data that was shared by Dr Kogan and his company, with the three companies is likely to have been used in connection with, or for the purposes of, political campaigning. FB denied this as far as UK residents were concerned and the Commissioner was unable, on the basis of information before her, whether FN was correct. However, she nevertheless concluded that the personal data of UK users who were UK residents was put at serious risk of being shared and used in connection with political campaigning. In short Dr Kogan and/or his company were in apposition where they were at liberty to decide how to use the personal data of UK residents, or who to share it with.

As readers will know, this aspect of the story continues to attract much media attention about the possible impact of the data sharing scandal on the US Presidential elections and the Brexit referendum. The Commissioner’s conclusions are quite guarded, given the lack of evidence or information available to her.

Susan Wolf will be delivering these upcoming workshops and the forthcoming FOI: Contracts and Commercial Confidentiality workshop which is taking place on the 10th December in London. 

Our 2019 calendar is now live. We are running GDPR and DPA 2018 workshops throughout the UK. Head over to our website to book your place now. 

Need to prepare for a DPO/DP Lead role? Train with Act Now on our hugely popular GDPR Practitioner Certificate.

LGL Advert

 

Posted in Cloud, Data Protection, Data Sharing, Fines, GDPR, ICO, Information Security, Personal Data | Tagged , , , , , , , | Leave a comment

Google v Lloyd- Representative action for damages fails under the DPA 1998

 

GoogleAs more individuals become aware of the way in which organisations such as Facebook, and Google have used their personal data unlawfully, then the prospect of litigation, and class actions, seems increasingly likely.  However, the recent case of Lloyd v Google [2018] EWHC 2599 (QB) demonstrates that it doesn’t necessarily follow that a clear breach of data protection legislation will result in a successful claim for damages.  The case shows that even if claimants can prove that there has been a breach of data protection legislation (now the GDPR and DPA 2018) they need to identify what harm the breach has caused and how the damage has been caused by the breach. This will inevitably be a fact specific exerciser.

The background-the Safari Workaround and DoubleClick Ad Cookie

The case concerned the use, by Google, of a cookie known as the “DoubleClick Ad cookie” between 2011 -2012. Google allegedly used the cookie to secretly track the internet activity of iPhone users in the US and the UK. Ordinarily the Safari browser (developed by Apple) had a default setting that blocked the use of third-party cookies, such as the DoubleClick Ad cookie. However, Google was able to exploit certain exceptions to this default blockage and implement the so called “Safari Workaround” which enabled Google to set the cookie on an iPhone, when the user used the Safari browser. This gave Google access to a huge amount of browser generated personal information, including the address or URL of the website which the browser is displaying to the user. It was claimed that this information enabled Google to obtain or deduce other sensitive information about individuals, such as their interests and habits, race ethnicity, class, political or religious view, health, age, sexuality and financial position. Google was also alleged to have aggregated this information to create lists of different types of people, such as “football lovers’, and offered these lists to subscribing advertisers.

Regulatory action was against Google in the USA with Google agreeing to pay US$25.5 million civil penalty to settle charges brought by the US Federal Trade Commission, and a further US$17 million to settle state consumer-based actions.  No such regulatory action as taken by the Information Commissioner even though the breach clearly affected UK iPhone users.

 The representative claim

The action against google was brought by Mr Lloyd who was the only named claimant. However, he brought this action as a representative of a much larger class of people. This is a novel type of litigation that allows a representative to sue in a representative capacity on behalf of a class of people who have “the same interest” in the claim. It was not entirely clear how big the class was, but estimates ranged between 5.4-4.4 million people. Google not surprising was keen that permission was denied, bearing in mind it estimated its potential liability (if the case was successful) of between £1-3 billion.

Mr Lloyd argued that he and each member of the group/class he represented had a right to be compensated “for the infringement of their data protection rights”. Specifically, it was alleged that Google had carried out the secret tracking and collation of personal data without the data subject’s consent or knowledge; that this was a breach of Google’s duty under s 4(4) of the DPA 1998 and that the data subjects were entitled to compensation under s 13 DPA 1998.

In other words, the fact of the contravention gave them a right to be compensated.  Neither Mr Lloyd or any member of the group alleged or gave evidence about any financial loss, distress or anxiety. There were no individual allegations of harm. In fact, Mr Lloyd asserted that the claim was generic and claimed an equal, standard “tariff” award for each member of the class (the claim was for £750 per person). This turned out to be fatal to the claim.

Litigation against a US based company

Any litigant, or group of litigants, considering an action against Apple or Google or any other such company that is based outside the UK first need the permission of the High Court in order to serve a claim against a defendant outside of the jurisdiction of the domestic courts. Before the court will grant permission, the claimant must prove three things. First that the case falls within one of the listed “jurisdictional gateways”; second, that the case has a reasonable prospect of success and finally that England is the appropriate place to deal with the case.  The High Court had no difficulty deciding that England would be the natural jurisdiction for the case since the claimants were all in the UK and the alleged damage had been incurred in the UK.  However, the High Court Judge found that Mr Lloyd’s case failed on the remaining two issues and denied permission for the case to proceed.

The Court identified that the relevant gateway in this case was that the claimant had to prove they had a good arguable claim in tort and the damage was sustained in England & Wales.  The Judge was clear that a claim for damages under the DPA 1998 is a claim in tort. He was also satisfied that each member of the class was (for at least some of the relevant period) within the jurisdiction when they connected to the internet using the Safari browser.

However, the real and substantial issue in this case was whether the Safari Workaround had caused “damage” within the meaning of the DPA 1998.  The Court engaged in a lengthy analysis of the case law on DPA damages and concluded that the claimants had not sustained damages in this case. On this basis the court decided that Mr Lloyd did not have a good arguable case or a reasonable prospect of success.

Damages under the DPA 1998

Section 13 of the DPA 1998 provided that an individual who suffers damage by reason of any contravention by a data controller of any of the requirements of the DPA 1998 is entitled to compensation from the data controller for that damage.

The High Court decided that giving the words their natural meaning, this statutory right to compensation arises where

(a) there has been a breach of the DPA; and

(b) as a result, the claimant suffers damage.

These are two separate events connected by a causal link.  In short, the breach must cause the damage. Based on this logic, it necessarily follows that some breaches will not give rise to damages.  The High Court judge suggested some examples where a data controller processes personal data in breach of the DPA, but where the breach may not warrant an award of compensation, such as:

  • recording inaccurate data, but not using or disclosing it
  • Holding, but not disclosing, using or consulting personal data that are irrelevant
  • Holding data for too long
  • Failing, without consequences, to take adequate security measures.

Of course, this is not to say that these types of breaches could never give rise to a successful claim for damages, as much will depend on the context and facts of the case. However, the Court did suggest that data subjects had alternative remedies such as rectification, erasure and objection.

One of the key arguments presented by Lloyd was that the claimants had incurred damage because they lost control of their data. According to the Court, there will be circumstances where the loss of control may have significantly harmful consequences, such as in Vidal Hall. (Google in v Vidal-Hall and others & The Information Commissioner [2015] EWCA Civ 311) The focus in that case was on the significant distress caused to the claimants by the delivery to their screens of unwanted advertising material. However, decision was very fact specific; it seemed that the type of information that had secretly been tracked and used to send targeting advertising was of a particularly private and sensitive nature, such that it would have caused harm to the claimants had any one else seen their computer screens.

The High Court in Lloyd v Google also accepted that delivery of unwanted commercial advertising can be upsetting in other ways, for example where repeated or bulk unwanted communications:

  • Is so distressing it constitutes harassment even if the content is inherently innocuous
  • Infringes a person’s right to respect for their autonomy
  • Represents a material interference with their freedom of choice over how they lead their life.

However, on the facts of the case the Court concluded that the claimants had not provided any particulars of any damage suffered.   Rather the claimants seemed to be relying on the fact that the claimants were entitled to be compensated because of the breach alone. The judge rejected this as a possibility.

A Court cannot award compensation just because the data protection rules have been breached. The Court also rejected the idea that the claimants should be compensated in order to “censure” the defendant’s behaviour. The Court also rejected any argument that damages under the DPA should be awarded in order on a sort of “restitutionary” basis, that is ‘calculated by reference to the market value of the data which has been refused’.

Representative action cases- what lessons can be learnt?

This was a novel litigation, it involved one named claimant bringing an action on behalf of a large group. The cation faced difficulties right from the start, not least in trying to identify the group. The Judge identified three real difficulties with this type of action:

  1. The representative (Lloyd) and the members of the class do not all have the “same interest” which is an essential requirement for any representative action. Some people may have suffered no damage and others different types of damage. Consequently, they did not all have the same interest in the action.
  2. Even if it was possible to define the class of people represented, it would be practically impossible to identify all members of the class.
  3. The court would not exercise its discretion to allow this case to go forward, particularly given the costs of the litigation, the fact that the damages payable to each individual (were the case to succeed) would be modest, and that none of the class had shown any interest in, appeared to care about in the claim.

Anyone contemplating pursuing this type of claim in future would be well advised to carefully consider and take on board the judge’s criticisms and seek to address them before pursing an action.

LGL Advert

Susan Wolf will be delivering the forthcoming GDPR workshop in Birmingham on the 19th November. Book your place now! 

 

Posted in Data Sharing, GDPR, Lawsuits, Litigation, personal data, Privacy, Social media | Leave a comment

Act Now launches GDPR Policy Pack

ACT NOW NEWS

The first fine was issued recently under the General Data Protection Regulation (GDPR) by the Austrian data protection regulator. Whilst relatively modest at 4,800 Euros, it shows that regulators are ready and willing to exercise their GDPR enforcement powers.

Article 24 of GDPR emphasises the need for Data Controllers to demonstrate compliance through measures to “be reviewed and updated where necessary”. This includes the implementation of “appropriate data protection policies by the controller.” This can be daunting especially for those beginning their GDPR compliance journey.

Act Now has applied its information governance knowledge and experience to create a GDPR policy pack containing essential documentation templates to help you meet the requirements of GDPR as well as the Data Protection Act 2018. The pack includes, amongst other things, template privacy notices as well as procedures for data security and data breach reporting. Security is a very hot topic after the recent £500,000 fine levied on Equifax by the Information Commissioner under the Data Protection Act 1998.

We have also included template letters to deal with Data Subjects’ rights requests, including subject access. The detailed contents are set out below:

  • User guide
  • Policies
    • Data Protection Policy
    • Special Category Data Processing (DPA 2018)
    • CCTV
    • Information Security
  • Procedures
    • Data breach reporting
    • Data Protection Impact Assessment template
    • Data Subject rights request templates
  • Privacy Notices
    • Business clients and contacts
    • Customers
    • Employees and volunteers
    • Public authority services users
    • Website users
    • Members
  • Records and Tracking logs
    • Information Asset Register
    • Record of Processing Activity (Article 30)
    • Record of Special Category Data processing
    • Data Subject Rights request tracker
    • Information security incident log
    • Personal data breach log
    • Data protection advice log

The documents are designed to be as simple as possible while meeting the statutory requirements placed on Data Controllers. They are available as an instant download (in Word Format). Sequential files and names make locating each document very easy.

Click here to read sample documents.

The policy pack gives a useful starting point for organisations of all sizes both in the public and private sector. For only £149 plus VAT (special introductory price) it will save you hours of drafting time. Click here to buy now or visit or our website to find out more.

Act Now provides a full GDPR Course programme including one day workshops, e learning, healthchecks and our GDPR Practitioner Certificate. 

Posted in CCTV, Data Portability, dpia, GDPR, Policy pack, Privacy, Security, Subject Access, Uncategorized | Tagged , , , , , , , , | Leave a comment