Act Now Launches Online FOI Practitioner Certificate

photo-1501504905252-473c47e087f8

Act Now is pleased to announce the launch of its new online FOI Practitioner Certificate.

This new course has been designed to mirror our classroom based course that was running successfully throughout the country before the Coronavirus lockdown.
Delegates will benefit from the same fantastic features in a live online learning environment.

Class sizes are 50% smaller to ensure that delegates receives all the attention and support they need to get the best out of the course. They will also have plenty of opportunities to ask questions, test their skills and engage with FOI practitioners from the comfort of their home office.

The four days of training are split up into three online sessions per day. Using our online training platform, delegates will be able to see and hear the course tutors as well as the slides, exercises and case studies. We have also built in 1 to 1 tutor time at the end of each day to provide individual support.

A very comprehensive set of materials, including all legislation, will be posted to delegates in advance of the online sessions. In addition they will have access to our online Resource Lab, which now includes updated videos on key aspects of the syllabus.

This new course builds upon our wealth of experience of designing and delivering online training as well as the delegate feedback from our online GDPR Practitioner Certificate.

Susan Wolf, who has designed this new course says:

“This is a very exciting opportunity. Despite the current difficult times and uncertainties, this online course gives FOI practitioners access to high quality training, that is cost effective and safe.”

The first course starts on 20th August with a special introductory price of  £1,995  plus VAT. Places are filling up so book early to avoid disappointment. We can also deliver this course on an in-house basis customised to the needs of your staff. Please get in touch for a quote.

 

Posted in Certificated course, FOI, Online, Uncategorized | Tagged , , | Leave a comment

Customer Contact Details for Track and Trace: GDPR Considerations

photo-1441985969846-3e7c90531139

“A pint of beer and a packet of crisps, Sir? That’ll be £3.90 and your personal data please.”

For some businesses, such as restaurants and pubs, the government is also intending to place an additional obligation. The guidance document states:

“The opening up of the economy following the COVID-19 outbreak is being supported by NHS Test and Trace. You should assist this service by keeping a temporary record of your customers and visitors for 21 days, in a way that is manageable for your business, and assist NHS Test and Trace with requests for that data if needed. This could help contain clusters or outbreaks.”

This new requirement to collect and store personal data, alongside encouraging or compelling customers, clearly raises data protection and privacy implications.
In a statement to the House of Commons on 23rd June 2020, Boris Johnson said, “We will work with the sector to make this manageable.” Speaking to the Guardian newspaper the next day, the Information Commissioner’s Office (ICO), said it was “assessing the potential data protection implications of this proposed scheme and is monitoring developments”. With a week to go before the new rules come into force, both need to get a wriggle on!

Reaction on to the Prime Minister’s statement on social media was nothing but predictable. People immediately started discussing which fake name they would use rather than hand over their personal data. Dominic Cummings and Matthew Hancock seem popular choices.

Lawful Basis

As we publish this blog, there have been no changes in legislation and no further emergency COVID-19 regulations. Nor have any changes to licensing laws been proposed in order to enforce the collection of this data.

So how can a restaurant manager or pub landlord justify collecting personal data in these circumstances? Let’s consider the lawfulness conditions under Article 6 of GDPR for processing data. If a business will not allow someone to dine or drink in its premises unless a name and address is recorded, they cannot use consent as their condition for processing. The customer is not freely giving their data as they have no real choice if they want to use the premises. There is no contract between the parties at the stage of entering the premises. There’s no statutory requirement in law to demand it or any official authority for businesses to require it. No-one is going to die immediately if the data is not handed over so vital interests cannot be used.

Unless emergency legislation is passed in the next week it appears businesses will have to rely on the “legitimate interests” condition under Article 6 to collect and process the personal data of customers.

Privacy Notices

If businesses decide it is in their legitimate interests to collect customer contact data, they also need to demonstrate fairness and transparency to meet the requirements of the first data principle. This brings us to Privacy Notices. A quick sampling of my local pubs showed only 3 out of the 10 currently have Article 13 compliant Privacy Notices on their websites. All three were part of national chains. The more local independent pubs do not appear to have a Privacy Notice on their website. How will these pubs explain to customers why they want their data and what they are going to do with it? Perhaps there will be signs to be read upon entering.

Security of the Data

One of the biggest risks to businesses is not keeping this newly collected personal data secure, which could result in the possibility of a data breach under GDPR. Under Article 32 the business needs to take appropriate organisational and technical measures to keep the data secure. Devices will need to be password protected if not encrypted. Access will have to be controlled. New security policies and procedures will need to be put in place by next week.

 In addition, all staff will need to be trained, quickly, regarding handling this newly collected data. Stories have already surface in New Zealand, after this system was introduced there, of female customers being harassed by staff who had taken their details from the contact list.

Retention

The government has said that businesses need to keep customer contact data for 21 days. This raises more questions for businesses to consider. How will this be implemented?
Do systems allow this retention period? How will paper records be disposed of securely? There’ll be a run on shredders soon!

Social Exclusion

The government is also asking pubs and restaurants to use apps to enable customers to order at their tables thus limiting contact with others. The Wetherspoons chain has had such an app for ‘table service’ for some time. We know the government likes apps but they too need to be GDPR-compliant.

Furthermore those customers who are unwilling or unable to comply with the new requirements, whether because they object to the collection of data, do not have ID documents or are economically excluded as they do not have smartphones and/or bank accounts face discrimination as they will be unable to access the social spaces that are pubs and restaurants. There could be challenges against such measures on this basis.

Trust and Burden

Ultimately it will be down to individuals as to whether they care about their data enough or would prefer a pint or a pie after 3 long months. It may be that they trust their local restaurant and landlord with their data. Some individuals will decide it’s just not worth the hassle and risk for the sake of a socially distanced Sunday lunch.

Some small businesses may decide that the requirement to processes customers’ personal data in a GDPR complaint way is too much of a burden considering they have 8 days to prepare on top of re-opening, getting staff back and trained and making their premises COVID-secure.

Our GDPR Essentials E learning course is designed to teach frontline staff essential GDPR knowledge in an engaging, fun and interactive way. In just over 30 minutes staff will learn about the key provisions of GDPR and how to keep personal data safe.

online-gdpr-banner

Posted in coronavirus, COVID-19, GDPR, Pubs, Restaurants, Track and Trace, Uncategorized | Tagged , , , , | Leave a comment

The CCPA Becomes Enforceable on 1st July 2020 (and there is more to come!)

photo-1523595857-fe9ee689f76f

The California Consumer Privacy Act (CCPA) becomes fully enforceable on 1st July 2020. The Act regulates the processing of California consumers’ personal data, regardless of where a company is located. CCPA provides broader rights to consumers and stricter compliance requirements for businesses than any other state or federal privacy law.

Like the EU General Data Protection Regulation (GDPR), CCPA is about giving individuals control over how their personal data is used by organisations. It requires transparency about how such data is collected, used and shared. It gives Californian consumers various rights including the right to:

  • Know and access the personal being collected about them
  • Know whether their personal data is being sold, and to whom
  • Opt out of having their personal data sold
  • Have their personal data deleted upon request
  • Avoid discrimination for exercising their rights

CCPA also requires that a security breach involving personal data, must be notified to each individual it affects. It does not matter if the data is maintained in or outside of California.

CCPA is often called the US equivalent of the EU General Data Protection Regulation (GDPR). Both laws give individuals rights to access and delete their personal information. They require organisations to be transparent about information use and necessitate contracts between businesses and their service providers. In some respects, however, the CCPA does not go as far as GDPR. For example, it only applies to for profit entities, it does not require a legal basis for processing personal data (like Article 6 of GDPR), there are no restrictions on international transfers and no requirement to appoint a data protection officer.

Enforcement

Unlike GDPR, CCPA does not have a regulator like the Information Commissioner in the UK. It is primality enforced by the California Attorney General (AG) through the courts; although there is a private right of right action for a security breach. The courts can impose fines for breaches of CCPA depending on the nature of the breach:

  • $2,500 for an unintentional and $7,500 for an intentional breach
  • $100-$750 per incident per consumer, or actual damages, if higher – for damage caused by a security breach

A business shall only be in breach of the CCPA if it fails to cure any alleged breach within 30 days after being notified of the same.

The AG has now published the final proposed CCPA Regulations. These have to be read alongside the Act. The accompanying Final Statement of Reasons provides some interesting insights into the AG’s views and potential positions on certain issues.

While the CCPA fines and damages may appear relatively low, it is important to note that they are per breach. A privacy incident can affect thousands or tens of thousands of consumers, in which case it could cost a company hundreds of thousands or even millions of dollars.

Two big US companies, Hanna Andersson and Salesforce, are already facing a class action lawsuit alleging CCPA violations. Both suffered a data breach that compromised the names, addresses, and credit card information of over 10,000 California residents, which were then sold on the dark web. The lawsuit claims the companies failed to protect consumer data, provide adequate security measures, safeguard their systems from attackers, and delayed notification of the breach.

During the coronavirus pandemic there has been an increased use of video chat and conferencing apps to stay connected. Both Zoom and Houseparty have class actions claiming that they failed to obtain consent from customers for the disclosure of their personal information to third parties like Facebook.

CCPA 2.0

There is more to come! The California State Assembly held a hearing on 12th June 2020 on the California Privacy Rights Act (CPRA) ballot initiative. Californians for Consumer Privacy, an advocacy group and the proponent of the 2018 ballot initiative that led to the enactment of the CCPA, has gathered more than 900,000 signatures to place the CPRA on the ballot in November of 2020. This is now looking very likely after Friday’s California Superior Court ruling although a deal could be struck to amend the CCPA in exchange for withdrawing the ballot initiative.

The CPRA (or an amendment to the CCPA) will further expand privacy rights of California consumers as well as compliance obligations of businesses, their service providers and contractors. It will, among other things, permit consumers to (1) prevent businesses from sharing (in addition to selling) their personal data; (2) correct inaccurate personal data about them; and (3) limit businesses’ use of “sensitive personal information,” known as Special Category Data under GDPR. This includes information about their race, ethnicity, religion, union membership and biometric data. The proposed law will prohibit businesses from collecting and using personal information for purposes incompatible with the disclosed purposes, and from retaining personal information longer than reasonably necessary. Readers with knowledge of GDPR will agree that this new law is even more like GDPR than the CCPA.

The CPRA will also establish a new California Privacy Protection Agency which will be tasked with enforcing and implementing consumer privacy laws and imposing administrative fines. If enacted CPRA will become operative on 1st January 2023 although its obligations would only apply to personal data collected after 1st January 2022.

A Federal Privacy Law?

CCPA represents the first real, comprehensive privacy legislation in the U.S. It will, no doubt, form the foundation for other state privacy regulations in the future, and quite possibly a U.S federal privacy regulation. Nevada residents also now have more control over how their personal information is used. Senate Bill 220 went into law recently, giving consumers more ability to keep websites from selling their information to third-party firms. Proactive businesses are already considering CCPA as a de facto US privacy law. Recently Microsoft announced that it will apply the main CCPA rights to all its customers in the U.S.

CCPA’s impact will not just be felt by California based businesses. Any business which processes personal data about Californian consumers needs to revaluate its privacy practices. With 40 million Californian residents, making up 12 percent of the US population, it is likely that most big business wherever they are based will have to comply with the CCPA. With substantial fines and penalties for breaches and a 6 month ‘look back’ period, now is the time to implement CCPA compliance measures.

Act Now has launched a US privacy programme covering every thing US and international business need to know about CCPA and GDPR.

 

.

online-gdpr-banner

Posted in California, CCPA, Uncategorized, USA | Tagged , , | Leave a comment

Disclosure Staff Names in FOI Requests  

chuttersnap-jchrnikx0tm-unsplash

One of the most popular search terms on our blog is “disclosure of names under FOI.”
A further question that we were recently asked on a course is whether FOI practitioners should provide their names when they respond to requests.

There have been some important developments since 2013 and our last two blogs on this topic. The provisions of S.40(2) of the Freedom of Information Act  (FOI) 2000 have been amended to take into the provisions of General Data Protection Regulation 2016 (GDPR) and the Data Protection Act 2018 (replacing the Data Protection Act 1998).
In addition we now have the benefit of  two rulings from the Upper Tribunal namely Information Commissioner v Halpin (GIA) [2019] UKUT 29 (AAC) and Cox v Information Commissioner and Home Office [2018] UKUT 119 (AAC). In addition, the Information Commissioner’s Office has issued revised guidance on requests for personal data about public authority employees which take into account the recent developments.

The FOI Section 40(2) Exemption

The names of staff working in public authorities are personal data as defined by Article 4 (1) of GDPR and S. 3 Data Protection Act (DPA) 2018. In addition organisational charts and internal directories that contain staff names are also personal data if they identify individual members of staff. FOI requests may not necessarily be couched as request for staff names. For instance, a requestor may wish to see “ all communications” about a certain subject, but these communications may include the names of those sending and receiving emails. They may wish to find out the names of staff present in specific meetings (which is what happened in the Cox case). The Cox case was the first occasion in which the Upper Tribunal was tasked with considering the principles governing the disclosure of the names of civil servants, but clearly it has wider application to all other public authorities.

When a public authority receives an information request that includes a request for the names of staff, it needs to consider the third-party personal data exemption in S. 40(2) FOI. This is an absolute exemption if:

  • Disclosure of the third-party personal data (in this case staff) would contravene any of the data protection principles; or

  • Disclosure would contravene an objection to processing under GDPR Article 21; or –

  • The personal data would be exempt if the data subject  (member of staff concerned) had made a subject access request.

An almost identical exception operates in EIR regulation 13.

The data protection principles are listed in GDPR Article 5. The first principle is the most relevant in this context. This requires that the processing of personal data must be lawful, fair, and transparent. Disclosing under the FOI constitutes processing.

Before disclosing any staff names the first question is whether the disclosure is lawful. There are six lawful bases for processing in GDPR Article 6, but only consent or legitimate interests are relevant to disclosure under the FOI or EIR. It may be possible to ask staff for their consent to disclose their names. However, given the particularly high threshold for consent to be valid (see GDPR Article 7) and the imbalance of power in an employer/employee relationship, any consent is not necessarily going to be valid.

Legitimate Interests

The alternative lawful basis is that disclosure is “necessary for the purposes of the legitimate interests pursued by the controller or by a third party except where such interests are overridden by the interest or fundamental rights and freedoms of the data subject which require protection of personal data…” (GDPR Article 6 (1)(f)). Some readers may be concerned because the GDPR specifically states that public authorities cannot rely on the legitimate interests’ lawful basis when processing in the performance of their tasks. However, this restriction is lifted in relation to disclosure under the FOI or EIR by S.40(8) of FOI and Reg. 13(6) of EIR respectively.

The ICO guidance suggests that public authorities answer three key questions when considering this issue, namely:

Question 1: What is the legitimate interest in disclosure (or what is the purpose)?

This includes the legitimate interest of the public authority or a third party, which is likely to be the requestor.  A wide range of interests may be legitimate interests. The requestor may have  a personal and private reason for wanting to know staff names, but this makes it no less relevant. In the Halpin case, the Upper Tribunal confirmed that a purely private interest was capable of amounting to a legitimate interest. In this case Mr Halpin wanted details of the training undertaken by two social workers because their capacity and skills were relevant in any appeal against a Care Act assessment.

Question 2: Is it necessary to disclose staff names for that purpose?

This requires a public authority to ask whether it is “necessary” to disclose staff names in order to serve the legitimate interests of the requestor. It may be possible to provide the applicant with alternative information, such as the numbers of staff involved in a meeting and information about their roles and levels of seniority without providing names. For example, in McFerran v Information Commissioner EA/2012/0030,the requestor wanted to know the names of the council staff who were present during a police search of a council property. The Tribunal acknowledged that there was a legitimate interest in knowing that the search had been conducted properly but it was not necessary for the requestor to know the names of the council staff involved.

Question 3: Does the legitimate interest outweigh the interest and rights of the staff concerned? 

This involves a balancing exercise. Public authorities need to consider the likely impacts or consequences that disclosure of staff names will have on the staff themselves.
Names should not be disclosed if disclosure will cause unjustified adverse effects on the staff concerned. It is important to remember when making this assessment, that disclosure of names under the FOI is to “the world at large”. Again, the Upper Tribunal in Halpin was at pains to emphasise that even if the requestor indicates they have no intention of publicising the information, the public authority loses control of the information once it is disclosed. Disclosure under the FOI is not subject to any duty of confidence. This becomes a relevant factor in deciding whether the disclosure will cause unwarranted harm to the  named individuals.

The key question when it comes to disclosing names, is what is the harm that will arise from disclosure? There must be a connection between the disclosure and the harm.
Even if disclosure may cause distress to a member of staff  this doesn’t automatically trump the legitimate interests of the requestor; the public authority must undertake a balancing exercise. When a public authority carries out this balancing exercise it should take the reasonable expectations of the staff concerned into account. For example, just asking whether the member of staff concerned would have a reasonable expectation that their names would be disclosed to the world at large provides a useful starting point.
This also enables the public authority to address the question of fairness.
In deciding whether to disclose staff names it is important to think about the public facing nature of the role filled by the individual member of staff ; their seniority in the organisation; whether a public authority has a policy  on the disclosure of staff names that informs their expectations. The staff privacy notice should also provide staff with some understanding of when their names may be disclosed in response to FOI request.

Clearly a Chief Executive of an organisation should expect that their name is released into the public domain. As the ICO guidance advises:

The more senior an employee is and the more responsibility they have for decision making and expenditure of public money, the greater their expectation should be that you disclose their names.

On the other hand somebody with responsibility for cleaning offices will have a real expectation that their name remains confidential. FOI practitioners are familiar with this assessment, which is  based on ICO guidance and an earlier case of Home Office v Information Commissioner EA/2011/0203. This said that the names of junior civil servants are generally protected from disclosure unless they occupy a public-facing role. However the decision in the Cox case makes it clear that each case will depend on its own facts and context. There is no blanket presumption in favour of disclosure of the names of senior officials, each case must be considered carefully and with regard to the legitimate interests of the requestor.

Disclosing Names of FOI practitioners

The question of whether a public authority should disclose the name of a person handling an FOI request raises all of the above considerations. First, what is the legitimate interest in a FOI requestor knowing the name of the person who handled their request?
Second, is it necessary to know that person’s name to serve that legitimate interest? Finally does the legitimate interest of the requestor outweigh any harm that may be caused to the member of staff handling the request. There is no legal obligation to disclose staff names and a public authority could refuse under S 40(3) FOI if all of the above are satisfied.

In the interests of transparency many public authorities disclose the names of the person who has handled their request. Given the public facing role and the work that FOI practitioners do it is arguable that their expectation is that their names may be disclosed. However in some organisations FOI requests are dealt with by many different staff at various levels rather than via a single FOI point of contact. In these circumstances more junior staff who have handled requests may have a greater expectation of privacy.

This and other developments will be discussed in our  FOI and EIR workshops  which are now available as an online option. If you are looking for a qualification in freedom of information,  our  FOI Practitioner Certificate  is ideal. It will soon be available as an online option. Please get in touch to learn more.

online-gdpr-banner

 

Posted in FOI, names, Section 40, Staff, Uncategorized | Tagged , , , , | Leave a comment

The EasyJet Data Breach: GDPR Fine Arriving?

robert-hrovat-3hTBB-ISAJg-unsplash

On 19th May 2020 it was reported that in January 2020 EasyJet was subject to what they describe as a “highly sophisticated” cyber-attack, resulting in the personal data of over 9 million customers being “hacked”. Detailed information about the attack is sparse, with most media sources repeating the same bare facts. Some of the information below is based on the media reports and emails sent to EasyJet customers. At the time of writing there was no information about this on the Information Commissioner’s Office web site.
What little information is available points to a number of breaches of the General Data Protection Regulation (GDPR) which could result in the Information Commissioners Office (ICO) imposing a monetary penalty.

However, in view of the ICO’s reassessment of its regulatory approach during the current Coronavirus pandemic and reports that it has further delayed the imposition of its £183 million fine against British Airways, readers may be forgiven for thinking that EasyJet will not be on the receiving end of a fine any time soon. In any event, it seems likely that the ICO will be forced to consider the fact that EasyJet, along with the whole airline industry has been very severely affected by the Coronavirus and faces huge financial pressures.
The consequences for EasyJet in respect of this breach will remain unclear for many months and may disappoint customers whose personal information has been stolen.

Breach of Security

All Data Controllers must comply with the data protection principles set out in Article 5 of GDPR. In particular, Article 5 (1) (f) (the security principle) requires Data Controllers to process personal data in a manner that “ensures appropriate security” of the personal data that they process. That  includes protecting against “unauthorised or unlawful processing and against accidental loss, destruction or damage.” This obligation to process personal data securely is further developed in GDPR Article 32 which requires Data Controllers to implement “appropriate technical and organisational measures to ensure a level of security appropriate to the risk”. The steps that a Data Controller has to take will vary, based upon “the state of the art, the costs of implementation and the nature, scope, context and purposes of processing as well as the risk of varying likelihood and severity for the rights and freedoms of natural persons”. In other words, Data Controllers must implement security measures that are “appropriate to the risks” presented by their processing, which reflects the GDPR’s risk-based approach. So, for example, a village hairdresser will not be expected to take the same amount of security precautions as an international airline handling personal data (and often Special Category Data) about millions of people. We do not know what cyber-security precautions EasyJet had in place to prevent this-attack, however it is arguable that it should have reviewed its security arrangements (which it may well have done) in the wake of the British Airways attack that was widely reported in September 2018.

There is no doubt that the incident amounts to a “personal data breach” under GDPR Article 4 (12) since it involves a breach of security leading to the unauthorised access of the personal data of about 9 million people. Of the 9 million people affected, 2,208 had their credit card details stolen.

Breach Notification

When a Data Controller becomes aware of a “personal data breach” it must notify the ICO “without undue delay, and where feasible not later than 72 hours after becoming aware of it” (GDPR Article 33). The controller is relieved from this duty where the breach is “unlikely to result in a risk to the rights and freedoms of natural persons”. That does not appear to be the case here given both the scale of the attack and the fact that the hackers gained access to customers’ credit card details and travel plans. The media reports indicate that the ICO was informed about the attacks that took place in January 2020, but there is no indication exactly when it was informed. If EasyJet did not notify the ICO within the time frames of Article 33, then this constitutes a further breach of the GDPR.
Phased notification is allowed though when a Data Controller does not have all the full details of the data breach within the 72 hours. This is likely to be the case in the EasyJet case where they instructed an immediate forensic investigation to establish the nature and extent of the breach, but the initial notification should have been within the 72 hour period as per Article 33.

Notifying Easy Jet Customers

GDPR Article 34 requires a Data Controller to notify any Data Subjects when the personal data breach is “likely to result in a high risk to the[ir] rights and freedoms”. The threshold for communicating a data breach to Data Subjects is higher than for notifying  the ICO and therefore it will not always be necessary to communicate with affected Data Subjects.
Data Controllers must assess the risk on a case by case basis. However, the Article 29 Working Party Guidelines on Breach Notification suggests that a high risk exists when the breach may lead to identity theft, fraud or financial loss. This would appear to be the case in the EasyJet breach. The GDPR does not state any specific deadline for notification but it does say that it should be “without undue delay”.

Media reports suggest that EasyJet customers were notified in two separate tranches.
The first notification to customers, whose credit details were stolen, was sent by email in early April. The second tranche, to all other customers, was sent by 26th May.
Customers who received emails at the end of May were advised that their name, email address and travel details were accessed (but not their credit card or passport details).
The purpose of notifying customers is to enable them to take steps to protect themselves against any negative consequences of the breach. The email suggested that customers take extra care to avoid falling victim to phishing attacks.

It remains to be seen whether EasyJet customers were notified “without undue delay” given that the airline became aware of the breach in January but the first notification to customers whose credit card details were stolen was not until end of April. It is plausible that this may have been too late for some customers. If this is the case then not only would this result in a  further breach of the GDPR, but could expose EasyJet to claims for compensation under GDPR Article 82. Indeed, according to SC Magazine, a law firm has already issued a class action claim in the High Court. Note that according to Google v Lloyd (and now under GDPR) claimants not do now have to show direct material damage to claim compensation.

Will Easy Jet Be Fined?

The details available to date certainly suggest a breach of Article 5 (1) (f) and possibly Article 32. In addition, it may be the case that EasyJet failed to notify their customers without undue delay and have breached Article 34. Breaches of these provisions could theoretically result in the ICO imposing a monetary penalty of up to 4% of EasyJet’s total worldwide annual turnover in respect of a breach of Article 5 and up to 2% of its total worldwide annual turnover for breaches of Articles 32 and 34.

It is too early to compare the circumstances of the EasyJet breach with the British Airways breach. The numbers of Data Subjects whose credit card details were involved in the BA attack was reported to be half a million (compared to 9 million with the EasyJet attack). However the number of people whose credit card details were stolen in the BA attack was much greater (about 380,000 booking transactions), although British Airways notified its customers immediately. Therefore the scale and gravity of the two breaches are not identical. The ICO will need to take these factors into account in deciding on the level of any fine. The maximum that she could fine is (as stated above) up to 4% of EasyJet’s annual turnover. It is not clear what this figure is but the EasyJet Annual Report for 2019 states that the company’s total revenue in 2019 was £6,385 million. In contrast BA’s total revenue was £12.2 billion. The fine will almost certainly be smaller than that imposed on British Airways, but it really remains to be seen how the ICO will react to the financial pressure that EasyJet are clearly under as a result of the Coronavirus pandemic. All we can do is watch this space.

This and other GDPR developments will be covered in our new online GDPR update workshop. Our next online  GDPR Practitioner Certificate course is  fully booked. A few places left  on the course starting on 2nd July.

online-gdpr-banner

 

Posted in Data Breach, EasyJet, GDPR, Security, Uncategorized | Tagged , , , | Leave a comment

Covid-19, GDPR and Temperature Checks

Hands holding Thermometer Infrared Gun Isometric Medical Digital Non-Contact.

Emma Garland writes…

Many countries have now been in some form of lockdown for a considerable length of time. As some of the lockdown measures are slowly being eased, one of the possible solutions to prevent a “second wave” is the implementation of temperature checks in shops and workplaces. This involves placing a thermometer on an individual’s forehead. Of course if the temperature is recorded or there is another way the individual can be identified, it will involve processing health data. Care must be taken to consider the GDPR and privacy implications.

Apple reopened stores across Germany on 11th May with extra safety procedures, including temperature checks and social distancing. It is now facing a probe by a regional German data protection regulator into whether its plan to take the temperature of its store customers violates GDPR.

The benefits of temperature check are self-evident. By detecting members of the public or staff who have a high temperature, and not permitting them to enter the store or workplace, staff have less risk of close contact with people who may have COVID 19. Temperature checks are just one small part of stopping the spread of COVID 19 and can be intrusive. What is the lawful basis for processing such data? Art 6(1)(d) of GDPR allows processing where it is:

“…is necessary in order to protect the vital interests of the data subject or of another natural person”

Of course “data concerning health” is also Special Category Data and requires an Article 9 condition to ensure it is lawful. Is a temperature check necessary to comply with employment obligations, for medical diagnosis or for reasons of public health?

All conditions under Article 6 and 9 must satisfy the test of necessity. There are many causes of a high temperature not just COVID 19. There have also been doubts over the accuracy of temperature readings. They take skin temperature, which can vary from core temperature, and do not account for the incubation phase of the disease where people may be asymptomatic.

ICO Guidance

The Information Commissioner’s Office (ICO) has produced guidance on workplace testing which states:

“Data protection law does not prevent you from taking the necessary steps to keep your staff and the public safe and supported during the present public health emergency.
But it does require you to be responsible with people’s personal data and ensure it is handled with care.”

The ICO suggests that  “legitimate interests” or “public task” could be used to justify the processing of personal data as part of a workplace testing regime. The former will require a Legitimate Interests Assessment, where the benefit of the data to the organisation is balanced against the risks to the individual.  In terms of Article 9, the ICO suggests the employment condition, supplemented by Schedule 1 of the Data Protection Act 2018. The logic used here is that employment responsibilities extend to compliance wide range of legislation, including health and safety.

More generally, the ICO says that that technology which could be considered privacy intrusive should have a high justification for usage. It should be part of a well thought out plan, which ensures that it is an appropriate means to achieve a justifiable end. alternatives should also have been fully evaluated. The ICO also states:

“If your organisation is going to undertake testing and process health information, then you should conduct a DPIA focussing on the new areas of risk.”

A Data Protection Impact Assessment should map the flow of the data including collection, usage, retention and deletion as well as the associated risks to individuals.
Some companies are even using thermal cameras as part of COVID 19 testing.
The Surveillance camera Commissioner (SCC) and the ICO have worked together to update the SCC DPIA template, which is specific to surveillance systems.

As shops begin to open and the world establishes post COVID 19 practices, many employers and retailers will be trying to find their “new normal”. People will also have to decide what they are comfortable with. Temperature should be part of a considered approach evaluating all the regulatory and privacy risks.

Emma Garland is a Data Governance Officer at North Yorkshire County Council and a blogger on information rights. This and other GDPR developments will be covered in our new online GDPR update workshop. Our next online  GDPR Practitioner Certificate course is  fully booked. A few places left  on the course starting on 2nd July.

Posted in coronavirus, COVID-19, Data Retention, GDPR, Human Rights, Privacy, Uncategorized | Tagged , , , , , | Leave a comment

The NHS COVID 19 Contact Tracing App: Part 4 Questions about Data Retention and Function Creep

photo-1565878025373-d29620e25e32

The first three blog posts in this series have raised many issues about the proposed NHS COVID19 Contact Tracing App (COVID App) including the incomplete DPIA and lack of human rights compliance. In this final post we discuss concerns about how long the data collected by the app will be held and what it will be used for.

From the DPIA and NHSX communications it appears that the purpose of the COVID App is not just to be part of a contact tracing alert system. The app’s Privacy Notice states:

“The information you provide, (and which will not identify you), may also be used for different purposes that are not directly related to your health and care. These include:

  • Research into coronavirus 
  • Planning of services/actions in response to coronavirus
  • Monitoring the progress and development of coronavirus

Any information provided by you and collected about you will not be used for any purpose that is not highlighted above.”

“Research”

Article 89 of the GDPR allows Data Controllers to process personal data for archiving purposes in the public interest, scientific or historical research purposes or statistical purposes, subject to appropriate safeguards set out in Section 19 of the Data Protection Act 2018.

NHSX has said that one of the “appropriate safeguards” to be put in place is anonymisation or de-identification of the users’ data; but only if research purposes can be achieved without the use of personal data. However, even anonymised data can be pieced back together to identify individuals especially where other datasets are matched.
The Open Rights Group says:

“Claims such as ‘The App is designed to preserve the anonymity of those who use it’ are inherently misleading, yet the term has been heavily relied upon by the authors of the DPIA. On top of that, many statements leave ambiguities…”

There are also legitimate concerns about “function creep”. What exactly does “research into coronavirus” mean? Matthew Gould, the chief executive of NHSX, told MPs the app will evolve over time:

“We need to level with the public about the fact that when we launch it, it will not be perfect and that, as our understanding of the virus develops, so will the app. We will add features and develop the way it works.”

Whilst speaking to the Science and Technology Committee, Gould stated that “We’ve been clear the data will only ever be used for the NHS.” This does not rule out the possibility of private companies getting this data as NHS Data Processors.

Data Retention

Privacy campaigners are also concerned about the length of time the personal data collected by the app will be held; for both contacts and for people who have coronavirus. The DPIA and Privacy Notice does not specify a data retention period:

“In accordance with the law, personal data will not be kept for longer than is necessary. The exact retention period for data that may be processed relating to COVID-19 for public health reasons has yet to be set (owing to the uncertain nature of COVID-19 and the impact that it may have on the public).

In light of this, we will ensure that the necessity to retain the data will be routinely reviewed by an independent authority (at least every 6 months).

So, at the time of writing, COVID App users have no idea how long their data will be kept for, nor exactly what for, nor which authority will review it “every six months.” Interestingly the information collected by the wider NHS Test and Trace programme is going to be kept by Public Health England for 20 years. Who is to say this will not be the case for COVID App users’ data?

Interestingly, none of the 15 risks listed in the original DPIA relating to the COVID App trial (see the second blog in this series) include keeping data for longer than necessary or the lawful basis for retaining it past the pandemic, or what it could be used for in future if more personal data is collected in updated versions of the app. As discussed in the third blog in this series, the Joint Human Rights Committee drafted a Bill which required defined purposes and deletion of all of the data at end of the pandemic. The Secretary of State for Health and Social Care, Matt Hancock, quickly rejected this Bill.

The woolly phrase “personal data will not be kept for longer than is necessary” and the fact NHSX admit that the COVID App will evolve in future and may collect more data, gives the Government wriggle room to retain the COVID App users’ data indefinitely and use it for other purposes. Could it be used as part of a government surveillance programme? Both India and China have made downloading their contact tracing app a legal requirement raising concerns of high tech social control.

To use the App or not?

Would we download the COVID App app in its current form? All four blogs in this series show that we are not convinced that it is privacy or data protection compliant. Furthermore, there are worries about the wider NHS’s coronavirus test-and-trace programme. The speed at which it has been set up, concerns raised by people working in it and the fact that no DPIA has been done further undermines confidence in the whole set up. Yesterday we learnt that the Open Rights Group is to challenge the government over amount of data collected and retained by the programme.

Having said all that, we leave it up to readers to decide whether to use the app.
Some privacy experts have been more forthcoming with their views. Phil Booth of @medConfidential calls the Test and Trace programme a “mass data grab” and Paul Bernal, Associate Professor in Law at the University of East Anglia, writes that the Government’s approach – based on secrecy, exceptionalism and deception – means our civic duty may well be to resist the programme actively. Finally if you need a third opinion, Jennifer Arcuri, CEO of Hacker House, has said she would not download the app because “there is no guarantee it’s 100 percent secure or the data is going to be kept secure.” Over to you dear readers!

Will you be downloading the app? Let us know in the comments section below.

This and other GDPR developments will be covered in our new online GDPR update workshop. Our  next online  GDPR Practitioner Certificate course is  fully booked. A few places left  on the course starting on 2nd July.

online-gdpr-banner

Posted in coronavirus, COVID-19, Data Retention, GDPR, Human Rights, Privacy, Uncategorized | Leave a comment

The NHS COVID 19 Contact Tracing App: Part 3 The Human Rights Angle

christian-lue-P0JL8np1N6k-unsplash

Everyone will agree that the government needs to do everything it can to prevent the further spread of the Coronavirus and to “save lives” (except if your name is Dominic Cummings -Ed). However, there is much less consensus about the what it should do, and this can be seen in the current debate about the proposal to roll out a contact tracing  system and the NHS COVID App. This is the third in a series of blog posts where we examine the COVID App from different perspectives.

On May 7 2020, the  Parliamentary Joint Committee on Human Rights (PJCHR) published its report on the proposed contact tracing system and made a series of important recommendations to address its concerns about the compatibility of the scheme with data protection laws and the Human Rights Act 1998. After waiting for two weeks, the Secretary of State for Health, Matt Hancock, replied to the Committee rejecting its proposals as “unnecessary!” Let us examine those proposals in detail.

The Human Rights Considerations

Section 6 of the Human Rights Act 1998 makes it unlawful for any public authority (that includes the UK government and the NHSX) to act in a way that is incompatible with a Convention right. Article 8(1)of the ECHR states that “Everyone has the right to respect for his private and family life, his home and his correspondence.” This is not an absolute right. Article 8(2) provides that an interference with the right to privacy may be justified if it:

is in accordance with the law and is necessary in a democratic society in the interests of national security, public safety or the economic well-being of the country, for the prevention of disorder or crime, for the protection of health or morals, or for the protection of the rights and freedoms of others.”

However, the government also has an obligation to protect the “right to life” enshrined in Article 2 of the ECHR. This means that if the NHS COVID App really can prevent the spread of the virus and save lives, then this is going to a major consideration in deciding whether the interference with Article 8 is necessary and proportionate.

On 7 May the Parliamentary Joint Committee on Human Rights  (PJCHR) published a Report on the NHS COVID App and this provides a very detailed assessment of some of the human rights implications of the “centralised” approach that the NHS has proposed. The overall conclusion of the report is that if the app is effective it could help pave the way out of current lockdown restrictions and help to prevent the spread of Coronavirus. However, it also concludes that the app, in its current form, raises “significant concerns regarding surveillance and the impact on other human rights which must be addressed first.”

How will the COVID App interfere with the right to privacy?

At first glance it would appear that the COVID App does not involve the transfer of any personal data. As explained in the first blog in this series, app user will be given a unique ID which will be made up of a set of random numbers and the first half of a person’s post code. The NHS web site suggests that this ‘anonymises’ the information. However, as the Parliamentary Report notes, there are parts of England where less than 10,000 people live in a post code area and as little as 3 or 4 “bits” of other information could be enough to identify individuals. The report also notes that relying upon people self-reporting alone (without requiring conformation that a person has tested positive for COVID 19) may carry the risks of false alerts thereby impacting on other people’s rights if they have to self-isolate unnecessarily.

Necessary interference?

An interference with a person’s right to privacy under ECHR Article 8 may be justified under Article 8(2) if it is “in accordance with the law” and is “necessary” for the protection of “health” (see above).

To be in accordance with the law, the app must meet the requirements of the General Data Protection Regulation (GDPR) and the Data Protection Act 2018 “http://www. legislation.gov.uk/ukpga/2018/12/contents” (DPA). However, as noted below, the PJCHR believes that the “current data protection framework is contained in a number of different documents and it is nearly impossible for the public to understand what it means for their data which may be collected by the digital contact tracing system”. The Committee’s recommendations in relation to this are considered below.

The remaining  human rights consideration is whether the interference with peoples’ private lives is “necessary”. The answer to this depends on whether the use of the app will contribute to reducing the spread of COVID 19 and whether it will save lives.
This in turn depends on whether the app works and on the uptake of the app.

Although it was reported that uptake of the app in the Isle of Wight has exceeded 50% of the population, this falls short of the 60% that the government had previously suggested was necessary for the app to be effective. It is also debatable whether it necessarily follows that the uptake will be the same on the mainland. If the App is not capable of achieving its objective of preventing the spread of the virus, then the interference with peoples’ privacy rights will not be proportionate and will not fulfil the requirement of necessity in Article 8(2).

Although many people will probably download the app without thinking about privacy issues (how often do any of us download apps without checking Privacy Notices?), many others may have some real privacy concerns, particularly after the recent media debates. This has not been helped by reports that Serco (the company contracted to train call centre staff for the contact tracing scheme) has accidentally shared the email addresses of 300 contact tracers. Or by the fact that in other parts of the world there is growing concern about the privacy issues related to the use of contact tracing apps. Uptake of the app may be adversely affected if people lack confidence in the way in which data is being processed and why, and in the light of above they may have concerns about data security.

Consequently, the PJCHR’s report includes a series of recommendations aimed at ensuring that “robust privacy protections” are put in place as these are key to ensuring the effectiveness of the app .

Central to their recommendations was a proposal that the government introduce legislation to provide legal certainty about how personal data will be processed by the COVID App. Although individuals’ data protection rights are protected by the GDPR and DPA 2018 the Committee believes that it is “nearly impossible” for the public to understand what will happen to their data and also that it is necessary to turn government assurances about privacy into statutory obligations. The PJCHR sent a copy of their draft Bill to Secretary of State, Matt Hancock. However, on 21 May Matt Hancock rejected that proposal on the basis that the existing law provides “the necessary powers, duties and protections” and that participation in contact tracing and use of the app is voluntary.
In contrast the Australian government has passed additional new privacy protection legislation specifically aimed at the collection, use and disclosure of its COVID safe app data.

The Committee’s other recommendations are:

  1. The appointment of a Digital Contact Tracing Human Rights Commissioner to oversee the use, effectiveness and privacy protections of the app and any data associated with digital contact tracing. It calls for the Commissioner to have the same powers as the Information Commissioner. It would appear that Matt Hancock has also rejected this proposal on the basis that there is already sufficient governance in place.
  2. Particular safeguards for children under 18 to monitor children’s use, ensure against misuse and allow for interviews with parents where appropriate. It is noticeable that the Committee has set the age at 18.
  3. The app’s contribution to reducing the severity of the lockdown and to helping to prevent the spread of COVID 19 must be demonstrated and improved at regular intervals for the collection of the data to be reasonable. Therefore the Secretary of State for Health must review the operation of the app on a three weekly basis and must report to Parliament every three weeks.
  4. Transparency. In the second of this series of blog posts, we noted some of the issues relating to the publication of the Data Protection Impact Assessment. The PJCHR calls for this to be made public as it is updated.
  5. Time limited. The data associated with the contact tracing app must be permanently deleted when it is no longer required and may not be kept beyond the duration of the health emergency. However these terms may be open to some interpretation.

Matt Hancock has written that he will respond to these other issues “in due course”.
It is unclear what this means, but it does not suggest any immediate response.

The Draft Bill

The PJCHR’s draft bill (rejected by Matt Hancock) proposed a number of important provisions, some of which are set out below.

The Bill specifically limited the purpose of the COVID App to:

  1. Protecting the health of individuals who are or may become infected with Coronavirus; and
  2. Preventing or controlling the spread of Coronavirus (a) preventing the spread of Coronavirus.

Additionally it contained provisions  that prohibited the use of centrally held data without specific statutory authorisation; limited the amount of time that data could be held on a smart phone to 28 days followed by automatic deletion unless a person has notified that they have COVID 19 or suspected COVID 19. It also prohibited “data reconstruction” in relation to any centrally held data. The fact that the Bill includes this, seems to suggest an implicit recognition that the Unique IDs are not truly anonymous.

The ‘status’ of the NHS COVID App keeps changing and it still remains to be seen when (and if) it will be rolled out. But the Northern Ireland Assembly has already announced it will be working with the Irish government to produce a coordinated response based on a  decentralised model.  It is reported to be doing this because of the difficulties and uncertainties surrounding the app, and the human rights issues arising from a centralised app.

This and other GDPR developments will be covered in our new online GDPR update workshop. Our  next online   GDPR Practitioner Certificate  course is  fully booked. We have  1 place left   on the course starting on 11th  June. 

online-gdpr-banner

Posted in coronavirus, COVID-19, Human Rights, NHS APP, Privacy, Uncategorized | Tagged , , , , | 1 Comment

The NHS COVID 19 Contact Tracing App Part 2: The Data Protection Impact Assessment

Person Ticks Checkbox Marks with a Pen, filling up To Do List. Checking Marks and FIlling in a Task List / Questiannaire / Medical Cart. Moving Macro Close-up Camera

Yesterday the Prime Minister said England will have a “world-beating” Covid 19 contact tracing system from June. Part of this system is the introduction of the NHS contact tracing app (“the Covid App”) which is currently being trialled on the Isle of Wight.
The app was initially meant to be launched across England in mid-May. Yesterday No.10 suggested this will be done “at a later date.” Why the delay? Well if you look at the recently published Data Protection Impact Assessment (DPIA) for the trial it’s obvious that much more work needs to be done. Here is our analysis of some of the issues raised. (If you are new to this subject we suggest you read the first blog in our series which discussed how such apps work and the different models which can be used.)

Background to the DPIA

The start of the App project has not been auspicious; nor does it instil confidence in the people running it. How can the public, let alone privacy professionals, trust the government when they say that the app will respect their privacy?

The trial of the app started on the Isle of Wight before the Information Commissioner’s Office (ICO) had been given sight of the DPIA. Although they have now seen a copy, the ICO is yet to give a formal opinion. Should the trial have gone ahead in this situation?

As demands grew to see the DPIA, NHSX published it as a .pdf document! However embedded documents including the all-important risk register could not be accessed.
So much for transparency! A few days later the word version of the DPIA was published revealing all the documents but there were typos and some names were not redacted. More importantly, those scrutinising it raised concerns that “high risks” in the original documentation had been listed as only “medium risks” in the risk register. NHSX quickly removed the word document and only the .pdf version is now available (here). For the trial to go ahead before all of the promised and finalised accurate documentation had been released again does not engender faith in the app’s ability to protect users’ privacy.

Oversight

An Ethics Advisory Board has been set up to oversee the Covid App project. In a letter to the Secretary of Health and Social Care, the Board spelt out the 6 principles it expected to be followed; value, impact, security and privacy, accountability, transparency and control.

Some members of the Board have since raised their concerns to the press over how the Board has been responded to. They were also unhappy not to have seen the final DPIA before being asked to comment.

Parliament’s Joint Committee on Human Rights has also been scrutinising the Covid App. It has said that it is not reassured that the app protects privacy and believes that it could be unlawful if the large amount of data gathered proved ineffectual. The Committee has even taken the unusual step of drafting a bill which would require all of the collected data to be deleted after the pandemic is over. (We will look at what data the NHS wants to keep for research purposes and why in our fourth and final blog in this series.)

These serious concerns being raised being by experts and parliamentarians will have a big impact on the public uptake of the app.

Privacy by Design

In line with Article 25 of the GDPR, the app’s DPIA states that it was designed and will continue to evolve with the Privacy by Design principles embedded. They include collecting the minimal amount of data necessary; data not leaving the device without the permission of the user; users’ identities obscured to protect their identity; no third-party trackers; proximity data deleted on users’ phones when no longer required; user can delete the app and its data at any time; personal data will not be kept for longer than is necessary in the central database; data in the central database will not be available to those developing in the app apart from in exceptional circumstances; and provision of any data from the central database will be subject to a data protection impact assessment and establishment of legal basis for the disclosure.

Interestingly, the ICO and European Data Protection Board have both favoured the de-centralised model as it is more with Article 25 and the principle of data minimisation under Article 5 of the GDPR.

Identified Risks

The key part of any DPIA are the risks identified and what mitigation can be put in place to reduce that risk if possible. The documented risks in the Covid App include:

  • Transferring of data outside of the EEA
  • Misuse of information by those with access
  • Adequate data processing agreements with relevant data processors
  • Lack of technical or organisational measures implemented to ensure appropriate security of the personal data
  • Personal data not being encrypted both/either in transit or at rest
  • Lack of testing which would assess and improve the effectiveness of such technical and organisational measures
  • Inadequate or misleading transparency information
  • Misuse of reference code issued by app for test requests and results management
  • Malicious access to sonar backend by cyber-attack. Extraction and re-identification of sonar backend data by combination with other data
  • Identification of infected individual due to minimal contact – e.g. isolated person with carer who is only contact
  • Malicious or hypochondriac incorrect self-diagnosis on app
  • Absence of controls over access to app by children
  • Lower than expected public trust at launch
  • Uncertainty about whether users will be able to exercise SRRs in relation to data held in the sonar backend
  • Uncertainty over retention of individual data items

It is surprising that the Covid App DPIA only identifies 15 risks in such a major project involving sharing Special Category Data. To assess all those risks as low to medium also casts doubts on the robustness of the risk assessments. Recently we heard that wide-ranging security flaws have been flagged by security researchers involved in the Isle of Wight pilot.

There also seems to be a lack of clarity about the data being processed by the app.
In response to the concerns raised, NHXS itself tweeted that the Covid App “does not track location or store any personal information.”

This was quickly objected to by many from the data protection community who disagreed with both assertions and argued the app used pseudonymised data and trackers.
The ICO itself states on its website:,

“Recital 26 (of the GDPR) makes it clear that pseudonymised personal data remains personal data and within the scope of the GDPR”.

The DPIA itself, however, does state that pseudonymised will be used and is personal data. The mixed messages coming from NHSX will only continue to cause confusion and once again erode trust.

Unidentified Risks

What is even more worrying is that there are some risks that have not been identified in the original DPIA:

  • There is a risk that there could be function creep and identification over time as more personal data is added or different research projects add in other identifiable data sets, along with the risk of interpreting smartphone users’ movements and interactions.
  • Users’ rights for their data to be erased under Article 17 of the GDPR have been completely removed once the data is used for research purposes. We’ll explore this more in a later blog around research.
  • There is no mention of any risk associated with the Privacy and Electronic Communications (EC Directive) Regulations 2003/2426.
  • No decision has yet been made on retention periods for research. The data could be kept for too long and breach the GDPR Principle 5.
  • The collection of personal data could be unlawful as it may breach the Human Rights Act 1998. If the app does not prove effective, it is arguable that it is not necessary and proportionate for the purpose it was created. More on this in the third blog in this series.
  • It also is unclear as to how the NHS risk scoring algorithm works as details have not been published. The Privacy Notice makes no mention of automated processing and is therefore not compliant with Article 13 (2)(f) of the GDPR. 

Conclusion

At this moment in time, there are still far too many questions and unaddressed concerns relating to the Covid App to reassure the public that they can download it in good faith, and know exactly what will happen to their data.

Feedback from the privacy community should result in a revised DPIA and further scrutiny. Only after all the questions and concerns have been addressed should the app be launched. Yesterday outsourcing firm Serco apologised after accidentally sharing the email addresses of almost 300 contact tracers. The company is training staff to trace cases of Covid-19 for the UK government!

This and other GDPR developments will be covered in our new online GDPR update workshop. Our  next online   GDPR Practitioner Certificate  course is  fully booked. We have  1 place left   on the course starting on 11th  June. 

online-gdpr-banner

Posted in coronavirus, COVID-19, GDPR, NHS APP, Uncategorized | Tagged , , , | 3 Comments

Act Now Expands its Cyber Security Team

Olu

Cyber security is one of the Information Commissioner’s regulatory priorities; not surprising when you consider the Notices of Intent (to fine) issued by the ICO on British Airways and Marriott International. Recently we learnt that two companies involved in building emergency coronavirus hospitals have been hit by cyber-attacks. Cyber security is an important subject that Data Protection Officers need to understand to be able to fulfil their role effectively.

Act Now Training is pleased to announce that leading cyber security expert, Olu Odeniyi has joined its team of associates. Olu is a Cyber Security, Information Security and Digital Transformation Trusted Advisor who has 30 years’ experience. During this time, he has held several key senior leadership, strategic and operational positions, in the public and private sectors. As a former trustee of three charities, Olu held the roles of Technical Lead, Treasurer and Chair, where he was responsible for regulatory compliance, operational and project risk management.

Recent projects delivered by Olu include investigation of cyber related breaches, analysis of organisations’ cyber security postures and in-depth risk assessments. Olu has advised companies on requirements for attaining the government backed cyber essentials certification and the coveted ISO 27001 Information Security Management.
Workshops, presentations and lectures at the University of West London were given by Olu on topics such as information security and digital transformation.

At the University’s Enterprise Hub, Olu guided start-up companies on cyber security issues ranging from processes to technical considerations – he continues to support and mentor such companies. Analysis of academic cyber security research on novel ways to secure IoT (Internet of Things) devices using artificial intelligence concluded with Olu reporting his findings to the University.

Olu speaks at various conferences and information sessions on information governance and cyber security. In February this year, Olu spoke at the PrivSec Conference on ‘Deepfakes’ (hyper realistic synthetic video/audio generated by deep neural networks) to a packed theatre at the QEII conference centre in London. The session was hosted within the Threat Intelligence theatre with other speakers such as Mike Hulett, Head of Operations at National Crime Agency (NCA).

Olu is a professional member of the BCS (British Computer Society – The Chartered Institute for IT) and a Microsoft Certified Professional (MCP). Within the BCS, Olu is a member of the Information Risk Management and Assurance (IRMA), Information Security, Artificial Intelligence and the Cybercrime Forensics specialist interest groups. Olu said:

“I am delighted to be joining the Act Now teamlook forward to using my cyber security and digital transformation expertise to help Data Protection Officers understand and overcome the cyber challenges their organisations face. Over the coming months I will be developing practical online training courses that delegates can take from the comfort of their office 

Ibrahim Hasan, solicitor and director of Act Now Training, said:

“Olu’s reputation proceeds him. His expert knowledge coupled with experience of working for a range or organisations will help us expand our cyber security services. Together with our other cyber expert, Steven Cockroft, we are confident that we will be able to service the increasingly complex cyber needs of clients.”

In addition to training, Olu can help your organisation with personal data breaches, PEN testing, incident management, breach reporting and incident responses. Olu can also act as an outsourced or interim Chief Information Security Officer (CISO) or a Chief Information Officer (CIO).

Olu will be a delivering a free webinar on “Introduction to Cyber Security for DPOs on 26th May 2020 (11am). Places are limited so please book early. Our GDPR Update is ideal for those looking to keep abreast of the latest GDPR developments. Finally, the GDPR Practitioner Certificate course is now available as an online option and filling fast.

online-gdpr-banner

Posted in Associates, cyber security, Experts, Olu, Uncategorized | Tagged , , , | Leave a comment