GDPR Fine for Charity E Mail Blunder

A Scottish charity has been issued with a £10,000 monetary penalty notice following the inadvertent disclosure of personal data by email. 

On 18th October, HIV Scotland was found to have breached the security provisions of the UK GDPR, namely Articles 5(1)(f) and 32, when it sent an email to 105 people which included patient advocates representing people living with HIV. All the email addresses were visible to all recipients, and 65 of the addresses identified people by name. From the personal data disclosed, an assumption could be made about individuals’ HIV status or risk. 

The Information Commissioner’s Office (ICO) is urging organisations to revisit their bulk email practices after its investigation found shortcomings in HIV Scotland’s email procedures. These included inadequate staff training, incorrect methods of sending bulk emails by blind carbon copy (bcc) and an inadequate data protection policy. It also found that despite HIV Scotland’s own recognition of the risks in its email distribution and the procurement of a system which enables bulk messages to be sent more securely, it was continuing to use the less secure bcc method seven months after the incident.

On the point of training, HIV Scotland confirmed to the ICO that employees are expected to complete the “EU GDPR Awareness for All” on an annual basis.  The ICO recommended that staff should receive induction training “prior to accessing personal data and within one month of their start date.” Act Now’s e learning course, GDPR Essentials, is designed to teach employees about the key provisions of GDPR and how to keep personal data safe. The course is interactive with a quiz at the end and can be completed in just over 30 minutes. Click here to watch a preview. 

HIV Scotland was also criticised for not having a specific policy on the secure handling of personal data within the organisation. It relied on its privacy policy which was a public facing statement covering points such as cookie use, and data subject access rights; this provided no guidance to staff on the handling of personal and what they must do to ensure that it is kept secure. The Commissioner expects an organisation handling personal data, to maintain policies regarding, amongst other things, confidentiality (see our GDPR policy pack).

This is an interesting case and one which will not give reassurance to the Labour Relations Agency in Northern Ireland which had to apologise last week for sharing the email addresses and, in some cases ,the names of more than 200 service users. The agency deals confidentially with sensitive labour disputes between employees and employers. It said it had issued an apology to recipients and was currently taking advice from the ICO.

Interestingly the ICO also referenced in its ruling, the fact that HIV Scotland made a point of commenting on a similar error by another organisation 8 months prior. In June 2019, NHS Highland disclosed the email addresses of 37 people who were HIV positive. It is understood the patients in the Highlands were able to see their own and other people’s addresses in an email from NHS Highland inviting them to a support group run by a sexual health clinic. At the time HIV Scotland described the breach as “unacceptable”. 

The HIV Scotland fine is the second one the ICO has issued to a charity in the space of 4 months. On 8th July 2021, the transgender charity Mermaids was fined £25,000 for failing to keep the personal data of its users secure. The ICO found that Mermaids failed to implement an appropriate level of security to its internal email systems, which resulted in documents or emails containing personal data being searchable and viewable online by third parties through internet search engine results.

Charities need to consider these ICO fines very carefully and ensure that they have polices, procedures and training in place to avoid enforcement action by the ICO. 

This and other GDPR developments will be discussed in detail on our forthcoming GDPR Update workshop. We have a few places left on our Advanced Certificate in GDPR Practice course starting in January.

Labour Relations Agency Data Breach: Ibrahim Hasan’s BBC Interview

95505eee-53d6-4784-89be-605782852235-2

The Labour Relations Agency in Northern Ireland has apologised for sharing the email addresses and, in some cases the names, of more than 200 service users.

https://www.bbc.co.uk/news/uk-northern-ireland-58988092

Here is Ibrahim Hasan’s interview with BBC Radio Ulster:

More media interviews by Ibrahim here.

Facial Recognition in Schools: Please, sir, I want some more.

Yesterday the Financial Times reported that, “nine schools in North Ayrshire will start taking payments for school lunches by scanning the faces of pupils, claiming that the new system speeds up queues and is more Covid-secure than the card payments and fingerprint scanners they used previously.”

For a few years now, schools have used biometrics including automated fingerprint identification systems for registration, library book borrowing and cashless catering. Big Brother Watch reported privacy concerns about this way back in 2014. Now a company, called CRB Cunninghams, has introduced facial recognition technology to allow schools to offer children the ability to collect and pay for lunches without the need for physical contact. In addition to the nine schools in Scotland, four English schools are reported to be introducing the technology. Silkie Carlo, the head of Big Brother Watch, said: 

“It’s normalising biometric identity check for something that is mundane. You don’t need to resort to airport-style [technology] for children getting their lunch.”

The law on the use of such technology is clear. Back in 2012, the Protection of Freedoms Act (POFA) created an explicit legal framework for the use of all biometric technologies (including facial recognition) in schools for the first time. It states that schools (and colleges) must seek the written consent of at least one parent of a child (anyone under the age of 18) before that child’s biometric data can be processed. Even if a parent consents, the child can still object or refuse to participate in the processing of their biometric data. In such a case schools must provide a reasonable alternative means of accessing the service i.e. paying for school meals in the present case. 

POFA only applies to schools and colleges in England and Wales. However, all organisation processing personal data must comply with the UK GDPR. Facial recognition data, being biometric, is classed as Special Category Data and there is a legal prohibition on anyone processing it unless one of the conditions in paragraph 2 of Article 9 are satisfied. Express consent of the Data Subjects (i.e. the children, subject to their capacity) seems to be the only way to justify such processing. 

In 2019 the Swedish Data Protection Authority fined an education authority (SEK 200 000 ,approximately 20 000 Euros) after the latter instructed schools to use facial recognition to track pupil attendance. The schools had sought to base the processing on consent. However, the Swedish DPA considered that consent was not a valid legal basis given the imbalance between the Data Subject and the Data Controller. It ruled that there was a breach of Article 5, by processing students’ personal data in a manner that is more intrusive as regards personal integrity and encompasses more personal data than is necessary for the specified purpose (monitoring of attendance), Article 9 and Articles 35 and 36 by failing to fulfil the requirements for an impact assessment and failing to carry out prior consultation with the Swedish DPA. 

The French regulator (CNIL) has also raised concerns about a facial recognition trial commissioned by the Provence-Alpes-Côte d’Azur Regional Council, and which took place in two schools to control access by pupils and visitors. The CNIL concluded that “free and informed consent of students had not been obtained and the controller had failed to demonstrate that its objectives could not have been achieved by other, less intrusive means.” CNIL also said that facial recognition devices are particularly intrusive and present major risks of harming the privacy and individual freedoms of the persons concerned. They are also likely to create a sense of enhanced surveillance. These risks are increased when facial recognition devices are applied to minors, who are subject to special protection in national and European laws.

Facial recognition has also caused controversy in other parts of the world recently. In India the government has been criticised for its decision to install it in some government-funded schools in Delhi. As more UK schools opt for this technology it will be interesting to see how many objections they receive not just from from parents but also from children. This and other recent privacy related stories highlight the importance of a Data Protection Officer’s role.

BONUS QUESTION: The title of this contains a nod to which classic novel? Answers in the comments section below.

All the recent GDPR developments will be discussed in detail on our forthcoming GDPR Update workshop. We have a few places left on our Advanced Certificate in GDPR Practice course starting in November.

Footballers’ Personal Data: Ibrahim Hasan’s BBC Interview

fringer-cat-hddmxlpafgo-unsplash

On Tuesday there was an interesting story in the media about a group of footballers threatening legal action and seeking compensation for the trade in their personal data. 

The use of data is widespread in every sport. It is not just used by clubs to manage player performance but by others such as betting companies to help them set match odds. Some of the information may be sold by clubs whilst other information may be collected by companies using public sources including the media.

Do footballers have rights in relation to this data? Can they use the GDPR to seek compensation for the use of their data?

On Tuesday, Ibrahim Hasan gave an interview to BBC Radio 4’s (PM programme) about this story. You can listen below:

This and other GDPR developments will be discussed in detail on our forthcoming GDPR Update workshop. We have a few places left on our Advanced Certificate in GDPR Practice course starting in November.

Ring Doorbells, Domestic CCTV and GDPR

The Daily Mail reports today that, “A female doctor is set to be paid more than £100,000 after a judge ruled that her neighbour’s Ring smart doorbell cameras breached her privacy in a landmark legal battle which could pave the way for thousands of lawsuits over the Amazon-owned device.”

Dr Mary Fairhurst, the Claimant, alleged that she was forced to move out of her home because the internet-connected cameras are so “intrusive”. She also said that the Defendant, Mr Woodard, had harassed her by becoming “aggressive” when she complained to him.

A judge at Oxford County Court, ruled yesterday that Jon Woodard’s use of his Ring cameras amounted to harassment, nuisance and a breach of data protection laws. The Daily Sage goes on to say:

“Yesterday’s ruling is thought to be the first of its kind in the UK and could set precedent for more than 100,000 owners of the Ring doorbell nationally.”

Before Ring doorbell owners rush out to dismantle their devices, let’s pause and reflect on this story. This was not about one person using a camera to watch their house or protect their motorbike. The Defendant had set up a network of cameras around his property which could also be used to watch his neighbour’s comings and goings. 

Careful reading of the judgement leads one to conclude that the legal action brought by the Claimant was really about the use of domestic cameras in such a way as to make a neighbour feel harassed and distressed. She was primarily arguing for protection and relief under the Protection from Harassment Act 1997 and the civil tort of nuisance. Despite the Daily Mail’s sensational headline, the judgement does not put domestic CCTV camera or Ring doorbell owners at risk of paying out thousands of pounds in compensation (as long as they don’t use the cameras to harass their neighbours!). However, it does require owners to think about the legal implications of their systems. Let’s examine the data protection angle.

Firstly, the UK GDPR can apply to domestic CCTV and door camera systems. After all, the owners of such systems are processing personal data (images and even voice recordings) about visitors to their property as well as passers-by and others caught in the systems’ peripheral vision.  However, on the face of it, a domestic system should be covered by Article 2(2)(a) of the UK GDPR which says the law does not apply to “processing of personal data by an individual in the course of purely personal or household activity.” Recital 18 explains further:

“This Regulation does not apply to the processing of personal data by a natural person in the course of a purely personal or household activity and thus with no connection to a professional or commercial activity. Personal or household activities could include correspondence and the holding of addresses, or social networking and online activity undertaken within the context of such activities.”

The judge in this case concluded that the camera system, set up by the Defendant, had collected data outside the boundaries of his property and, in the case of one specific camera, “it had a very wide field of view and captured the Claimant’s personal data as she drove in and out of the car park.” This would take the system outside of the personal and household exemption quoted above, as confirmed by the Information Commissioner’s CCTV guidance:

“If you set up your system so it captures only images within the boundary of your private domestic property (including your garden), then the data protection laws will not apply to you.

But what if your system captures images of people outside the boundary of your private domestic property – for example, in neighbours’ homes or gardens, shared spaces, or on a public footpath or a street?

Then the General Data Protection Regulation (GDPR) and the Data Protection Act 2018 (DPA18) will apply to you, and you will need to ensure your use of CCTV complies with these laws.”

Once a residential camera system comes under the provisions of the UK GDPR then of course the owner has to comply with all the Data Protection Principles including the obligation to be transparent (through privacy notices) and to ensure that the data processing is adequate, relevant and not excessive. Data Subjects also have rights in relation to their data including to see a copy of it and ask for it to be deleted (subject to some exemptions).

Judge Clarke said the Defendant had “sought to actively mislead the Claimant about how and whether the cameras operated and what they captured.” This suggests a breach of the First Principle (lawfulness and transparency). There were also concerns about the amount of data some of the cameras captured (Fourth Principle).

Let’s now turn to the level of compensation which could be awarded to the Claimant. Article 82 of the UK GDPR does contain a free standing right for a Data Subject to sue for compensation where they have suffered material or non-material damage, including distress, as a result of a breach of the legislation. However, the figure mentioned by the Daily Mail headline of £100,000 seems far-fetched even for a breach of harassment and nuisance laws let alone GDPR on its own. The court will have to consider evidence of the duration of the breach and the level of damage and distress cause to the Claimant. 

This judgement does not mean that Ring door camera owners should rush out to dismantle them before passing dog walkers make compensation claims. It does though require owners to think carefully about the citing of cameras, the adequacy of notices and the impact of their system on their neighbour’s privacy. 

The Daily Mail story follows yesterday’s BBC website feature about footballers attempting to use GDPR to control use of their performance data (see yesterday’s blog and Ibrahim Hasan’s BBC interview). Early Christmas gifts for data protection professionals to help them highlight the importance and topicality of what they do!

This and other GDPR developments will be discussed in detail on our forthcoming GDPR Update workshop. We have a few places left on our Advanced Certificate in GDPR Practice course starting in November.

Ronaldo’s Data and GDPR: Who said data protection is boring?

There is an interesting story this morning on the BBC website about a group of footballers threatening legal action and seeking compensation for the trade in their personal data. The use of data is widespread in every sport. It is not just used by clubs to manage player performance but by others such as betting companies to help them set match odds. Some of the information may be sold by clubs whilst other information may be collected by companies using public sources including the media.  

Now 850 players (Ed – I don’t know if Ronaldo is one of them but I could not miss the chance to mention my favourite footballer!), led by former Cardiff City manager Russell Slade, want compensation for the trading of their performance data over the past six years by various companies. They also want an annual fee from the companies for any future use. The data ranges from average goals-per-game for an outfield player to height, weight and passes during a game. 

BBC News says that an initial 17 major betting, entertainment and data collection firms have been targeted, but Slade’s Global Sports Data and Technology Group has highlighted more than 150 targets it believes have “misused” data. His legal team claim that the fact players receive no payment for the unlicensed use of their data contravenes the General Data Protection Regulation (GDPR). However, the precise legal basis of their claim is unclear. 

In an interview with the BBC, Slade said:

“There are companies that are taking that data and processing that data without the individual consent of that player.”

This suggests a claim for breach of the First Data Protection Principle (Lawfulness and Transparency). However, if the players’ personal data is provided by their clubs e.g., height, weight, performance at training sessions etc. then it may be that players have already consented (and been recompensed for this) as part of their player contract. In any event, Data Protection professionals will know that consent is only one way in which a Data Controller can justify the processing of personal data under Article 6 of GDPR. Article 6(1)(f) allows processing where it:

“is necessary for the purposes of the legitimate interests pursued by the controller or by a third party, except where such interests are overridden by the interests or fundamental rights and freedoms of the data subject which require protection of personal data… .”

Of course, this requires a balancing exercise considering the interests pursued by the clubs and data companies and the impact on individual players’ privacy. Some would argue that as far as public domain information is concerned, the impact on players’ privacy is minimal. However, “the interests or fundamental rights and freedoms of the data subject’ also include reputational damage, loss of control and financial loss, all of which it could be argued result from the alleged unauthorised use of data.

The BBC article quotes former Wales international Dave Edwards, one of the players behind the move:

“The more I’ve looked into it and you see how our data is used, the amount of channels its passed through, all the different organisations which use it, I feel as a player we should have a say on who is allowed to use it.”

The above seems to suggest that the players’ argument is also about control of their personal data. The GDPR does give players rights over their data which allow them to exercise some element of control including the right to see what data is held about them, to object to its processing and to ask for it to be deleted. It may be that players are exercising or attempting to exercise these rights in order to exert pressure on the companies to compensate them.

Without seeing the paperwork, including the letters before action which have been served on the companies, we can only speculate about the basis of the claim at this stage. Nonetheless, this is an interesting case and one to watch. If the claim is successful, the implications could have far-reaching effects beyond football. Whatever happens it will get data protection being talked about on the terraces!

Ibrahim Hasan, solicitor and director of Act Now Training, has given an interview to BBC Radio 4’s (PM programme) about this story. You can listen again here (from 39) minutes onwards.

This and other GDPR developments will be discussed in detail on our forthcoming GDPR Update workshop. We have a few places left on our Advanced Certificate in GDPR Practice course starting in November.

%d bloggers like this: