Data Protection update – October 2022 – Stephenson Harwood




02 Nov 2022

Welcome to our data protection bulletin, covering the key developments in data protection law from October 2022.
On 7 October 2022, President Biden signed an Executive Order on Enhancing Safeguards for United States Signals Intelligence Activities (the “Executive Order“) which outlines the steps to be taken to implement the United States’ commitments under the EU-US Data Privacy Framework (the “Framework“) (previously reported on in our March 2022 bulletin). The Framework is intended to provide a safeguard for transatlantic data flows following the landmark Schrems II ruling in July 2020 which declared the EU-US Privacy Shield invalid as a transfer tool.
The Executive Order has the force of law in the US and sets out the US Government’s commitment to restore a legal basis for overseas transfers of personal data to the US, and ensure protection for EU personal data that is essentially equivalent to the protection it receives in Europe.
Under the Executive Order, the US Government commits to:
The EU and UK must now independently choose whether to grant an adequacy decision in relation to the US for the Framework to be approved, which is not expected before Spring 2023.
Data Protection Partner, Katie Hewson, joined the TechUK panel discussion on ‘The Future of Transatlantic Data Flows: The UK Perspective’ earlier this month to give her initial thoughts. We have outlined our five key takeaways here. You can also read our blog post here for a detailed summary of the Executive Order and Framework, including commentary on the reactions and challenges it has received to date.
Following the conclusion of a public consultation on employment practices last year (reported on in our September 2021 alert), the Information Commissioner’s Office (the “ICO“) has published two updated pieces of draft guidance covering monitoring at work; and information about workers’ health (together the “Draft Guidance“).
These updates form part of the ICO’s commitment to develop a new user-friendly online resource with topic-specific guidance to address the substantial changes to data protection legislation since the ICO published its employment practices code, the supplementary guidance and the quick guide (together the “Previous Guidance“). The Draft Guidance aims to help employers understand the data protection obligations they are required to comply with when monitoring workers and processing worker health data. 
It is important for organisations to note that these obligations arise in the context of any employment relationship regardless of the contract type (i.e. they apply to part-time employees, workers, contractors and volunteers).
We are expecting the ICO to release further guidance in relation to employment practices, most likely in relation to recruitment and selection and employment records.
Technological advances in recent years have drastically increased the methods available to conduct monitoring of workers. For example, with the mass move to home working in recent years, employers have been able to conduct monitoring of workers through the increasingly common use of applications and software such as Zoom and Teams. The draft guidance on monitoring at work (the “Draft Monitoring Guidance“) seeks to respond to a number of new monitoring activities not previously available to employers and provide guidance on how data protection legislation should be interpreted and applied to such activities.
The key takeaways from the Draft Monitoring Guidance are:
The Draft Monitoring Guidance is subject to public consultation until 11 January 2023.
The draft guidance relating to health information that is processed in the context of employer-worker relationships (the “Draft Workers’ Health Information Guidance“) seeks to account for modern employment practices and technological advances which have increased the type and amount of data concerning health that is available to, and processed by, employers.
The UK’s General Data Protection Regulation (“UK GDPR“) provides a specific definition of “data concerning health” which covers personal data related to the physical or mental health of an individual. Any such health data is special category data and the collection of such data will always be intrusive. Employer’s must therefore be transparent about the processing and provide clear information about what information is being collected and why. The Draft Workers’ Health Information Guidance specifies that employers should ensure that appropriate security measures are in place to protect worker health data, and that access to worker health data should be restricted on a “need to know” basis.
The guidance also emphasises the difficulties for an employer seeking to rely on consent, both as a lawful basis for processing data and to satisfy the specific condition for processing special category data, owing to the relationship dynamic between employer and worker that makes it difficult to prove that consent is freely given.
The updated guidance includes sections on the following specific topics:
The Draft Workers’ Health Information Guidance is subject to public consultation until 26 January 2023.
The ICO recently published guidance on direct marketing, specifically relating to the use of electronic mail (the “Electronic Mail Guidance“). The Electronic Mail Guidance seeks to provide further detail than the ICO’s general guide to the Privacy and Electronic Communications Regulations 2003 (as amended) (“PECR“).
The Electronic Mail Guidance provides practical steps to ensure compliance with PECR, clarifying the following key points:
The Electronic Mail Guidance can be read here.
The European Data Protection Board (“EDPB“) recently issued minor updates to its guidelines on identifying a controller or processor’s lead supervisory authority (“Guidelines 8/2022“) and its guidelines on personal data breach notifications (“Guidelines 9/2022“), seeking to provide helpful clarification.
The updates to Guidelines 8/2022 clarify the position in relation to the designation of a lead supervisory authority when there are joint controllers in the EEA. This issue is not directly dealt with in the EU General Data Protection Regulation 2016/679 (“EU GDPR“); however, the Guidelines 8/2022 emphasise that joint controllers shall transparently “determine their respective responsibilities for compliance with their obligations under the GDPR“. The updates further provide that joint controllers should delegate responsibilities between themselves in order to ensure compliance with the EU GDPR, including specifically with respect to the organisation of contact with data subjects and supervisory authorities. However, this does not displace the standard determination of a competent supervisory authority under the EU GDPR, and joint controllers cannot designate a common main establishment which would then act as the lead supervisory authority for both joint controllers. Further, supervisory authorities themselves are not beholden to any agreements between joint controllers, and the exercise of their power cannot be limited by any such arrangements.
The Guidelines 8/2022 now establish the test for identifying the lead supervisory authorities for joint controllers. In addition to checking if the joint controllers are established in the EEA, the test is to identify the country where the place of central administration is in the EEA for each respective joint controller, which will then act as the lead supervisory authority for the respective joint controller.
The updates to Guidelines 9/2022 seek to impose more onerous personal data breach notification obligations on controllers who are not established in the EU, but who are still subject to the extra-territorial provisions of the EU GDPR. Under the terms of the proposed updates, such controllers (who do not benefit from the one-stop-shop system) shall be required to notify “every single authority for which affected data subjects reside in their Member State.” This differs to current guidance which specifies that non-EU controllers shall only be required to notify the EU supervisory authority in the Member State where the controller’s representative is established. 
If confirmed, these updates have the potential to generate an extremely heavy workload for controllers based outside of Member States (including those based in the UK). Such controllers would be tasked with notifying a breach to numerous data protection authorities within the 72 hour time limit.
The EDPB are inviting comments on Guidelines 8/2022 until 2 December 2022 and on Guidelines 9/2022 until 29 November 2022.
On 25 October 2022, the Retained EU Law (Revocation and Reform) Bill (the “REUL Bill“, previously promoted as the “Brexit Freedoms Bill“), had its second reading in Parliament. The REUL Bill will next be reviewed by the House of Commons Public Bill Committee (the “Public Bill Committee“) which must conclude its consideration by 22 November 2022. We discussed the introduction of the REUL Bill and its key provisions in our September bulletin.
Parliament have released a call for written evidence in which views regarding the REUL Bill can be submitted to the Public Bill Committee. The call is to those with expertise relevant to or a special interest in the REUL Bill. Written evidence can be considered by the Public Bill Committee up until the deadline of 5pm on 22 November 2022.
The UK GDPR and PECR currently fall within the scope of the REUL Bill’s sunset provisions, and the Data Protection Act 2018 (“DPA 2018“) will likely become assimilated law. However, the current draft of the REUL Bill makes various references to new sections of the DPA 2018 which will be introduced by the Data Protection and Digital Information Bill (the “Bill“), indicating that the UK Government intends for the UK GDPR to continue to apply, despite the overarching removal of the principle of the supremacy of EU law. As previously reported in our September bulletin, the second reading of the Bill was scheduled for 5 September 2022, however this was delayed following the appointment of Elizabeth Truss as Prime Minister and no new date has been set. It is therefore far from clear what the future is for data protection legislation in the UK and organisations should pay close attention to developments in this area, noting the administrative resources required to ensure continued compliance.
The call for written evidence can be found here.
On 12 October 2022, the UK Secretary of State issued a designated vendor direction to thirty-five UK telecoms network operators, requiring the removal of Huawei technology from all UK 5G public networks by the end of 2027. On the same day, Huawei received a public designation notice which categorised the company as a high risk vendor of 5G network equipment and services.
The UK Government’s decision to issue these legal notices follows the National Cyber Security Centre’s conclusion that the impact of US sanctions on Huawei’s supply chain (which, among other things, prevented access to US semiconductor technology) meant the security of Huawei products could no longer be managed. 
The designated vendor direction prohibits the installation of new Huawei technology on 5G networks with immediate effect. A number of other interim deadlines were also introduced to ensure that operators meet the ultimate deadline at the end of 2027 (such as removing Huawei equipment from the network core by 31 December 2023 and removing Huawei equipment from significant national security sites by 28 January 2023). It is anticipated that the performance of such interim controls may result in disruption and network outages – exacerbated by global supply issues.
This decision comes at a time of significant importance for telecoms providers, who now face stricter security obligations resulting from the introduction of the Electronic Communications (Security Measures) Regulations (the “Regulations“), which will be laid before Parliament under the Telecommunications (Security) Act 2021 – as reported in our September 2022 bulletin and our March 2022 bulletin.
Under the new Regulations, companies who do not meet their obligations can be fined up to 10 per cent of turnover or, in the case of a continuing contravention, £100,000 per day by the regulator.
The ICO has imposed a fine of £4,400,000 on Interserve Group Limited (“Interserve“) following an investigation into a cyber-attack on Interserve’s systems (as reported on in our May 2020 bulletin, which can be read here).
In its monetary penalty notice, the ICO identified a number of deficiencies in the security of Interserve’s IT infrastructure, which ultimately led to the 2020 cyber-attack affecting the personal data of up to 113,000 of Interserve’s employees. The ICO determined that these security deficiencies amounted to a failure to process personal data in a manner ensuring the appropriate security of the personal data, and therefore that Interserve had breached Article 5(1)(f) EU GDPR.
As a result of Interserve’s use of outdated operating systems and protocols, ineffective endpoint security and lack of employee training on phishing, the ICO further determined that Interserve had “failed to implement appropriate technical and organisational measures to ensure a level of security appropriate to the risk” as required by Article 32 EU GDPR.
Although Interserve had taken steps to improve the security of its infrastructure and notified the ICO and the National Crime Agency promptly following the cyber-attack, the ICO considered a fine of £4,400,000 was effective, dissuasive and proportionate in relation to the breaches of Articles 5(1)(f) and 32 EU GDPR.
The monetary penalty notice can be read in full here.
The ICO has issued two fines to Easylife Limited (“Easylife“) for breaches of the EU GDPR and PECR.
The ICO found that, between August 2019 and August 2020, the purchase of a number of items from the Easylife catalogue would trigger third party marketing calls to the customers in relation to health conditions linked to the specific catalogue item purchased. As the Information Commission John Edwards noted in his statement “Easylife was making assumptions about people’s medical condition based on their purchase history without their knowledge, and then peddled them a health product“, which amounted to a failure to process customer data lawfully, fairly and in a transparent manner in accordance with Article 5(1)(a) EU GDPR.
Additionally, the ICO noted that the data inferred by Easylife should be treated as special category data as it was personal data relating to health, despite the fact that Easylife did not definitively know whether the individuals had the inferred health condition or not. This was the case, in particular, as Easylife was using the inferred data to make its processing decisions and target its marketing to individuals.
Easylife were also found to be in breach of Regulation 21 PECR due to the use of over one million unsolicited direct marketing calls, which amounted to predatory nuisance calls.
The fine issued to Easylife totalled £1,480,000, made up of £1,350,000 in relation to the breach of the EU GDPR and £130,000 for the breach of PECR.
The monetary penalty notices can be read in full here and the Information Commissioner’s statement can be read here.
The ICO has completed an investigation into a data breach by the Home Office, which involved anti-terrorism related documents containing special category data (the “Documents“) being left at a London venue. Following this investigation, the ICO has decided to issue a reprimand to the Secretary of state for the Home Department for infringements of Articles 5(1)(f) and 33(1) UK GDPR. In particular, the ICO identified that the Secretary of State had failed to ensure the security of the personal data and had failed to report the data breach to the ICO within 72 hours of becoming aware of the loss of the Documents.
In the reprimand, the ICO noted that the handling instructions for the Documents were not followed and that there was no documented sign out process for the removal of the Documents from Home Office property.
Amongst the further action recommended by the ICO, the Home Office was advised to ensure that “prominent and sufficient practical guidance” is given to staff handling documentation similar to the Documents, as well as to monitor whether staff are complying with any policies and procedures already in place.
The full reprimand can be read here.
The ICO has issued a warning to all companies contemplating the use of biometric technology that attempts to analyse data subjects’ emotions (the “Warning“). In the Warning, the Deputy Commissioner, Stephen Bonner, stated that “developments in the biometrics and emotion AI market are immature” and that the risks of exploiting any opportunities created by these technologies are greater as a result.
Emotional analysis technologies attempt to detect emotional cues using a range of biometric identifiers, such as gaze tracking, facial expressions and heartbeats. However, the Warning has highlighted the risk of bias, inaccuracy and discrimination when using insufficiently developed emotional analysis technologies.
Whilst the Warning could signal the ICO’s willingness to issue fines to companies who use experimental emotional analysis technologies irresponsibly or without assessing the public risks of use, the ICO has also set out its intentions to publish its broader guidance on the use of biometric technologies in Spring 2023. This follows the ICO’s insight and foresight reports, published simultaneously to the Warning, which aim to provide businesses with some guidance on how to use biometric technology and data in a UK GDPR compliant manner.
The ICO’s insight and foresight reports can be read here, whilst the Warning can be read in full here.
The French data protection supervisory authority (“CNIL“) has become the latest supervisory authority to fine US facial recognition software developer Clearview AI Inc (“Clearview“) for breaches of the EU GDPR. Clearview were found to have collected and used biometric data without a legal basis, constituting an unlawful processing of personal data in breach of Article 6 EU GDPR, and were also found to have failed to effectively and satisfactorily take into account the rights of individuals in breach of Articles 12, 15 and 17 EU GDPR.
The monetary penalty of €20,000,000, the maximum available to CNIL under Article 83 EU GDPR, is the latest fine against Clearview for its scraping of images from publicly available websites for use in facial recognition software. As we have reported on in previous bulletins, Clearview have already received fines this year from the Italian Data Protection Authority, the ICO, and the Greek DPA.
The full decision of CNIL can be read here.
Andrea Jelinek, the Chairperson of the EDPB, has written to the European Commissioner for Justice, Didier Reynders, to propose changes to how EU GDPR is enforced to tackle issues caused by administration of data protection cases in individual EU Member States.
Amongst the proposals, the EDPB set out a list of procedures that it seeks to align at an EU level, in particular in relation to rules determining how long domestic data protection authorities have to handle cases.
Additionally, the proposals aim to improve cooperation between data protection authorities in cross-border cases to improve the efficiency and speed with which cases are handled, as well as improving the response in high profile cases such as the Irish Data Protection Commission’s investigation into the data protection breaches committed by Instagram, as reported on in our September bulletin (here).
The EDPB’s letter to the European Commissioner for Justice can be read here.
Following a referral to the CJEU by the Austrian Supreme Court, Campos Sánchez-Bordona, an Advocate General within the CJEU, has issued an opinion confirming that ‘mere upset’ arising from breaches of EU GDPR does not entitle an affected data subject to compensation.
In the underlying case referred to the CJEU, UI v Österreichische Post AG, the Claimant is seeking compensation of €1,000 for the ‘inner discomfort’ caused by the political affinity attributed to him following the Austrian Post Service’s extrapolation and processing of personal data.
In his opinion, the Advocate General noted that infringement of the EU GDPR does not automatically give rise to a right for data subject to receive compensation, as there are other remedies laid down in the EU GDPR to protect the data subjects’ fundamental rights.
Although the Claimant’s data in Österreichische Post was processed without obtaining his consent under Article 6(1)(a) EU GDPR and the EU GDPR provides for compensation for non-material damage such as that being claimed by the Claimant, the Advocate General further held that non-material damage does not include the mere upset that someone might feel as a result of breaches of EU GDPR. The Advocate General’s opinion did state, however, that it is for the national courts of EU Member States to decide precisely what non-material damage constitutes in the context of the EU GDPR.
Whilst the Advocate General’s opinion is not binding on the CJEU, if the opinion is followed it will set an important precedent on data subjects’ access to compensation under the EU GDPR and is likely to lead to a significant reduction in litigation arising therefrom, in particular, claims for collective redress for breaches of EU GDPR.
The Advocate General’s opinion can be read in full here.
In a similar case before the English Court, in Driver v Crown Prosecution Service [2022] EWHC 2500 (KB), Knowles J ordered the Defendant to pay the Claimant damages of £250 for breaches of the DPA 2018. In Knowles J’s judgment, he held that the breaches in question were “at the lowest end of the spectrum” and resulted only in a very modest degree of distress to the data subject. Therefore, although there was a breach of the DPA 2018, the damage caused was not sufficient to warrant more than the most minimal damages award that the Court could make.
Knowles J’s judgment can be read in full here.
In another Advocate General’s opinion for the CJEU, Advocate General Tamara Ćapeta has stated that the domestic legislation of EU Member States cannot prevent the interests of data subjects being taken into consideration pursuant to the EU GDPR. The Advocate General’s opinion comes following a request for a preliminary ruling by the CJEU from the Swedish Supreme Court in the case of Norra Stockholm Bygg AB v Per Nycander AB. The CJEU was asked to clarify whether EU GDPR imposes requirements on domestic procedural legislation, in particular when considering disclosure in legal proceedings.
In Norra Stockholm, the Defendant contested the quantum claimed by the Claimant in respect of the purported outstanding balance due for construction work, arguing that the Claimant spent less time than claimed on the project. The Defendant requested disclosure of staff attendance records in order to evidence the actual amount of time spent on the project, however this was rejected by the Claimant on the basis that the individual data subjects’ interests, which would be infringed by the disclosure, outweighed the value of allowing the Defendant access to the records.
In her opinion, the Advocate General held that where data is collected due to provisions of EU GDPR or domestic law, the starting point should always be that data subjects who have not given their consent to the processing of their personal data will always have an interest in restricting such processing. The Court must therefore be able to “justify why this interest should be interfered with” when allowing disclosure. The Advocate General further noted that the principles of Article 5 EU GDPR, in particular the data minimisation principle of Article 5(1)(c) EU GDPR, should always be considered.
In accordance with Article 5(1)(c) EU GDPR, the Advocate General stated that the proportionality of the disclosure needs to be considered and the question of whether the personal data be adequate, relevant and limited to what is necessary in relation to the purposes for which they will be processed needs to be asked. If the national courts of EU Member States follow the principles laid out in the EU GDPR and conduct a proportionality analysis when deciding on disclosure of evidence in civil litigation, then the interests of data subjects will be protected in the presence of any national legislation, the Advocate General further noted.
The Advocate General’s opinion can be read in full here.
The CJEU has ruled that EU-based companies must use reasonable efforts to alert third parties if consumers revoke their consent to published personal data, such as contact information and data, in public directories.
The preliminary ruling stemmed from Proximus NV (Public electronic directories) v Gegevensbeschermingsautoriteit, a Belgian Court of Appeal case involving a data subject who had revoked his consent to having his personal data published in a public telephone directory. Pursuant to Directive 2002/58/EC (the “Directive“), where consent has been given by a data subject in a single instance to their contact details appearing in directories, other third party directories may rely on that consent to publish those contact details in their directories. Despite the data subject informing the original directory provider (Proximus NV) of his desire for his details to be removed from the directory, his personal data was still published in a number of third party directories.
The CJEU held that, following a request by a data subject to remove personal data from a directory (pursuant to the data subject’s right of erasure contained in Article 17 EU GDPR), the operator of the directory should make reasonable efforts to inform any third party that received personal data from the operator of the data subject’s withdrawal of consent. Additionally, the CJEU noted, the operator of the directory should inform search engine providers of the withdrawal of the data subject consent to the publication of their personal data.
The CJEU judgment has not yet been published in full in English, however, once published, will be readable in full here.
Katie Hewson
Partner
T:  +44 20 7809 2374 M:  Email Katie | Vcard Office:  London
Ben Sigler
Partner
T:  +44 20 7809 2919 M:  +44 7584 237 401 Email Ben | Vcard Office:  London
Alison Llewellyn
Managing associate
T:  +44 20 7809 2278 M:  Email Alison | Vcard Office:  London
 

source


CyberTelugu

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top