31 Aug 2022
Welcome to our data protection bulletin, covering the key developments in data protection law from August 2022.
The Information Commissioner’s Office (“ICO“) has published updated guidance on the use of binding corporate rules (“BCRs“) as a data transfer mechanism for controllers and processors under the UK GDPR.
The ICO recognises that BCR applicants may seek both EU and UK BCRs and that the requirements for both jurisdictions may currently overlap. In response to this, the updated guidance is designed to simplify the UK BCR approval process. Supporting documents and commitments will only be requested once during the UK approval process and the referential table that organisations must complete has been revised.
Organisations already granted approval of their UK BCRs by the ICO will not need to take any further action, although organisations still awaiting UK BCR approval should expect engagement from the ICO based on the new guidance. To read the guidance in full click here.
The ICO has published a short guidance note to help small businesses deal with complaints about how they’ve used people’s information. Although the guidance is primarily directed towards small charities, clubs or organisations, the principles it sets out for good complaints handling are a useful steer for all businesses. The guidance recommends taking the following steps:
To read the guidance in full click here.
Scotland is nearing approval of the world’s first statutory Code of Practice on the use of biometric data for policing and criminal justice (the “Code“). On 7 September, the Code will be brought before Scottish ministers and, if unopposed, could be brought into effect as early as 16 November. It is hoped that the Code will provide a guide to organisations working in policing and criminal justice and assist them in making decisions relating to the use of new and emerging biometric applications and technologies. The Code aims to address current gaps in legislation and assist bodies to whom the Code applies in decision-making around the adoption of biometric technologies.
The Department for Digital, Culture, Media & Sport (“DCMS“) has published an in-depth qualitative study which explores organisational experiences of cyber security breaches. The study aimed to understand the level of existing cyber security before a breach; the types of cyber-attacks affecting organisations; how businesses act in the aftermath of a breach; and the impact of such breaches. The DCMS hopes that the findings will help businesses and organisations understand the nature and significance of the cyber security threats they face and what others are doing to stay secure. Some of the key findings were as follows:
To read the study in full click here.
The supervisory authority of Lower Saxony has fined a bank €900,000 for creating customer profiles, enriched with third-party data, for advertising purposes without consent.
The DPA held that such processing of large amounts of data could not be based upon legitimate interest (Article 6(1)(f) GDPR). The DPA stated that processing based on a legitimate interest requires a balancing act between the interest of the controller and the fundamental rights and freedoms of the data subject. In doing so, the controller had to consider the reasonable expectations of the data subjects. In this instance, the DPA held that the data subject could not reasonably expect large amounts of its personal data to be analysed by the controller for the purpose of tailoring its advertising.
The DPA held that, in addition, data enrichment from a third-party source and linking it to profiles could also not be based on legitimate interest. This could potentially link data from all areas of life to an accurate customer profile, which could also not be reasonably expected by a customer.
The publicly traded adtech company, Criteo, has disclosed in its financial filings published on 5 August 2022 that it has been the subject of a proposed fine of approximately $65.4 million for alleged breaches of the GDPR.
While specific details of the investigation and reasons behind the proposed fine are unknown, Criteo’s chief legal officer Ryan Damon issued a statement saying that the firm “strongly disagrees” with the report’s findings, “both on the merits relating to the investigator’s assertions of non-compliance with GDPR and the quantum of the proposed sanction.”
This news comes two years after France’s supervisory authority, CNIL, launched an investigation into the company’s data practices. This investigation arose from a complaint made by Privacy International which raised concerns that Criteo was processing internet users’ personal data – including special category data – without the appropriate user consent frameworks in place, as well as concerns that Criteo were not complying with high-level GDPR principles including fairness, transparency, accuracy and integrity. A final decision on the case and associated fines is unlikely to be finalised until sometime next year, according to Criteo.
The Danish data protection authority, Datatilsynet, has upheld its ban of 14 July 2022 against the Municipality of Helsingør’s use of Google Workspace. The decision, covered in our July bulletin, concerns the authority’s decision finding that the use of Google’s Workspace productivity suite was incompatible with GDPR as a result of Google’s non-compliant international data transfers. Datalisynet has upheld its general ban on the processing of personal data with Google Workspace until adequate documentation and impact assessment have been carried out and until the processing operations have been brought in line with the GDPR. In upholding the ban, the Datatilsynet specified that the ban applies until the Municipality brings its processing activities in line with the GDPR and carries out a DPIA that meets the content and implementation requirements of the same pursuant to Articles 35 and 36 of the GDPR.
Austrian advocacy group, noyb.eu, has lodged a complaint with CNIL alleging that Google breached a 2021 Court of Justice of the European Union by sending direct marketing emails to customers without requesting permission.
Google and CNIL did not immediately respond to requests seeking comment.
A decision of the Court of Justice of the European Union (the “CJEU“) handed down on 1 August 2022 could have major implications for online platforms that use background tracking and platforming to target users with behavioural ads or feed recommender engines that are designed produce ‘personalised’ content.
This decision arose from a referral from the Lithuanian courts and relates to national anti-corruption legislation that required publication of names of officials’ spouses or partners. The CJEU was asked to consider whether the publication of a name of a spouse or partner amounted to the processing of sensitive data because it could potentially reveal sexual orientation. The Court decided that it does. By implication, the same rule would apply to inferences connected to other types of special category data too: namely, the mere possibility that an inference could be drawn that data is special category can be sufficient to require it to be treated as special category data. In contrast, the UK position as set out in ICO guidance is that such inference must actually be drawn, or otherwise acted on as if it were true, for the data to be seen as special category. This signifies possible a divergence between the UK and EU approaches to what data is special category, as the CJEU ruling will not apply in the UK.
The judgment has several implications. Large online platforms have traditionally been able to circumvent a narrower interpretation of ‘special category’ personal data (such as health information, sexual orientation, political affiliation) (which is strictly controlled under GDPR) by triangulating and connecting large amounts of personal information through behavioural tracking to enable sensitive inferences to be drawn about individuals. This CJEU ruling indicates that this tracking seems likely to intersect with protected interests and therefore entails the processing of sensitive data. Importantly, the CJEU has said that even incorrect inferences fall under the GDPR’s special category processing requirements.
A tighter interpretation of existing EU privacy laws, therefore, poses a clear strategic threat to adtech companies and will necessitate significant changes in the use of targeted advertising. It may also have complex knock-on effects in other areas, as it may require the application of an Article 9 exemption to many more types of processing where there is potential for special category data to be inferred: for example, if an inference of religious belief could be drawn from CCTV footage of those entering a church, the judgment may make it harder to justify using CCTV around such sensitive locations.
T: +44 20 7809 2374 Email Katie | Vcard
T: +44 20 7809 2960 M: +44 7769 143 367 Email Naomi | Vcard Office: London
T: +44 20 7809 2919 M: +44 7584 237 401 Email Ben | Vcard Office: London