Abstract
Even though the Metaverse from science fiction is not a reality yet, it is possible to take a glimpse into how it might look like. However, the current vision of the Metaverse does not only encompass software. A great deal of companies is complementing their Metaverse projects with Virtual and Augmented Reality devices such as headsets and glasses. In this line, one of the last technological advancements in virtual and augmented reality devices included the introduction of eye-tracking technology. However, when new and additional kinds of data are processed, emerging risks for data protection might arise. This paper will, therefore, discuss the compatibility of eye-tracking devices for virtual and augmented reality environments with the European Union General Data Protection Regulation (GDPR). Being the GDPR considered a worldwide role model in terms of fundamental rights protection, the compatibility of such devices with one of the most severe data protection regimes will be put to the hardest test. The paper will do so by analyzing the state of the art of the technology, its use in headsets and glasses for virtual and augmented reality Metaverse environments, and the potential risks that such use might entail for data protection. After that, such risks will be confronted with the relevant applicable provisions of the GDPR. Finally, the paper will issue policy recommendations.
Similar content being viewed by others
1 Introduction
The Metaverse was born in science fiction and has ended up becoming a reality. The term was first used by Neil Stephenson in his 1992 novel Snow Crash, and it represented ‘the vision of escaping to a place where digital displaced the physical’ (Levy, 2022). This is exactly the idea behind the current development of the Metaverse: to be able to do all the things humans can do within the physical world but in virtual environments and many more, some of them perhaps unimaginable nowadays.
Even though the Metaverse is not yet a mature technology, it has been widely reportedly used in fields with high social relevance such as healthcare (Marr, 2022) and education (Cortés, 2022). However, the Metaverse experience requires an entrance door. It is not possible to fully immerse into its wonders with a computer, a tablet, or a mobile phone, for instance. New tools are needed to make the most out of new worlds. To this end, different hardware solutions have been developed such as Augmented Reality (AR) glasses or Virtual Reality (VR) headsets. In essence, AR glasses and VR headsets work differently, especially in terms of how consumed content is provided to users and how users perceive augmented and virtual environments. In AR, virtual models and holograms are overlaid on the real-world content, whereas in VR, users observe fully immersive and abstracted virtual worlds. Since both AR glasses and VR headsets are located on users’ heads in direct proximity to them, it is possible to provide tailored and smooth experiences by observing users’ states through the sensors integrated to these devices.
In this line, to further make the experience even smoother, state-of-the-art VR and AR devices have started to increasingly incorporate eye-tracking sensors and technology. Such technology allows mimicking the person’s facial expressions with their virtual avatar or providing better image quality and performance. In addition, it helps users interact with the visualizations and spaces in a hands-free way (Lystbæk et al., 2022) and personalize the content based on users’ behaviors (Plopski et al., 2022).
With increasingly more complex and modern technologies, the challenges surrounding such technologies also get more complex. From a legal perspective, the addition of eye-tracking technology, apart from a great deal of new functionalities, also entails a new set of risks. For instance, when used for emotion recognition purposes, eye-tracking technology might compromise the right to freedom of thought.Footnote 1 Further, such potential intrusion could jeopardize the right to respect mental integrity.Footnote 2 For instance, the scholarship has very recently discussed the use of eye-tracking technologies in the neuromarketing sector and the risks associated for consumer’s behavior and manipulation (Sposini, 2024). More specifically, from a privacy and data protection point of view, the use of eye-tracking technology entails an additional amount of data gathered and processed, eye-tracking data, which, in some cases, might be personal data. In addition to the novelty of the eye-tracking technology in AR and VR, different from many others in terms of sensor features, with state-of-the-art eye trackers, it is possible to get fine-grained details on the visual attention and cognition of users. Hence, this requires a particular focus on legal and privacy aspects, especially in the Metaverse, considering that users might spend there a significant amount of their daily lives.
This study is the first to focus on the specific legal implications of the use of eye-tracking technology in VR and AR devices for Metaverse environments. Further, the empirical research conducted on the available privacy policies represents a completely original contribution. Previous studies (Agencia Española de Protección de Datos, 2022; Bolognini & Carpenelli, 2022; Cerrina Feroni, 2023) have identified various ethical and legal risks of the Metaverse, but their engagement with eye tracking is limited or inexistent. The scarce literature digging into the (mostly) privacy implications of the use of eye tracking in VR and AR has a strong technical approach (Bozkir et al., 2023; Kröger et al., 2020). While this approach is necessary for data protection by design and default purposes,Footnote 3 it is not enough. Therefore, this paper will look at the impact of eye-tracking technology within VR headsets and AR glasses for Metaverse environments. It will also discuss the compatibility of eye-tracking devices in AR/VR for Metaverse experiences with the European Union (EU) data protection regulation. More specifically, the paper will analyze their compatibility with the EU General Data Protection Regulation (hereinafter, GDPR).Footnote 4 Being the GDPR considered a worldwide role model (Bradford, 2019; Cervi, 2022) in terms of fundamental rights protection, the data protection compliance of such technologies will be put to the hardest test.
This paper is structured in five parts. Part one has introduced the context and justification of the research. Part two will outline the privacy and data protection risks of eye-tracking technologies. Part three will dig deeper into the GDPR regime applied to such risks and whether they are tackled by the privacy policies of several VR and AR devices. Part four will point out some policy recommendations regarding the challenges discussed within parts two and three and part five will summarize the conclusions of the paper.
2 Privacy and Data Protection Risks of Eye-Tracking Technology
Eye-tracking hardware and software obtain and process eye-tracking data which consists of ‘eye movements and eye positions of an individual’ (Lim et al., 2022). To process eye-tracking data, AR and VR devices first capture an image of the eye. This image constitutes the raw data that, as it will be further explained, is often not further processed by eye-tracking providers due to privacy reasons, after estimating the gaze direction through a mathematical algorithm.
According to Article 4(1) GDPR,
‘personal data’ means any information relating to an identified or identifiable natural person (‘data subject’); an identifiable natural person is one who can be identified, directly or indirectly, in particular by reference to an identifier such as a name, an identification number, location data, an online identifier or to one or more factors specific to the physical, physiological, genetic, mental, economic, cultural or social identity of that natural person;
While applying a more ‘US-centric’ criteria to define personal data as personally identifiable information (Schwartz & Solove, 2011) might exclude the categorization of eye-tracking data as personal data, the Court of Justice of the European Union (CJEU) and the EU data protection scholarship, contemplate a broader notion of personal data, which might include eye-tracking data. In the Nowak v Data Protection Commissioner judgement,Footnote 5 the CJEU stated that,
[t]he use of the expression ‘any information’ in the definition of the concept of ‘personal data’, within Article 2(a) of Directive 95/46, reflects the aim of the EU legislature to assign a wide scope to that concept, which is not restricted to information that is sensitive or private, but potentially encompasses all kinds of information, not only objective but also subjective’Footnote 6
Even when such a statement refers not to the GDPR but to its predecessor known as the Data Protection Directive or DPD,Footnote 7 the same rationale applies. Eye-tracking data, on its own, might not allow the identification of a person. However, if the data processor/controller has ‘the legal means which enable it to identify the data subject with additional data […] about that person’,Footnote 8 such as the raw data, then we will be moving within the realm of personal data. Following the same reasoning and broad interpretation, the CJEU has ruled that information such as IP addressesFootnote 9 or the written answers submitted by a candidate at a professional examination and the examiner’s comments with respect to those answers were personal data.Footnote 10
The EU’s stance towards a broad notion of personal data can also be observed within Article 29 Working Party’s Opinion 4/2007 on the concept of personal data (de Hert & Papakonstantinou, 2016). The Opinion echoes the Commissions words that ‘the amended proposal meets Parliament’s wish that the definition of “personal data” should be as general as possible, so as to include all information concerning an identifiable individual.’Footnote 11 Regarding the means to identify the data subject, the Opinion states that ‘[t]he intended purpose, the way the processing is structured, the advantage expected by the controller, the interests at stake for the individuals, as well as the risk of organisational dysfunctions (e.g. breaches of confidentiality duties) and technical failures should all be taken into account.’Footnote 12 The substance for the analysis of these factors should be included within the privacy policies of eye-tracking devices. However, as it will be discussed later within this paper, the privacy policies are mostly disappointing in this respect.
According to Article 29 Working Party, ‘[o]ne relevant factor […] for assessing “all the means likely reasonably to be used” to identify the persons will in fact be the purpose pursued by the data controller in the data processing.’Footnote 13 Regarding eye-tracking data, in some cases, the goal of tracking users’ eyes is precisely to collect personal information about individuals. In others, the collected data is not gathered with that goal, but it can nonetheless be reused to make inferences about individuals. In these situations, following Article 29’s words, it can be presumed that the controller and any other relevant parties possess or intend to possess the methods “likely reasonably to be used” for the purpose of identifying the data subject. As a result, the data should be regarded as pertaining to specific individuals, and data protection regulations should apply to the processing.Footnote 14 Finally, technical measures to prevent identification play a critical role in cases when the processing aim does not include the identification of the data subject. Considering all the reasonable means that the controller or any other person could reasonably use to identify the individuals, putting in place the necessary technical and organizational measures to protect the data against identification may make the difference in determining that the individuals are not identifiable. In this instance, the fulfillment of those criteria will act as a requirement for the information to be specifically excluded from the definition of personal data.Footnote 15
If we follow Article 4(14) GDPR, eye-tracking data might also qualify as biometric data as long as they ‘allow to confirm the unique identification of that natural person’. Biometric data, according to Article 9(1), belong to what the GDPR calls ‘special categories of personal data’. Following their position on the notion of personal data, the CJEU has ruled that the notion of special categories of data should be also broadly interpreted. According to the Court,
a wide interpretation of the terms “special categories of personal data” and “sensitive data” is confirmed by the objective of Directive 95/46 and the GDPR, noted in paragraph 61 of the present judgment, which is to ensure a high level of protection of the fundamental rights and freedoms of natural persons, in particular of their private life, with respect to the processing of personal data concerning themFootnote 16
Considering such a broad interpretation of the notion of special categories of data, we could argue that, everytime that eye-tracking data allows us to confirm the unique identification of a natural person, the processing of special categories of data should be considered possible. Additionally, eye-tracking data could potentially be considered a special category of personal data as data revealing racial or ethnic origin.Footnote 17 According to Kröger and others, ‘video-based eye trackers can directly record […] skin color of a user’ (Kröger et al., 2020). Skin color is covered, according to Georgieva, under the notion of racial or ethnic origin of the GDPR (Georgieva & Kuner, 2020). Therefore, if they record the skin color of a user, eye-tracking devices will be processing special categories of data. Following Kröger and others, eye-tracking data can also reveal information regarding physical health, mental health, and substance use disorders (Kröger et al., 2020). Thus, if the eye-tracking data reveals health information, eye-tracking devices will be processing special categories of personal data according to the definition of Article 9(1) GDPR, which includes health data.Footnote 18
Such data processing is, in principle, prohibited within Europe, according to Article 9(1) GDPR. However, the regulation establishes some exceptions to allow this processing, such as consent. According to Recital 51 GDPR, when processing data under Article 9 GDPR, not only the criteria outlined in this article apply, but also the GDPR’s basic principles. This implies that the principles outlined in Article 5 GDPR apply, as well as the criteria for lawful processing. As a result, the processing of specific categories of personal data must not only be based on one of the exceptions and conditions of Articles 9(2) and 9(3) GDPR, but also on an extra concurrent legal basis stated in Article 6(1) GDPR. These can be consent, performance of a contract, compliance with a legal obligation, protecting the vital interests of the data subject or other natural persons, performance of a public task or legitimate interest.
When personal data and special categories of data are considered through the lens of eye tracking, one of the most straightforward ways of obtaining them is through iris texture, which is like a visual fingerprint. Therefore, using this type of data, it is possible to carry out authentication tasks accurately (Kumar & Passi, 2010). This is one of the main reasons most companies do not share raw eye and video data due to privacy reasons.Footnote 19 However, even if the raw data is not shared or not further processed beyond the gaze estimation task, prior research pointed out the inference likelihood of various user attributes, such as age, gender, race, body mass index, sexual preference, or health status using eye-tracking data (Liebling & Preibusch, 2014; Kröger et al., 2020). These inferences could potentially be carried out in the Metaverse as well (Bozkir et al., 2023).
However, the main difference between person or user attribute identification through raw data (e.g., iris textures) and processed eye movements (e.g., gaze directions over time) is that with the raw data, the identification is independent of the stimulus users encounter since the processing of the data is carried out at the raw eye image and video level. Furthermore, user attribute identification, such as the inference of health status or body mass index, depends on the stimulus that is viewed, as people with different characteristics explore visual stimuli in different ways, which is reflected in their gazing behaviors. For instance, there is a low likelihood that eye-tracking data from a driving task in VR reveals the sexual preferences of the users (Bozkir et al., 2023; Rieger & Savin-Williams, 2012). Therefore, it is essential to consider the stimulus users see and virtual space that users are surrounded with, when studying the privacy aspects of eye-tracking data in AR/VR.
Considering the privacy risks, a few works focused on protecting the individuals’ privacy when eye trackers are utilized. Differential privacy (Dwork, 2006) has been one of the approaches applied. It essentially uses a privacy metric to quantify the risk of an individual participating in a database, and relevant privacy protection is achieved by adding randomly generated statistical noise to the data and queries. Therefore, an adversary is never sure about whether an individual participates in a database or not. Since privacy protection is achieved by adding a significant amount of noise in the data, it comes with the cost of decreased utility. Decreased utility means that the usefulness of the data and further data mining-related tasks decreases. Therefore, it is essential to arrive at a privacy and utility trade-off, where privacy is protected yet the utility task achieves a good performance.
Previous research has shown that it is possible to achieve such a privacy-utility trade-off when eye-tracking data that are collected from AR/VR are anonymized. However, researchers also indicated that finding an optimal trade-off is not trivial (Steil et al., 2019; Bozkir et al., 2021). While differential privacy approaches preserve privacy through mathematical proofs with the cost of decreased utility, more practical approaches in the same domain focused on either data downsampling or using probabilistic machine learning techniques to mitigate privacy issues, particularly to temper person re-identification tasks (David-John et al., 2021; Fuhl et al., 2021). Alternative to the privacy protection methods discussed, one can process eye-tracking data in encrypted domains to potentially protect sensitive information from the adversaries. However, encryption and related mathematical operations often come with the cost of computational complexity and such a complexity is another challenge to address in the pervasive eye-tracking world, especially considering AR/VR spaces (Bozkir et al., 2020).
The next section of this paper discusses the regulatory regime that the GDPR puts in place to address the above mentioned questions and if and how they were reflected within the privacy policies of several AR/VR devices that embed eye trackers for accessing the Metaverse.
3 Legal Analysis
3.1 Basis for Data Processing
As stated within the previous sections of the paper, special categories of data processing are in principle prohibited by Article 9(1) GDPR. However, the second paragraph of such an article allows the processing in certain specific scenarios. Such exceptions must be additional to the existence of a lawful basis for data processing from Article 6(1). Thus, both must be met simultaneously. For this piece, we will discuss the three that are most likely to apply to the processing of eye-tracking data by AR and VR devices within Metaverse environments, namely (explicit) consent, performance of a contract and legitimate interest.
3.1.1 Consent
The most used legal basis that applies to the processing of eye-tracking data by VR/AR devices in Metaverse environments is consent. However, it cannot be completely dismissed that other bases might apply, maybe not now but in the future such as substantial public interest (Europol Innovation Lab, 2022).
According to Article 4(11) GDPR, ‘consent’ of the data subject means any freely given, specific, informed and unambiguous indication of the data subject’s wishes by which he or she, by a statement or by a clear affirmative action, signifies agreement to the processing of personal data relating to him or her’. Further, Articles 7 and 8 GDPR establish the conditions for consent and the conditions applicable to a child’s consent concerning information society services (Kosta, 2020a, b). As we will discuss later, privacy policies of AR/VR devices including eye tracking mainly contemplate consent as their lawful basis allowing for data processing.
In general, it is an opt-in/opt-out model where the user should enable eye-tracking functionality. However, regarding the processing of special categories of personal data, Article 9 requires “explicit” consent. According to Article 29 Working Party, ‘[t]he term explicit refers to the way consent is expressed by the data subject. It means that the data subject must give an express statement of consent.’Footnote 20 This could mean a written statement signed by the data subject but also ‘in the digital or online context, a data subject may be able to issue the required statement by filling in an electronic form, by sending an email, by uploading a scanned document carrying the signature of the data subject, or by using an electronic signature.’Footnote 21 The abovementioned opt-in/opt-out model seems to be more aligned with the US concept of consent looser than the GDPR’s one, particularly when regarding the processing of special categories of personal data.
On top of this, two aspects should be discussed to make such consent completely compliant with the GDPR, namely the presence of dark patterns and information duties.
3.1.1.1 Dark Patterns
According to the Spanish Data Protection Authority,
The term dark patterns refers to user interfaces and user experience implementations intended to influence people’s behaviour and decisions when interacting with websites, apps and social networks, so that they make decisions that are potentially detrimental to the protection of their personal data. (Agencia Española de Protección de Datos, 2022)
If the opt-in option is enabled for eye tracking by default, it may be excessively easy to reach personal data and/or very difficult to reject the enabling due to the design and format of the consent form. As such would require no effort or minimal reading for the data subject, the conditions of Article 4(11) or 9(2)(a) GDPR might not be applying in practice. The same might happen if the opt-out option is difficult to reach or burdensome for the data subject to accept. In this line, privacy policies for eye tracking by VR/AR devices should be carefully monitored in search of potential dark pattern practices.
3.1.1.2 Information Duties
Whenever data processing is lawful, data controllers are expected to comply with a series of information duties towards the data subject. These duties should be carefully implemented within the privacy policies of VR/AR devices including eye tracking. Article 13 GDPR establishes the kind of information to be provided where personal data are collected from the data subject and, according to Article 12 GDPR, such information shall be provided ‘in a concise, transparent, intelligible and easily accessible form, using clear and plain language, in particular for any information addressed specifically to a child.’ Children are a particularly relevant audience in this respect, since gaming in the Metaverse is highly targeted to them.
3.1.2 Performance of a Contract
In order for this lawful basis to apply, the conditions for establishing a contractual relationship between the data controllers or processors and the data subjects should be clearly established. Further, according to Sposini, ‘this provision does not cover those situations where the processing is not genuinely necessary for the performance of a contract, but rather unilaterally imposed on the data subject by the controller.’ (Sposini, 2024). Finally, any interference with the right to personal data must be balanced against the benefit such interference entails.
In the case of eye-tracking data processing by AR/VR headsets, in principle, the benefit is related to improving and enhancing the customer experience. Eye tracking allows for smoother involvement and its inclusion was aimed at the mimicking of facial expressions by virtual avatars. However, such features should be balanced against the risk eye tracking might entail for data protection. In the same line, regarding data minimisation, according to the Norwegian Data Protection Authority (DPA), data minimisation ‘stipulates proportionality’ in intervening with the data subject’s privacy (Datatilsynet, 2018). Following the less restrictive means test of the proportionality principle, Recital 39 GDPR specifies that ‘[p]ersonal data should be processed only if the purpose of the processing could not reasonably be fulfilled by other means.’ Further, according to Sposini, ‘the fact that some data processing is covered by a contract does not automatically mean that processing is necessary for its performance.’ (Sposini, 2024). Whether there are other less restrictive and equally effective technical solutions to achieve the same level of user experience without compromising fundamental rights in the field of eye tracking is an open question. However, providers should consider, regarding state-of-the-art technology, that any future developments should always advance in the most privacy-friendly and less intrusive direction.
According to several Data Protection Authorities, performance of a contract as a legal basis provides consumers with a degree of confidence when they agree to engage into a contract to use social media platforms that are intrinsically reliant on advertising, such as Facebook and Instagram.Footnote 22 Furthermore, limiting the possibility of relying on contractual necessity may cause data controllers to rely on alternative legal bases, necessitating additional investments, such as the legitimate interest, for designing and implementing appropriate safeguards that balance data controllers’ interests with data subjects’ fundamental rights (Pollicino et al., 2023). Finally, even when the performance of a contract is contemplated as a lawful basis within Article 6(1), it is not one of the exceptions provided by Article 9(2) for the processing of special categories of personal data. Considering that performance of a contract does not appear within Article 9(2) exceptions, this will not be allowed as a lawful basis in case of the processing of special categories of personal data.
3.1.3 Legitimate Interest
The legitimate interest lawful basis for data processing has also been argued as a possibility to allow the processing of data within Metaverse environments, for instance, within the education field. By way of illustration, VR headsets are used by medical students to recreate performing a surgery within a surgery room in the Metaverse. In this case, it should be considered whether the data processing by eye tracking passes the balancing test comparing the legitimate interest of the controller or third party to the rights and freedoms and interests of the individuals. Further, the principle of purpose limitation should be also considered since all the data processed should strictly serve to the legitimate interest and not to further purposes such as advertising.Footnote 23
However, it is important to consider that even when legitimate interest is contemplated within Article 6(1) as a lawful basis for personal data processing, it is not one of the exceptions provided by Article 9(2) for the processing of special categories of personal data. Considering that, for special categories of data to be processed, Article 9(2) exceptions and Article 6(1) lawful grounds for data must be met simultaneously, legitimate interest will not be allowed as a lawful basis in case of the processing of special categories of personal data. Therefore, this basis will only apply in case no special categories and/or biometric data is processed by eye-tracking devices. If that is the case, the different purposes for the processing operations should be broken down and the (potential) legitimate interest in any of them should be considered. Granularity within the data policies will help in this respect. For instance, there could be a legitimate interest within the headset/glasses calibration process, in case failing in such a process will entail inoperability of the technology. However, if less intrusive ways of doing the same operations are available, the legitimate interest lawful basis should not prevail.
The use of the legitimate interest lawful basis for data processing has been quite controversial regarding social networks (in a certain case, one of the many Metaverse purposes could be to act as a social network) since different models have been adopted by the data controllers such as performance of a contract, legitimate interest and consent. The data protection authorities have been upfront regarding the use of the legitimate interest database by social network sites particularly in relation to targeted advertising (Pollicino et al., 2023).Footnote 24 Therefore, the use of the legitimate interest lawful basis for eye-tracking data processing in Metaverse environments is an open question out of the scope of this paper. First, because the relevant stakeholders do not seem to be able to reach an agreement as to whether the activities performed within the Metaverse and the consequent data processing (in general and by eye-tracking devices) could fit within the legitimate interest lawful basis. Second, because the scholarship is almost nonexistent in this direction.
3.2 Data Security, Storage Limitation, Purpose Limitation and Data Minimisation
Regarding data security requirements, Article 32 GDPR establishes obligations on both data controllers and processors. Apart from that, the principle of storage limitation from Article 5(1)(e) GDPR states that ‘[p]ersonal data shall be kept in a form which permits identification of data subjects for no longer than is necessary for the purposes for which the personal data are processed’. In this case, the importance of the purpose of the processing should be also assessed. Privacy policies should clearly state the purposes for data processing. Some policies claim that the data will be immediately deleted. However, they also claim that, in case of a crash, accessing logs might be sent to the platform providers. Further, it should be considered whether developers of AR/VR applications also have access to eye-tracking data for development purposes. Data security requirements are especially important within the case of biometric data since, in case of a data breach, unlike other data that can be replaced such as passwords or bank account numbers, they are irreplaceable. Therefore, full anonymization should be the goal to achieve. Anonymized data are not personal data and thus do not fall under the scope of application of the GDPR.Footnote 25 However, it should be noted that, according to a growing strand of literature, as data processing technologies evolve and more data becomes accessible for processing, full anonymisation is no longer achievable (Purtova, 2018).
Regarding data protection by design and default criteria, including privacy-enhancing solutions such as differential privacy (see Sect. 2) will assist eye-tracking devices to comply with data protection by design and default criteria. This will be further explored within our third policy recommendation in Sect. 4.
According to the data minimisation principle (Article 5(1)(c) GDPR), the processing is always connected and rigorously confined to a specific purpose, hence it is impossible to give general permission to the processing of personal data. In other words, in the case of eye-tracking data specific consent is required for each purpose. Together with the principle of purpose limitation, data minimisation limits the collection and processing of personal data to what is necessary and compatible with the original purpose. Empirical technical solutions such as decreasing the granularity of the eye-tracking data by applying downsampling (David-John et al., 2021) can also help to this end, as long as downsampled data representations are coherent with purpose limitations.
As explained throughout this paper, the collection and processing of eye-tracking data by VR/AR devices for Metaverse environments can have different uses, such as calibration, improving avatar representation, or smoothing the Metaverse experience. In this line, privacy policies should establish whether eye-tracking data will be used for one or more of these uses (or different ones) and collect separate consents for each of them. Application developers should also keep in mind that the use of eye movement data beyond the initial purpose should also require a separate legal grounding and consent (Gressel et al., 2023). Further, different Metaverse experiences (such as different applications) should also comply with the data minimisation principle as far as the collection and processing of eye-tracking data are concerned. It could be the case where a user consents to the processing of their eye-tracking data for a specific game/application but is not interested in providing consent for a different one. Privacy policies that lack accurate information and therefore confuse the user to provide a wider consent than the initially intended will not be in compliance with the GDPR and the transparency and accountabilityFootnote 26 of data controllers/processors will also be jeopardized.
3.3 Data Transfers
Finally, the implications of joint controllership and cross-border data transfers should be briefly considered. Although not exclusive to eye-tracking data, such questions are crucial aspects of data protection in the context of VR and AR applications that involve multiple actors and jurisdictions. According to Article 3 GDPR, non-European organizations must comply with the GDPR if they provide virtual goods and services to users or monitor the behavior of EU-based users. Although the literature has criticised the current state of affairs relating to cross-border transfers of personal data to third countries for being disproportionate, ineffective, and unpredictable (Björn, 2023), the rules established under Chapter V of the GDPR are significantly precise in formulating the tools that allow for the transfer of personal data outside the EEA area. Furthermore, the EDPB and national regulatory bodies have been quite active in working on the interpretation and potential holes that must be remedied.Footnote 27 Cross-border data transfers to countries outside the European Economic Area (EEA) or to international organisations are critical components of the Metaverse infrastructure, since most of the companies offering Metaverse experiences are located outside of the European Union, particularly the US. Further, the “no-boundaries’’ character of the Metaverse requires further clarification of the rules governing data transfers and processing beyond the EU. In this fashion, some authors even have argued that ‘[t]he metaverse should be an exception to data transfer rules in order to facilitate functionality and interoperability in the programmed platform.’ (Martin, 2022). However, in the opinion of these authors, the Metaverse case is no so different, for instance, from that of the Internet.
The export of personal data from the EEA to third countries must comply with a set of criteria and requirements outlined in Chapter V of the GDPR. Aside from adhering to the rules outlined in Chapter V, transferring personal data to a non-EEA country or international organisation necessitates adhering to the GDPR basic processing principles, which include having an appropriate legal basis for processing, implementing the necessary security measures, and only processing the personal data required for the specific processing activity. Even if the recipient of personal data serves as a data processor, a contract must be established. According to the GDPR, there are two basic ways to transmit data outside of the EEA. The first one involves transfers based on an adequacy judgement (Article 45 GDPR). According to an evaluation by the European Commission, the third country or international organisation in question must have an “equivalent level of data protection” to that prevailing in the EEA region.Footnote 28 The second method involves transfers subject to sufficient protections (Article 46 GDPR). The “appropriate safeguards” that may be used to transfer personal data to non-EEA countries in the absence of adequacy decisions can be provided by the various transfer tools listed in Article 46(2) GDPR: standard data protection clauses (SCCs), binding corporate rules (BCRs), codes of conduct, certification mechanisms, and ad hoc contractual clauses. However, because these instruments rely on successful protection, they are frequently reviewed. Not surprisingly, according to GDPR art. 46(5), authorizations granted by National Data Protection Authorities (DPAs) under the aegis of the European Union’s first data processing legislation, Directive 95/46/EC, “remain valid until such time as the same supervisory authority amends, replaces, or revokes them, if necessary”.
The Schrems II case,Footnote 29 where the CJEU emphasized that standard contractual provisions and the other transfer tools specified in Article 46 GDPR do not function in a vacuum, is one of the most notable instances in this regard. Therefore, to determine whether the law or practice of a third country affects the effectiveness of the appropriate safeguards included in Article 46 GDPR, controllers or processors acting as exporters are responsible for doing so on a case-by-case basisFootnote 30 and, where appropriate, in cooperation with the importer in the third country. In some situations, the Court does not exclude exporters from taking additional steps to close these protection gaps and raise the level of protection to that mandated by EU law. The CJEU highlights that exporters will need to identify these steps on a case-by-case basis, even if it does not specify which ones these could be.
Additionally, the GDPR allows for data transfers based on exceptions (Article 49 GDPR). These transfers are seen as exceptional and may be used in the following situations: when made with the express consent of the individual; when required for the execution of a contract between the individual and the organization or for pre-contractual actions taken at the individual’s request; when required for the execution of a contract made in the individual’s best interests between the data controller and another party; when required for significant public interest considerations; when required for the establishment, exercise, or defence of legal claims; required to safeguard the vital interests of the concerned individual or other parties in cases where the concerned individual is physically or legally unable to give consent; or derived from a register that is intended to provide public information under EU law or national law of an EEA country (and which is accessible for public consultation by anyone with a legitimate interest in viewing the register). In addition, a form of necessity test needs to be used to determine whether the transfer is necessary to fulfil the precise goal of the derogation in question.Footnote 31
3.4 Policy Analysis
This section has a practical scope. Therefore, it focuses on concrete cases to discuss the compatibility of the use of AR/VR devices with eye-tracking functionality with the GDPR. To this end, the English versions of the privacy policies of several devices were selected. First, a distinction was made between AR and VR devices, by including multi-purpose Extended Reality (XR) devices within the VR category. For the AR device distinction, a focus is given to optical see-through displays. The focus for VR and multi-purpose XR devices is either users are fully immersed in a virtual space, or they view the real-world content through video see-through displays, respectively.
In summary, when optical see-through displays are used, users primarily observe the real-world content and digital information is overlaid in front of their real view. In contrast, video-see-through displays capture real-world content through cameras, and digital content is overlaid onto the streamed camera feed, which might lead to slight latency issues in visualizations. Beyond the technical capabilities, in terms of privacy, AR devices can be considered riskier than VR devices due to directly perceiving real-world surroundings and bystanders.
Subsequently, a further distinction was made between those devices which had an ad hoc privacy policy and those which forwarded the users to their general privacy policies. Analyses of devices in the former category can be more granular and specific while those in the latter category only allow us to consider more general aspects, as far as they apply to eye-tracking data. The analysis has focused on the categories of data processed, particularly eye-tracking data, and such data processing. The distinction between general and ad-hoc privacy policies obeys a merely practical question. In principle, the existence of an ad-hoc privacy policy allows, from a research perspective, to grasp more detailed and granular information for the purposes of this paper. However, in practice, as we will discuss, most devices lack this policy. This should not imply that the eye-tracking data processing should be obscure. Therefore, the most logical choice seems to dig into the general privacy policies. It should be considered, though, that this distinction does not have any legal implications. In this respect, Article 24(2) GDPR establishes that ‘[w]here proportionate in relation to processing activities, the measures referred to in paragraph 1 shall include the implementation of appropriate data protection policies by the controller.’ and the referred measures alluded to the appropriate technical and organizational measures that the controller should implement to demonstrate compliance with the GDPR.
Along these lines, the existence of ad-hoc policies, from a consumer/data subject perspective should make transparency and access to information easier. If a user is looking to acquire or try an AR/VR device, going to their particular privacy policy should make things easier when looking for information about how the collected data will be processed. On the other hand, when such policies are inexistent, the general privacy policy should offer clear and transparent information for the data subject to feel they are not relinquishing their right to information (Articles 12 and 13 GDPR) just by the lack of an ad-hoc privacy policy. However, in practice, as we will discuss in the following sections, this is translated in most of the policies in scattered and insufficient information, thus affecting the GDPR’s transparency principle. Finally, this also entails consequences for the data controllers. According to the principle of accountability (Article 5(2) GDPR), the controller should demonstrate compliance with a transparent processing of personal information. However, it is not very clear that compliance with the transparency principle can be ensured considering the level and quality of information that the privacy policies provide. This circumstance is further enhanced within those devices with no ad-hoc privacy policies.
To decide what eye-tracking devices for Metaverse environments should be the object of this analysis, two criteria were employed. First, VR devices were selected from the bestselling ones, according to market research.Footnote 32 Second, since there was no comparable data available for AR, the devices were selected following the framework of the paper ‘Speculative Privacy Concerns About AR Glasses Data Collection’ (Gallardo et al. 2023) and widely used optical see-through displays. Finally, the last commercial versions of the devices available were carefully chosen at the time of writing this article. Therefore, based on the above-mentioned, the following devices were selected: Magic Leap 2, HoloLens 2, Meta Quest Pro, Play Station VR2, HTC Vive PRO Eye, and Varjo XR-3. Table 1 shows a summary of the selection criteria.
3.4.1 AR Devices
Eye-tracking devices have been integrated into AR glasses, as eye-tracking data can provide various benefits such as hands-free interaction (Lystbæk et al., 2022) and human behavior understanding (Chadalavada et al., 2020). Especially in the earlier days of such devices, practitioners came up with customized and plug-in eye-tracking solutions for AR displays, such as the Microsoft HoloLens (Kassner et al., 2014; Pupil Labs, 2022). While such approaches allow for low-cost and flexible solutions, the disadvantage is that obtaining unstandardized sensor data as specs of the eye-tracking sensors likely differ from one setup to another. This makes not only the development of the solutions complicated due to the variety of sampling frequencies of the sensors, but also compliance with privacy policies.
3.4.1.1 Ad-hoc Privacy Policies
More recently, several AR device vendors have integrated eye trackers into their devices by default, including Microsoft HoloLens 2 (Microsoft, 2023), Magic Leap 2 (Magic Leap, 2023), and the newly announced Apple Vision Pro (Haeney, 2023), solving the issue of unstandardized eye-tracking data collection to a great extent. In terms of data access, while HoloLens 2 and Apple Vision Pro do not provide direct access to raw eye-image data, it is possible to get this data with Magic Leap 2 with relevant permissions.
According to their privacy policy,
Magic Leap 2 devices may collect eye tracking data including photos and videos of the eyes, pupil size and positions, gaze, vergence, center or rotation of eye, blink events, confidence of gaze and vergence, and other eye behaviors (e.g., fixation, smooth pursuit, saccades).Footnote 33
Such data is processed ‘locally, on-device’ and the device ‘does not collect, store, transfer or otherwise use eye tracking data in any manner’.Footnote 34 Furthermore, third parties such as an enterprise offering Magic Leap 2’s use or the app provider of an application installed on-device might have access or decided to collect and store the eye-tracking data. In such a case, the data is transferred directly without being collected, stored, transferred, or otherwise used by Magic Leap.
Finally, regarding the access of third parties to eye-tracking data, AR devices’ providers should guarantee that only the strictly necessary information is transferred (data minimization).Footnote 35 Third parties should also comply with the mandates of the GDPR as far as the processing of eye-tracking data is concerned. Eye-tracking applications might be particularly suitable for many scenarios in which there is shared controllership or a controller-processor connection between numerous entities processing the data. These concerns may include situations in which processors make crucial choices regarding data processing and, as a result, de facto act as controllers while “hiding” behind their role in order to avoid accountability; or situations in which processors repurpose data (for instance, data can be collected and used for visual interaction with AR environments purposes but not for estimating sensitive user attributes e.g., race, sexual preference, etc.), turning into controllers for the secondary processing. The legal regime applicable to joint controllers is stated within Article 26 GDPR. According to such article, joint controllers shall, unless and to the extent that the respective responsibilities of the controllers are determined by Union or Member State law to which the controllers are subject, determine in a transparent manner their respective responsibilities for compliance with the obligations under the GDPR, in particular as regards the exercising of the rights of the data subject and their respective duties to provide information to the data subject. Further, according to Cimina (Cimina, 2020), the CJEU jurisprudence provides further clarifications on the legal regime applicable to joint controllers: joint controllership may be established if an entity specifically influences the processing operation for its own goals, thereby taking part in the decision-making process regarding the goals and (crucial components of the) means; only inasmuch as it affects the goal and (crucial components of the) methods of the particular processing processes in which it engages can one entity be regarded as a joint controller; joint controllers shall not immediately mean equal responsibility; even in the absence of formal guidelines or instructions, the controller can nevertheless have a significant impact by helping to determine the goals and (crucial components of the) methods of the processing activity that is being carried out; and being unable to access personal information does not rule out shared controllership.
Finally, from a data controller perspective,Footnote 36 the purposes of processing and the involvement of the different third parties and their access to the data should be clearly established throughout the whole development lifecycle.
3.4.1.2 General Privacy Policies
Moving on to the AR devices with no specific privacy policy, we will discuss Hololens 2. A disclaimer should, though, be made before proceeding. The resort to general privacy policies was adopted since no device-specific privacy policies were found. This does not automatically imply that such privacy policies do not (always) exist. It only implies that they are not publicly available to the knowledge of the authors of this piece. This lack of public access already raises some transparency challenges since consumers will not be able to make a properly informed purchase, at least from a privacy and data protection point of view, if access to this information is not publicly available. According to McDonald and Cranor, ‘[p]rivacy policies should help reduce information asymmetries because companies share information with their customers.’ (McDonald & Cranor, 2008). However, as extensive research has shown, privacy policies have become, especially within the last few years after the introduction of privacy and data protection regulations, into long and harder-to-read pieces (Wagner, 2022).Footnote 37 They seem to be designed to discourage the reader rather than for informing and educating. As McDonald and Cranor pointed out, ‘[i]f people feel less benefit reading policies than they perceive cost of reading them, it stands to reason people will choose not to read privacy policies.’ (McDonald & Cranor, 2008). This is also important considering that eye-tracking technology, as shown within this paper, is a complex technology.
Further, the authors of the piece requested the abovementioned specific privacy policies with no answer, receiving the answer that there were no specific policies or with the suggestion to consult the advertisement pages of the product that provided no further clarification within this respect.
As far as HoloLens 2 is concerned, the privacy-related information regarding eye tracking cannot be found within their general privacy policy,Footnote 38 but within the developers’ documentation regarding the introduction of eye tracking within HoloLens 2.Footnote 39 This might entail some problems from a privacy perspective since the (potential) consumer has to make an extra effort if they are interested in the privacy features within the HoloLens 2. Instead of going to the privacy policy (the first reasonable place where one would go if they are interested in the privacy features of a product), they need to go to the developers’ documentation (a place not so straightforward to be consulted by common people). Even if they do so, the information within such a page is scarce:
[t]he Eye Tracking API has been designed with a user’s privacy in mind; it avoids the passing of any identifiable information, particularly any biometrics. For eye-tracking capable applications, the user needs to grant the app permission to use eye-tracking information.
First, the term passing is ambiguous from a data protection perspective and it is unclear whether data transfer or data processing was meant. Second, regarding eye-tracking capable applications, consent needs to be given by the data subject for the information to be used. However, in case such consent is given, what data is exactly collected, processed and used and for which purposes? The sentence does not go far, leaving all these questions unsolved with the consequent uncertainty for the data subject who is still expected to consent to the use of eye-tracking information for capable applications. Such consent does not seem to comply with the GDPR requirements as discussed within this paper.
3.4.2 VR Devices
Eye-tracker device evolution followed a similar trend in VR as it is in AR, starting with customized solutions that could be plugged into existing VR devices in the earlier days, such as the case of HTC Vive (Kassner et al., 2014; Pupil Labs, 2022). Later, different device vendors integrated eye trackers in their devices by default, such as HTC, Meta, HP, Pico, Varjo, and Fove. For instance, HTC has integrated Tobii eye trackers in their HTC Vive Pro Eye headset. In the HTC Vive Pro Eye device, it is possible to track users’ eyes with a frequency of up to 120 Hz and an accuracy between 0.5° and 1.1° (HTC Corporation, 2023). Due to privacy issues, practitioners can only access gaze directions and pupil sizes. Like HTC, HP and Pico have also integrated Tobii eye trackers into their HP Reverb G2 Omnicept (HP Development Company, L.P., 2023), and Neo 2 Eye and Neo 3 Pro Eye headsets (Tobii, 2022a; Tobii, 2022b) with frequencies up to 120 Hz and 90 Hz, respectively. Due to privacy reasons, raw eye images are not provided from these devices as well. Meta Quest Pro, a VR device from Meta, also integrates eye trackers into their devices (Meta, 2022a). Meta states that images of the eyes are only used for estimating the gaze direction and are deleted after the processing. In addition, it is explicitly stated that eye images never leave the device, meaning that neither Meta nor third-party applications can access raw data that can directly identify users (Meta, 2022b).
Despite the attempts to protect privacy by limiting access to raw data, there is a trade-off between privacy and utility in terms of the quality of gaze estimation task. Many state-of-the-art gaze estimation methods work in an end-to-end data-driven fashion, meaning that these methods rely on eye images and videos, not handcrafted features or intermediate feature representations. Therefore, raw eye images and videos are often preferred and required. Considering this, several device manufacturers followed a different path, such as Varjo and Fove. For example, Varjo VR-3 and XR-3, high-end VR and XR devices of Varjo, utilize eye trackers up to 200 Hz while providing raw image and video data per each eye (Varjo, 2023a; Varjo, 2023b) through their software (Varjo Developer, 2023). Similarly, Fove 0 of Fove VR also provides users with the opportunity to record raw eye image data (Fove Inc., 2023) along with other eye-related information such as pupillometry and gaze vectors. Considering the discrepancies in the specs of the VR headsets and the expectations that VR space would continue growing further with the Metaverse discussions soon, privacy risks should be very carefully considered both from a technical and legal point of view.
3.4.2.1 Ad-hoc Privacy Policies
Among the VR devices incorporating eye-tracking technology for Metaverse environments, only the Meta Quest Pro has a dedicated privacy policy. According to META’s ‘Eye Tracking Privacy Note’,
the raw image data is processed in real time on your headset and deleted once processing has been completed. We do not collect or store raw image data from the eye tracking feature on Meta servers. The abstracted gaze data is generated in real time on your headset and processed on device or Meta servers […] If you choose to calibrate eye tracking, the calibration data is stored on your device until you’ve chosen to delete this data in your device settings or deleted your account.Footnote 40
Further, according to the same document, ‘[r]aw image data and calibration data are not shared with apps, even if you choose to allow access to eye tracking data.’ Therefore, such data will be only processed momentarily on the device. As previously mentioned, raw image data qualifies as biometric thus special categories of data as long as it allows the unique identification of a person, and as special categories of data as long as it reveals racial or ethnic origin or concerns health.Footnote 41 Further, the processing operation occurs irrespective of the deletion of the information or not, although this last option is much more desirable.
Finally, the policy does not provide for the automatic deletion of the calibration data, but places this task on the user who has to choose whether to delete them. This provision might collide with the data protection by design and default requirements from Article 25 GDPR. According to the second paragraph of such article,
The controller shall implement appropriate technical and organisational measures for ensuring that, by default, only personal data which are necessary for each specific purpose of the processing are processed. That obligation applies to the amount of personal data collected, the extent of their processing, the period of their storage and their accessibility. In particular, such measures shall ensure that by default personal data are not made accessible without the individual’s intervention to an indefinite number of natural persons.
Placing the task of the deletion of the data in the user’s hands is contrary to the obligation to ensure by default that only the personal data which are necessary are processed, particularly regarding the period of their storage. In this regard, automatic deletion of the data once the calibration process has been finalized, will be more in compliance with Article 25 GDPR.
3.4.2.2 General Privacy Policies
Moving to VR devices with a general privacy policy, we will now examine Play Station VR2, HTC Vive PRO Eye and Varjo XR-3. Regarding Play Station VR2,Footnote 42 there is no mention of the processing of eye-tracking data. As previously pointed out, this makes the privacy-aware consumer’s choice extremely difficult. As far as HTC Vive PRO Eye is concerned, the most relevant part of their general privacy policy is the one referring to ‘Usage Data and Error Reporting’. According to this section under the heading ‘Information we automatically collect’,
[i]n addition to the information we automatically collect, […] we may collect more detailed de-identified data about your device usage and error report data about your device […] HTC may re-identify this data when appropriate; for example, when you request technical support, choose to use a specific HTC service or app, or create an HTC Account. […] Usage and error data settings will not affect HTC’s collection of de-identified activation data or data about specific HTC apps and services you choose to use, including HTC Account, and are otherwise subject to limitations described in this Policy.
Following this section, HTC may re-identify the collected data when appropriate. This sentence, again, is vague. Do they mean they re-identify an individual from the collected data? Or is it the data they claim they will re-identify? Further, even if they give some examples about what is considered appropriate, they do not establish a definition that allows the reader to discern other possible reidentification scenarios. Also, since the privacy policy is deliberately vague, we cannot know if eye-tracking data are included among this potentially re-identified data or not. The section continues,
[o]thers, including third party analytics service providers and third party ad service providers, […] may also automatically collect the same or similar information about you and your device when you use the Services, including personally identifiable information about your online activities over time and across third party website and applications.
The risk of third-party access to potentially sensitive information arises again. In this case, the policy explicitly mentions that we can be dealing with personally identifiable information. However, again, we cannot truly discern if they are referring to eye-tracking data or other kinds of data.
Finally, as far as Varjo XR-3 is concerned, the information about the data collected included within the general privacy policy was ‘name and contact information, other personal data necessary for maintaining the Business Partner relationship, information related to account registration, other information about subscriptions, electronic identification and behavior data’.Footnote 43 The only information that can be more or less related to the Varjo XR-3 is the fact that the headset serial number will be also collected. Further, the privacy policy states that ‘[w]e do not collect or process sensitive personal data (personal data of special categories).’Footnote 44 The blatant lack of information on the policy is an obstacle to any further reasoning on the topic. However, considering the functionalities of the device, the abovementioned broad interpretation of both personal data and special categories of data and the fact that the policy declares the processing of behavior data, among others, there is room to discuss whether there is no processing of data of special categories at all. In this respect, further research and a more open stance on the company’s behalf will be needed.
4 Policy Recommendations
This paper has reviewed the main privacy and data protection issues of the use of eye tracking within VR/AR devices for Metaverse environments stemming from their privacy policies. Further, these issues have been confronted with some of the applicable GDPR provisions. From such an analysis, a series of recommendations will be proposed within the following section. Such recommendations aim to adopt more privacy-enhancing approaches and follow data protection by design and default requirements from Article 25 GDPR.
First, a new tendency towards differential privacy approaches for eye tracking has been spotted. Technical research is moving towards more privacy-aware attitudes and, consequently, a new stream of literature proposing differential privacy is currently developing (Bozkir et al., 2021; Steil et al., 2019). However, as stated in part two of this paper, the privacy-utility trade-off is one of the weak spots of differential privacy. As a reaction, the technical scholarship has proposed other solutions such as data downsampling or using machine learning. Therefore, more research should be encouraged on privacy-enhancing techniques to solve the privacy-utility conundrum without sacrificing the former. In addition to privacy-enhancing methods that add noise to preserve privacy, like differential privacy, private data representations that do not reveal user identities yet provide good performance in other tasks could be researched to preserve privacy in a similar way to cryptography, but in a computationally less expensive way, as the real-time working principle is essential for a good user experience in VR/AR.
Second, along the same line, more interdisciplinary approaches to the use of eye tracking in AR/VR devices to access Metaverse environments should be encouraged. Technical research does not happen in a societal vacuum. Privacy and data protection regulations, such as the GDPR, but also forthcoming regulations such as the EU Artificial Intelligence Act (AI Act),Footnote 45 will limit the technical capacities, reaching, in the most extreme cases, banning certain technologies or uses of such technologies. In fact, Article 3 of the AI Act, contemplates within the definition of AI systems ‘a machine-based system […] that, […] infers, […] how to generate outputs such as predictions, content, recommendations, or decisions that can influence physical or virtual environments [emphasis added]’.Footnote 46 Along the same lines, Article 5 AI Act, prohibits AI systems employing manipulative or exploitative techniques.Footnote 47 Such practices, as established within Recital 29 of the regulation, ‘could be facilitated, for example, by […] virtual reality as they allow for a higher degree of control of what stimuli are presented to persons, insofar as they may materially distort their behaviour in a significantly harmful manner.’ Furthermore, Article 5(1)(f) prohibits the ‘use of AI systems to infer emotions of a natural person in the areas of workplace and education institutions, except where the use of the AI system is intended to be put in place or into the market for medical or safety reasons.’ As previously stated, eye-tracking devices could be used for accessing Metaverse workplace and educational environments. In those cases, when eye-tracking data is used to infer emotions, it will be contrary to Article 5(1)(f). Finally, the use of biometric categorisation systems are either prohibited or considered high-risk. If eye-tracking devices act in this capacity (see Sect. 3), they should comply with the AI Act corresponding regime (either banning or Article 16 and following AI Act).
Eye-tracking providers and developers should work hand in hand with regulators and policymakers to foster the development of more compliant solutions. On the other hand, legal operators should be more in contact with the industry to understand the nature of these technologies, the technological advantage they might provide, the challenges they face and the new lines of research they are working on. Inviting stakeholders will help ensure that the decisions consider the complexities of real-world uses of those technologies.
Third, access to data by third parties should be carefully monitored as to what kind of data is shared with such parties and under which legal basis. This is especially important in the case of special categories of data since, as previously stated, such data merit further protection. Additionally, within the case of biometric data, the consequence of a potential biometric data breach could be unfixable. Because of this, data protection by design and default criteria, including data security, data minimisation and storage/purpose limitation measuresFootnote 48 should be scrupulously implemented. Further, third parties in question should also ensure compliance with the GDPR and do not solely rest responsibility within the hands of the eye-tracking providers.
Fourth, since there is no clear pronouncement from the supranational Data Protection Authorities (namely, the European Data Protection Board and or European Data Protection Supervisor) regarding whether eye-tracking data and eye images used by eye-tracking headsets and glasses constitute personal data, special categories of data and or biometric data, guidelines coming from them will help to ease the uncertainty for data subjects and eye-tracking providers. Considering the increasing attention that Metaverse environments and AR/VR headsets are gaining, this should be put within the priority list of Data Protection Authorities.
Fifth, privacy policies for eye tracking within VR/AR devices to access Metaverse environments should be transparent, readable and not overwhelming. Information should be provided clearly but acknowledging at the same time the complexity of eye-tracking technology and the risks in the case of processing biometric and/or special categories of data. Special attention should be paid to children. As the gaming industry is one of the most benefited from the opportunities posed by VR/AR and Metaverse environments, children are expected to be one of the target customers of VR/AR devices with eye-tracking functionality. Even though their parents are expected to be involved and manage the privacy and data protection requirements of the products acquired for their children, children should also be able to read and clearly understand the content of privacy policies.
Regarding privacy policies, one of the main aspects that this piece has highlighted is the blatant insufficiency of general privacy policies. Such policies are extremely wide and vague, and they do not allow the data subject to make an informed decision from a consumer perspective. Along the same line, the lack of ad hoc or insufficiency of general privacy policies in some devices may affect the validity of the consent as a legal basis, due to a lack of information and transparency when providing certain information. Thus this might affect the accountability of the data controllers and processors for a lack of compliance with the requirements of the GDPR.
Along the same lines, no dark pattern practices should be allowed, and legal operators should carefully monitor privacy policies and consent forms in search of wrong practices. Otherwise, consent using eye-tracking technology would not meet the conditions imposed by the GDPR, and the legal basis for data processing would be unlawful. Accordingly, the data processing by eye-tracking devices will not follow the GDPR and, consequently, will be prohibited.
Finally, the use of the legitimate interest legal basis for data processing should be further explored by the relevant stakeholders, including scholars. Such a lawful basis might allow eye-tracking data processing, for instance, within certain Metaverse environments such as for educational purposes.
5 Conclusions
The links to the consulted privacy policies can be found within the footnotes and references of this paper.
This paper has discussed the use of VR/AR eye-tracking devices for Metaverse environments in light of the EU GDPR. First, it has analyzed the main privacy and data protection implications that the use of eye-tracking technologies entails from both a technical and legal perspective. Second, it has dug deeper into the nature of such technologies and the technological advantage entailed by VR/AR devices with the addition of eye tracking. After that, the privacy policies of six AR/VR devices have been discussed. Such privacy policies were ad-hoc for eye-tracking devices in some cases and general in the vast majority. Consequently, there is a lack of transparency on some privacy policies concerning the kind of data that is being collected and processed. In general, it can be argued that eye-tracking data can qualify as special categories of data if they reveal information about ethnic origin, are biometric data to uniquely identify a natural person or data concerning health. If that is the case, their processing is in principle prohibited unless one of the legal bases from Article 9(2) and a condition from Article 6(1) GDPR apply. Such bases could mainly be, consent on the one hand, and legitimate interest and performance of a contract for non-special categories of data on the other. For consent to apply, specific attention should be put to dark patterns and information duties and privacy policies in general. Finally, the paper proposes six policy recommendations including the need for more interdisciplinary research on privacy-enhancing techniques to solve the privacy-utility conundrum; careful monitoring of access to data by third parties, including data security and minimisation requirements; more guidance from supranational Data Protection Authorities on the processing of eye-tracking data; and more attention when designing privacy policies, especially for children.
Data availability
Data is provided within the manuscript or supplementary information files.
Notes
See Article 10 Charter of Fundamental Rights of the European Union.
Article 3(1) Charter of Fundamental Rights of the European Union.
See Article 25 GDPR.
Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation).
C-434/16, Peter Nowak v. Data Protection Commissioner ECLI: EU: C:2017:994.
Paragraph 34.
Directive 95/46/EC of the European Parliament and of the Council of 24 October 1995 on the protection of individuals with regard to the processing of personal data and on the free movement of such data.
C‑582/14 Patrick Breyer v. Bundesrepublik Deutschland [2016] ECLI: EU: C:2016:779, Paragraph 49.
ibid.
(n 5).
COM (92) 422 final, 28.10.1992, p. 10 (commentary on Article 2).
Page 15.
Page 16.
ibid.
Page 17.
C‑184/20, Vyriausioji tarnybinės etikos komisija v. Fondas ‘Nevyriausybinių organizacijų informacijos ir paramos centras ECLI: EU: C:2022:601, Paragraph 125. See also C‑101/01, Bodil Lindqvist EU: C:2003:596, paragraph 50.
Article 9(1) GDPR.
For a more extensive review on the nature of eye-tracking data as personal and biometric data, see (Sposini, 2024).
See, for instance, ‘Eye Tracking Privacy Notice | Meta Store’ https://www.meta.com/en-gb/help/quest/articles/accounts/privacy-information-and-settings/eye-tracking-privacy-notice/ accessed 27 October 2023.
Article 29 Working Party Guidelines on consent under Regulation 2016/679 Adopted on 28 November 2017 As last Revised and Adopted on 10 April 2018 p. 18.
ibid.
See In the matter of the General Data Protection Regulation DPC Inquiry Reference: IN-18-5-5 In the matter of LB, a complainant, concerning a complaint directed against Meta Platforms Ireland Limited (formerly Facebook Ireland Limited) in respect of the Facebook Service Decision of the Data Protection Commission made pursuant to Section 113 of the Data Protection Act, 2018 and Articles 60 and 65 of the General Data Protection Regulation Further to a complaint-based inquiry commenced pursuant to Section 110 of the Data Protection Act, 2018; In the matter of the General Data Protection Regulation.
DPC Inquiry Reference: IN-18-5-7 In the matter of TSA, a complainant, concerning a complaint directed against Meta Platforms Ireland Limited (formerly Facebook Ireland Limited) in respect of the Instagram Service Decision of the Data Protection Commission made pursuant to Section 113 of the Data Protection Act, 2018 and Articles 60 and 65 of the General Data Protection Regulation Further to a complaint-based inquiry commenced pursuant to Section 110 of the Data Protection Act 2018; Garante per la protezione dei dati personali, Provvedimento del 7 luglio 2022 [9788429] and European Data Protection Board, Binding Decision 1/2023 on the dispute submitted by the Irish SA on data transfers by Meta Platforms Ireland Limited for its Facebook service (Art. 65 GDPR) Adopted on 13 April 2023.
For an overview on the use of eye-tracking for neuromarketing purposes, see (Sposini, 2024).
ibid.
See Recital 26 GDPR.
See Article 5(2) GDPR.
See, for instance European Data Protection Board (2023) Guidelines 5/2021 on the interplay between the application of Article 3 and the provisions on international transfers as per Chapter V of the GDPR.
So far the EC has adopted adequacy decisions for: Andorra, Argentina, Canada (commercial organisations), Faroe Islands, Guernsey, Israel, Isle of Man, Japan, Jersey, New Zealand, Republic of Korea, Switzerland, United Kingdom, United States (commercial organisations participating in the EU-US Data Privacy Framework), and Uruguay. See https://www.edpb.europa.eu/sme-data-protection-guide/international-data-transfers_en.
C-311/18 Data Protection Commissioner v. Facebook Ireland Ltd, Maximillian Schrems [2020] ECLI: EU: C:2020:559.
European Data Protection Board (2020) Recommendations 01/2020 on measures that supplement transfer tools to ensure compliance with the EU level of protection of personal data.
Jonah Trenker, AR & VR: market data & analysis, Market Insights by Statista (September 2023).
‘Magic Leap 2 Supplemental Terms and Conditions’ https://www.magicleap.com/eye-tracking accessed 25 October 2023.
ibid.
Article 5(1)(c) GDPR.
Article 4(7) GDPR.
See also ‘I Tried to Read All My App Privacy Policies. It Was 1 Million Words.’ (Washington Post, 31 May 2022) https://www.washingtonpost.com/technology/2022/05/31/abolish-privacy-policies/ accessed 24 July 2023; Marcus Moretti Naughton Michael, ‘Why Privacy Policies Are So Inscrutable’ (The Atlantic, 5 September 2014). https://www.theatlantic.com/technology/archive/2014/09/why-privacy-policies-are-so-inscrutable/379615/ accessed 24 July 2023 and Kevin Litman-Navarro, ‘Opinion | We Read 150 Privacy Policies. They Were an Incomprehensible Disaster.’ The New York Times (12 June 2019) https://www.nytimes.com/interactive/2019/06/12/opinion/facebook-google-privacy-policies.html accessed 24 July 2023.
‘Microsoft Privacy Statement – Microsoft Privacy’ https://privacy.microsoft.com/en-us/privacystatement accessed 4 November 2023.
‘Eye Tracking Overview - Mixed Reality’ (3 March 2023) https://learn.microsoft.com/en-us/windows/mixed-reality/design/eye-tracking accessed 4 November 2023.
‘Eye Tracking Privacy Notice | Meta Store’ https://www.meta.com/en-gb/help/quest/articles/accounts/privacy-information-and-settings/eye-tracking-privacy-notice/ accessed 27 October 2023.
See Recital 10 GDPR.
To complete the full picture regarding the use of eye-tracking within Play Station VR2, the privacy policy of Tobii Eye Tracker 5 was also analyzed. Again, their privacy policy was general (not ad hoc) and contained no reference to the processing of eye-tracking data. A request for a more specific privacy policy for Tobii Eye Tracker 5 was sent, but no answer was obtained.
‘Privacy Policy’ (Varjo.com) https://varjo.com/privacy-policy/ accessed 26 October 2023.
ibid.
REGULATION (EU) OF THE EUROPEAN PARLIAMENT AND OF THE COUNCIL laying down harmonised rules on artificial intelligence (and amending Regulations (EC) No 300/2008, (EU) No 167/2013, (EU) No 168/2013, (EU) 2018/858, (EU) 2018/1139 and (EU) 2019/2144 and Directives 2014/90/EU, (EU) 2016/797 and (EU) 2020/1828, (Artificial Intelligence Act) (last published version from 6 March 2024).
Paragraph 1.
Paragraph 1(a) and (b).
See Articles 5(1)(b) and (d) GDPR.
References
AEPD. (2022, September). Metaverse and privacy. Retrieved July 17, 2023, from https://www.aepd.es/en/prensa-ycomunicacion/blog/metaverse-and-privacy
Article 29 Working Party Guidelines on consent under Regulation 2016/679 Adopted on 28 November 2017 As last Revised and Adopted on 10 April 2018.
Björn, L. (2023). Regulating access and transfer of data. Cambridge University Press.
Bolognini, L., & Carpenelli, M. E. (2022). The future of personal data in the Metaverse. Zenodo.
Bozkir, E., Ünal, A. B., Akgün, M., Kasneci, E., & Pfeifer, N. (2020). Privacy preserving gaze estimation using synthetic images via a randomized encoding based framework. In ACM Symposium on Eye Tracking Research and Applications, Stuttgart, Germany, 2–5 June 2020.
Bozkir, E., Günlü, O., Fuhl, W., Schaefer, R. F., & Kasneci, E. (2021). Differential privacy for eye tracking with temporal correlations. Plos One, 16(8), e0255979. https://doi.org/10.1371/journal.pone.0255979
Bozkir, E., Özdel, S., Wang, M., David-John, B., Gao, H., Butler, K., Jain, E., & Kasneci, E. (2023). Eye-tracked virtual reality: A comprehensive survey on methods and privacy challenges. Preprint at https://doi.org/10.48550/arXiv.2305.14080
Bradford, A. (2019). The Brussels effect: How the European Union rules the world. Oxford University Press.
Cervi, G. V. (2022). Why and how does the EU Rule Global Digital Policy: An empirical analysis of EU Regulatory Influence in Data Protection laws. DISO, 1(2), 18. https://doi.org/10.1007/s44206-022-00005-3
Chadalavada, R. T., Andreasson, H., Schindler, M., Palm, R., & Lilienthal, A. J. (2020). Bi-directional navigation intent communication using spatial augmented reality and eye-tracking glasses for improved safety in human–robot interaction. Robotics and Computer-Integrated Manufacturing, 61, 101830. https://doi.org/10.1016/j.rcim.2019.101830
Cimina, V. (2020). The data protection concepts of ‘controller’, ‘processor’ and ‘joint controllership’ under Regulation (EU) 2018/1725. ERA Forum (2021) 21:639–654 https://doi.org/10.1007/s12027-020-00632-8
Cortés, M. (2022). Analyses and insights on the potential impact of the metaverse on the education sector. Universitat Oberta de Catalunya.
Datatilsynet. (2018). Artificial intelligence and privacy.
David-John, B., Hosfelt, D., Butler, K., & Jain, E. (2021). A privacy-preserving approach to streaming eye-tracking data. IEEE Transactions on Visualization and Computer Graphics, 27(5), 2555–2565. https://doi.org/10.1109/TVCG.2021.3067787
de Hert, P., & Papakonstantinou, V. (2016). The new General Data Protection Regulation: Still a sound system for the protection of individuals? Computer Law & Security Review 32(2):179–194. https://doi.org/10.1016/j.clsr.2016.02.006
Dwork, C. (2006). Differential privacy. In M. Bugliesi, B. Preneel, V. Sassone, & I. Wegener (Eds.), Automata, languages and Programming (Vol. 4052, pp. 1–12). Springer.
European Data Protection Supervisor. (2019, January). Technology report No 1 smart glasses and data protection. Brussels.
European Data Protection Board. (2023, April). ‘Guidelines 05/2022 on the Use of Facial Recognition Technology in the Area of Law Enforcement’ version 2.0.
Europol. (2022). Policing in the Metaverse: What law enforcement needs to know. Retrieved July 17, 2023 from https://www.europol.europa.eu/publications-events/publications/policing-in-metaverse-what-law-enforcement-needsto-know.
Feroni, G. C. (2023). Il Metaverso Tra Problemi Epistemologici, Etici e Giuridici– MediaLaws. Retrieved July 17, 2023, from https://www.medialaws.eu/rivista/il-metaverso-tra-problemi-epistemologici-etici-e-giuridici/
Fove Inc. (2023). FOVE VR platform powering next generation VR & eye tracking applications. Retrieved July 26, 2023, from https://fove-inc.com/product/fove-vr-platform/
Fuhl, W., Bozkir, E., & Kasneci, E. (2021). Reinforcement learning for the privacy preservation and manipulation of eye tracking data. In I. Farkaš, P. Masulli, S. Otte, & S. Wermter (Eds.), Artificial neural networks and machine learning– ICANN 2021 (Vol. 12894, pp. 595–607). Springer.
Gallardo, A., Choy, C., Juneja, J., Bozkir, E., Cobb, C., Bauer, L., & Cranor, L. (2023). Speculative privacy concerns about AR glasses data collection. In Proceedings on Privacy Enhancing Technologies, Lausanne, Switzerland, 10–15 July 2023.
General Privacy Policy for the Tobii Group. Retrieved November 9, 2023, from https://www.tobii.com/company/privacy-policy
Georgieva, L., & Kuner, C. (2020). Article 9 Processing of special categories of personal data. In C. Kuner, L. A. Bygrave, C. Docksey, & L. Drechsler (Eds.), The EU General Data Protection Regulation (GDPR): A commentary. Oxford University Press.
Gressel, C., Overdorf, R., Hagenstedt, I., Karaboga, M., Lurtz, H., Raschke, M., & Bulling, A. (2023). Privacy-aware eye tracking: Challenges and future directions. IEEE Pervasive Computing, 22(1), 95–102. https://doi.org/10.1109/MPRV.2022.3228660
Haeney, D. (2023). Apple vision pro apps aren’t allowed raw access to the cameras. In App Development. Available via UploadVR. Retrieved July 26, 2023, from https://www.uploadvr.com/apple-vision-pro-apps-dont-get-access-to-the-cameras
HP Development Company, L.P. (2023). HP Omnicept & HP Reverb G2 Omnicept Edition. Retrieved July 26, 2023, from https://www.hp.com/us-en/vr/reverb-g2-vr-headset-omnicept-edition.html
HTC Corporation. (2023). HTC vive pro eye specs. Retrieved July 26, 2023, from https://www.vive.com/sea/product/vive-proeye/specs/
Kassner, M., Patera, W., & Bulling, A. (2014). Pupil: An open source platform for pervasive eye tracking and mobile gaze-based interaction. In Proceedings of the 2014 ACM international joint conference on pervasive and ubiquitous computing: Adjunct publication, Seattle, WA, USA, 13–17 September 2014.
Kosta, E. (2020a). Article 7 conditions for consent. In C. Kuner, L. A. Bygrave, C. Docksey, & L. Drechsler (Eds.), The EU General Data Protection Regulation (GDPR): A commentary. Oxford University Press.
Kosta, E. (2020b). Article 8 conditions applicable to child’s consent in relation to information society services. In C. Kuner, L. A. Bygrave, C. Docksey, & L. Drechsler (Eds.), The EU General Data Protection Regulation (GDPR): A commentary (p. 0). Oxford University Press.
Kröger, J. L., Lutz, O. H. M., & Müller, F. (2020). What does your gaze reveal about you? On the privacy implications of eye tracking. In M. Friedewald, M. Önen, E. Lievens, S. Krenn, & S. Fricker (Eds.), Privacy and identity management. Data for better living: AI and privacy: 14th IFIP WG 9.2, 9.6/11.7, 11.6/SIG 9.2.2 International Summer School, Windisch, Switzerland, August 19–23, 2019, Revised Selected Papers(pp. 226–241). Cham: Springer International Publishing.
Kumar, A., & Passi, A. (2010). Comparison and combination of iris matchers for reliable personal authentication. Pattern Recognition, 43(3), 1016–1026. https://doi.org/10.1016/j.patcog.2009.08.016
Levy, S. (2022). Neal Stephenson Named the Metaverse. Now, He’s Building It. Wired.
Liebling, D. J., & Preibusch, S. (2014). Privacy considerations for a pervasive eye tracking world. In Proceedings of the 2014 ACM International Joint Conference on Pervasive and Ubiquitous Computing: Adjunct Publication, Seattle, WA, USA, 13–17 September 2014.
Lim, J. Z., Mountstephens, J., & Teo, J. (2022). Eye-Tracking feature extraction for biometric machine learning. Frontiers in Neurorobotics, 15, 796895. https://doi.org/10.3389/fnbot.2021.796895
Lystbæk, M. N., Rosenberg, P., Pfeuffer, K., Grønbæk, J. E., & Gellersen, H. (2022). Gaze-hand alignment: Combining eye gaze and mid-air pointing for interacting with menus in augmented reality. Proceedings of the ACM on Human-Computer Interaction, 6(ETRA), 1–18. https://doi.org/10.1145/3530886
Magic Leap (2023). Magic Leap 2 devices. Retrieved November 08, 2023, from https://www.magicleap.com/ml2-devices
Marr, B. (2022). The amazing possibilities of healthcare in the metaverse. In Forbes. Retrieved July 17, 2023, from https://www.forbes.com/sites/bernardmarr/2022/02/23/the-amazing-possibilities-of-healthcare-in-the-metaverse/
Martin, B. (2022). Privacy in a programmed platform: How the General Data Protection Regulation Applies to the Metaverse. Harvard Journal of Law & Technology, 36(1).
McDonald, A. M., & Cranor, L. F. (2008). The cost of reading privacy policies.
Meta. (2022b). Eye tracking privacy notice| Meta Store. Retrieved July 17, 2023, from https://www.meta.com/help/quest/articles/accounts/privacy-information-and-settings/eye-tracking-privacy-notice/
Meta. (2022a). Eye tracking on Meta Quest Pro. Retrieved July 26, 2023, from https://www.meta.com/en-gb/help/quest/articles/getting-started/getting-started-with-quest-pro/eye-tracking/
Microsoft. (2023). Eye tracking on HoloLens 2. Retrieved July 26, 2023, from https://learn.microsoft.com/en-us/windows/mixed-reality/design/eye-tracking
Plopski, A., Hirzle, T., Norouzi, N., Qian, L., Bruder, G., & Langlotz, T. (2022). The eye in extended reality: A survey on gaze interaction and eye tracking in head-worn extended reality. ACM Computing Surveys, 55(3), 1–39. https://doi.org/10.1145/3491207
Pollicino, O. & De Gregorio, G. (2023) European Data Protection and Social Media: The Quest for Consistency in the Internal Market, February 6 2023, Medialaws
Pupil Labs. (2022). VR/AR, Introduction. Retrieved July 18, 2023, from https://docs.pupil-labs.com/vr-ar/
Purtova, N. (2018). The law of everything. Broad concept of personal data and future of EU data protection law. Law Innovation and Technology, 10(1), 40–81. https://doi.org/10.1080/17579961.2018.1452176
Rieger, G., & Savin-Williams, R. C. (2012). The eyes have it: Sex and sexual orientation differences in pupil dilation patterns. Plos One, 7(8), e40256. https://doi.org/10.1371/journal.pone.0040256
Schwartz, P. M., & Solove, D. J. (2011). The PII problem: Privacy and a new concept of personally identifiable information.
Sposini, L. (2024). Neuromarketing and eye-tracking technologies under the European Framework: Towards the GDPR and Beyond. Journal of Consumer Policy. https://doi.org/10.1007/s10603-023-09559-2
Steil, J., Hagestedt, I., Huang, M. X., & Bulling, A. (2019). Privacy-aware eye tracking using differential privacy. In Proceedings of the 11th ACM Symposium on Eye Tracking Research & Applications, Denver, CO, USA, 25–28 June 2019.
Tobii (2022a). Pico Neo 2 Eye. Retrieved July 26, 2023, from https://www.tobii.com/products/integration/xr-headsets/deviceintegrations/pico-neo-2-eye
Tobii (2022b). Pico Neo 3 Pro Eye. Retrieved July 26, 2023, from https://www.tobii.com/products/integration/xr-headsets/deviceintegrations/pico-neo-3-pro-eye
Varjo Developer. (2023). Varjo Native SDK: Eye tracking. Retrieved July 26, 2023, from https://developer.varjo.com/docs/native/eye-tracking
Varjo.com. (2023). Privacy policy. Retrieved October 26, 2023, from https://varjo.com/privacy-policy/
Varjo. (2023a). Technical specifications of Varjo VR-3. Retrieved July 26, 2023, from https://varjo.com/products/vr-3/
Varjo. (2023b). Technical Specifications of Varjo XR-3. Retrieved July 26, 2023, from https://varjo.com/products/xr-3/
Wagner, I. (2022). Privacy policies across the ages: Content and readability of privacy policies 1996–2021.
Cases
C-101/01, Bodil Lindqvist EU: C:2003:596.
C-582/14 Patrick Breyer v. Bundesrepublik Deutschland [2016] ECLI: EU: C:2016:779.
C-434/16, Peter Nowak v. Data Protection Commissioner ECLI: EU: C:2017:994.
C-184/20, Vyriausioji tarnybinės etikos komisija v. Fondas ‘Nevyriausybinių organizacijų informacijos ir paramos centras ECLI: EU: C:2022:601.
Legislation
Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation).
REGULATION (EU) OF THE EUROPEAN PARLIAMENT AND OF THE COUNCIL laying down harmonised rules on artificial intelligence (and amending Regulations (EC) No 300/2008, (EU) No 167/2013, (EU) No 168/2013, (EU) 2018/858, (EU) 2018/1139 and (EU) 2019/2144 and Directives 2014/90/EU, (EU) 2016/797 and (EU) 2020/1828, (Artificial Intelligence Act) (last published version from 6 March 2024).
Funding
Open access funding provided by European University Institute - Fiesole within the CRUI-CARE Agreement.
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Conflict of Interest
This research was partially funded by META.
Additional information
Publisher’s Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.
About this article
Cite this article
Menéndez González, N., Bozkir, E. Eye-Tracking Devices for Virtual and Augmented Reality Metaverse Environments and Their Compatibility with the European Union General Data Protection Regulation. DISO 3, 39 (2024). https://doi.org/10.1007/s44206-024-00128-9
Received:
Accepted:
Published:
Version of record:
DOI: https://doi.org/10.1007/s44206-024-00128-9