‘Reading your mind’: How eyes, pupils and heart rate could be used to target ads in the metaverse

Some experts are raising the alarm about what targeted advertising could look like in the Meta metaverse.
Some experts are raising the alarm about what targeted advertising could look like in the Meta metaverse. Copyright Canva
Copyright Canva
By Aisling Ní Chúláin
Share this articleComments
Share this articleClose Button

Brands are already staking out the commercial opportunities within the metaverse but some experts are raising the alarm about the implications immersive advertising will have for user privacy, safety and consent.

ADVERTISEMENT

If we’ve learned anything about new means of communication over the last century, it’s that where technology attracts people’s eyes and ears, advertisers won’t be long chasing after them.

It’s been the case with radio, cinema, TV, the Internet and social media, so it seems almost impossible that it won’t be the case in the so-called metaverse - the new fully realised, shared universe that companies like Meta are proposing to build.

In perhaps a sign of things to come, a host of brands have already dipped their toes into gaming metaverses, hosting virtual fashion shows and dropping exclusive collections in game.

Luxury fashion houses like Louis Vuitton, Valentino and Marc Jacobs have all designed digital items for the social simulation game Animal Crossing - and Balenciaga has collaborated with Fortnite on an exclusive drop of wearable skins for in-game characters, to name but a few.

‘Think about it as placement in the product instead of product placement’

But now that Meta, a targeted advertising powerhouse, has staked its claim to the metaverse, some experts are raising the alarm about the specific implications immersive advertising will have for user privacy, safety and consent.

“When you think about advertising in XR, you should think about it as placement in the product instead of product placement,” Brittan Heller, counsel with American law firm Foley Hoag and an expert in privacy and safety in immersive environments, told Euronews Next.

“The way that advertising works in these contexts is a little different because you seek out the experiences. You like the experiences,” she explained.

We're rapidly moving into a space where your intentions and your thoughts are substantial data sets that have technological importance in a way that they didn't before.
Brittan Heller
Human Rights Counsel - Foley Hoag LLP

“An ad in virtual reality may look like buying a designer jacket for your digital avatar [but] that's an ad for a clothing company that you are wearing on your body”.

“It may look like buying a game that puts you into Jurassic Park - [but] what better way to advertise the movie franchise than to actually put you in the experience of being in Jurassic Park?”

What is biometric psychography?

The problem here, according to Heller, is that in the metaverse, the capability for harvesting biometric data and using that sensitive data to target ads tailored to you, goes far beyond the considerable amount of data Facebook already uses to build our consumer profiles.

If the technology that Meta is promising comes to fruition, the possibility exists that a form of targeted advertising which tracks involuntary biological responses could be proliferated.

The risk that I think we've learnt from Cambridge Analytica is that privacy risks come into play when you have the combination of unanticipated data sets, especially when you're looking at emerging technology.
Brittan Heller
Human Rights Counsel - Foley Hoag LLP

For VR headsets to work in this environment, Heller says, they will have to be able to track your pupils and your eyes.

This means advertisements could be tailored according to what attracts or holds your visual attention and how you physically respond to it.

Heller has coined a term for this combination of one’s biometric information with targeted advertising: biometric psychography.

If an entity had access to biometric data such as pupil dilation, skin moistness, EKG or heart rate - bodily indicators that happen involuntarily in response to stimuli - and combined it with existing targeted advertising datasets, it would be “akin to reading your mind,” Heller said.

“The type of information you can get from somebody's pupil dilation, for example - that can tell you whether or not somebody is telling the truth. It can tell you whether or not somebody is sexually attracted to the person that they're seeing,” she explained.

“We're rapidly moving into a space where your intentions and your thoughts are substantial data sets that have technological importance in a way that they didn't before”.

“The risk that I think we've learnt from Cambridge Analytica is that privacy risks come into play when you have the combination of unanticipated data sets, especially when you're looking at emerging technology”.

ADVERTISEMENT

Regulating the metaverse

Heller believes that biometric laws in the United States are insufficient in protecting users from use or misuse of this kind of data because “biometrics laws in the States are defined by protecting your identity, not protecting your thoughts or your impulses”.

With the metaverse, the risk remains that the pace of development of the technology will outstrip the ability of institutions to regulate them effectively as has arguably been the case with social media platforms.

In light of the fact that companies hoping to build the metaverse are multinational and operate across borders, Heller believes the most effective way to deal with these issues of user protection is a “human rights based approach”.

“There are many stakeholders in this, there's civil society, there are public groups, there are governments and then there are intergovernmental organisations as well,” she explained.

“A human rights approach has been the way that we've been able to bring all of these players and their concerns together and make sure that everybody is heard”.

ADVERTISEMENT

But what can companies do to protect people in the metaverse?

If tech organisations are serious about guaranteeing users' digital rights in immersive environments, it will depend on them being open about the technology they are developing.

“I would want companies to be more transparent with the functionality of their technologies, not just their intentions and their business plans, but how this will work,” Heller said.

“That will help lawmakers ask the questions that they need to protect the public and to cooperate with each other for trans border technology”.

Share this articleComments

You might also like