Chelsea Manning and Data Privacy take centre stage at Geneva human rights film festival

Can you construct a life from Google seach data?
Can you construct a life from Google seach data? Copyright Jens Meyer/AP
Copyright Jens Meyer/AP
By Jez Fielder
Share this articleComments
Share this articleClose Button

"We turned your life into a film, and your data was the script."

ADVERTISEMENT

Hans Block's unnerving documentary 'Made to Measure' was screened on Saturday 5 March at the 20th edition of Geneva's human rights film festival, FIFDH.

The fascinating piece poses the question: Is it possible to reconstruct a person’s life based solely on its google searches?

'Behavioural residue' is the central element here. Those little data traces we leave as we go about everyday life. And this film shows us how we do that, and what the rest of the world can deduce about you from it.

The premise is an experiment that uses machine learning to analyse our digital footprint and then, crucially, to create a psychological profile from that data.

They used a volunteer as a case study and profiled her over the course of five years, leading to a theatrical finale where her digital life would be, as it were, inhabited and performed back to her by an actress.

We are all aware of how valuable algorithms are to online retail. But can a profile from our search history tell people much deeper facts about our true selves? Can it answer questions like: What am I afraid of? and What would I be willing to fight for?

What we search is big business. Take health for example. Our searches are monitored and the results compiled and sold to pharma and insurance companies by data brokers, and it's happening to all of us.

The mapping out of this woman's life turned into an extremely emotive piece of filmmaking when the search profile had revealed a miscarriage. Moved to tears the subject had to take a break.

"That's what it was like. Not at all in this context but still. That's insane," she said, shocked at how her reality had been pieced together and given life due to a dataset.

"That's what it was like," she repeated to the actress opposite her. "You really nailed that moment."

Later on she said she felt she had "been wiretapped."

The recapitulation achieved a kind of duality with the subject's true life when it came to remembering her past, post-experiment. "I can't go back to the real experience of my life. it's like reality has blended with the story. I can't get to that place anymore, I'm mixing it all up and I don't even know what's true."

Similarly for the actress. A disturbance. "It felt creepy to be the doppelganger."

Chelsea Manning at FIFDH

After the screening of 'Made to Measure', former US Army intelligence analyst Chelsea Manning, who is now a hardware security optimisation expert, was at FIFDH to take part in a Q&A session on the topic of data privacy.

Cliff Owen/AP
Manning in 2019Cliff Owen/AP

When asked if the film was accurate in terms of how social media platforms soak up your digital traces, Manning said yes.

"Especially with machine learning, you're able to make connections with and between things that a human being wouldn't normally associate with being connected."

"This is the dystopia we signed up for. We click 'agree'," said Manning.

Block's documentary is interspersed with expert opinion regarding the dangerous elements that operate within the algorithm process.

ADVERTISEMENT

"The search and explore function is being exploited by these online algorithms to keep us swiping, tapping, scrolling, to find some kind of novel stimuli," says Anna Lambke, an addiction researcher at Stanford University."

FIFDH
Poster for FIFDH 20th EditionFIFDH

Surveillance Capitalism

"There are good methods of surveillance, this idea that if we give a little bit of info that's extremely private but we do it for a good purpose it'll be okay, but that's normally a gateway for abuse and use down the line," said Manning. "It's creating a more palatable means of allowing surveillance into your life."

"We're going to surveil you so you become the product," she adds.

So, do algorithms know us better than we know ourselves?

Why shouldn't they? It's a multi-million euro industry. Data brokers are more interested in us than any of our friends are.

ADVERTISEMENT

During 'Made to Measure' you'll be piqued by many lines but one that impacted my viewing was the idea that 'Instagram knew I was pregnant before I did.' Our subconcious can inform our deepest fears, inclinations and thoughts, and it influences our search behaviour.

How can we stop it?

"The authorities who have been given the power to do something about it... have done nothing," says Data Privacy expert Johnny Ryan.

Manning takes up the baton. "Are we expecting someone else to come along and solve this problem?" she asks before stating that the onus is on "the people that designed this stuff."

"Doctors have obligations, lawyers have ethical obligations. They have consequences if they don't abide by certain ethical standards and protocols... Why is it that the people that design and develop this technology don't have these standards?"

Manning's current work with Swiss company Nym technologies is helping to create a more secure network of data sharing. "It creates an envelope or a tent around this traffic," she explains.

ADVERTISEMENT

Data back at you

What I found most masterful about the 'Made to Measure' project and its execution was that, during my viewing experience, the data I was giving away by: my answering of the on-screen questions; how often and where I paused the film; which parts I replayed and the speed at which I clicked, was being recorded and analysed. Chillingly, I was then treated to a read-out of my profile, a lecture on what the data had concluded about me. I won't share it with you, but some of the observations were striking (I flippantly thought about how horoscopes can dupe people, which was oddly reassuring) and then I was given the option to download the profile and upload it to a data broker. One part I shall give away is: "We think you are only reaching 37% of your potential. We know that is not what you want." A neat psychological confidence trick? Maybe. Maybe not.

FIFDH at 20

Over 10 days, diplomats, NGOs, victims, artists, philanthropists, activists, journalists, decision-makers, and the general public are invited to FIFDH to debate their views on human rights violations across the globe.

Prominent personalities who have participated in the debates include: Nobel Prize laureates Shirin Ebadi, Joseph Stiglitz, Tawakkol Karman and Dr Denis Mukwege. High Commissioners Michelle Bachelet, Louise Arbour and Zeid Ra’ad al Hussein. Human Rights Watch Director Kenneth Roth. Former NSA analyst Edward Snowden. Writers Chimamanda Ngozi Adichie, Arundathi Roy, Nancy Houston and Joe Sacco. Activist Angela Davis. Artists Ai Weiwei and JR. Diplomats and politicians Svetlana Tsikhanouskaya, Leila Shahid and Samantha Power, alongside many victims and actors on the ground.

Share this articleComments

You might also like