Euroviews. Big tech achieving “quantum supremacy” for the first time is bad news for our privacy ǀ View

Big tech achieving “quantum supremacy” for the first time is bad news for our privacy ǀ View
Copyright Google/Handout via REUTERS
Copyright Google/Handout via REUTERS
By Jamal Ahmed
Share this articleComments
Share this articleClose Button
The opinions expressed in this article are those of the author and do not represent in any way the editorial position of Euronews.

Google announced it reached “quantum supremacy," a technological achievement that has huge repercussions, not only for the company and its role in the world but for all of us individuals who want to maintain a semblance of the right to privacy.

ADVERTISEMENT

Google announced “quantum supremacy” last week, a technological achievement that has huge repercussions, not only for the company and its role in the world but for all of us individuals who want to maintain a semblance of the right to privacy.

Google researchers have developed a computer called Sycamore, which is exponentially more powerful in its processing power than a “standard” supercomputer. The workings behind Sycamore are what make it such a breakthrough, since it uses an algorithm that would take 10,000 years to give a similar output on a classical computer but only 200 seconds on Google's processor.

We should all be very concerned that an industry with a questionable track record on data protection, privacy and political neutrality now has access to the world’s most powerful computer.

There is still time for our governments to play catch up and protect consumers. Although Google’s Sycamore is advanced, it is still not capable of fulfilling every data scientist’s deepest desires.
Jamal Ahmed
Founder of Kazient Privacy Experts

The charge sheet against Facebook, rather than Google, is the longest – and is still growing. There have been long-standing concerns about the amount of data Facebook is harvesting from its users and what it would (or could) be used for. However, these issues are systemic and industry-wide, and in my opinion, the scandals involving Facebook in recent months and years could just as easily have affected Google, or perhaps even Microsoft.

These issues came to a head around the Cambridge Analytica scandal, where Facebook was implicated in allowing a Russian-linked firm to harvest a huge amount of personal data, including political preferences, and allowing that knowledge to be used to meddle in the 2016 US presidential election. Now that the processing power available to manipulate and use large amounts of data has increased, the stakes are raised in what big data can be used for.

The industry, however, doesn’t seem to accept these dangers. The implicit aim of tech companies is to acquire more users, more data, and ultimately more advertisers. The symbiotic relationship between these three factors underpins most tech companies’ business models, including the current wave of startups in Silicon Valley and elsewhere.

This will not change. But what must change is the regulation around data security and its implementation and enforcement. Regulation is, by and large, already present: in almost every developed country, it is illegal for someone to hold data without a range of rigorous checks and balances on how it is sourced, held and transferred between parties.

A range of international treaties, such as the European Union’s General Data Protection Regulation (GDPR) and the EU-US Privacy Shield mean that data can only achieve “freedom of movement” by fulfilling strict criteria.

The largest data owners like Facebook and Google tend to follow these rules closely, meaning that the main concern is not control of data, but the data’s actual power. Big data can already predict an individual’s consumer habits and personal desires to a somewhat eerie extent. As processing power grows exponentially, will we have Facebook ads that can penetrate deeper and deeper into our lives and consciousness? What will be the effect on our mental health? Our family relationships? And at the macro level, our economies?

None of these deeper questions appear to be being asked by either the industry or the regulators. Inevitably, they will become relevant as processing power increases; it is a matter of when, not if.

There is still time for our governments to play catch up and protect consumers. Although Google’s Sycamore is advanced, it is still not capable of fulfilling every data scientist’s deepest desires. The Sycamore chip is a 54-qubit processor. That is relatively limited, and is one of the many reasons that the discovery is not practically useful. Researchers want a 100-qubit - or even 200-qubit - system before they are really able to put it to the test and see whether the dreams of quantum computing are realised.

Rather than just controlling data transfer, it is time for a wider conversation about data usage. Which uses of data - regardless of who owns it and how it has been sourced - are ethical and safe? And which are unethical and dangerous?

As lawmakers like US congresswoman Alexandria Ocasio-Cortez seem to enjoy grilling tech executives like Mark Zuckerberg on the minutiae of data usage, I hope we do not lose sight of the bigger picture. The stakes are too high, and the processing power is now too big, for us to be complacent.

____________

Are you a recognised expert in your field? At Euronews, we believe all views matter. Contact us at view@euronews.com to send pitches or submissions and be part of the conversation.

Share this articleComments

You might also like

A glass ceiling in male-dominated Big Tech is keeping innovating women from making their mark ǀ View

The Panama Papers proved encryption is a massive asset for democracy

‘It scars you for life’: Workers sue Meta claiming viewing brutal videos caused psychological trauma