Watergate, Carl Bernstein and Bob Woodward of The Washington Post taught us that when political power uses technology to spy on political rivals, “democracy dies in the dark”. With Catalangate, the Citizen Lab researchers have taught us that in today’s world “democracy dies in the cloud”. And that is precisely the title of the book written by Josep Maria Ganyet, computer engineer, businessman and disseminator from Osona who was one of the victims of espionage in the Catalangate affair. The book introduces us to the surveillance capitalism conducted by big tech companies, and shows us how security agencies have taken advantage of this to create a permanent record of citizen activity and the consequences that this has for democracy. The book (La Magrana, 2023) also warns about the dangers of espionage through new technologies and the need for regulation to protect the individual rights of citizens.
Did you write the book, or ChatGPT?
[Laughs] No, I wrote it myself, but it is true that when I was finishing it, around November, the issue of ChatGPT came out in the media and, I don’t deny it, I tried it. As an experiment, I entered the subject of the last chapters and the conclusions I hoped to draw from it. The result was that the texts were very flat, without any really deep reflections. There was a lack of ethics and there was no irony, and none of the double meaning I like to use. Not interesting at all. Also, editing it, going over it, fixing it took a lot longer than if I’d written it myself. It’s true the texts are above average, but they lack soul.
And who is to say that I, your interviewer, am not AI (artificial intelligence)?
Well, why not? You could be. I can’t know. We don’t know each other and we’re doing the interview over the phone. There are platforms that synthesise the voice and someone could have recorded their voice and there could be a ChatGPT behind it asking the questions. In the technological times we are currently living in, I don’t believe I could detect whether I’m talking to a person or not.
We always talk about the cloud..., but what is it?
A very easy way to understand it is this metaphor of the cloud, which is open, ethereal, above our heads; it’s available, it saves our data and makes it accessible from anywhere in the world. All of this is very good, but in reality the cloud comprises concrete buildings, large data centres of reinforced concrete, steel and glass, which consume a lot of energy and are here on the ground, weigh a lot and, unlike clouds, which don’t belong to anyone, they do have an owner. Summed up in one sentence: the cloud is someone else’s computer. When you keep things there, they cease to be one hundred percent your property, you waive a number of rights; we’re making a copy of these on someone else’s computer, be it Google, Apple, Amazon, Netflix... And it will no longer be just yours, that data. You have shared custody of that data and you have to trust that someone else who owns this computer makes good use of it. The data that produces this data, the metadata, is not yours either.
Would it be an equivalent to what Orwell called Big Brother?
Not necessarily. Honestly, we hand over the data with some conditions. Making these computers work has very large costs, saving this data involves a lot of economic effort and in one way or another you have to charge for these services. Let’s think about what the services of Google Maps, YouTube, Netflix, Amazon must cost... Some are free, but our data, records of those activities we generate, metadata, this is traded, sold, packaged and classified. It has a lot of value and brands pay a lot of money to know what we are like, in an anonymised way, maybe, but they pay a lot of money to have it. It’s interesting to know what we receive and what they give us in return.
So we can’t talk about companies ruling from the shadows.
Exactly, it’s a tacit agreement: we don’t look at the conditions, nor can we understand them. I give the data in exchange for an excellent service. I’m thinking of Google Maps or Google Earth... But, going back to Big Brother, we can refer to it when it comes to monopolistic policies or manipulation of public opinion through this great knowledge they have of us. Then we could talk about a kind of Big Brother. In the end, we’re all watching over each other, we’re a family of Big Brothers. In the book, I talk about it in a chapter about the power we have on social media and what happens if we misuse it.
A handful of data that, in the hands of others, allows them to influence how we think to be because they know what we think?
There are two ways to do it, which I explain in the book. When Google realises it can predict the future, or where we will click next – through our searches they already sense our tastes: “These people are sure to buy a car before they buy some sports shoes”. The other is when Facebook realises that not only can it predict the future but it can also influence it. In 2014, an experiment was carried out with 6,400 users, divided into two groups. One was given positive news on the Facebook wall, and the other negative. Those with negative news posted comments on their wall like “It’s Monday, back to f****** work.” And the others, “It’s Monday, great, I get to see my coworkers again!”
We’re being manipulated, then.
It’s the contagion effect in groups and societies, which occurs so much in social networks. That’s how you can understand what’s going on in the world, how a character like Trump can become president, the rise of extremism and phenomena like flat-earthing. Understanding this, they will give us the content that makes us stay glued to the network longer. Note that I say “the content that makes us spend more time on the network”, not what we like most. The most logical thing would be that if people like kittens then offer them kittens. Well, that’s not what happens! Social media algorithms have learned that it’s much more efficient to generate more clicks by giving you content that polarises you to one side or the other. Because when you’re very biased towards a political party, for or against, or you’re very progressive or very conservative, it’s more likely you’ll click on what’s closest to you but also the other opposing pole, whether to criticise it or troll it. Therefore, the more extreme visions we promote, the more money we make, and social networks have made a ripple in the social fabric with our collaboration.
Who is it that encourages us to polarise?
There are three major power groups: Silicon Valley is the one that sets the pace, to the point that if you want to campaign against breast cancer, you can’t, because that doesn’t suit them; the great power that states have and use to spy on us, attacking or even exterminating dissent, as is the case with China or Spain; and, thirdly, we also have great power. We can destroy democracy with the cloud, as we said, or if we manage it well, it can become a fantastic, open tool for participation. The paradox is that these tools, the most participatory and open we have ever had, are capable of destroying democracy.
Another dilemma is security versus freedom. You’ve suffered due to this, haven’t you?
Yes, but Catalangate is only the tip of the iceberg. As a privileged witness on the front line, I can say that this can happen to anyone in any corner of the world, being subject to the big tech companies changing interests and this imbalance of power at the whim of states. As far as I know, there were attempts to get into my mobile, but I don’t know if they succeeded or not. When my mobile was analysed by Citizen Lab they confirmed my suspicions.
Is it lawful for states, corporations, law enforcement agencies to use this data against our freedom?
I am happy to offer it to them, just as I give away blood for the common good. They can’t tell us that they use our data to better serve us and then actually use it solely to benefit them. It’s not the same that Facebook uses facial recognition to establish ties between people we know than if it offers this data for military use.