On Privacy in Our Technical World


“It began back in the early 90s, when there was this wonderful thing called the Internet,” says Alexander Hanff.

He was smitten from the get-go.

Technology guru, and one of the world’s most vocal privacy advocates, Alexander became enamored with the power of technology before going out on his own at the age of 17, well beyond his modest English roots. But, soon enough, he would also see that there was a dark side to that power, and one he would go on to challenge with the same passion with which he embraced technology in the first place.

“Where I grew up in the UK, nobody really had any opportunity to do anything or go anywhere,” says Alexander. He couldn’t afford university, but he found a way to get there anyway. “I used to sneak into my friend’s computer science lectures,” he says, “and use his access card to get into the labs, where I played around with computers and discovered such technologies as Archie and FTP. … I spent literally days on end without sleep, trolling through massive amounts of information, talking with people all over the world, and just realized that the world was a much smaller place than it was where I’d grown up.”

Over the next 15 years, Alexander became immersed in Tech, traveling the world, and working for various startups and institutions. He went back to university to study sociology and the impact of technology on society. But in 2006, something happened to change his focus.

Just two months before Alexander was to deliver his thesis, he became aware of a company called Phorm, who was using behavioral advertising technologies within the ISP network. Based on advanced interception technology called Deep Packet Inspection, Phorm would intercept all of the web pages BT, TalkTalk and Virgin Media customers visited and then scan them for keywords. It would even scan what people searched for on Google. The goal was to target these customers with advertising – without their knowledge.

Alexander had become aware that technology was changing in ways that were potentially damaging to society. “Something which to me had been very empowering and gave me access to knowledge and resources and people I would never have dreamed I would have access to with the background I came from – was really just monetizing people, and controlling people,” Alexander said. “So it’s the commercial Web as we now know it vs. the original spirit of the Internet that launched the Web back in the early 90s.”

Alexander launched a grass roots campaign against Phorm that would become one of the most disruptive campaigns ever seen in the privacy space, and one that would plant the seed for rising awareness and global concern for privacy. Tens of thousands of complaint letters were sent to the European Commission. That campaign eventually shut Phorm down in Europe, but perhaps more importantly, it led to sweeping change in law in the EU. For a start, it changed the Privacy and Electronic Communications Directive – or what people now refer to as the “cookie law.” It also led to the foundations upon which GDPR was born.

That campaign sprung Alexander into the spotlight.

“When you get an opportunity to have an impact on things like that you have to take it,” he says. Alexander worked for three years at Privacy International, heading up their e-privacy portfolio, was involved in some very controversial cases there such as the Google wifi scandal, then later was special advisor to European Parliament. … Long story short, says Alexander, “there hasn’t been anything in the realm of privacy that I haven’t been involved in the last ten years. It’s been a crazy ride.”

Today, Alexander is still very heavily involved in advocacy work, co-founder of Think Privacy – focused not just on GDPR, but also on ePrivacy, and the principles of data protection and privacy. Think Privacy looks at taking companies beyond basic compliance, helping them to benefit from strong data ethics and gaining a competitive advantage as a result. He lobbies frequently in Brussels, on ePrivacy and GDPR. Recently, he filed his first formal complaint under GDPR against the European Data Protection Board – because their website was not compliant with GDPR and ePrivacy. After discussing the complaint in a plenary session in early July, they have now decided to comply with the GDPR and ePrivacy Directive, despite not being required to as a European Body. Alexander filed the complaint because he felt that the European regulator should lead by example and not hide behind a legal technicality.

Alexander shares some of his insights on the evolving relationship between Privacy and Tech.

Q: What do you think about technology and mainstream Social Media – The implications and dangers?

A: We live in a curated world. So, our news is being curated in the traditional channels (Fox news is a prime example) and on the information that we see, delivered by Google and various search engines, and of course the stuff we’re seeing on Facebook, Twitter, and other social networks. They’ve moved from chronological news, to things they think you’re going to find relevant. That to me is a very dangerous path to go down, because it narrows our vision into a focal point that we haven’t chosen. It has been determined for us by an algorithm. And that to me is incredibly dangerous and very troublesome and very worrying, because some of the greatest ideas have come out of nowhere. They’ve come from the peripheral. And if that peripheral no longer exists, what does that say for us as we move forward as a species that likes to create and invent and evolve. How is this very focused, narrow view going to affect us? We have no peripheral any more.”

Q: How has the outlook on digital privacy changed over the years?

A: I’ve been campaigning on e-privacy and ethical use of technologies for more than a decade. I go every day to a new website, and I see cookie options which are set off by default, instead of turned on by default. That to me is a real indication of how things have changed, since prior to May 25, which I see as a really positive thing, so the message is starting to get out. The news just came from Apple that they‘re going to start doing some very proactive blocking technologies in Safari browser. … Now we hear stories of people like Edward Snowden and the PRISM scandal and all of the revelations that he came up with and then Cambridge Analytica and now that Facebook shared their data with hardware vendors – and the stories are just piling up, which is raising awareness, causing a change in attitude and a change in behavior. So it’s a very positive time to be involved.

Q: How do you find the difference in the attitude or response of people abroad vs here in Europe?

A: I don’t see any difference. People are people all over the world. People have the same concerns all over the world.

Q: Do you think Americans are as concerned about privacy?

A: As individuals, yes, but as companies, not so much. Once people understand about how their data is being used, they become increasingly concerned. So the biggest problem that we’ve had until very recently, since about 2015, has been awareness. People were not aware of what was happening. There was some interesting research done by Pennsylvania State University, and Carnegie Melon back in 2009 about whether or not people trusted advertising, and … I think it was like 86% of citizens were concerned about their data being used for behavioral advertising.

But there was this incredible naiveté where they believed that the brands that we used in our every day lives would ever do these things, and they also believed that there were laws in place to protect them from anything bad happening. And they were wrong in both cases; some of the biggest brands that they trusted and used on a day-to-day basis were doing exactly these things, and there were no laws in place to prevent them from doing it.

There’s a cycle now, which is very positive … but we still have issues where the type of enforcement is not sufficient enough to encourage companies to behave a certain way. That’s changed a bit with GDPR … but even then the really big corporations will probably continue regardless. Companies like Facebook, who have literally been so blatant in their refusal to adhere to GDPR. Their changes to privacy notices recently, where they gave you no choice as to whether or not you would allow them to profile you or track you for advertising purposes … the way they’re getting people to consent is fundamentally against the principles of GDPR. The fact that they would do that so soon after the Cambridge Analytica scandal, to me was a very significant sign that Facebook, to be frank, doesn’t give a S#@T.

I mean they’re already in breach of their consent agreement from 2011 with the FTC. There’s a lot of pressure at the moment. There’s more investigation going on, with the news story that came out from the New York Times, giving access to device manufacturers. That’s going to add more pressure on the FTC with regards to that consent agreement. That could potentially cost Facebook a lot of money. Obviously there will be more investigations in Europe. We’ve got Max Shrems who has filed a case against them already under GDPR. There will be many other people filing complaints against them. So you could see a situation where their share value suffers significantly, and they’re forced to change their ways in order to fulfill their responsibilities to their shareholders.

Q: It seems that people don’t believe that bad things can happen, even when they hear the stories. They say, “wow that’s terrible,” and then continue to use Facebook as they always have.

A: It’s a social trap, because bad things have happened to people on Facebook. There have been situations when Facebook first introduced location sharing. People were being stalked, people’s houses were being broken into when it was known they weren’t going to be at home; there have been situations with bullying on Facebook, which have led to people committing suicide – tragic ends to people’s lives – as a result of too much information being shared with people who shouldn’t have had access to it. These stories are continuing to increase. And the harm that comes from them continues to increase as well. But, there’s this false dependency. People have invested so much into these giant social networks that to leave them would be a significant detriment to their lives.

That doesn’t mean that it won’t eventually happen that there’s a move away from the large platforms … and we would hope that regulators and legislators would come down on Facebook, and certainly attempt to break them up in some way, but until there are competing alternatives like Idka, {and a critical mass} we have this situation where effectively people’s lives have been captured, their social networks have been captured by this one company.

Q: Many online group moderators are keen provide a “public” platform so that people from all over are aware that there is a place to go for support. I think that people are addicted to the notion that they can get their messages out to the world.

A: It’s a false ideology, because they’re NOT getting their message to the world. There’s so much noise on Facebook that very little of it actually gets out to the world. Most of it stays in this “walled garden” and goes no further than maybe 1000 users, if you’re lucky. If you get incredibly lucky and manage to create something that goes viral that’s another thing, but that is such a low percentage of content. Not something that’s a result of being on Facebook at all. You could just as well have a chance of getting something to go viral on any other medium, whether it’s a forum, Youtube, Twitter, Reddit, whatever. It’s got nothing to do with being on Facebook.

Q: So there are alternatives out there. And there’s concern. But, it’s still tough for people to make the switch?

A: It’s a problem I’ve faced for a long long time but it is possible, it can be done. You just have to really persevere with the message.

With all the events around Facebook, there’s significant concern in the public sector about the use of Facebook for their organizational pages or resources for the general public. They should never be there in the first place! It’s ludicrous … One of the things that drove me insane … I was at an event in Brussels, and the interview with the president of the European parliament was streamed on Facebook Live. This is during the time when the Cambridge Analytica stuff was going on. They talk about the problems with Facebook in the hearings, and then go stream an interview on Facebook Live. It was ridiculous. In order to hear what that politician was saying you had to subscribe to Facebook in order to be able to access that media.

We also have case law from the Court of Justice of the European Union now, which places joint liability on organizations that use third party platforms as part of their commercial infrastructure.

Q: And yet, people are still saying, “I don’t have anything to hide.”

A: Everybody has something to hide. I’m a faculty member at Singularity University. I speak on data ethics, and one of the things I say to people is that in the summer months, when it gets warm, you don’t walk around naked do you? No, you put the air conditioning on or something. We all value our privacy. So to say you’ve got nothing to hide is just ludicrous. Having nothing to hide is not the same as having done something wrong.

Q: Some people think that privacy is against freedom of speech.

A: Back in 2010, I was speaking about Freedom of Speech and Privacy in Beijing. And, as you can imagine, this was a somewhat difficult situation to be in. And I told them then that you cannot have freedom of speech without privacy. In psychology this is a phenomenon that’s been talked about for decades. If people know they’re being observed, they change their behavior. If they know they’re being surveilled, they change their behavior: The Hawthorne effect in psychology. The idea that freedom of speech exists without privacy is just ridiculous.

Q: You’re very much involved in European tech. What about our relationship to Silicon Valley and the rest of the world?

A: One of the other things I’ve been doing over the past ten years is to raise the issue with the European Commission and various other institutions, with regards to this umbilical chord we have with Silicon Valley, and the fact that we need to invest in local infrastructure and local goods and services so that we can preserve our identity as a region, as individuals, as Europeans, instead of being manipulated by the four or five companies coming out of Silicon Valley and that have complete control over everything we do and everything we see.

It’s part of a project I’m working on. … You want to look at how you can develop technology even further, to try and release us from this dependency we have, and the way we access and have information presented to us. We need to try to maintain a more neutral – a more objective perspective of the world.

Q: What is your mission?

A: I’m straightforward. I’m here to help people have more privacy. My motivation is that. It’s critical to democracy. Without privacy we don’t have freedom of speech or freedom of thought. We don’t have freedom of choice. We don’t have democracy. We don’t have a society that I want to be part of, and that I want my grandchildren to be part of. We are going down a very dark path, where we no longer make choices, where we are nudged and manipulated towards a specific course, action or decision. And, one of the big things that is supposed to make us unique as a species, is this freedom of choice, and thought. If we lose that, we’ve really lost everything.

You’re passionate about what you do, and it comes through.

Yeah, I eat, sleep and breathe it.