A society where economically driven entities own your secrets is a dangerous society. It sounds like a dystopia but it is the reality we now live in. Fortunately there are choices.
Most of us have felt it. That creepy notion that somebody is watching. It can happen anywhere, as you walk through a park, sit at a café, and nowadays online. Browsing our Instagram feed and a random ad pops up. A pair of crispy sneakers in bright colors. The exact same sneakers we clicked on two days ago on our work computer. Random ad? Not at all. Creepy? Indeed.
But that’s just the beginning of it.
We are tracked, followed and watched. Mapped, analyzed and remembered. Constantly. Every step on our digital way. Which means it goes well beyond cookies remembering what ads we click on.
Take the dating app Tinder. Most people don’t care to find out what rights Tinder have. Who wants to read through a sleeping pill of a document longer than a Friday night listing terms and conditions in a complicated, dry language when there are people to swipe through, chat with and date?
We are physical beings filled with pulsating emotions. Data is neither emotional nor physical. Data is just… data. But what if you would see it in real life? Hold it in your hands?
A Guardian journalist demanded with the help of a lawyer and a social rights activist to get access to her private information from Tinder. And got it. 800 pages of information. Her entire conversations were there, what she had liked on Facebook and when, where she had been at different times, whom she had dated, when she had interacted with them and where she was at the time of doing so, links to her instagram photos, the age-range of men she had dated, etcetera, etcetera. She writes that reading through the material was like taking “a trip into my hopes, fears, sexual preferences and deepest secrets”.
This happens all the time, on many fronts. You are surrounded by companies collecting information about you. And then selling that data to third parties, using it to direct ads at you, sell you products, send you articles you are likely to click on and not waste time with others you aren’t likely to click on.
”Be less curious about people and more curious about ideas,” Marie Curie famously said. But what if you are not exposed to new ideas simply because you aren’t likely to click on them? Or what if somebody deliberately is putting ideas in your head because they have a purpose they want you to serve?
It is not established exactly how big the role of data, fake news and directed articles played in the US election of last year and other factors of course also played a role in the outcome. But that they played a role is without a doubt. The Trump campaign used big data to track millions of potential voters for their candidate and then used facebook to send ads – for example heavily exaggerated articles or complete lies about Hillary Clinton – straight to them. This type of ads are called “dark posts” and they can only be seen by the receiver, a formidable way to avoid other people and media to review, fact check and analyze them. The Trump campaign used the same method to track down and feed propaganda to potential Hillary voters as well.
The mathematician Cathy O’Neill, a former teacher at MIT and Columbia University who later worked on Wall Street, became disenchanted and joined the Occupy-movement, has put a lot of work into analyzing algorithms, the written code that today runs so much in our world. What posts you see in your feed, who gets to come to a job interview, and so on. People tend to see algorithms as science. ”They are opinions embedded in code”, O’Neill says and calls them ”weapons of math destruction”.
An example: companies writes code to sort through job applications. They define what successful is based on who has “made it” in the company through history. The algorithm will as an effect overlook women. “A lot can go wrong when we put faith in big data”, she says.
Companies are now also using data to not only locate who would be a possible customer but also to more effectively than ever alter customer’s behavior with the goal of turning them into even better customers. For example the clothing company Under Armour have expressed that their goal is to through data transform people into better athletes that then become bigger consumer of their goods.
The academic Shoshana Zuboff at Harvard Business School calls it ”surveillance capitalism” and she writes that the core of the problem comes down to the rights of participating in this extravagance information sharing that is happening. We all choose to participate, we all accept to trade data about ourselves for free services (for example when we Google for information). But is it really a choice? Do we really in today’s internet based society have a free will and can we partake in this world, socially as well as economically, without selling deeply private information about ourselves? “Surveillance capitalism does not erode these decision rights” Shoshana Zuboff writes, ”but rather it redistributes them” creating a new dimension of social inequality. ”The full implications of this development have preoccupied me for many years now, and with each day my sense of danger intensifies,” she writes.
Idka as a company has grown out of the same sense of urgency and from a desire to do something about it. It is a company completely based on the idea that integrity is of high value, that information about our selves is not a commodity and that a society where market forces who’s primary purpose is to make money and expand owns our deepest secrets is a dangerous society. Idka is a completely private social media where you own all your information and no data is collected and none of your information is sold or shared with other parties. There are no ads. You can connect with friends, businesses, share and store material knowing that no one else but exactly who you actively chose listens.
Idka is an online, connected existence without the creepy feeling of somebody watching you. Idka is a choice. Visit https://idka.com/