… Or is it?

News over the past several months has indicated a greater global concern for digital privacy. Many, including former Facebook and Google employees and supporters, have come forward to challenge the way in which the tech giants collect, use, and sell our data – especially personal and sensitive information. Concerns over lack of privacy have increased with the uncovering of the Cambridge Analytica scandal, and the Congressional hearing with Facebook CEO Mark Zuckerberg. In May, GDPR took effect, protecting privacy for millions of European social media users.

Still, our likes-addicted culture continues to thrive.

Privacy In the News

In early January, Roger McNamee – one of Facebook’s first supporters and investors – stepped into the spotlight on a kind of apology tour, speaking out against Facebook’s use of personal data, saying that the social media giant was “ignoring bad actors” that were manipulating its platform. “There’s been an increasing understanding that when you’re using Facebook, a lot of bad things are going to happen to you, as a user,” McNamee said. “That is not a 100 percent guarantee, but the risk is really, really high.”

McNamee wasn’t the only ex-Facebooker to come forward about the use of personal data, especially as it relates to selling it to third parties. Former Facebook manager, Sandy Parakilas, was interviewed by New York Magazine on privacy, addiction, and why Facebook must “dramatically” change its business model. Former product manager, Antonio Garcia Martinez, talked about (CEO) Mark Zuckerberg’s “disingenuous and strange” reaction to the election. (This came after the discovery of Russian meddling on the platform during the 2016 US presidential election.)

Steve Wozniak took a stand, joined by a slew of other celebrities, and shut down his Facebook account in protest to its policies. Later, Facebook-owned Whatsapp CEO Jan Koum stepped down, citing differences in opinion when the company expressed its desire to make it easier for businesses to use its tools, which would require the weakening of encryption policies.

The world watched, some in awe, but others with visible disinterest and confusion, as Zuckerberg answered questions from congressional legislators about Cambridge Analytica’s misuse of personal Facebook profiles.

A few days ago, at Facebook’s annual shareholder meeting, Zuckerberg was met with anger, frustration, and doubts about his leadership. One investor even told him, “Emulate George Washington, not Vladimir Putin!” (With a majority rule for Zuckerberg, most proposals were voted down, and it seems it’s business as usual for Facebook.)

Facebook is not the only platform facing criticism of its practices; In May Gizmodo reported that about a dozen Google employees quit due to the company’s involvement in Project Maven – an artificial intelligence drone program for the Pentagon, which uses AI to detect people and objects for human review. And nearly 4,000 workers signed a petition demanding not only that Google stop its involvement in Maven, but also that Google avoid any future military work. The involvement runs contrary to Google’s ethos, some said, recalling the mantra “don’t do evil” which was thought to be at the heart of Google’s principles. One employee put it like this: “Humans, not algorithms, should be responsible for this sensitive and potentially lethal work.”

Later in May, within hours of GDPR taking effect in Europe, complaints were filed against Facebook, Google, Instagram, and WhatsApp. The companies were accused of forcing users to consent to targeted advertising to use the services. Privacy group noyb.eu, led by activist Max Schrems, said people were not being given a “free choice.”

So, it would seem, the world is beginning to care about digital privacy and ethics in technology. Yet, the industry remains largely unchecked. For the most part, we continue to use social media platforms as we always have.

“So, who cares? I have nothing to hide!”

Despite all the negative news, many of us, particularly younger users, don’t seem to care. It’s almost a matter of habit; we like reading about what other people are doing, and we like talking about ourselves. And Facebook makes it so easy for us.

As technology players, it is tempting for the Idka team to join in the game. Data, after all, gives us valuable information about the market we are selling to. In fact, this data has become the bread and butter of big business. There are many ways to use an individual’s personal data. It’s not just what we like on Facebook, or share on Instagram, or the pictures we post. … It’s not just what we knowingly put online. Once we join, these media companies know where we go, who our friends are, what medical conditions we have, the groups we join, our political and sexual orientation, what we buy, everything we talk about on our phones, what we look like on camera. If we have “Alexa,” it’s even easier to get access to our conversations, even when we think we are offline. As has been widely quoted, “Data is the new oil” and in that case, Facebook is Standard Oil and Google is Getty. It may make some rich, but there will be long-term consequences for our insatiable desire for data (and oil).

But we’ve decided this isn’t the way.

Here at Idka we’ve decided that an individual’s right to privacy and transparency in policies and control over data is more important than building an algorithm and giving it control over what we see. This is the foundation of our platform. Connect, communicate, store and share – while your data remains protected. You, in other words,control the sharing.Apple CEO Tim Cook pointed this out when the FBI asked Apple to build a backdoor into its products. “The demand would undermine the very freedoms and liberty our government is meant to protect,” he said. We agree with Mr. Cook. We can’t be frivolous with technical creations. We must take responsibility for their impact on society and freedoms. Customers are transitioning to Idka based on the integrity of the product, and on our core belief that privacy is a basic human right.