What Really Happens to the Information you put Online

Let’s take, for example, a single photo of you. You’re on vacation and you want to share your travels with family and friends, so you take a selfie with the Eiffel Tower in the background and post it to Facebook, or upload to Google photos. Google or Facebook suggest you tag yourself and anyone else with you in the photo, so you tag your name to your face.

In the background, Facebook and Google are using facial recognition algorithms to assess your biometric information. How far apart are your eyes? What’s the length and diameter of your nose? How high is your forehead? What color are your eyes? All this information is stored in their databases and matched with other identifying information, including your address, travel locations, buying and financial profiles, and much more.

You return from your holiday in Paris and walk into your favorite store near home. Cameras located throughout the store zoom in on your face and share that image with their Google Cloud service. Google already knows who you are, as their facial recognition algorithms have already taken that information from the photos of you online. Google tells the store algorithms that you’re a traveler, so maybe a store assistant comes and suggests new luggage. If you think that’s creepy, there’s more. Google also knows your search history, your buying history and your criminal history. If you’re a criminal, they can flag that for the store for theft prevention. If not, they may track your buying habits in the store, like how long you linger in certain aisles, and blend it with your online buying habits for a more in-depth profile of you. Imagine spending too long in the birth control section of the store and finding condom ads online when you get home.

Google has now collected deep information about you, as has Facebook. They sell that information to advertisers, employers or other organizations, or they allow an app like Cambridge Analytica or Crimson Hexagon to harvest the data directly. Those data collection and analytics companies then give your profile information to their clients – clients like the US Government, the Kremlin and the UK government.

In the meantime, there’s a data breach by a hacker at one of the advertising companies that has your profile information. Maybe it’s the store that also uses the cameras to recognize your face. That hacker may sell your personal information to criminals. The store might alert you if your credit card information has been stolen, but it’s too late. Your information has been hacked and they can’t get it back. All your social media data, photos, location information and habits are now in the hands of criminals who then re-sell it on the dark web.

Let’s say that information is now in the hands of hackers. Did you know that facial recognition software is not 100% accurate? If you’ve ever used a FitBit, you know that data collection is often inaccurate. It will say you climbed 24 floors when you went up an elevator. It’s the same with facial recognition software. There can be hiccups with the software that get your biometrics wrong. What if your biometrics are not only inaccurate but also now align with a criminal. Just as someone can steal your identity via a credit reporting agency, your online identity can be hacked, stolen or just plain wrong. If it is hacked, a computer expert can easily trick the biometrics to change your facial data just enough so that you become identified as a criminal, a decoy for the actual criminal.

Now, you walk into a bank like Wells Fargo. Their cameras are scanning for criminals. Your face is flagged as a potential criminal or an associate of a criminal, and you are refused banking services. Since Wells Fargo has stopped taking cash deposits, they are scanning for money launderers. Your online profile could wrongly associate you with someone who is a criminal, or maybe you are distantly connected to a criminal in your social network. Again, you could be refused services.

Now let’s go back to Cambridge Analytica who sold your data to governments. Let’s say your next trip is to a country where facial recognition is used everywhere as a method of population control. Your face is correctly recognized and matched with your social media profile. And maybe you have said some critical things about the country’s government. Could you be detained and arrested as punishment for your criticism? Maybe you visit another country where it’s illegal to speak out against the country’s government, and the same thing happens – you’ve been critical of a government leader on social media, and you are flagged as hostile to the government.

Even at home, let’s say you participate in a protest march and the facial recognition software in the streets identify you. Could your own government mark you as a potential threat? Could you be put on a watch list, making it more difficult for you to travel and fly?

Yes, these are dystopian nightmares, but as more surveillance is used to deter crime and catch criminals, the chance of false identity and of your information being used against you is on the rise.

Your life on most social media doesn’t end with your posts. You don’t control where your data goes, or how it is used or misused. Companies are willing to spend a significant amount of money to access your private information. Many have built entire empires on freely using people’s personal information. If it’s worth that much to them, shouldn’t we do more to protect it?

Here are a few things you might consider to keep your data safer:

  1. Use a more secure search engine like DuckDuckGo or Mozilla
  2. Store photos in secure places like Idka, not on social media or free services like Google photo.
  3. Don’t use facial recognition technologies on your devices.

The good news is that we can do more to protect our personal data and our identity. There are alternative ways to connect and share online, via networks and platforms like Idka, that are private and that do not sell data to third parties. We believe that your privacy is a basic right — that you should be able to decide with whom you share, and what journey you and your data take.