What kinds of things should we be looking out for, or do, to protect our privacy while managing our health? And when, if at all, should we share our information? We invited Chief Privacy and Regulatory Officer with Omada Health, Lucia Savage, to the GoodTech Vidcast to help us answer those questions and more. Proceed to read a recap of the show, or watch it in its entirety here.
Omada and the Healthcare Industry
Lucia Savage is the Chief Privacy and Regulatory Officer at Omada Health, a healthcare service delivery company. She tells us that Omada is just like a doctor’s office, but the way that they provide their services is through a secure and regulated technology platform, through which trained healthcare professionals communicate with individuals (equivalent to patients). The platform allows Omada to collect digital signals that people want them to collect about their life. A part of Lucia’s job is to make sure Omada’s engineers and product designers understand why the healthcare rules operate the way they do, as well as explain to regulators how tech can fit into traditional healthcare, and make healthcare better. Lucia says, “I’ve been in healthcare for almost 20 years. I really believe in the powers of these digital signals, to make our healthcare professionals smarter, better, more accurate and more relevant to each patient they see.”
As Omada is a platform handling a lot of sensitive information, we ask Lucia what they use for communication. She tells us that they’ve built their own secure platform for their professionals to talk to their patients. “We also have a bunch of satellite services, for example through our customer support unit, and other things that create security in the communications. We know that as a healthcare organization, there’s going to be health information on our platform that we want to secure as best we can. I think the healthcare system really has to wrap its head around that there is a point in the communication chain, where responsibility falls to the receiver. Does the receiver use a secure messaging app, or do they have a hotmail account? Those are things the healthcare provider literally can’t control, and we have to assume that people are grownups,” she says.
As Lucia mentions, a lot of sensitive information is shared and stored within the communication system of any health organization. Therefore, it is critical to protect that information, and to have a system that is as secure as possible. Omada Health uses its own secure platform for communication, but creating a proprietary communication platform is not a possibility for all organizations. Luckily, there are safe and secure alternatives you can turn to, without having to use platforms like Facebook or Slack, and without having download software, or to rely upon IT support. Idka is one such example.Communication between doctor and patient is made easy on Idka, with multiple features, such as chat, video chat, groups and storage. Idka is fully encrypted, and nothing on the platform is searchable online outside of the platform. Everything you post, share, or store on Idka remains under your control, meaning that you can easily share information about your health in a chat with your patient or doctor, without worrying that an outside party might access it.. Read more about how Idka works, and get started with a FREE account on idka.com!
What are some challenges for the innovators in the healthcare industry? Lucia says, “I think in the US, we’ve had a period of time in healthtech innovation where there’s a branch of products and ideas that are trying to specifically stay in this low-regulated consumer retail space. Then there is another branch of people like Omada, that are trying to actually change the healthcare delivery system, and stay squarely to healthcare. We’ve done a lot of work in that space, and our view at Omada is that healthcare should embrace digital tech for clinically ethicatious services. Because, when you’re in the healthcare system, a lot of things get solved. How you account for fraud, how you bill – the privacy rules kind of click into place automatically. But, different people choose different business models, so I think the first thing the innovator’s have to think about is their idea compared to their revenue model, and how they are going to make money off of their idea, because they’re definitely two distinctive branches in the US. That’s not going to be the case in Europe, because there, the regulations follow the data and not the economic sector. So, you may be a health tech company in Europe and fit into some specialized GDPR rules for healthcare providers. Innovators worldwide really have to think about what their idea is, where they’re going to do it, how it is going to make money, and how they are going to comply with any laws that apply to that method.”
Privacy in the United States
Last week, January 28th to be exact, was the Data Privacy Day in the US, or Data Protection Day, as it’s known as in Europe. Lucia tells us that she saw a lot of discussions around this on Twitter, where people were asking what they should do on this day. “My comment was, ‘check your app settings, to make sure they’re still what you want them to be, and then teach a friend how to check theirs.’ When apps update it can override your settings and revert you to the default, and people don’t know how to check their settings. You should also use add-ons to do ad blocking. Do whatever it is you can do to manage whatever apps you’re using.”
Lucia tells us that there’s a very vigorous debate about privacy in the US right now. It’s clear that the US doesn’t have a countrywide consumer privacy law, like GDPR. “We have different privacy laws for different economic sectors, and as our uptake of tech has grown and the Internet of Things has become ubiquitous, the gaps in that approach have become apparent,” she says. “A key thing here is to really solve issues of how consumers are best protected from a privacy perspective in the States, we probably need a new law, and that conversation has to be taken to congress.”
So, why is Big Tech then suddenly pushing for a greater regulation of data? Lucia states that it’s because Big Tech companies want the same law in every state and territory. “What’s really hard for businesses of any kind, whether you’re little or big, is to manage a different law in every place you have a person,” she says. “Tech has employees and users everywhere. That’s why there’s this dialogue about having regulation nationwide, because it’s easier to manage when it’s the same everywhere.”
The Privacy Dangers in Healthcare Today
Google is currently collaborating with a hospital health system that puts identifiable patient data in the hands of the tech giant’s engineers, for use in projects on machine learning and artificial intelligence. We ask Lucia what she thinks of this. She tells us, “In the American system, healthcare institutions have always been able to hire data analysis firms to help them. Google is just joining that fray, and they’ve said that they’re doing this by the rules. Among the rules are things like that the people handling the patient identifiable information have to be distinct and segregated from the people that, for example, are handling all the rest of Google’s business. Another rule is that the people handling the information have to have access based on their role and their need. So, all of that is really constrained by regulation in the US. Although the government has announced they’re investigating the relationship, Google has been very adamant that they’re following the rules.”
Lucia says that for some patients, the ability to have ready access to their own information can be a matter of life and death. “That’s what America has been trying to build, this inoperable system where wherever you go as a patient, the new doctor you’re seeing there can find out what they need to know to avoid causing you harm as they treat you. Until we have interoperability, we can’t really have a situation that advances patient health when we keep the data in discrete systems, and don’t allow the data to go where it needs to go for care. If we’re really going to talk about this and really solve this problem, we have to look at all the issues on the table and really decide if the system we have in the US is working, or if we need a new system.”
Spreading Health Information on Social Media
A lot of people take use of social media support groups, to feel connected to other people that are going through what they’re going through. But, is it safe to share information about your healthcare on social media platforms that take your information to sell to third parties? Lucia claims that there are significant health, and emotional health, benefits in a support network, for people either managing other people’s conditions or people with conditions themselves. “You may not have people in your community to talk to, but you can find ten or 15 like-minded people on social media. In addition, there’s a further benefit that through social media, people with super rare diseases, have been able to find each other and create really important pods of patients that can collaborate with researchers to find cures.” On the other hand, Lucia adds that people should be really thoughtful. If they want to keep their health information private, but want to talk to people on social media, they have to make sure that the people they talk to don’t spread their information. “You can easily control what you say about yourself, but somebody else could write about you. It becomes a process about educating your network, about maintaining your privacy,” Lucia says.
We agree that one should educate their network about privacy to make sure that sensitive information is not getting into the wrong hands. At the same time, we believe the danger remains that personal information, such as health-related data, could be leaked by people you know. Social media platforms themselves also have your information in an iron grip, and can and will take advantage of that. It’s no secret that Big Tech companies, like Facebook, have been involved in some pretty heavy health data privacy issues. To get a grip of what happens to your health data on Facebook, we recommend reading articles like Facebook’s role in the health data crisis, or You Don’t Want Facebook Involved With Your Health Care. The latter explains it well: “As tech companies move into health care, these digital profiles will become part of our medical records, with the potential to shape the care we receive, the resources we can access, and the bill we pay at the end.”
We understand that a sense of community is important, and social media has helped many people find that. But, we also know that many social media platforms take advantage of what you say, share, and do on their platform. Even if you join a “private” group – on most public platforms, your personal information is still at risk of being exposed to invested third parties. It is, of course, up to you how you share information about your health, but we have to think of the risks and consequences of our choices. If you’re looking for a place to safely share information about your health, we recommend trying a platform for communication like Idka, that will never share or sell what’s rightfully yours to protect. The data you leave on Idka stays on Idka. If you’re looking for a community specifically for support on health issues, or maybe a community for questions and advice around specific conditions, you can easily create or join groups on Idka. You get the same sense of community, without having to feel like you’re leaving sensitive information out there for the platform to share or sell for profit.
Any questions for us, we’re always happy to chat! Reach out on www.idka.com/contact, and try Idka for FREE today!