Privacy from the Inside Out: How secure is data collected by digital pills?

SUmmary

Ingestible electronic sensors (IESs or “digital pills”) can be taken with or as a part of a drug in order to collect and record patient data, such as medication adherence or physiological metrics. This information can then be shared with relevant parties, including the patient, family members, and health care providers. This episode will address the complex legal, ethical, and legislative issues raised by digital pills. Cynthia Chauhan (a patient advocate and two-time cancer survivor) will share her experience using an implantable sensor, and Ari Waldman (an authority on the nexus of law and technology) will describe the challenges for security, privacy, and ethics.

Episode

Transcript

Cynthia Chauhan: What things have they put in place to stop hacking and how long will I live without the device, as opposed to how long I may live with the device? Because the ultimate hack is when our body kills us anyway. 

I. Glenn Cohen: I’m Glenn Cohen. I’m the Faculty Director of the Petrie-Flom Center, the James A. Attwood and Leslie Williams Professor of Law, and the Deputy Dean of Harvard Law School and your host. You’re listening to Petrie Dishes, the podcast of the Petrie-Flom Center for Health Law Policy, Biotechnology, and Bioethics at Harvard Law School. 

We’re taking you inside the exploding world of digital health, one technology at a time. Today, it’s what are sometimes called smart sensors and smart pills, which we’ll also refer to as ‘digital pills’ and ‘implantable sensors.’ Technologies that monitor patients from inside their own bodies. I’m very familiar with these since I was a bioethics advisor for Otsuka on the first of these pills to come to market, ‘Abilify MyCite.’ 

Ari Waldman: Digital pills have the opportunity to enhance medical care. I think one of the main reasons why people support or are excited about this technological advancement is for things like ensuring adherence to a medical regimen, an ingestible sensor can monitor when you need to take a particular medication, I’m sure it could even monitor your sugar levels, for people who have different types of diabetes. 

I. Glenn Cohen: That’s Ari Waldman, a Professor of Law and Computer Science. He’s describing the key benefit to a digital pill. That once the sensor is ingested by a patient, it monitors their vitals in real time. Let’s hear from Cynthia Chauhan, who has real life experience with an implantable sensor. 

Cynthia Chauhan: I have medical devices in my body and one of them uses AI. And I, I really like it. It makes me feel much safer because it’s called a CardioMEMS. It’s in my pulmonary artery. And every day I lie on this special pillow and it reads my pressures and sends them to my doctor immediately. If they’re out of line the one in his office alarms and they call me. So, that to me is a good use of AI to help me stay safe and be aware. It heightened my awareness or support my awareness of my body and what it’s doing. 

I. Glenn Cohen: Cynthia experience exemplifies how smart ingestible and implantable technologies can fill in the gaps that patients cannot address on their own. Through real time monitoring one potential benefit is providing doctors with more information to inform patient care. Still, there are gaps that these technologies cannot fill. 

Ari Waldman: But as we’re thinking about those benefits, we should ask ourselves the question, ‘Will an ingestible sensor or a digital pill solve the problems that are posed in these areas?’ So, adherence to medical care is an enormous problem. My sister is a physician. She is a pediatric endocrinologist. She’s worked in hospitals in urban areas and in suburban areas. And she’s seen all sorts of patients, many for whom English is not their first language, whose parents work three jobs in order to make ends meet, who don’t have healthcare. There are lots of social and cultural reasons why adherence is so difficult. It’s not entirely clear to me, as a sociologist, although admittedly not a sociologist of health, it’s not entirely clear to me that given the social and cultural factors that contribute to lack of adherence in certain medical care that a digital pill is going to be the answer.  

I. Glenn Cohen: What Ari is saying is that despite their potential, smart sensors and pills cannot solve all monitoring related problems, particularly those that involve a patient’s own choice to act based on that information. 

Ari Waldman: There’s such trust in technology that many people think that the introduction of a piece of technology will solve the whole problem. This is ‘the answer’ where technology really should be a supplement to traditional forms to enhance traditional forms of care.  

I. Glenn Cohen: Smart pills and sensors, or health technologies more generally, should be treated as just one tool to meet a patient’s healthcare needs. For some patients, they can definitely be helpful, but they might not be for everyone. Cynthia does a nice job of delineating some of the considerations on whether to use a smart sensor or pill. 

Cynthia Chauhan: I think it’s not just whether or not it’s safe, although that’s primary, but secondly, is it being used because it’s important to my care or because it’s the cute new tool on the block. For example, I am a very good pill taker. I take my pills when I’m supposed to. If I choose not to take one, I talk to my doctor about it first. So, I don’t think I need a pill tracker. And that’s just one more extra thing that makes no sense to use on me. On the other hand, if I don’t take my pills or I forget my pills, or I take them at the wrong time, then a pill tracker would be an important part of my care. 

I. Glenn Cohen: What are some other considerations? Well, data privacy is a big one. 

Cynthia Chauhan: I think that you always have a right. If you are sharing your health data or giving parts of your body for use, it’s not enough to trust it. You have to be told; this is how we take care of your data. This is what we do to keep your data safe.  

I. Glenn Cohen: Now Cynthia breaks down further how she thinks about trust. 

Cynthia Chauhan: There’s, there’s responsible trust and irresponsible trust. To me, irresponsible trust is saying, ‘Oh, he’s the doctor, he knows everything. Or she’s the doctor. She knows everything. I don’t have to worry.’ That’s irresponsible trust. Responsible trust is saying ‘We’re partners in my care. I really want to help in this way. Tell me how you protect my data so that I can be comfortable and you can have the tissue or blood or whatever from me that you need to do the research.’ I think it needs to be thoughtful decision-making that we do together. 

I. Glenn Cohen: In other words, we shouldn’t blindly trust our doctors without hearing more information and coming to an informed decision together. But the digital pill itself might undermine that responsible trust as Ari reflects. 

Ari Waldman: Digital pills essentially reframe or reorient the relationship between the doctor and a patient in that when you and I normally go to a physician we have agency. We take the initiative to share information about what’s wrong with us for diagnostic purposes, and we sign forms and give over information. Once you ingest a sensor, once you ingest a digital pill, that information is automatically being sent to a physician. 

I. Glenn Cohen: Of course informed consent is a touchstone of ethical medical practice. But how does it work with ingestible smart pills and implantable smart sensors? 

Ari Waldman: When we normally go to a doctor, a doctor may describe the risks and benefits of a particular procedure. Provide us with a form that has those lists and we can consider and ask questions and then sign off on the procedure. That’s traditional informed consent. But consent to address a digital pill strikes me as much more like the consent for data collection online. It happens when you click a button, when you click agree. The first time you set up an account, that’s when you consent. And that consent is now operable almost in perpetuity. Well, you consented, you agreed to the terms of service, so you consented. And whereas in a traditional doctor-patient relationship, we have opportunities to continually update and revise our consent and get more information for consent. Here, I worry that it’s going to be placed into a system that allows companies to just take one set of consents when we sign up or when we ingest it, and that’s it.  

I. Glenn Cohen: Just as, when I want to opt out from data collection online, we also might want to opt out from data collection taking place inside our own bodies. And while there are privacy laws that govern health data collection, they’re not as broad as you might imagine. 

Ari Waldman: Everyone seems to be a HIPAA expert, but of course, none of them know that HIPAA is actually a really narrow privacy law. It only protects certain types of data in the hands of certain types of healthcare entities, certain covered entities, such that there are companies like Apple that actually have control over far more health data than anyone that’s covered by HIPAA.  

I. Glenn Cohen: In some instances, the narrow coverage of HIPAA, it only applies to covered entities and business associates, means that it will not cover all the health data collection. And thus, there may not be robust privacy protection. Data security is another key topic where the regulatory standard may not be broadly applicable. 

Ari Waldman: There are security issues posed with any other new technology that’s Wi-Fi enabled. So, in this way, digital pills are no different than smart homes, smart thermostat, Wi-Fi enabled products. They can be hacked. You have more information, more options for unauthorized access. But there are amplified concerns because of the sensitivity of the particular information. 

I. Glenn Cohen: While security is a concern shared across any product collecting data, the security of something inside one’s body, either implanted for a longer-term or swallowed for a short term in the case of a smart pill, may be inherently more concerning. Ari shares some horror stories related to other digital technologies. 

Ari Waldman: When I teach my information privacy course, I share a story, I share a couple of stories and this is one of them, about a family who had smart thermostats in their homes. And their reason is because their utility company offered them lower prices to have them installed. And it turned out that because of how the automatic system was set up, air conditioning would go on, lights would go on, et cetera, you have a smart home, not just a smart thermostat, the utility company knows when temperatures go up in a bedroom or when temperatures go up in a kitchen. And when you see patterns of temperatures going up in a bedroom at a particular time in the evening on a regular basis, then there’s a company that pretty much knows when you’re having sex and where you’re having sex on a regular basis. And even if they’re not broadcasting that information, a lot of my students find that really, really creepy. 

I. Glenn Cohen: If you think that an air conditioning unit making inferences about your personal habits sounds creepy, consider an even more invasive example. 

Ari Waldman: Another story I share that is exemplified in that, again, it is not a digital pill story but it is very much related. This is a story out of the United Kingdom, where a same-sex couple, they were going to spend several months apart. One partner bought a Wi-Fi enabled sex toy, which vibrated. And you can download an app. Someone with the app can turn on the vibration tool. Right, so this is an internet-of-sexual-things story. Uh, so, this was a way for them to even though they were apart to literally feel each-others presence. It just so happens that at one point, the partner who was using, this was a device that could be used in public, like no one would know you were using this device because it was hidden, this person was in a meeting at work. And the app turned on and it took the vibration up to its maximum. And that can be very startling and very disturbing in the middle of a professional meeting. So he called his partner and he said ‘you have to turn this off’ and it turns out he wasn’t doing it. The sex toy has been hacked, and someone was putting it up to maximum because they had gained unauthorized access. This poses a stark example of the security issues that are posed whenever we have a Wi-Fi enabled tool.  

I. Glenn Cohen: That’s quite a startling example of just how important data security is when considering digital tools that involve a person’s own body. 

Ari Waldman: It’s one thing if you have a Wi-Fi enabled thermostat, someone can hack into that. Fine. Yeah, those pose issues. But digital pills and sex toys, they get inserted into your body, pose different concerns because they can exert control and damage and harm from within, which are far more significant and far more dangerous. 

I. Glenn Cohen: Given these concerns, how can we better secure digital tools to prevent harm from within. What might an ideal regime look like? 

Ari Waldman: We need a more, I think, a more aggressive healthcare privacy regime that not only covers companies like Apple – it’s not only a concern posed by digital pills, but not only companies that have lots of health information outside of healthcare providers. But also puts specific restrictions and specific affirmative obligations, security related obligations, and privacy-related obligations on all companies that make these types of products. Security obligations can’t just be ‘you need to provide reasonable care,’ which is the standard that the FTC generally uses when it comes to security. Although even the power of the FTC to require that has been eroded by federal appellate courts over the last several years. It needs to be more specific yet flexible to be able to be enhanced, as time goes on. And there’s no reason why we can’t use an independent review board.  

I. Glenn Cohen: Reasonable care is a standard that says, ‘this is what a reasonable person would do to protect against harm.’ But what Ari is saying is that we need to do more, but so far the FTC hasn’t stepped up. 

Ari Waldman: One of the reasons why the FTC likes to stay out of a lot of technical controversies is because they say they don’t have the expertise, that ‘let’s just let the companies deal with it because they have the security experts.’ Well, maybe then public governance should involve significant technical expertise to be able to know what is and what is not appropriate. That builds in flexibility because the experts are going to be up to date on the newest and the best forms of security. So that’s a radical reorientation of what public governance with respect to security really means.  

I. Glenn Cohen: So, utilizing expert knowledge at the public governance level can help address knowledge gaps in regulating new technologies. And there are other areas of the law that may be helpful.  

Ari Waldman: Unsafe design is something that tort law started to focus on in the sixties in the 20th century, when we started to build new, highly technical products, that were really dangerous. And there’s no reason why we can’t have an updated form of products liability and design protections for new technologies, again, making sure that public governance is associated or has the ability to understand this technology.  

I. Glenn Cohen: As Ari suggests an updated products liability regime that considers how technologies are designed and how they function is also a key security measure. He suggests that doctors can lend their expertise in creating these changes. 

Ari Waldman: These debates, and these problems with these technologies, are going to be wrapped into broad debates in our society about the role of corporate power and the imbalance and asymmetries that consumers have in informational capitalism and private rights of action and how insulated companies are going to be from accountability. At some point medical professionals who have taken an important political stance before are going to have to get involved in how we fix public governance to deal with really dangerous or potentially dangerous medical tools. 

I. Glenn Cohen: In other words, since medical professionals will ultimately advise on, use, and prescribe these technologies, it’s important that they play a role in developing these product safety norms. And these safety standards will become increasingly high stakes as technologies develop. 

Ari Waldman: As we look forward to what digital pills or what ingestible sensors are going to be, we are not too far off from what used to be science fiction, where you have ingestible machines that are able to perhaps engage in repairs of broken bones or tissues. So, if those are hacked, you can murder people, you can program people. What happens if those go into your brain, and those affect how you feel? There’s a great potential, and I don’t mean to provide scary, possible sci-fi options like Battlestar Galactica or anything. It’s a long way of saying that in a world where security is pretty much a private concern, where regulators don’t always get involved because they don’t feel that they are capable and that the technologists themselves are the better ones to do it. Security is going to be even more important when we start putting things inside our bodies. 

I. Glenn Cohen: Given that a digital pill or implantable sensor might be hacked. Will patients still assume the risks? Let’s hear from Cynthia. 

Cynthia Chauhan: When I think about the risk/benefit of implantable, or digestive devices, I think about, how risky is this to hacking? If it’s hacked, what is the risk to my health and wellbeing? For example, if the hack on my CardioMEMS is just to read it well, I don’t like it, but it’s not going to kill me. If I have an implantable device that hacking makes work differently, to my detriment, then I would want to have a more intensive conversation about how are you protecting me from hacking. 

I. Glenn Cohen: For Cynthia, her doctor’s transparency and communicating the risks is critical to her decision in whether to use the device. There will never be zero risk when it comes to data security, she acknowledges. The question is, just how much risk are you willing to accept? As Cynthia suggests, this depends on each individual patient and the benefits they think they will achieve through the digital pill or implantable sensor. 

Cynthia Chauhan: If this is a device that is going to lengthen my life by possibly five years and is possible to hack, what is their experience with being hacked? What things have they put in place to stop hacking? And how long will I live without the device, as opposed to how long I may live with the device? Because the ultimate hack is when our body kills us anyway. So, that to me is, is the kind of thinking I would do. I would not say no to a device that might be hacked just out of pocket, but I would say let’s talk about it. Let’s talk about the risk.  

I. Glenn Cohen: As we’ve heard today, ingestible and implantable electronic sensors can be very useful in making sure patients stick to their medication regimen, evaluating the efficacy of a therapy, and collecting physiological metrics. And in the future, we’ll see many other use cases. But these pills and devices also have a key vulnerability. They may be susceptible to hacking just like any other digital device. But these aren’t just any digital device. They are devices that involve personal health information and are inside our bodies. Our current regulatory regime has not been designed with these kinds of products in mind, and some view it as too hands off when it comes to showing up the privacy and security of this information. Now, as we wrap up this episode, we’ll leave you to consider: will these tools become health enhancing technologies or just another vulnerability? 

I. Glenn Cohen: If you liked what you heard today, check out our blog ‘Bill of Health’ and our upcoming events. You can find more information on both at our website, petrieflom.law.harvard.edu. And if you want to get in touch with us, you can email us at petrie-flom@law.harvard.edu. We’re also on Twitter and Facebook @petrieflom. 

Today’s show was written and produced by Chloe Reichel. Nicole Egidio is our audio engineer. Melissa Eigen provided research support. We also want to thank Ari Waldman and Cynthia Chauhan for talking with us for this episode. 

This podcast is created with support from the Gordon and Betty Moore Foundation and the Cammann Fund at Harvard University. 

I’m Glenn Cohen and this is Petrie Dishes. Thanks for listening.

Created with support from the Gordon and Betty Moore Foundation and the Cammann Fund at Harvard University.