The recent push from both Apple and the CMS to give patients more control of their own health data stands to boost patient engagement, which most in the industry consider a good thing. But moving data outside of the relatively safe confines of an electronic health record adds another layer of risk and vulnerability.
As more parties gain access to the data, more avenues for breaches open up, potentially jeopardizing not just information security but also patient privacy. Bad actors can now target not only EHR systems but also patients' phones, where health data reside.
Those pools of data will only grow larger. Last week, Apple officially launched the next iteration of its Health app, which allows people to pull their electronic health record data onto their iPhones. Thirty-nine health systems have partnered with the tech giant to make their patients' data available. "When you have this data on your phone, you have risks that would traditionally not have existed," said Daniel Farris, chair of the technology practice at Fox Rothschild.
Even Facebook's Mark Zuckerberg can attest that no organization is immune from public—and potentially political—backlash when sensitive consumer data is compromised. Many argue that Facebook's problems with Cambridge Analytica and access to user data should serve as a wake-up call to technology executives who have been entrusted to protect consumer information.
Allowing patients to put data on their phones isn't just a matter of flipping a switch and then walking away. It's first and foremost the patient's responsibility to keep the data secure and private. Health systems and vendors alike are encouraging patients to understand the risks and take precautions. "The patient who downloads this information absolutely must secure their device to protect their own records," said John Kravitz, chief information officer at Danville, Pa.-based Geisinger, one of the first health systems to link its records with Apple's Health app.
But getting the data onto those phones requires security safeguards on the part of the health system and EHR vendor too. Because of those safeguards, the actual movement of the data shouldn't pose too great a security risk, argued Steve Dunkel, chief information security officer at Geisinger.
Just as a patient has to go through authentication to access a patient portal on their phone, they do the same when granting the Health app access to their records. The app then makes a secure connection and receives the data via the FHIR standard.
While security can be built directly into the app, some encouraged using security controls already available on phones. "Patients have requirements for strong passwords, and we can make them more secure by using newer features like Touch ID on the patient's mobile phone," said Janet Campbell, vice president of patient engagement for Epic Systems Corp.
This drive to get patient data into patients' hands comes not only from companies and health systems but also from the federal government. Jared Kushner, senior adviser to President Donald Trump, and CMS Administrator Seema Verma want to increase interoperability and give patients more control of their own data, they announced at the Healthcare Information and Management Systems Society's annual meeting in March.
But the federal government's new zeal for data mobility should be accompanied by a push for security standards, said John Riggi, an FBI veteran who's now senior adviser for cybersecurity and risk at the American Hospital Association. "The issue surrounding apps in particular is that (the Office of the National Coordinator for Health Information Technology) has not promulgated specific security standards," he said.
While some apps, like those made by EHR vendors, abide by HIPAA security and privacy rules by law, once data are on a patient's phone, the patient might unwittingly share them with an app that doesn't have such stringent security and privacy controls. "There should be a measured approach in collaboration with HHS and the ONC and the providers to ensure whatever platform a patient uses to access the EHR has been fully vetted and complies with all HIPAA privacy and security rules," Riggi said.
There's also the threat of malicious apps, which might be able to extract patient health information. Malware of this sort already exists to steal financial and other data, and benevolent apps already share information with advertisers.
"This is not as much a concern when you're dealing with a large, trusted organization such as Apple," Riggi said. Apple requires patients to authenticate the data transfer by logging into their health systems' EHR patient portals. Data then travel straight from the portal into the app, never passing through Apple servers.
Patients can bear some of the responsibility, primarily by ensuring the app is from a trusted vendor. Indeed, information technologists stress the need for education.
"We enthusiastically support the consumer's right to access a copy of their data and to decide how it should be used," said Don Bisbee, senior vice president of clinical and business strategy for Cerner. "But continued education is needed around the potential risks associated with choosing to expose sensitive health data to broader groups than the covered entities where HIPAA protections apply."
Riggi compared the need for security in healthcare apps to the need for security in financial apps. But financial apps, he pointed out, are entirely controlled by the financial institutions they're related to, whereas health apps aren't necessarily controlled by health systems. "There should be a measured approach with collaboration between HHS and ONC and the providers to ensure whatever platform a patient uses to access the EHR has been fully vetted and complies with all HIPAA privacy and security rules," Riggi said.
The type of data contained in the Health app (and others) complicates how HIPAA would apply, though. The Health app won't just hold protected health information from EHRs. It'll also hold patient-generated data from wearables and other sources—data that aren't considered protected health information but could, in theory, be turned into it.
"The same set of data may be subject to HIPAA or not depending on where it is or who accessed it, and that could create drastically different privacy and security requirements," Farris said.
For example, if a person has data from a wearable on their phone and only that person has accessed the data, then the data aren't considered protected health information. But as soon as that person shares the data with a provider, intending it to be used for healthcare purposes, the data become protected. If that data were breached somehow, it would be considered a HIPAA violation. That, in turn, raises the question of who would be responsible for such a breach. "If there's a failure of security at the phone level, maybe Apple is liable," Farris said. "If there's a failure at the app level, it might be the app maker, and if there's insecure transmission, it might be the EHR vendor."
For instance, if a patient downloads their information to the Health app and then posts it on Twitter, that's not a violation of HIPAA. But if that patient inadvertently lets a third-party app access their information, who's responsible? Even though the data may still be secure, they're no longer private—and that's concerning, Farris said.
"You can't have privacy without security, but you can have security without privacy," he added.