Companies that handle personally identifiable health information are subject to data privacy rules under the Health Insurance Portability and Accountability Act—rules that have grown more complex with the proliferation of mobile health applications (mHealth apps). Those that want to develop mHealth apps in a secure and compliant manner have two options: Build a HIPAA-compliant application of your own, or buy one from a provider. This week, we ask UCLA how it weighed the pros and cons for its mHealth development.

The 563-page HIPAA Omnibus Rule is a package of privacy- and security-related rule changes that went into effect in 2013. The law effectively sweeps up any app that collects, stores, or shares PHI with covered entities—particularly doctors, hospitals, and medical offices.

The medical school at University of California/Los Angeles became one of the first organizations to confront this next-generation compliance with HIPAA, when two years ago UCLA’s Gambling Studies Program (UGSP) decided to create an app as part of a larger IT effort to treat patients using mobile and Web technologies. The apps built on UCLA’s platform are aimed at helping people suffering from addictions, and UGSP was one of the first to try the app route.

Patients can use the smart phone app to access information about sponsors, treatments, and techniques to help them deal with their alcohol, smoking, or gambling addictions. At the same time, “we’re constantly collecting data from the patients who are enrolled in our research study,” says Ardeshir Rahman, UGSP program manager at UCLA. Through the Web app, mental healthcare providers can collect and analyze patient data in real-time to better understand patterns around their patients’ addictions.

“The biggest hurdle that we had was, ‘How do we make this entire platform HIPAA- compliant and secure from top to bottom?’” Rahman says. Before UGSP could build and deliver its mHealth app to physicians and therapists, it first needed to demonstrate to UCLA’s compliance department and Institutional Review Board (IRB) that it could securely store and transmit patient data to mobile devices in a way that is HIPAA-compliant.

Unlike the compliance department, the IRB’s focus is strictly patient safety. With mHealth apps, in particular, “they want to know how, and in what way, we are protecting patient data,” Rahman says.

Trial and Error

Starting in 2013, UCLA approached the task by brainstorming solutions that it could build on its own. “We were using some UCLA servers, making them HIPAA-compliant, and feeding data through that,” Rahman says.

“The biggest hurdle that we had was, ‘How do we make this entire platform HIPAA compliant and secure from top to bottom?’”
Ardeshir Rahman, UGSP Program Manager, UCLA

As the process unfolded, however, the complexity of the project started to wear on the app developers. They estimated they would need three to four months to create a product that fully complied with both UCLA and HIPAA standards, and that actually worked, Rahman says. “That created quite a bottleneck” for clinicians eager to use the app, since development obstacles were formidable.

That’s when UCLA began a Google search for a HIPAA storage solution. “We needed something that was platform agnostic, meaning it had to be HIPAA compliant and that didn’t tie us down to a certain technology or platform,” Rahaman says.

Under HIPAA, “business associates” must abide by some of the same requirements as the covered entities they work with, including encryption standards for patient data that may pass into their hands. The rule makes them liable for unintended disclosures. For these reasons alone, choosing a mHealth app provider that is HIPAA-compliant is an important consideration.

The problem is that, at the time UCLA started developing its mHealth app, “no one was doing that,” Rahman says. “If you wanted a HIPAA compliance service, you needed to buy into a lot of the provider’s technologies. What we really wanted was just this simple and clean way of sending and storing data that was completely agnostic on the technology that we wanted to build on.”

BUSINESS ASSOCIATES RULE

The following is a partial text of the HIPAA Omnibus rule as it applies to business associates.
The final rule implements the proposed revisions to § 164.500. While we understand commenters’ concerns regarding the uses and disclosures of health information by entities not covered by the Privacy Rule, the Department is limited to applying the HIPAA Rules to those entities covered by HIPAA (i.e., health plans, health care clearinghouses, and health care providers that conduct covered transactions) and to business associates, as provided under the HITECH Act.  As we discuss further below, section 13404 of the HITECH Act creates direct liability for impermissible uses and disclosures of protected health information by a business associate of a covered entity ‘‘that obtains or creates’’ protected health information ‘‘pursuant to a written contract or other arrangement described in §164.502(e)(2)’’ and for compliance with the other privacy provisions in the HITECH Act. Section 13404 does not create direct liability for business associates with regard to compliance with all requirements under the Privacy Rule (i.e., does not treat them as covered entities). Therefore, under the final rule, a business associate is directly liable under the Privacy Rule for uses and disclosures of protected health information that are not in accord with its business associate agreement or the Privacy Rule. In addition, a business associate is directly liable for failing to disclose protected health information when required by the Secretary to do so for the Secretary to investigate and determine the business associate’s compliance with the HIPAA Rules, and for failing to disclose protected health information to the covered entity, individual, or individual’s designee, as necessary to satisfy a covered entity’s obligations with respect to an individual’s request for an electronic copy of protected health information. See §164.502(a)(3) and (a)(4). Further, a business associate is directly liable for failing to make reasonable efforts to limit protected health information to the minimum necessary to accomplish the intended purpose of the use, disclosure, or request. See §164.502(b). Finally, business associates are directly liable for failing to enter into business associate agreements with subcontractors that create or receive protected health information on their behalf. See §164.502(e)(1)(ii). As was the case under the Privacy Rule before the HITECH Act, business associates remain contractually liable for all other Privacy Rule obligations that are included in their contracts or other arrangements with covered entities.
Source:  HHS.

That’s where TrueVault came into play. “We started doing research...and serendipitously, we found TrueVault had developed a solution for that.”

Jason Wang, founder of TrueVault, says TrueVault tries to achieve security in multiple ways. The company encrypts customer data, yes, but also parcels data among multiple servers to increase the chance no hacker will ever obtain all  the customer's data. An outside firm audits TrueVault's security quarterly.

Rahman says UCLA also chose TrueVault for its easy-to-use capabilities. “We designed TrueVault to speak ‘developer,’” Wang explains. That means we design our tools to behave like other tools developers are already familiar with, rat her than reinventing the wheel when it comes to saving, collecting, and storing data. “By doing that, there is no learning curve for developers,” he says.

TrueVault’s robust security and easy-to-use capabilities helped make the approval process with UCLA’s Institutional Review Board seamless, “because it didn’t take a lot to describe how the [application program interface] works and how we’re using the API, Rahman says. It was a matter of telling the IRB: “‘The data is not stored permanently on the device. It’s immediately sent using this highly encrypted protocol to a highly encrypted server, which is HIPAA-compliant.’”

In a medical university setting, where IRB members can be physicians who may not necessarily be tech savvy, the broader lesson for other mHealthapp developers is the importance of speaking in plain terms. Rahman strived to describe how the technology works to them in simple terms, so the IRB clearly understand the measures implemented to make the data HIPAA-compliant.

For UCLA, the implementation process, itself, was anti-climatic. “It took us about a week to get it up and running on our system,” Rahman says.

Such results may not be typical of all providers. If you’re a healthcare provider that has an outdated electronic health record system that’s been around for a decade, for example, and you now have to migrate onto a solution like TrueVault that’s HIPAA-compliant, it might take a little bit longer, Wang says.

Still, developing an mHealth app on your own can take significantly longer. Wang says some customers will spend up to six months with a team of engineers dedicated to it, and spend as much as $500,000 in manpower and equipment cost, before they realize that they’re still not achieving what they need to be achieving.

The manpower and expense involved in keeping up with such a project can be overwhelming. Hackers find new ways to infiltrate systems every day. “What they realize is building a solution is not a one-time event. Once you build it, you have to maintain it,” says Wang. “You don’t want to have a leaky pipe.”

The biggest lesson UCLA learned throughout the whole process in developing its mHealth app is the importance of being as clear and concise as possible when looking for that support and buy-in from senior officers and the board. In a university setting, for example, you have to know how to speak with both physicians and IT developers—a mix of people who have varying levels of technical knowledge—to make all those piece come together, Rahman says.

“How are you clearly and concisely conveying the steps that you’re taking to secure that data?” he says. “That’s a very important step.”