Before you start spending money on regulatory consultants get back to basics.
Do you or do you not need to comply with the HIPAA Security Rule?
If you do – what is the very first thing you should do?
In this post – we will get back to basics with 3 practical ways of complying and reducing your regulatory risk.
I specialize in cyber security and privacy compliance consulting for medical device companies in Israel. While this may sound like a niche, it is actually a very nice and not so small niche – with over 700 biomed vendors and a market growing 7-8% / year.
Israeli biomed startups are incredibly innovative and it’s fun working with smart people. Here are 3 ways to improve your HIPAA risk analysis in just a few minutes:
Check # 1 – Maybe you are not a HIPAA Business associate
If you are a medical device vendor and you connect to a hospital network or share data with doctors, you are automatically a BA; a Business associate according to the HIPAA Business Associate definition. If you are a BA – you need to comply with the HIPAA Security Rule. But maybe you are not a BA.
By law, the HIPAA Privacy Rule applies only to covered entities – health plans, health care clearinghouses, and certain health care providers. A “business associate” is a person or entity that performs certain functions or activities that involve the use or disclosure of protected health information on behalf of, or provides services to, a covered entity. If an entity does not meet the definition of a covered entity or business associate, it does not have to comply with the HIPAA Rules. See definitions of “business associate” and “covered entity” at 45 CFR 160.103.
So – if you developed a consumer mobile app that monitors stress with a cool dashboard to visualize data stored in the cloud enabling your users to reduce their stress and compare themselves with other people like them; so long as the software doesn’t share data with a covered entity (like their doctor) – you are not a BA. Check.
Check # 2 – Maybe you don’t even need to comply with HIPAA
HIPAA applies to storage of EPHI (electronic protected health information). Simply put – EPHI is the combination of PII (personally identifiable information – such as a name, email and social security number) and healthcare / clinical data.
So – getting back to our hypothetical mobile medical app for personal stress tracking, let’s suppose that the stress data includes heart rate and respiratory rate in addition to how many times/day the consumer called their girl friend to make sure she still loves them (when they are freaking out with stress before a big exam). If your mobile medical app for stress management doesn’t store personal information with the clinical data, then you don’t have to comply with HIPAA because you don’t you don’t create, store or transmit EPHI in the cloud. Check.
Check # 3 – Using contractors for software development? Vet them and oversee them
There is commonly-used expression in Hebrew – “launch and forget” (שגר ושכח). I believe that this is a Hebrew translation of the American English term “Fire and forget” that refers to a type of missile guidance which does not require further guidance after launch.
When it comes to contractors you do not want to “launch and forget”.
Maybe you are a BA and you have to comply with HIPAA – just because the HIPAA Security Rule does not have a standard safeguard for vetting contractors, can you afford to gloss over this area?
This is a big one boys and girls. If you use contractors for code development, make sure you thoroughly vet them before engaging with them on upwork. I am not talking about quality of work – it is a given that you need someone highly competent in whatever problem problem domain you are trying to crack (your girl-friends brother-in-law may not be your best fit). I am talking about the threat of a contractor stealing your code, dropping Android malware into your app, or enabling tap-jacking to steal personal data).
Even if they don’t steal your code, consider the threat of your contractor working for a competitor, leaking your IP or being hired by a business partner who click-jacks your entire business model.
It’s tempting to work with software developers in Ukraine or China but be careful. Consider the case of Russian programmers who wrote code for U.S. military communications systems or the DOJ accuses firm that vetted Snowden of faking 665,000 background checks.
In the Federal space, it is sad but true that there is huge a Federal procurement machine that enables these kinds of attacks to happen because of organizational complexity ( a nice way of saying that accountability falls between the cracks) and greed ( a not so nice way of saying that big contracts are a challenge to morality and governance).
US Federal agency purchasing managers who sign purchase orders without appropriate contractor vetting and oversight are not a justification for a privately-held Israeli medical device company to sin…
The calls for more legislation are a knee-jerk reaction to the horses having left the barn years ago but when common sense and good business practices yield to greed then perhaps we do need a tail wind. Vetting software development contractors should be a standard requirement in the core security guidance such as HIPAA, PCI and FDA cyber security but in the meantime – don’t wait for the US Federal Government to tell you to vet your contractors and oversee their work. Put an accountability clause into your contracts.