In my previous post Risk does not walk alone – I noted both the importance and often ignored lack of relevance of internal audit and corporate risk management to the business of cyber security.
Israeli biomed companies often ask us about the roles of audit and risk management in their HIPAA security and compliance activities. At the eHealth conference in Israel last week – a lawyer gave a presentation on HIPAA compliance and stated:
If you have to do one thing, make sure everything is documented – your policies and procedures, corrective action you took. Everything. That is your best line of defense.
Security is not an exercise in paperwork.
With all due respect to lawyers – no. Your best line of defense is implementing real security countermeasures in a prioritized way an ensuring that you are doing the right stuff all the time by integrating your HIPAA Security Rule and Compliance activities with your internal audit and risk management teams.
Risk does not walk alone
Risk is not an independent variable that can be managed on its own. It is not an exercise in paper work. Risk is a function of external and internal attackers that exploit weaknesses (vulnerabilities) in people and systems and processes in order to get something of value (assets). The HIPAA Security Rule prescribes in a well-structured way – how to implement the right security countermeasures to protect EPHI – the key assets of your patient customers.
The importance of audit for HIPAA
While audit is not specifically mentioned in the HIPAA Security Rule – security review and risk management are key pieces – audit is crucial for you to stay on track over time.
According to the Institute of Internal Auditors, internal auditing is an “independent, objective assurance and consulting activity designed to add value and improve an organization’s operations.” Internal audits provide assurance and consulting services to management in an independent and objective manner. But what does that mean? It means that internal auditors can go into your business operation and determine if your HIPAA security and compliance is a story on paper or a story being acted out in real life.
Audit – necessary but not sufficient
However, internal audit is not a line of defense and neither is a corporate risk management function a line of defense.
HIPAA Security and Privacy Rule compliance regards investigating plausible threats, valuable assets, vulnerabilities and security countermeasures that mitigate asset vulnerabilities and reduce the risk which is the result of threats exploiting vulnerabilities to damage assets.
When we frame security defenses in terms of mitigating attacks – we immediately see that neither audit nor corporate risk management fall into the category of countermeasures.
So why is audit and risk management important?
Audit is crucial to assuring that the security portfolio is actually implemented at all levels. Yes – all levels – including the CEO office and the last of the cleaning team. Audit strengths are also their weakness – they generally do not understand the technical side of security and therefore audit must work hand in glove with the operational and engineering functions in an organization.
Risk management is key to prioritizing implementation of security countermeasures – because – let’s face it – business and engineering operations functions are not qualified to evaluate asset value.
Your HIPAA and Security Rule compliance is not just about paper-work. It’s about getting it right – day in and day out.
Modern system architecture for medical devices is a triangle of Medical device, Mobile app and Cloud services (storing, processing and visualizing health data collected from the device). This creates the need for verifying a chain of trust: patient, medical device, mobile app software, distributed interfaces, cloud service software, cloud service provider.
No get out of jail free card if your cloud provider is HIPAA compliant.
Medical device vendors must implement robust software security in their device, app and cloud service layers and implement regulatory compliance in people and technical operations. If you are a medical device vendor, you cannot rely on regulatory compliance alone, nor can you rely on your cloud provider being HIPAA compliant. I’ve written here and here how medical devices can be pivot points for attacking other systems including your customers’ and end users devices.
Regulatory compliance is not security
There are two notable regulatory standards relating to medical devices and cloud services – the HIPAA Security Rule and the FDA Guidance for Management of cybersecurity in medical devices. This is in addition to European Data Protection requirements and local data security requirements that a particular country such as France, Germany or New Zealand may enforce for protecting health data in the cloud.
The American security and compliance model is unique (and it is typically American in its flavor) – it is based on market forces – not government coercion.
Complying with FDA Guidance is a requirement for marketing your medical device in the US.
Complying with the HIPAA Security Rule is a requirement for customers and covered entity business associates to buy your medical device. You can have an FDA 510(K) for your medical device and still be subject to criminal charges if your cloud services are breached. HHS has announced in the Breach Notification Rule and here that they will investigate all breaches of 500 records and more. In addition, FDA may enforce a device recall.
But – compliance is not the same as actual enforcement of secure systems
Verifying the chain of trust
Medical device vendors that use cloud services will generally sign upstream and downstream business associate agreements (BAA) but hold on:
There is an elephant in the room: How do you know that the cloud services are secure? If you have a data breach, you will have to activate your cyber-security insurance policy not your cloud providers sales team.
Transparency of the cloud provider security operations varies widely with some being fairly non-transparent ()and others being fairly transparent (Rackspace Cloud are excellent in their levels of openness before and after the sale) in sharing data and incidents with customers.
When a cloud service provider exposes details of its own internal policy and technology, it’s customers (and your medical device users) will tend to trust the provider’s security claims. I would also require transparency by the cloud service providers regarding security management, privacy and security incident response.
One interesting and potentially extremely valuable initiative is the Cloud Trust Protocol.
The Cloud Trust Protocol (CTP) enables cloud service customers to request and receive data regarding the security of the services they use in the cloud, promoting transparency and trust.
The source code implements a CTP server that acts as a gateway between cloud customers and cloud providers:
- A cloud provider can push security measurements to the CTP server.
- A cloud customer can query the CTP server with the CTP API to access these measurements.
The source code is available here on Github.
You are VP R&D or CEO or regulatory and compliance officer at a medical device company.
Your medical devices measure something (blood sugar, urine analysis, facial anomalies, you name it…). The medical device interfaces to a mobile app that provides a User Interface and transfers patient data to a cloud application using RESTful services over HTTPS.
The Medical device-Mobile app-Cloud storage triad is a common architecture today for many diagnostic, personal well-being and remote patient monitoring indications.
We have numerous clients with the Medical device-Mobile app-Cloud storage system architecture and we help them address 4 key security issue –
- How to ensure that personal data and user authentication data is not stolen from the mobile medical app,
- How to ensure that the mobile medical app is not used as an attack pivot to attack other medical device users and cloud servers,
- How to comply with the HIPAA Security Rule and ensure that health data transferred to the cloud is not breached by attackers who are more than interested in trafficking in your users’ personal health data,
- How to execute effective security incident response and remediation – its a HIPAA standard but above all – a basic tenet for information security management.
How effective is your security incident response?
The recent SANS Survey on Security Incident Response covers the challenges faced by incident response teams today—the types of attacks they detect, what security countermeasures they’ve deployed, and their perceived effectiveness and obstacles to incident handling.
Perceived effectiveness is a good way of putting it – because the SANS Survey on Security Incident Response report has some weaknesses.
First – the survey that is dominated by large companies: over 50% of the respondents work for companies with more than 5,000 employees and fully 26% work for companies with more than 20,000 employees. Small companies with less than 100 employees – which cover almost all medical device companies are underrepresented in the data.
Second – the SANS survey attempts, unsuccessfully, to reconcile reports by the companies they interviewed that they respond and remediate incidents within 24 hours(!) with reports by the PCI (Payment Card Industry) DSS (Data security standard) Association that retail merchants take over 6 months to respond. This gap is difficult to understand – although it suggests considerable variance in the way companies define incident response and perhaps a good deal of wishful thinking, back-patting and CYA.
Since most medical device companies have less than 100 employees – it is unclear if the SANS findings (which are skewed to large IT security and compliance organizations) are in fact relevant at all to a medical device industry that is moving rapidly to the medical device-App-Cloud paradigm.
3 things a medical device vendor must have for effective incident response
- Establish an IRT. (Contact us and we will be happy to help you set up an IRT and train them on effective procedure and tools). Make sure that the IRT trains and conducts simulations every 3-6 months and above all make sure that someone is home to answer the call when it comes.
- Lead from the front. Ensure that the head of IRT reports to the CEO. In security incident response, management needs to up front and not lead from behind.
- Detect in real time. Our key concern is cloud server security. Our recommendation is to install OSSEC on your cloud servers. OSSEC sends alerts to a central server where analysis and notification can occur even if the medical device cloud server goes down or is compromised.
Clients frequently ask us questions like this.
I have a quick question about our HIPAA compliance that we achieved back in early 2013. Since then we have released a couple of new software versions and we are wondering to what extent we need to perform another security and compliance assessment. Please let us know what sort of information you might require to evaluate whether or not a new HIPAA security rule assessment is required.
What about the upcoming changes in HIPAA in 2016?
Any software changes that increase the threat surface to attacks (new ports, new interfaces, new modules that use PHI) would be reason to take a look at your Security Rule compliance.Re HIPAA 2016 – OCR is still making plans but it is almost certain they will be doing audits. I believe that due to sheer size of the program – they will start with the biggest hospitals – I do not think that small medical device vendors will be on their radar – although the big guys that had serious adverse events will probably get audited (insulin pumps, implanted cardiac devices)
Medical devices, mobile apps, Web applications – storing data in the cloud, sharing with hospitals and doctors. How do I comply with HIPAA? What applies to me – the Security Rule, the Privacy Rule or both?
Consider a common use case these days – you’re a medical device vendor and your device stores health information in the cloud. You have a web and/or mobile application that enable doctors/hospitals to access the data from my device as part of their healthcare services. If you operate in the United States, what HIPAA regulations apply ? Do I need to comply with the Privacy Rule, the Security Rule or both?
There is a good deal of confusion regarding the HIPAA Privacy and Security Rules and how things work. In this article, we will examine the original content of the HIPAA regulation and explain who needs to do what.
What is the Privacy Rule?
The HIPAA Final Rule (enacted in Jan 2013) has 2 pieces – the Privacy Rule and the Security Rule.
The Privacy Rule establishes standards for the protection of health information. The Security Rule establishes security standards for protecting health information that is held or transferred in electronic form. The Privacy Rule broadly defines ‘‘protected health information’’ as individually identifiable health information maintained or transmitted by a covered entity in any form or medium. The Privacy Rule is located at 45 CFR Part 160 and Subparts A and E of Part 164.
Who needs to comply with the Privacy Rule?
By law, the HIPAA Privacy Rule applies only to covered entities – health plans, health care clearinghouses, and certain health care providers. However, most health care providers and health plans do not carry out all of their health care activities and functions by themselves. Instead, they often use the services of a variety of other persons or businesses – and transfer/exchange health information in electronic form to use these services. These “persons or businesses” are called “business associates”; defined in 45 CFR 164.502(e), 164.504(e), 164.532(d) and (e) 45 CFR § 160.102, 164.500.
What is the Security Rule?
The Security Rule operationalizes the Privacy Rule by addressing the technical and non-technical safeguards that the “covered entities” and their business associates must implement in order to secure individuals’ “electronic protected health information” (EPHI). The Security Rule is located at 45 CFR Part 160 and Subparts A and C of Part 164.
Who needs to comply with the Security Rule?
Since its an operational requirement, the Security Rule (by law) applies to covered entities, business associates and their sub-contractors. While the Privacy Rule applies to protected health information in all forms, the Security Rule applies only to electronic health information systems that maintain or transmit individually identifiable health information. Safeguards for protected health information in oral, written, or other non-electronic forms are unaffected by the Security Rule.
Business associate liability
Section 13404 of the HITECH Act creates direct liability for impermissible uses and disclosures of protected health information by a business associate of a covered entity “that obtains or creates” protected health information “pursuant to a written contract or other arrangement described in § 164.502(e)(2)” and for compliance with the other privacy provisions in the HITECH Act.
Section 13404 does not create direct liability for business associates with regard to compliance with all requirements under the Privacy Rule (i.e., does not treat them as covered entities). Therefore, under the final rule, a business associate is directly liable under the Privacy Rule for uses and disclosures of protected health information that are not in accord with its business associate agreement or the Privacy Rule.
Permitted use of EPHI by a business associate
While a business associate does not have health care operations, it is permitted by § 164.504(e)(2)(i)(A) to use and disclose protected health information as necessary for its own management and administration if the business associate agreement permits such activities, or to carry out its legal responsibilities. Other than the exceptions for the business associate’s management and administration and for data aggregation services relating to the health care operations of the covered entity, the business associate may not use or disclose protected health information in a manner that would not be permissible if done by the covered entity (even if such a use or disclosure is permitted by the business associate agreement).
Taken from the Federal Register
See § 160.103 for HIPAA general definitions used by the law – definitions of business associates, protected health information and more.
- The Privacy Rule establishes standards for the protection of health information.
- The Security Rule establishes operational security standards for protecting health information that is held or transferred in electronic form.
- The Security Rule applies only to electronic health information systems that maintain or transmit individually identifiable health information. Safeguards for protected health information in oral, written, or other non-electronic forms are unaffected by the Security Rule.
- Business associates do not have direct liability with regard to compliance with all requirements under the Privacy Rule (i.e., does not treat them as covered entities). A business associate is directly liable under the Privacy Rule for uses and disclosures of protected health information that are not in accord with its business associate agreement or the Privacy Rule.
Before you start spending money on regulatory consultants get back to basics. Do you or do you not need to comply with the HIPAA Security Rule? If you do – what is the very first thing you should do? In this post – we will get back to basics with 3 practical ways of complying and reducing your regulatory risk.
I specialize in cyber security and privacy compliance consulting for medical device companies in Israel. While this may sound like a niche, it is actually a very nice and not so small niche – with over 700 biomed vendors and a market growing 7-8% / year.
Israeli biomed startups are incredibly innovative and it’s fun working with smart people. Here are 3 ways to improve your HIPAA risk analysis in just a few minutes:
Check # 1 – Maybe you are not a HIPAA Business associate
If you are a medical device vendor and you connect to a hospital network or share data with doctors, you are automatically a BA; a Business associate according to the HIPAA Business Associate definition. If you are a BA – you need to comply with the HIPAA Security Rule. But maybe you are not a BA.
By law, the HIPAA Privacy Rule applies only to covered entities – health plans, health care clearinghouses, and certain health care providers. A “business associate” is a person or entity that performs certain functions or activities that involve the use or disclosure of protected health information on behalf of, or provides services to, a covered entity. If an entity does not meet the definition of a covered entity or business associate, it does not have to comply with the HIPAA Rules. See definitions of “business associate” and “covered entity” at 45 CFR 160.103.
So – if you developed a consumer mobile app that monitors stress with a cool dashboard to visualize data stored in the cloud enabling your users to reduce their stress and compare themselves with other people like them; so long as the software doesn’t share data with a covered entity (like their doctor) – you are not a BA. Check.
Check # 2 – Maybe you don’t even need to comply with HIPAA
HIPAA applies to storage of EPHI (electronic protected health information). Simply put – EPHI is the combination of PII (personally identifiable information – such as a name, email and social security number) and healthcare / clinical data.
So – getting back to our hypothetical mobile medical app for personal stress tracking, let’s suppose that the stress data includes heart rate and respiratory rate in addition to how many times/day the consumer called their girl friend to make sure she still loves them (when they are freaking out with stress before a big exam). If your mobile medical app for stress management doesn’t store personal information with the clinical data, then you don’t have to comply with HIPAA because you don’t you don’t create, store or transmit EPHI in the cloud. Check.
Check # 3 – Using contractors for software development? Vet them and oversee them
There is commonly-used expression in Hebrew – “launch and forget” (שגר ושכח). I believe that this is a Hebrew translation of the American English term “Fire and forget” that refers to a type of missile guidance which does not require further guidance after launch.
When it comes to contractors you do not want to “launch and forget”.
Maybe you are a BA and you have to comply with HIPAA – just because the HIPAA Security Rule does not have a standard safeguard for vetting contractors, can you afford to gloss over this area?
This is a big one boys and girls. If you use contractors for code development, make sure you thoroughly vet them before engaging with them on upwork. I am not talking about quality of work – it is a given that you need someone highly competent in whatever problem problem domain you are trying to crack (your girl-friends brother-in-law may not be your best fit). I am talking about the threat of a contractor stealing your code, dropping Android malware into your app, or enabling tap-jacking to steal personal data).
Even if they don’t steal your code, consider the threat of your contractor working for a competitor, leaking your IP or being hired by a business partner who click-jacks your entire business model.
It’s tempting to work with software developers in Ukraine or China but be careful. Consider the case of Russian programmers who wrote code for U.S. military communications systems or the DOJ accuses firm that vetted Snowden of faking 665,000 background checks.
In the Federal space, it is sad but true that there is huge a Federal procurement machine that enables these kinds of attacks to happen because of organizational complexity ( a nice way of saying that accountability falls between the cracks) and greed ( a not so nice way of saying that big contracts are a challenge to morality and governance).
US Federal agency purchasing managers who sign purchase orders without appropriate contractor vetting and oversight are not a justification for a privately-held Israeli medical device company to sin…
The calls for more legislation are a knee-jerk reaction to the horses having left the barn years ago but when common sense and good business practices yield to greed then perhaps we do need a tail wind. Vetting software development contractors should be a standard requirement in the core security guidance such as HIPAA, PCI and FDA cyber security but in the meantime – don’t wait for the US Federal Government to tell you to vet your contractors and oversee their work. Put an accountability clause into your contracts.
Thoughts for Yom Kippur – the Jewish day of atonement – coming up next Wed.
Security on modern operating systems (Windows, OS/X, iOS, Android, Linux) is getting better all the time – but Android using SELinux and MAC (mandatory access control) doesn’t make for catchy, social-media-sticky news items.
A client (a good one) once told me that people never remember your successes, only your failures. (He also believed that all software developers are innately incapable of telling the truth but that’s another story).
The corollary to this notion of failure-skew in the business (and security) world is media reporting. Consider media emphasis on reporting violent and/or negative events. It’s not a hot news item to say that 39% of Israeli Arabs are proud to be Israeli nor is it newsworthy to report that 29% are very proud. The world (Middle East included) is actually a much better place then it seems when not viewed through the lens of social media news reporting and re-purposing (I’m not sure what the correct term for the Huffington Post is so I’ll just use the word repurpose).
FB and Twitter create discussion threads, not examination-of-empirical data threads. Discussion is easier, more fun and cheaper than collecting data and examining it’s quality.
In addition, radical voices are far more interesting than statistics. Who cares that according to World Bank statistics, in 1990 there were 1.91 billion people who lived on less than $1.25 a day an in 2011 it was just one billion. Radical voices (amusingly adopted by the US President) will continue to blame poverty on the rise in Islamic and Iranian terror even though it emanates from the wealthiest countries in the world.
The Jews over the world are up to bat this coming Wed on Yom Kippur. We can bemoan how bad things are and what a terrible President or PM we all have and how our society is falling apart, or we can take a little piece of our own life and fix it. Send thank you notes to people. Patch your systems once/week. That’s a good start. And pretty easy to do.
Now what does this have to do with software security you ask?
Our clients read social media. They read about zero-days and they get all excited and then do nothing.
Yet another serious Android security issue was publicized this week, with the latest exploit rendering devices “lifeless,” and said to affect more than half of units currently on the market. Latest Android security exploit could leave more than half of current devices ‘dead’ & unusable
Now let’s check out that URL – its from Apple Insider. Hmm – somebody has an ax to grind I bet.
Security on modern operating systems (Windows, OS/X, iOS, Android, Linux) is getting better all the time – but Android using SELinux and MAC (mandatory access control) doesn’t make for catchy, social-media-sticky news items.
So this year – I mean this Wednesday – don’t wring your hands. Do a security assessment on your systems and prioritize 1 thing, find that one weakest link in your system and harden it up.
Friday, today is the 14’th anniversary of the Al Queda attack on the US in New York on 9/11/2001.
The world today is more connected, more always-on, more accessible…and more hostile. There are threats from Islamic terror, identity theft, hacking for pay, custom spyware, mobile malware, money laundering and corporate espionage. For those of us working in the fields of risk management, security and privacy, these are all complex challenges in the task of defending a business.
The biggest challenge is the divide between IT and management. It’s similar to the events leading up to 9/11: The FBI investigated and the CIA analyzed, but the two sides never discussed the threats and the potential damage of Saudis learning to fly, but not how to land airplanes.
A chain of risk analysis
The HIPAA Final Rule creates a chain of risk analysis and compliance from the hospital, downstream to the business associates who handle / process PHI for the hospital and sub-contractors who handle / process PHI for the business associate.
And so on.
The first thing an organization needs to do is a risk analysis. How important is a risk analysis? Ask Cancer Care Group who were just fined $750,000 for non-compliance with the Security Rule.
$750,000 HIPAA settlement emphasizes the importance of risk analysis and device and media control policies
Cancer Care Group, P.C. agreed to settle potential violations of the Health Insurance Portability and Accountability Act of 1996 (HIPAA) Privacy and Security Rules with the U.S. Department of Health and Human Services (HHS), Office for Civil Rights (OCR). Cancer Care paid $750,000 and will adopt a robust corrective action plan to correct deficiencies in its HIPAA compliance program. Cancer Care Group is a radiation oncology private physician practice, with 13 radiation oncologists serving hospitals and clinics throughout Indiana.
On August 29, 2012, OCR received notification from Cancer Care regarding a breach of unsecured electronic protected health information (ePHI) after a laptop bag was stolen from an employee’s car. The bag contained the employee’s computer and unencrypted backup media, which contained the names, addresses, dates of birth, Social Security numbers, insurance information and clinical information of approximately 55,000 current and former Cancer Care patients.
OCR’s subsequent investigation found that, prior to the breach, Cancer Care was in widespread non-compliance with the HIPAA Security Rule. It had not conducted an enterprise-wide risk analysis when the breach occurred in July 2012. Further, Cancer Care did not have in place a written policy specific to the removal of hardware and electronic media containing ePHI into and out of its facilities, even though this was common practice within the organization. For more information see the HHS Press Release from Sep 2, 2015 $750,000 HIPAA settlement emphasizes the importance of risk analysis and device and media control policies
Risk analysis is the first step in meeting Security Rule requirements
I have written here, here, here and here about the importance of risk analysis as a process of understanding the value of your assets, the impact of your threats and the depth of your vulnerabilities in order to implement the best security countermeasures.
The HIPAA Security Rule begins with the analysis requirement in § 164.308(a)(1)(ii)(A). Conducting a risk analysis is the first step in identifying and implementing safeguards that comply with and carry out the standards and implementation specifications in the Security Rule. Therefore, a risk analysis is fundamental to the entire process of HIPAA Security Rule compliance, and must be understood in detail by the organization in order to specifically address safeguards and technologies that will best protect electronic health information. See Guidance on Risk Analysis Requirements under the HIPAA Security Rule. Neither the HHS Guidance nor NIST specify a methodology – we have been using the Practical Threat Analysis methodology for HIPAA Security risk analysis with Israeli medical device companies since 2009 and it works smoothly and effectively helping Israeli medical device vendors comply and implement robust security at the lowest possible cost.
§ 164.308(a)(1)(ii)(A) Risk Analysis (R1): As part of the risk management process, the company performs information security risk analysis for its services (see company procedure XYZ) analyzing software application security, data security and human related vulnerabilities. Risk analysis is performed according to the Practical Threat Analysis methodology.
1 A refers to addressable safeguards, and R refers to required safeguards.