Tag Archives: EMR

Problems in current Electronic Health Record systems

Software Associates specializes in helping medical device and healthcare technology vendors achieve HIPAA compliance and improve the data and software security of their products in hospital and mobile environments.

As I noted here and here, the security and compliance industry is no different from other industries in having fashion and trends.  Two years ago, PHR (Personal Health Records) systems were fashionable and today they’re not – probably because the business model for PHR applications is unclear and unproven.

Outside of the personal fitness and weight-loss space, it’s doubtful that consumers will pay  money for a Web 2.0 PHR application service to help them store  personal health information especially when they are paying their doctor/insurance company/HMO for  services. The bad news for PHR startups is that it’s not really an app that runs well on Facebook and on the other hand, the average startup is not geared to do big 18-24 month sales cycles with HCP (health care providers) and insurance companies.  But – really, business models is the least of our problems.

There are 3 cardinal  issues with the current generation of EHR/EMR systems.

  1. EHR (Electronic Health Records) systems address the business IT needs of government agencies, hospitals, organizations and medical practices, not the healthcare needs of patients.
  2. PHR (Personal Health Records) systems are not integrated with the doctor-patient workflow.
  3. EHR systems are built on natural language, not on patient-issue.

EHR – Systems are focused on business IT, not patient health

EHR systems are enterprise software applications that serve the business IT elements of helthcare delivery for healthcare providers and insurance companies; things like reducing transcription costs, saving on regulatory documentation, electronic prescriptions and electronic record interchange.1

This clearly does not have much to do with improving patient health and quality of life.

EHR systems also store large volumes of information about diseases and symptoms in natural language, codified using standards like SNOMED-CT2. Codification is intended to serve as a standard for system interoperability and enable machine-readability and analysis of records, leading to improved diagnosis.

However, it is impossible to achieve a meaningful machine diagnosis of natural language interview data that was uncertain to begin with, and not collected and validated using evidence-based methods3.

PHR – does not improve the quality of communications with the doctor

PHR (Personal Health Records) on the other are intended to help patients keep track of their personal health information. The definition of a PHR is still evolving. For some, it is a tool to view patient information in the EHR. Others have developed personal applications such as appointment scheduling and medication renewals. Some solutions such as Microsoft HealthVault and PatientsLikeMe allow data to be shared with other applications or specific people.

PHR applications have a lot to offer the consumer, but even award-winning applications like Epocrates that offer “clinical content” are not integrated with the doctor-patient workflow.

Today, the health care system does not appropriately recognize the critical role that a patient’s personal experience and day-to-day activities play in treatment and health maintenance. Patients are experts at their personal experience; clinicians are experts at clinical care. To achieve better health outcomes, both patients and clinicians will need information from both domains– and technology can play a key role in bridging this information gap.”4

 EHR – builds on natural language, not on patient issues

When a doctor examines and treats a patient, he thinks in terms of “issues”, and the result of that thinking manifests itself in planning, tests, therapies, and follow-up.

In current EHR systems, when a doctor records an encounter, he records planning, tests, therapies, and follow-up, just not under the main entity, the issue. The next doctor that sees the patient needs to read about the planning, tests, therapies, and follow-up and then mentally reverse-engineer the process to arrive at which issue is ongoing. Again, he manages the patient according to that issue and records information, but not under the main “issue” entity.

Other actors such as public health registries and epidemiological researchers go through the same process. They all have their own methods of churning through planning, tests, therapies, and follow-up, to reverse-engineer the data in order to arrive at what the issue is.

This ongoing process of “reverse-engineering” is the root cause for a series of additional problems:

  • Lack of overview of the patient
  • No sufficient connection to clinical guidelines, no indication on which guidelines to follow or which have been followed
  • No connection between prescriptions and diseases, except circumstantial
  • No ability to detect and warn for contraindications
  • No archiving or demoting of less important and solved problems
  • Lack of overview of status of the patient, only a series of historical observations
  • In most systems, no sufficient search capabilities
  • An excess of textual data that cannot possibly be read by every doctor at every encounter
  • Confidentiality borders are very hard to define
  • Very rigid and closed interfaces, making extension with custom functionality very difficult

4 Patricia Brennan, “Incorporating Patient-generated Data in meaningful use of HIT” http://healthit.hhs.gov/portal/server.pt/

Tell your friends and colleagues about us. Thanks!
Share this
Federal Healthcare Chart

Healthcare data interoperability pain

Data without interoperability =  pain.

What is happening in the US healthcare space is fascinating as stimulus funds (or what they call in the Middle East – “baksheesh”) are being paid to doctors to acquire an Electronic Health Records system that has “meaningful use”. The term “meaningful use” is vaguely  defined in the stimulus bill as programs that can enable data interchange, e-prescribing and quality indicators.

Our hospital recently spent millions on a emr that does not integrate with any outpatient emr. Where is the data exchanger and who deploys it? What button is clicked to make this happen! My practice is currently changing its emr. We are paying big bucks for partial data migration. All the assurances we had about data portability when we purchased our original emr were exaggerated to make a sale. Industry should have standards. In construction there are 2×4 ‘s , not 2×3.5 ‘s.
Government should not impinge on privacy and free trade but they absolutely have a key role in creating standards that ensure safety and promote growth in industry.
Read more here:  Healthcare interoperatbility pains

Mr Obama’s biggest weakness is that he has huge visions but he can’t be bothered with the details so he lets his team and party members hack out implementations, which is why his healthcare initiatives are on a very shaky footing – as the above doctor aptly noted.  But perhaps something more profound is at work. The stimulus bill does not mention standards as a pre-requisite for EHR, and I assume that the tacit assumption (like many things American) is that standards will “happen” due to the power of free markets. This is at odds with Mr. Obama’s political agenda of big socialistic government with central planning. As the doctor said: “government absolutely (must) have a key role in creating standards that ensure safety and promote growth in industry”.  The expectation that this administration set is that they will take care of things, not that free markets will take care of things.  In the meantime, standards are being developed by private-public partnerships like HITSP – enabling healthcare interoperability

The Healthcare Information Technology Standards Panel (HITSP) is a cooperative partnership between the public and private sectors. The Panel was formed for the purpose of harmonizing and integrating standards that will meet clinical and business needs for sharing information among organizations and systems.

It’s notable that HITSP stresses their mission as meeting clinical and business needs for sharing information among organizations and systems.   The managed-care organizations call people consumers so that they don’t have to think of them as patients.

I have written here, here and here about the drawbacks of packaging Federal money, defense contractors and industry lobbies as “private-public partnerships”.

You can give a doctor $20k of Federal money to buy EMR software, but if it doesn’t interact with the most important data source of all (the patient), everyone’s ROI (the doctor, the patient and the government) will approach zero.

Vendor-neutral standards are key to interoperability. If the Internet were built to HITSP style standards, there would be islands of Internet connectivity and back-patting press-releases, but no Internet.

The best vendor-neutral standards we have today are created by the IETF – a private group of volunteers, not by a “private-public partnership”.

The Internet Engineering Task Force (IETF) is a large open international community of network designers, operators, vendors, and researchers concerned with the evolution of the Internet architecture and the smooth operation of the Internet. It is open to any interested individual. The IETF Mission Statement is documented in RFC 3935.

However – vendor-neutral standards are a necessary but insufficient condition for “meaningful use” of data.  There also has to be fast, cheap and easy to use access in the “last mile”.  In healthcare – the last mile is the patient-doctor interaction.

About 10-15 years ago, interoperability in the telecommunications and  B2B spaces was based on an EDI paradigm with centralized messaging hubs for system to system document interchange. As mobile evolved into 3G, cellular applications made a hard shift to a distributed paradigm with middleware-enabled interoperability from a consumer handset to all kinds of 3G services – location, games, billing, accounting etc running at the operator and it’s content partners.

The healthcare industry is still at the EDI stage of development – as we can see from organizations like WEDI and HIMSS

The Workgroup for Electronic Data Interchange (WEDI)

Improve the administrative efficiency, quality and cost effectiveness of healthcare through the implementation of business strategies for electronic record-keeping, and information exchange and management...provide multi-stakeholder leadership and guidance to the healthcare industry on how to use and leverage the industry’s collective technology, knowledge, expertise and information resources to improve the administrative efficiency, quality and cost effectiveness of healthcare information.

What happened to quality and effectiveness of patient-care?

It is not about IT and cost-effectiveness of information (whatever that means). It’s about getting the doctor and her patient exactly the data they need when they need it.   That’s why the doctor went to medical school.

Compare EDI-style message-hub centric protocols to RSS/Atom on the Web where any Web site can publish content and any endpoint (browser or tablet device) can subscribe easily. As far as I can see, the EHR space is still dominated by the  “message hub, system-system, health-provider to health provider to insurance company to government agency” model, while in the meantime, tablets are popping everywhere with interesting medical applications. All these interesting applications will not be worth much if they don’t interact enable the patient and doctor to share the data.

Imagine the impact of IETF style standards, lightweight protocols (like RSS/Atom) and $50 tablets running data sharing apps between doctors and patients.

Imagine vendor-neutral, standard middleware for  EHR applications that would expose data for patients and doctors using an encrypted Atom protocol – very simple, very easy to implement, easy to secure and with very clear privacy boundaries. Perhaps not my first choice for sharing radiology data but a great way to share vital signs and significant events like falling and BP drops.

This would be the big game changer  for the entire healthcare industry.  Not baksheesh. Not EDI. Not private-public partnerships.

Tell your friends and colleagues about us. Thanks!
Share this

HIPAA and cloud security

In almost every software security assessment that we do of a medical device, the question of HIPAA compliance and data security arises.  The conversation often starts with a client asking the question – “I hear that Amazon AWS is HIPAA compliant?  Isn’t that all I need?

Well – not exactly. Actually, probably not.

As Craig Balding pointed out on his blog post Is Amazon AWS Really HIPAA Compliant Today? there are some basic issues with AWS itself.

There is no customer accessible AWS API call audit log
In other words, you have no way to know if, when and from where (source IP) your AWS key was used to make API calls that may affect the security posture of your AWS resources (an exception is S3, but only if you turn on logging (off by default)).

There is no way to restrict the source IP address from which the AWS API key can be used.
The AWS API interface can be used from any source IP at any time (and as above, you have no audit trail for EC2 API calls).  This is equivalent of exposing your compute and storage management API to the entire planet.

Each AWS account is limited to a single key – unauthorized disclosure of the key results in total breakdown of security

It only gets worse.
Web services and storage are just a small part of  data security.

Even if Amazon AWS was perfect in terms of it’s data security countermeasures – there would still be plenty of opportunity for a data breach of PHI.

There are multiple attack vectors from the perspective of HIPAA compliance and PHI data security.  The following schematic gives you an idea of how an attacker can steal PHI, figure (inspired of my colleague Michel Godet) using any combination of no less than 15 attack vectors to abuse and steal PHI:

There are potential data security vulnerabilities in the client layer, transmission layer, platform layer (Operating system) and cloud services (Amazon AWS in our example).

Note that the vulnerabilities for a PHI data breach can not only happen inside any layer but in particular there are vulnerabilities in the system interfaces between layers.

Let’s take a specific example.

Consider a remote medical diagnostic service that collects information, transmits it over secure channels (https for the sake of argument) to a centralized facility for processing and diagnosis.  The entire transmission stream can be secure but if the processing and diagnosis facility uses Microsoft IIS as an interface, it is possible to attack the IIS Web server, create denial of service and exploit IIS7 and Windows operating system vulnerabilities in order to gain access to the machine itself, the data in motion and possibly gain access and compromise the internal network.

A discussion of HIPAA compliance needs to include a comprehensive threat analysis of the entire supply chain of data processing and not just limit itself to the cloud services that store electronic medical records.

For further reading, see the below resources on HIPAA compliance with Amazon Web services and work that Software Associates has done on threat modeling.

 

Tell your friends and colleagues about us. Thanks!
Share this