The ultimate technical challenge of interoperability is not defining the standards but getting the standards adopted and used. David Hancock, healthcare executive advisor at InterSystems highlights the biggest technical issues in adoption and how they should be overcome.
Those of you who have read my three previous blogs will know that one of the main barriers to successfully implementing interoperability standards is the variability of endpoints. Systems in health and social care have different interoperability capabilities and many are already legacy and will never be updated.
Interoperability – we’re not in Kansas anymore
Interoperability has moved on considerably from the late 1980s when HL7 v2 messaging was defined. Many people still regard interoperability as the support of HL7 v2 messaging, but today we are no longer in that known place where people were, perhaps, far more comfortable – “We’re not in Kansas anymore” to borrow a phrase!
Today, interoperability goes beyond just messaging as it becomes more sophisticated, allowing information to be reason on. As a result, it includes three additional types of exchanging information:
– Document exchange – e.g. Transfers of Care, Referrals, etc.
– REST APIs – Web services that provide interoperability between computer systems on the Internet.
– Services – e.g. support of more complex architectural patterns such as IHE XDSb. etc or workflow such as publish and subscribe models.
– Driving the future of healthcare
Irrespective of whether the comparisons are suitable or not, many of our expectations are set by the experience in other parts of our lives. For example, the way the banking system now works, mobile phone networks and global supply chains all work together pretty seamlessly, and so we would expect of healthcare.
Arguably, healthcare is more complex. What makes it different is the shape and scope of the data itself – reflecting both the complexity of human physiology, disease, and related processes, and the very distributed, specialised and oftentimes siloed nature of much of health and social care practice. A clear example is the complexity of medication management and dose timing instructions, in comparison to, say, financial ledgers.
We can break the technical barriers into the definition of realistic standards that meet the three types of interoperability defined above and then how we drive adoption. There are huge challenges in this technical work that must now be overcome quickly – this is next generation integration. Our standards and systems must negotiate security matters, and access and return information in a common, structured way using open standard APIs.
Interoperability – failing fast, learning quickly
In this complex landscape where demand is high, the possibilities for error increase. I have seen numerous initiatives try to move fast, but miss key stages out so become unstuck later on. To be successful we need to accept failing fast, but learn quickly, whilst keeping an eye on our objective – usability and adoption.
The process is clear:
– Identify interoperability requirements and a first of type to use it
– Define a standard information model
– Implement the information model in a computable format including the definition of APIs
– Test using test harnesses hackathons and connectathons
– Implement in first of type and review clinical assurance and validation
– Identify interoperability requirement
Within INTEROPen we work with NHS England, NHS Digital (and now NHSX) and the rest of the service to identify the clinical needs and define the use cases. We need a compelling requirement and an organisation prepared to implement it as a first of type. This allows the work to be commissioned by whomever wants to commission it. This can be NHS England, a LHCRE, a GDE etc.
Define a standard information model
We must define a standard information model for the scope of this work and it must be referenced to, and relate to, previous information modelling. When talking about concepts such as medications, allergies or encounters, we must be able to ensure we are all talking about the same thing and they are defined consistently across use cases. If the data, is not modelled in the same way, then systems won’t be able to talk to each other because information is organised and stored differently. In the UK, the best organisation to do this is the Professional Records Standards Body (PRSB). In this way it is clinical driven and validated.
Implement in a computable format
The next challenge is to implement the model and accompanying standards using HL7 FHIR, and where relevant clinically coding such as Dictionary of Medicines and Devices (dm+d) and SNOMED, into a computable format to allow data to be exchanged.
Right now, there is still information in patient records that is free text, whether as prescribing information, medical notes, etc. To standardise will allow NHS and social care systems to classify and code diagnoses, procedures, events, medications, observations, vital signs……the list goes on. All systems must be able to accept standards which can then allow information to flow at scale to not only consume data from other systems, but to also process and reason on it.
Test using test harnesses, hackathons and connecthons
Setting up of test harnesses and running hackathons is key to starting to identify what works, and importantly, our opportunity to fail. Software can be run against test harnesses to ensure it works, but when you have a hackathon – where real software from different vendors is brought together and integrated – you can tell if it really is working. The rationale is that often issues don’t emerge until developers dive into the detail and get stumped or confused. And likewise, end users may not fully articulate their requirements until presented with something tangible they can touch and feel which doesn’t quite work as they imagined.
Clinical validation and assurance of implementation made by suppliers
Having tested it, it needs to be understood if it works in reality and whether clinicians can work with it. This is going beyond the technology, standards and data models delivering against a technical specification and making sure it is clinically acceptable, validated and assured. As you move from requirements through design and implementation the clinical requirements and usability can get “lost in translation” and it is vital to check early that what is developed remains clinically valid, safe and usable.
Technical implementation of interoperability in a specific care landscape with specific systems
The technology has now completed hackathons and connectathons, and made sure is it clinically validated, next it needs to be tried out in the user environment. We must identify if it works in the complexity of the real world using first of types. This should be with any organisation, but we would expect this to be through the leadership of GDE and LHCRE organisations. Once it meets the requirement there is now a very sustainable basis for widespread adoption.
Turning standards definition on its head
Having defined the mechanics of how this should be done technically, I suspect that there are many of you shouting at your screens saying that “this is all very admirable David, but we are never going to get anywhere doing this in a centralised way; the central groups will be too much of a bottleneck”.
This is absolutely true and therefore it is important that local health economies are able to both commission and do this work themselves and for it to move from local to national. In this way it becomes a true community endeavour.
This does have a significant impact on skills required. We will need the PRSB to work with local multi-disciplinary teams who have the skills to define the Care Connect FHIR profiles. These will need to have a light touch review from the centre to ensure the profiles are consistent with others defined elsewhere. It is great that the OneLondon LHCRE is the first local health economy embarking on defining Care Connect FHIR profiles for their programme and then offering them to the rest of the NHS as national profiles.
Turning standards definition on its head means we have far more capacity to define profiles rapidly and develope close to where they’ll be used. Meaning there is a high chance of successful implementation and adoption.
It’s all about adoption (stupid)
I have worked too long in this field and seen too many initiatives fail because of the myriad of issues facing interoperability, but especially because of the lack of focus on adoption and to misquote Bill Clinton, it’s all about adoption (stupid). Today I think we are better placed than ever to land this, although there is still much work to do to overcome the technical barriers and achieve wide-scale adoption.
Original Article – https://www.governmentcomputing.com/health/comment/technical-challenges-interoperability