Web3 as we know, is not a new concept, but is gaining importance in last few years since machines have taken a substantial role, in the way we are interfacing with data and making contextual sense of data with more accuracy. Some of the points below are different aspects of Web3 which is rapidly changing the healthcare landscape.
Rise of Data & Immersive Experiences
In last few years, data has grown exponentially and are collected over systems, devices and sensors. As data is collected in form of patient medical symptoms and issues, observations, diagnosis, medications taken, procedures applied, they all are captured in system of records like EHRs, EMRs, Practice Management systems, HIMS. With different types of data getting collected, intelligent systems are enabling doctors to creating linkage between different symptoms, provide an accurate diagnosis, and provide appropriate treatment.
With more processing power and internet speed coming over 5G, immersive experiences will now not restrict itself to just games, but will bring to some of these medical interventions. Instead, it will be immersive everywhere. Ability to analyze and create context over data will drive lot of web3 innovations happening in healthcare space.
Imagine a use case scenario, where doctors are speaking naturally with patients and machine is generating context and feeding learnings to an EHR and a Pharmacy system to identify and automate creation of a treatment plan or simply generate a prescription, which then automatically integrates into a Rx Order flow.
Imagine another use case, where doctors are just sharing their availabilities by interacting naturally and appointment slots are marked available in the EHRs. Similarly, once a patient interacts with a front desk number, they simply interface with a voice recognition system that can setup a doctor’s appointment without any doctor’s intervention.
Imagine these appointments that are blocked are then automatically shared across the network of other connected systems with more contextual information.
For example, a reminder system can then send a message to patient to book a nearby Uber or Lyft just by speaking naturally. Imagine an Uber or Lyft fleet knows that a patient drop is required, and they price the drop accordingly, hence saving cost for the patient transfer. This may mean that the next Uber coming to visit and pickup a patient knows the patient conditions and can act accordingly.
While ambulance service is for emergency, reminder engine can decide based on the context of the information and decides accordingly to call an Uber, Lyft nearby over booking an ambulance.
Once the patient walks in, the self-service check-in kiosk already identifies the patient and pulls all medical history for the doctor which is pushed to doctor’s device based on patient’s consent. While this saves time for patients to fill up forms, patients can simply converse naturally with the kiosk to add additional information.
With the above use cases, data remains at the core of these connected networks. Companies who work with the data infrastructure have often created business logic after data is integrated across different apps, or services, or around AI models. Companies even adopt warehouses and data lakes to clean and normalize these data. However, data centric initiatives will center around creation of the data culture and bringing uniformity around process & data definitions, so that data integration exercises are efficient, secured and cleaned to derive uniform context while they are shared across the network. Another aspect while sharing the data would be the implementation of technologies like blockchain in healthcare, which will ensure higher security and data integrity. Blockchain will enable accurate data sharing between healthcare providers, which simply means accuracy in diagnosis, and increase in treatment effectiveness. Look at companies like Patientory Inc., that democratizes individual ownership of the world’s health data and incentives to improve health outcomes.
Once the patient is getting diagnosed, a 3D image technology can feed the exact X ray images in 3-Dimensional view to build a better context of the patient problem. For example, Cone-beam computed tomography systems (CBCT) are used these days as a popular option against a variation of traditional computed tomography (CT) systems. Such CBCT systems can be used to rotate around the patient, capturing data using a cone-shaped X-ray beam. These data are then used to reconstruct a three-dimensional (3D) image of the patient’s anatomy: dental (teeth); oral and maxillofacial region (mouth, jaw, and neck); and ears, nose, and throat (“ENT”).
As patient walk in to a medical procedure, these images along with the collected data can provide more real time contextual insights on the patient’s condition, thus providing recommendations to doctors for an alternate intervention.
Thanks to the movement of HL7 FHIR that has come as a mandate, ensuring a step closer to semantic interoperability. FHIR’s primary purpose is to address interoperability with well-structured, expressive data models, that can support and simplify efficient data exchange mechanisms.
However, FHIR also supports clinical terminology linkage and validation, which is a big step towards uniformity of data driven understanding. Hence, with XML and JSON documents, syntactic as well as validations against a defined set of business rules, will promote high data fidelity; and hence can be considered a step very close to achieve semantic interoperability.