Connected Health / iDAAS (Intelligent Data As A Service) - Common Use Cases

Connected Health / iDAAS (Intelligent Data as A Service) - Common Use Cases


Because "Data is the asset!!!" iDAAS is about enabling information for knowledge that can be used for action in a secure and scalable manner. iDAAS is a powerful healthcare industry design pattern/framework that is a blue print for connecting, processing and leveraging clinical, financial and life sciences data at scale to build solutions for today and tomorrow in a consistent manner.

The Following are very common use cases and within it includes references to all the relevant repositories needed to get up and running and processing data. The average time for most to start leveragimg the platform is about 40 minutes with the first time configuration of AMQ-Streams (Kafka) and validating connection settings taking a majority of the time.

If you need to support HL7 Processing than here are the following repositories required and needed.
Capability Component Component Desc.
Receiving iDAAS Connect Clinical - HL7 This iDAAS Connect accelerator specifically and ONLY supports the clinical integration standards of HL7. From an integration connectivity and standards perspective it has HL7 MLLP Servers that support the following types from any vendor and any specific message version from 2.1 to 2.8: ADT (Admissions), ORM (Orders), ORU (Results), SCH (Schedules), PHA (Pharmacy), MFN (Master File Notifications), MDM (Medical Document Management) and VXU (Vaccinations). This accelerator apart from handling the needed specific connectivity also does a minimal initial routing of data and has complete auditing integrated.
Routing iDAAS Connect Data Distribution This iDAAS Connect accelerator is specifically implemented to help keep the iDAAS Connect (HL7 and FHIR) accelerators focus solely on connectivity, simplified initial data processing needs and auditing. It is also capable and intended to be used by the iDAAS Connect Third Party accelerator as well. It purpose is to route and move data, it does this by implementing an enterprise integration pattern named HCDD-EIP (Healthcare Data Distribution and Enterprise Integration Pattern). This accelerator apart from handling the needed specific connectivity with detailed routing logic built in and complete auditing integrated.
Run (Transform and Re-Shape Data) Event Builder This component is what enables Red Hat, partners, SIs and developers to develop, extend or enhance the platform's ability to process data into any type of needed custom format for any type of needed processing. iDAAS Event Builder is designed to call out and invoke needed events and can be customized based on business needs very quickly. The only thing past cloning the source code is setting up the appropriate way to include iDAAS Event Builder jar files so that it can be included. If you do not wish to leverage the existing code, enhance or extend it developers are able to add their own custom code for processing and object building.
Run iDAAS DREAM (Data Runtime Enterprise Automated Mgmt) DREAM's design intent is to enable Red Hat, partners, SIs and developers to implement iDAAS and/or other capabilities internal or external to iDAAS in a dynamic manner without the need for stopping the platform and needing to restart it to work with data while adding new features.
Research and Resolve iDAAS Data Hub iDAAS Data Hub is where audit and other related data for the platform is stored. Like the rest of iDAAS it is intended to be extensible. iDAAS Data Hub is a platform to enable processing of data into the various components and data models included. The key things Data Hub is meant to ensure resources have data driven insights from ANY activity the iDAAS platform will do. A key thing to note in the data model and events the system focuses on is a way to associate one organization to many healthcare entities and to many applications and within each application any components wished to be defined. This is ALL up to the implementation. Because of this data enablement iDAAS focuses on enabling a detailed eventing model to iDAAS Data Hub for any activities the system does, this specifically is done through a transaction event which has a rich set of data attributes to enable detailed insight.

Required

Capability Component Component Desc.
Receiving iDAAS Connect Clinical - FHIR This iDAAS Connect accelerator specifically and ONLY supports ALL FHIR resources. From an integration perspective it enables processing of over 60+ specific FHIR resources that span base, clinical, financial (all), specialized (all but testing). Another benefit of this platform is that you DO NOT require a FHIR server for this be leveraged. However, It has been tested to work with several FHIR servers: HAPI JPA Server, Microsoft Azure FHIR Server and IBM FHIR Server. This accelerator apart from handling the needed specific connectivity also does a minimal initial routing of data and has complete auditing integrated.
Routing iDAAS Connect Data Distribution This iDAAS Connect accelerator is specifically implemented to help keep the iDAAS Connect (HL7 and FHIR) accelerators focus solely on connectivity, simplified initial data processing needs and auditing. It is also capable and intended to be used by the iDAAS Connect Third Party accelerator as well. It purpose is to route and move data, it does this by implementing an enterprise integration pattern named HCDD-EIP (Healthcare Data Distribution and Enterprise Integration Pattern). This accelerator apart from handling the needed specific connectivity with detailed routing logic built in and complete auditing integrated.
Run (Transform and Re-Shape Data) Event Builder This component is what enables Red Hat, partners, SIs and developers to develop, extend or enhance the platform's ability to process data into any type of needed custom format for any type of needed processing. iDAAS Event Builder is designed to call out and invoke needed events and can be customized based on business needs very quickly. The only thing past cloning the source code is setting up the appropriate way to include iDAAS Event Builder jar files so that it can be included. If you do not wish to leverage the existing code, enhance or extend it developers are able to add their own custom code for processing and object building.
Run iDAAS DREAM (Data Runtime Enterprise Automated Mgmt) DREAM's design intent is to enable Red Hat, partners, SIs and developers to implement iDAAS and/or other capabilities internal or external to iDAAS in a dynamic manner without the need for stopping the platform and needing to restart it to work with data while adding new features.
Research and Resolve iDAAS Data Hub iDAAS Data Hub is where audit and other related data for the platform is stored. Like the rest of iDAAS it is intended to be extensible. iDAAS Data Hub is a platform to enable processing of data into the various components and data models included. The key things Data Hub is meant to ensure resources have data driven insights from ANY activity the iDAAS platform will do. A key thing to note in the data model and events the system focuses on is a way to associate one organization to many healthcare entities and to many applications and within each application any components wished to be defined. This is ALL up to the implementation. Because of this data enablement iDAAS focuses on enabling a detailed eventing model to iDAAS Data Hub for any activities the system does, this specifically is done through a transaction event which has a rich set of data attributes to enable detailed insight.

Capability Component Component Desc.
Receiving iDAAS Connect - Third Party This iDAAS Connect accelerator is specifically designed to receive data from several dozens connectors. The connectors include JDBC (any jdbc compliant data source with a jar), Kafka, FTP/sFTP and sFTP, AS400, HTTP(s), REST and many more. Since this accelerator is built atop the upstream of Apache Camel this accelerator can leverage any supported components. This accelerator apart from handling the needed specific connectivity also does a minimal initial routing of data and has complete auditing integrated.
Routing iDAAS Connect Data Distribution This iDAAS Connect accelerator is specifically implemented to help keep the iDAAS Connect (HL7 and FHIR) accelerators focus solely on connectivity, simplified initial data processing needs and auditing. It is also capable and intended to be used by the iDAAS Connect Third Party accelerator as well. It purpose is to route and move data, it does this by implementing an enterprise integration pattern named HCDD-EIP (Healthcare Data Distribution and Enterprise Integration Pattern). This accelerator apart from handling the needed specific connectivity with detailed routing logic built in and complete auditing integrated.
Run (Transform and Re-Shape Data) Event Builder This component is what enables Red Hat, partners, SIs and developers to develop, extend or enhance the platform's ability to process data into any type of needed custom format for any type of needed processing. iDAAS Event Builder is designed to call out and invoke needed events and can be customized based on business needs very quickly. The only thing past cloning the source code is setting up the appropriate way to include iDAAS Event Builder jar files so that it can be included. If you do not wish to leverage the existing code, enhance or extend it developers are able to add their own custom code for processing and object building.
Run iDAAS DREAM (Data Runtime Enterprise Automated Mgmt) DREAM's design intent is to enable Red Hat, partners, SIs and developers to implement iDAAS and/or other capabilities internal or external to iDAAS in a dynamic manner without the need for stopping the platform and needing to restart it to work with data while adding new features.
Research and Resolve iDAAS Data Hub iDAAS Data Hub is where audit and other related data for the platform is stored. Like the rest of iDAAS it is intended to be extensible. iDAAS Data Hub is a platform to enable processing of data into the various components and data models included. The key things Data Hub is meant to ensure resources have data driven insights from ANY activity the iDAAS platform will do. A key thing to note in the data model and events the system focuses on is a way to associate one organization to many healthcare entities and to many applications and within each application any components wished to be defined. This is ALL up to the implementation. Because of this data enablement iDAAS focuses on enabling a detailed eventing model to iDAAS Data Hub for any activities the system does, this specifically is done through a transaction event which has a rich set of data attributes to enable detailed insight.