Clinical Decision Support for Chronic Pain Management and Shared Decision-Making IG
0.1.0 - CI Build International flag

Clinical Decision Support for Chronic Pain Management and Shared Decision-Making IG, published by CQF. This guide is not an authorized publication; it is the continuous build for version 0.1.0 built by the FHIR (HL7® FHIR® Standard) CI Build. This version is based on the current content of https://github.com/cqframework/cds4cpm/ and changes regularly. See the Directory of published versions

Implementation Resources

Implementation Resources

The following implementation resources are adapted from the CDS4CPM initial pilot sites and are being provided here to support future implementations of MyPAIN and PainManager. These types of site-oriented enriching materials are provided to assist with some of the toughest friction points of implementing new SMART-on-FHIR technologies. By supporting implementation sites with Implementation Guides augmented with implementation resources, interoperability teams will move FHIR technologies beyond demonstration projects into the production environment to achieve long term sustainability.

Use Case materials

Personas

Why this is helpful: Personas represent archetypal system end-users and serve to ground development. By following the creation of data as it originates and shows how it moves through the system and/or workflow, personas help illustrate and track the provenance of data.

Often created or initiated by a design team or developer, personas also require insight from a subject-matter-expert and/or from key stakeholders at implementing sites. Ultimately, personas support the confirmation that the intended requirements will meet the needs of realistic users.

Next steps: Feedback from the implementation site and the development team will lead to changes in the personas. Concordantly, refinement of the personas will inform the development team and implementation team. It is important to establish an iterative process for incorporating this feedback loop.

[Download] Example Persona templates

Use Cases

Why this is helpful: Example use cases serve to extend the persona and detail specific aspects of workflow. In addition to input from implementing sites and the development team, use cases utilize clinician review to confirm the relevance of the intervention. Use cases also ensure that system requirements are met and can be fully tested, and they are the starting point for the data element crosswalk.

Next steps: To the greatest extent possible, develop and utilize explicit use cases early to help establish requirements and guide progress in development.

[Download] Example use cases

Wireframes

Why is this helpful: Wireframes provide an early visual example of what the application is supposed to look like, what it is supposed to do (what problem is it solving?) and how users might interact with it. This helps to quickly articulate the application look/feel and facilitate input from key stakeholders at implementation sites. By demonstrating how the application is situated in the information sharing space, wireframes also a preview into the workflow to support the intervention.

Next steps: Wireframes are a key connection between the system integrators and the rest of the project. It is common to use wireframes as a scaffold to iteratively refine design aspects and validate these among different user groups.

[Download] Example wireframes

Test Scripts

Why is this helpful: Test scripts provide implementing sites with the ability to run explicit tests to ensure the system produces the expected outcome. This in turn uncovers flaws and plays a part in the iterative development process. Test scripts are often developed by the implementing site, with guidance from the development team. Changes to the personas have a cascading (and reciprocal) effect through to the test scripts.

Next steps: Before using test scripts, prepare implementation sites to anticipate testing needs and unanticipated errors (e.g., data not being pulled in and/or presented) and plan the timeline accordingly.

[Download] Example test scripts

Evaluation Components

Why this is helpful: Evaluation forms a critical piece of the artifact lifecycle by providing the evidence that informs the revision of an artifact or an implementation. Evaluation components include planned activities associated with capturing data to inform the evaluation and a summary report that describes lessons learned for future implementations. The evaluation plan is often developed and shared among site implementers and the development team. Settling the plans for evaluation plan to include examples of data output allows the teams to plan for capture, testing, review and analysis of the data.

Next steps: There needs to be a mechanism for how evaluation data will inform updates to the IG. The development of this mechanism may lie with standards development groups.

[Download] Example evaluation components