Skip to main content
  • Study Protocol
  • Open access
  • Published:

Michigan Model for HealthTM Learning to Enhance and Adapt for Prevention (Mi-LEAP): protocol of a pilot randomized trial comparing Enhanced Replicating Effective Programs versus standard implementation to deliver an evidence-based drug use prevention curriculum

Abstract

Background

School-based drug use prevention programs have demonstrated notable potential to reduce the onset and escalation of drug use, including among youth at risk of poor outcomes such as those exposed to trauma. Researchers have found a robust relationship between intervention fidelity and participant (i.e., student) outcomes. Effective implementation of evidence-based interventions, such as the Michigan Model for HealthTM (MMH), is critical to achieving desired public health objectives. Yet, a persistent gap remains in what we know works and how to effectively translate these findings into routine practice. The objective of this study is to design and test a multi-component implementation strategy to tailor MMH to meet population needs (i.e., students exposed to trauma), and improve the population-context fit to enhance fidelity and effectiveness.

Methods

Using a 2-group, mixed-method randomized controlled trial design, this study will compare standard implementation versus Enhanced Replicating Effective Programs (REP) to deliver MMH. REP is a theoretically based implementation strategy that promotes evidence-based intervention (EBI) fidelity through a combination of EBI curriculum packaging, training, and as-needed technical assistance and is consistent with standard MMH implementation. Enhanced REP will tailor the intervention and training to integrate trauma-informed approaches and deploy customized implementation support (i.e., facilitation). The research will address the following specific aims: (1) design and test an implementation strategy (Enhanced REP) to deliver the MMH versus standard implementation and evaluate feasibility, acceptability, and appropriateness using mixed methods, (2) estimate the costs and cost-effectiveness of Enhanced REP to deliver MMH versus standard implementation.

Discussion

This research will design and test a multi-component implementation strategy focused on enhancing the fit between the intervention and population needs while maintaining fidelity to MMH core functions. We focus on the feasibility of deploying the implementation strategy bundle and costing methods and preliminary information on cost input distributions. The substantive focus on youth at heightened risk of drug use and its consequences due to trauma exposure is significant because of the public health impact of prevention. Pilot studies of implementation strategies are underutilized and can provide vital information on designing and testing effective strategies by addressing potential design and methods uncertainties and the effects of the implementation strategy on implementation and student outcomes.

Trial registration

NCT04752189—registered on 8 February 2021 on ClinicalTrials.gov PRS

Peer Review reports

Background

School-based universal prevention interventions have demonstrated notable potential to reduce the onset and escalation of drug use and mental health problems, including among youth exposed to trauma, marginalization, and socioeconomic disadvantage [1, 2]. Universal prevention interventions, also referred to as Tier 1, are delivered to an entire population regardless of risk [3]. These interventions can have a lasting impact on youth by reducing or preventing multiple interrelated outcomes (e.g., drug use and poor mental health) that share common risk factors [4,5,6]. School-based prevention can also reach large populations of young people, including those underserved in other settings [7]. Thus, schools are a critically important setting in which to support well-being and mitigate the effects of risk exposure among children and youth. Tier 1 prevention that is responsive to population needs offers a promising opportunity to reduce the short- and long-term consequences of exposure to stress and adversity, including substance abuse, the development of substance use disorders, mental illness, and academic failure, by enhancing resilience, providing a supportive context, and avoiding stigmatization and retraumatization [1, 8].

Recent research indicates that trauma exposure is pervasive among youth. An estimated 30.5% of youth ages 12–17 are exposed to multiple (2 or more) Adverse Childhood Experiences or ACEs [9]. ACEs are potentially traumatic events that occur during childhood including abuse, neglect, witnessing violence, parental substance abuse, and mental health problems [9, 10]. School-based, trauma-informed interventions represent a promising way to mitigate the impact of exposure to adversity on children and youth, especially given the reach of universal prevention and the high prevalence of trauma exposure in the general population [1]. Researchers have found higher rates of ACEs, other trauma, and toxic stress exposure among youth experiencing marginalization and socioeconomic disadvantage [11].

Taken together, this research indicates that school-based universal prevention would benefit from incorporating approaches to meet the needs of trauma-exposed youth. By incorporating trauma-informed approaches, teachers and other school professionals can reduce the risk of additional adversity exposure and retraumatization and strengthen factors that support resilience [12, 13]. Evidence-based interventions (EBIs), however, have rarely been designed to remain responsive to student needs, such as trauma exposure [14]. As a result, EBIs frequently fail to achieve desired public health outcomes, including among those who would most benefit [15, 16]. Researchers suggest that the public health impact of EBIs can be improved by addressing key determinants (or barriers) and facilitators of successful implementation (see Fig. 1). this would enhance the adoption, delivery, and sustainment of EBIs, as well as bridge the sizable gap between knowing which prevention strategies work and effectively translating such strategies into routine practice [17, 18]. Designing and deploying implementation strategies for existing universal prevention EBIs, such as the Michigan Model for HealthTM (MMH), offers an efficient way to address key barriers to implementation, meet population needs, and achieve public health objectives.

Fig. 1
figure 1

Conceptual model for applying implementation strategies to evidence-based interventions (EBIs), adapted from Proctor et al. [19]; Lyon & Bruns [20

MMH is a theoretically-based, universal prevention curriculum that has demonstrated efficacy in randomized trials in reducing substance use and improving mental health outcomes among high school students [21, 22]. The curriculum is grounded in social cognitive theory [23] and the health belief model [24]. Our current study focuses on three core units of the MMH curriculum: the foundational skills unit, social and emotional health, and alcohol, tobacco, and other drugs. The MMH curriculum is recognized as an evidence-based intervention by CASEL (the Collaborative on Academic and Social and Emotional Learning) and is aligned with Michigan and National (USA) Health Education standards [25, 26]. The curriculum is widely adopted across Michigan, with 91% of health teachers using MMH [27]. Yet, similar to other EBIs, MMH is infrequently delivered with fidelity [28]. A statewide study found that 58% of educators failed to meet state-designated MMH fidelity standards [27] (i.e., delivering 80% or more of the curriculum); this is even higher among schools in economically challenged communities, with 73% not meeting state fidelity standards [29].

Replicating effective programs (REP) is a well-suited implementation strategy for school-based prevention. REP is a multi-component strategy used in community settings, including schools, focused on enhancing fit between the intervention and context while maintaining fidelity to core EBI functions. It is based on the CDC’s research-to-practice framework [30, 31] and guided by social cognitive [32] and diffusion of innovations theories [33]. REP is a low-level strategy that is consistent with the standard implementation of the MMH curriculum and includes three primary components: curriculum packaging materials, teacher training, and as-needed technical assistance (Table 1). As REP is not always sufficient to effectively implement complex behavioral interventions [34], researchers developed Enhanced REP that includes tailoring of curriculum materials, tailored training, and ongoing provider consultation, or Facilitation. Facilitation is based on the integrated Promoting Action on Research Implementation in Health Services (iPARIHS) framework, to provide more intensive implementation support [34, 35]. Researchers have found enhanced program uptake among clinical sites deploying multi-component implementation strategies such as Enhanced REP [36]. This additional tailoring and support can aid in mitigating barriers, enhancing intervention-context fit, and ultimately achieving desired public health outcomes [14]. Additionally, the cost of implementation fundamentally influences program delivery in schools, which often have competing demands and carefully allocated resources [7, 37]. To date, most economic evaluation has focused on intervention costs and not the costs of implementation strategies required to deploy and sustain them [37]. Systematic examination of costs and outcomes for multi-component implementation strategies is vital for scale-up and sustainability in community settings [37, 38].

Table 1 Standard implementation and Enhanced Replicating Effective Programs (Enhanced REP) components for drug use prevention intervention implementation (adapted from Kilbourne et al. [39])

Previous research identifying key determinants of the MMH curriculum implementation established that teachers found the intervention was unable to consistently meet students’ needs (i.e., the context), in particular among students experiencing marginalization, trauma, and disadvantage, posed challenges to intervention acceptability, which, in turn, reduced fidelity [40]. While the teachers reported the curriculum is adaptable, they also reported that more intensive adaptations, including those designed to meet the needs of specific subgroups of students, were time and resource-intensive [41]. Teachers reported these adaptations were, however, critical and one teacher reported that “(the curriculum) turn(ed) students away, and it turned the group of students away who needed it most. The students who have substance abuse problems, have emotional/mental health problems [40].” Collectively, this research underscores the need for implementation strategies to facilitate the responsiveness of EBIs for those disproportionately at risk of marginalization, trauma, and ultimately substance misuse, abuse, and the development of substance use disorders [42, 43]. Deploying implementation strategies that incorporate systematic adaptations to improve intervention-context fit and needed training and support is a promising approach to enhancing intervention acceptability, and fidelity to ultimately achieving improved health outcomes. An important first step, however, is executing a pilot study designing and testing the feasibility of the implementation strategies; pilot studies of implementation strategies are underutilized and can provide vital information on designing and testing effective strategies through addressing potential design and methods uncertainties and assessing potential effects of the implementation strategy on implementation and student outcomes [44].

Aims/objectives

The goal of the study is to design and test a multi-component implementation strategy (Enhanced REP) to enhance the effective delivery of a comprehensive health curriculum in community schools serving youth exposed to trauma and compare it to standard implementation.

Primary aims

The primary aim of this study will be to design and test Enhanced REP to deliver MMH. We will evaluate the feasibility, acceptability, and appropriateness of the implementation strategy, compared to standard REP (i.e., standard implementation) which will be the basis of a larger hybrid type 3 cluster-randomized trial.

The second primary aim is to conduct an economic evaluation to estimate implementation strategy costs and cost-effectiveness of Enhanced REP versus standard implementation to deliver MMH in preparation for a larger trial.

Secondary aims

The secondary aims are to evaluate the potential effectiveness of Enhanced REP versus standard implementation on youth outcomes including student drug use, drug use risk perceptions, quality-adjusted life years, and student-level fidelity (e.g., satisfaction, curriculum engagement). We will also assess fidelity to the MMH curriculum using a teacher dose delivered measure, consistent with previous MMH implementation research [27, 29, 45].

Methods/design

Sample and setting

This pilot study will include ten schools from two intermediate school districts (ISDs: provide general education and curriculum support to multiple school districts), or Regional Educational Service Agencies, in Michigan. In 2019, 20.5% of youth nationally and 22.0% in Michigan experienced 2 or more ACEs and the risk of exposure increased to 24.2% for youth who experienced economic hardship [46].

Procedures

The implementation strategy will be deployed by the school health coordinators as they are key MMH implementation intermediaries. This is also consistent with current practices; school health coordinators work with schools across Michigan serving in 24 regional hubs to support health programming such as MMH [47]. The health coordinators maintain relationships with school districts and health teachers and provide support including training, technical assistance, and consultation for school health programs, practices, and policies [47]. Thus, by using existing infrastructure and capacity for deploying Enhanced REP to support teachers in delivering MMH with fidelity, we will enhance the likelihood of sustainment.

As a first step in designing Enhanced REP, we will convene an advisory board of stakeholders (e.g., school health coordinators, teachers, other school professionals, and admin). Given preliminary data findings and the rates of community-level trauma and drug use in the state, we will consult experts in trauma-informed approaches to support curriculum tailoring and advise on trauma-informed training options. We will work with the advisory board to identify key areas of the curriculum that can be tailored, that is, the form aspects of the intervention. We will distinguish these elements from the core elements needed to be retained to support MMH effectiveness. This process will be guided by systematic adaptation steps described by Escoffrey et al. [48] and the foundational theories (the health belief model and social cognitive theory) of the intervention, to ensure that we incorporate fidelity-consistent adaptations that will not compromise the curriculum effectiveness [23, 24, 49]. We will then integrate the proposed changes into the curriculum materials. This will include replacing out-of-date materials, incorporating trauma-informed mental and emotional health resources, making the format user-friendly and flexible for teachers, updating drug use information and resources, and adding new activities for students (e.g., online interactive activities). The advisory committee members will each receive $250 remuneration for participation in study activities.

The facilitation component will be based on the Quality Enhancement Research Initiative (QUERI) implementation facilitation strategy and iPARIHS framework [50, 51]. Facilitation promotes provider (i.e., teacher) capacity and self-efficacy in addressing barriers to MMH implementation [31]. The health coordinators will receive specialized training in facilitation based on the QUERI training program and a school-based trial deploying facilitation [31], adapted to Tier 1 prevention. Health coordinators will support teachers in implementing the tailored curriculum by engaging in the facilitation activities (see Table 2 for sample facilitation schedule and activities).

Table 2 Facilitation component of Enhanced REP, schedule of activities (adapted from Kilbourne et. al. [31])

The training component will include asynchronous and synchronous modules focused on trauma-informed approaches with teachers as the intended audience. Before conducting training, we will assess teachers’ exposure to and awareness of trauma-informed practices to tailor the training to meet their needs. Sample modules for the trauma-informed training component of the implementation strategy bundle are included in Table 3.

Table 3 Trauma-informed training component of Enhanced REP

Taking into consideration school closures and inconsistent education delivery in the 2020-2021 school year, we delayed the beginning of the study to September of 2021. The primary reason for this was to ensure conditions among intervention and control groups were as similar as possible and to decrease the likelihood that outcomes would be confounded by the learning modality of a school such as virtual compared to in-person learning. During this time, we planned to conduct user-testing of the tailored curriculum component of Enhanced REP. Two teachers who did not qualify to participate in the trial portion of the study tested the digital content and provided feedback. Following this user testing, the study team will meet with teachers and health coordinators to discuss feedback and identify areas for refinement prior to pilot study initiation.

In cooperation with the school health coordinators, we will identify and recruit 10 high schools currently using MMH that fail to meet state standards for implementation (implementing less than 80% of the curriculum) and/or facing one or more barriers to MMH implementation, for participation. We will focus on schools where at least 20% of students are eligible for free and reduced lunch, as socioeconomic disadvantage is an additional risk factor for ACEs. We will match participating schools a priori on key characteristics such as school size and percent of students eligible for free/reduced lunch to ensure balance across study conditions [54]. We will randomize schools to either standard implementation (akin to standard REP) or Enhanced REP (see Table 1, Fig. 2). For the standard implementation condition, teachers will receive the MMH curriculum manual, standard training, and as-needed technical assistance, provided to them by the health coordinators. The Enhanced REP condition will receive an MMH curriculum manual tailored to incorporate trauma-informed approaches, tailored trauma-informed training, and implementation facilitation.

Fig. 2
figure 2

Group randomized controlled trial pilot study design. ATOD: alcohol, tobacco and other drugs

The health coordinators and study team will meet with teachers and administrators at participating schools to share study information. We will discuss specific procedures around data collection. Teachers will provide informed consent prior to the start of the study. We will conduct semi-structured interviews with teachers pre-implementation to assess readiness and post-implementation to assess the feasibility, acceptability, and appropriateness of standard implementation and Enhanced REP. Teachers will also complete a post-implementation survey to evaluate these constructs quantitatively. Students will be eligible for student-level surveys. Students will be recruited in conjunction with school district partners. The study team will communicate with teachers and administration, either in-person or using video conferencing and via email, to share study information.

The student survey questions will be similar to other school-based surveys (i.e., Mi-PHY: Michigan Profile for Healthy Youth), and we will follow a similar procedure for survey administration. The teachers will send a letter to parents to provide information and an opportunity to opt-out prior to the start of the study. Following parental letter and participant assent, students will complete a self-administered questionnaire through a secure, online server. As MMH is integrated as part of the school curriculum, students will complete the initial survey during their health class before the MMH delivery and at the end of the term. Students who do not assent will receive an alternate activity. We expect 300 students to participate, based on an 80% response rate across the schools, similar to other youth studies [55]. Each school will receive $500 to support their participation in the study and teachers will receive up to $300 per academic term as remuneration for study activities.

Primary aim 1

Compare deploying Enhanced REP with standard implementation to deliver the MMH and evaluate the feasibility, acceptability, and appropriateness of the implementation strategy among teachers.

Measures

Feasibility, acceptability, and appropriateness

To evaluate comprehensively feasibility, acceptability, and appropriateness we will adopt a convergent mixed methods design (see Fig. 3). The purpose of a convergent design is to obtain “different but complementary data on the same topic [56]. We will use Weiner et al.’s [57] measures to assess acceptability, appropriateness, and feasibility. Each construct has four items (e.g., REP is appealing, REP seems suitable), using a 5-point Likert scale: 1—strongly disagree to 5—strongly agree. The interview guide will focus on eliciting feedback on specific Enhanced REP components (manual, training, and facilitation) and existing challenges with curriculum implementation.

Fig. 3
figure 3

Aim 1 convergent mixed methods design (adapted from Creswell & Plano-Clark [58])

Data analytic approach

Qualitative data

We will use an inductive/deductive thematic analytical approach outlined by Hsieh and Shannon for interview transcripts [59]. First, each member of the study team will review the transcript material to develop a broad understanding of the content [60]. Second, the empirical material contained in the interviews will be independently coded by project team members to condense the data into analyzable units. Segments of text ranging from a phrase to several paragraphs will be assigned codes based on a priori or emergent themes (also known as open coding; [61]). Codes will also be assigned to describe connections within and between categories and subcategories (axial coding; [61]). Third, the text will be independently coded by at least two study team members. Disagreements will be resolved through consensus and the team will develop a final codebook. Using this codebook, two study team members will separately review transcripts to determine the level of agreement in the codes applied [62]. Fourth, based on these codes, we will use qualitative software to generate a series of categories connecting text segments grouped into separate “nodes.” We will use these nodes to examine the association between different a priori and emergent categories. Fifth, the different categories will be further condensed into broad themes.

Quantitative data

We will evaluate appropriateness, acceptability, and feasibility using descriptive analyses from teacher surveys. The analyses will focus on descriptive statistics for the quantitative data from providers, including means, standard deviations, and proportions as appropriate.

Data integration

Results from each data set will be examined side-by-side to explore convergence (i.e., comparing analysis conclusions) to investigate if qualitative and quantitative results concur and complementarity (i.e., one set of findings elaborates on another). We will also investigate how interview results elaborate on quantitative results (expansion) to deepen our understanding of why and how Enhanced REP may or may not be acceptable, feasible, and appropriate and how this may influence fidelity [58]. Should discordant findings arise, we will use a method of managing such findings as described by Creswell and Plano Clark, such as collecting additional data, re-analyze existing datasets, and identifying potential sources of bias [58].

Primary aim 2

Conduct an economic evaluation to estimate the costs and cost-effectiveness of deploying Enhanced REP versus standard implementation to deliver MMH.

Procedures

We will monitor the activities listed in Table 4 to estimate implementation strategy costs. The study team will track time for labor costs, which will constitute most of the resources/costs. This will include time logs for health coordinators to track Enhanced REP Facilitation activities, costs for training, and indirect costs associated with teacher time to attend tailored training and participate in implementation support. The study team will also track and compile all non-labor costs (e.g., adapted curriculum website updates and maintenance).

Table 4 Cost inputs for the economic evaluation of the Enhanced REP implementation strategy

Measures

Costs

Researchers have used micro-costing approaches frequently in implementation science [37]. One approach is a modified cost calculator approach that has been applied in the Costs of Implementing New Strategies (COINS); this approach identifies a range of costs across phases of implementation (e.g., pre-implementation, implementation, and sustainability) tailored to the strategies utilized for a specific implementation effort and focused on the perspective of the organization/provider deciding to adopt the EBI [63,64,65,66]. This is useful in identifying costs related to implementation for several reasons: (1) it aids in determining direct costs of implementation through tallying time spent on activities in each phase of implementation strategy deployment (often the bulk of implementation strategy costs), (2) this practical approach can provide needed guidance and scaffolding for stakeholders and decision-makers to determine implementation costs so organizations could accurately estimate the necessary resources for implementation success, and (3) this approach has been used previously with Enhanced REP in estimating costs as the first step in cost-effectiveness analysis for a community-based clinical trial [67]. Table 4 provides a list of anticipated activities whose costs will be estimated prospectively as part of the pilot trial. We will also assess the costs of the REP condition, or standard implementation of MMH, for the intervention materials and training using available data from MDHHS and the Michigan School Health Coordinators Association (MiSHCA). For the standard implementation condition, we will ask coordinators to track time spent on as-needed technical assistance.

Health outcomes

We will use the EQ-5D to assess Quality Adjusted Life Years (QALYs). The EQ-5D is a multi-attribute utility instrument that yields interval-level scores ranging from 0 (dead) to 1 (perfect health) [68]. This mapping provides a health utility measure for each health state experienced by patients in the study and can be used to calculate quality-adjusted life years, the preferred measure for health benefits used in cost-effectiveness analysis [52].

Data analytic approach

All costs will be adjusted to the current year's US dollars. Costs will include implementation strategy costs (i.e., inputs) listed in Table 4, such as labor costs for meetings, costs associated with training, and labor costs associated with the provision of facilitation. We will estimate the costs of Enhanced REP and standard implementation using the cost data, with the comparator strategy being standard implementation. Our primary utility will be a change in reported quality-adjusted life years (QALYs). We will also assess changes in student drug use during the intervention period. We will use net costs (net increase in costs from Enhanced REP compared to standard implementation) and net effectiveness (net change in drug use for Enhanced REP versus standard implementation) to estimate the incremental cost-effectiveness ratio for student outcomes. We will conduct a one-way sensitivity analysis on all cost input parameters listed in Table 4 as well as health utilities to provide estimates of the costs and incremental cost-effectiveness to decision-makers. The analysis will also include multi-way sensitivity analyses on the parameters whose results are most sensitive in influencing the cost-effectiveness ratio [52]. This will inform the feasibility of our costing approach, provide preliminary information on cost input distributions and which inputs may be especially influential on cost-effectiveness ratios to inform the detail of future data collection efforts; this pilot study will also provide preliminary information on the cost-effectiveness of Enhanced REP to inform the utility of undertaking a CEA in a larger trial [53].

Secondary aims

Our secondary aim is to evaluate the potential effectiveness of Enhanced REP versus standard implementation on student outcomes. Student outcomes include drug use, drug use risk perceptions, and student-level fidelity (e.g., satisfaction, curriculum engagement) as described by Barrera et al. [69]. We will also assess MMH fidelity using the teacher dose delivered.

Measures

Secondary aim study measures are summarized in Table 5.

Table 5 Secondary outcome measures

Behavioral outcomes

We will assess past 3- and 12-month substance use using items from Monitoring the Future (MTF; [70]) with adapted response options and timeframe.

Implementation outcomes, student-level

Fidelity: engagement

Engagement has been identified as a fidelity dimension that provides information on participant responsiveness to the intervention and is key to intervention success [69]. As described in Barrera et al. [69], we will assess engagement using student satisfaction and key intervention skills. The satisfaction measure will be adapted based on a scale developed by Giles et al. for another drug prevention intervention with good psychometric properties that will include four items [72]. We will evaluate key intervention skills: assertive communication, refusal skills, and decision-making. These dimensions are identified in the MMH curriculum summative evaluation materials, assessed in previous MMH studies, and based on National Health Education Standards [21, 26]. Students will rate their level of agreement using a 5-point Likert scale on their proficiency with specific elements related to each skill.

Fidelity, teacher-level: MMH dose delivered

We will assess the dose or amount of the intervention delivered using a curriculum fidelity tracking form. Teachers will complete a brief form following each lesson and unit included in the study. Lessons are grouped into multiple units, including the alcohol, tobacco, and other drug prevention unit, the skills unit, and the social and emotional learning unit. We will assess the dose delivered by calculating the total number of lessons completed within each unit (10 lessons/unit). We will also ask teachers to report any adaptations or modifications, guided by the framework proposed by Wiltsey-Stirman et al. [74] on the tracking form. Modifications include adding, removing, and changing content, substituting activities, and changing activity formats.

Data analysis

Secondary outcomes

The focus of this pilot trial will be to generate information to inform our approach to collecting student-level data and estimate the treatment effects variance estimates for a larger study [75]. We will calculate and examine treatment effects using linear mixed-effects models (LMM), with an understanding of potential challenges with estimating treatment effects for pilot studies [76]. LMM are appropriate models for analyzing clustered data and may involve fixed and random effects [77]. We will control for key demographic variables in all analyses, including race/ethnicity gender, and socioeconomic status.

We will assess MMH dose delivered using the sum score across all units as well as unit-specific dose delivered calculations. Consistent with previous research, we will also calculate the proportion of the units delivered, individually and combined, with a focus on descriptive statistics (means, standard deviations).

Discussion

This research has the potential to support the design and deployment of effective and sustainable strategies for implementing drug use prevention interventions in schools. Preventing or tempering the onset and escalation of drug use can reduce the burden of social, emotional, and economic costs placed on youth and their families, communities, and society [3]. The proposed research can contribute to improving public health and reducing health disparities related to the disparate consequences of drug use and addiction among youth exposed to trauma. Systematically designing and testing implementation strategies to address key determinants implementation, including poor fit between the intervention and context/population needs, will support refinement and ultimately, leadership engagement in ongoing implementation support once the research study ends. Economic evaluation can provide vital information for stakeholders and decision-makers regarding implementation strategies to support and sustain quality EBI delivery.

This project has several strengths. First, we will design and systematically deploy a theoretically grounded bundle of implementation strategies, Enhanced REP, to address key determinants. Second, researchers have noted that many full implementation trials have progressed without initial pilot trials [44]; thus, this research will support the quality and rigor of full implementation trials by providing an opportunity to refine the implementation strategy and address issues around the feasibility of research methods to enhance the contributions to the field [44]. This research has the potential to support the advancement of research-to-practice translation of substance use prevention programs through designing implementation strategies that are effective, sustainable, and that improve quality program delivery. Third, we will use a community-engaged approach to designing the implementation strategy by incorporating an advisory board composed of multiple stakeholders. Fourth, we will estimate the costs and cost-effectiveness of the implementation strategies versus standard implementation to help address a central challenge to the effective implementation and sustainment of EBIs. This has the potential to provide useful, accessible information for communities to make well-informed decisions about resource allocation and implementation. Fourth, this project has the potential to advance implementation strategies for universal substance use prevention interventions. Researchers have estimated that effectively implementing school-based prevention would save an estimated lifetime monetary cost of $33.5 billion and a total cost of $98.6 billion to society for early adolescents using costs estimated in 2002 [78]. Finally, this research will support using implementation science to meet the needs of underserved families and communities who are at increased risk of trauma exposure, drug use, and its consequences.

We also note several potential study limitations. While the study will work with schools serving under-resourced families across two counties in Michigan, schools are heterogeneous settings with different organizational structures. Thus, different schools may experience other relevant barriers at various levels not addressed in the Enhanced REP strategy bundle. Alternative strategies may be needed to support successful implementation, including strategies in the outer setting, and may be an important focus of future research. We did not include students explicitly in the implementation strategy design process. Although implementation strategies often focus on the intervention deliverer and relevant contextual considerations, incorporating the input of the intervention recipients in addition to assessing their responsiveness (e.g., engagement) is an important future direction and potential focus of a larger trial.

The next step in this program of research would be to test the implementation strategy on a larger scale over a longer duration, including a large-scale cluster-randomized trial Type 3 hybrid trial. This would be conducted in schools serving youth experiencing ACEs and other forms of adversity across multiple states adopting the MMH curriculum. This research will also support scaling up these strategies to improve health curriculum delivery and advance substance use prevention.

Availability of data and materials

Electronic copies of publications will be made accessible to a journal in PubMed Central. This article is licensed under the Creative Commons Attribution 4.0 Generic License (CC BY 4.0), which permits use, sharing, adaptation, distribution, and reproduction if appropriate credit to the original author(s) is provided along with a link to the Creative Commons license and an indication of changes. De-identified primary participant (i.e., student)-level data will be available through an appropriate data repository, such as the NIH HEAL (National Institutes of Health: Helping to End Addiction Long-termSM) Initiative central data repository. This data will be available upon acceptance for publication of the main findings from the final student-level dataset. Data will be available in the NIH HEAL repository per HEAL guidelines. Access to individual-level data will require entering into a data-sharing agreement that includes requirements to protect participants’ privacy and data confidentiality.

Abbreviations

ACEs:

Adverse childhood experiences

CASEL:

Collaborative for Academic, Social, and Emotional Learning

CDC:

Centers for Disease Control and Prevention

COINS:

Costs of Implementing New Strategies

DSMB:

Data Safety and Monitoring Board

EBI:

Evidence-based intervention

Enhanced REP:

Enhanced Replicating Effective Programs

HIPAA:

Health Insurance Portability and Accountability Act

iPARIHS:

Integrated Promoting Action on Research Implementation in Health Services

IRB:

Institutional Review Board

ISDs:

Intermediate School Districts

LMM:

Linear mixed-effects models

Mi-LEAP:

Michigan Model for HealthTM: Learning to Enhance and Adapt for Prevention

MMH:

Michigan Model of HealthTM

MTF:

Monitoring the Future

NIDA:

National Institute on Drug Abuse

NIH HEAL:

National Institutes of Health Helping to End Addiction Long-termSM

NSDUH:

National Survey on Drug Use and Health

QALYs:

Quality-adjusted life years

QUERI:

Quality Enhancement Research Initiative

SUD’s:

Substance use disorders

References

  1. Herrenkohl T, Hong S, Verbrugge B. Trauma-informed programs based in schools: linking concepts to practices and assessing the evidence. Am J Community Psychol. 2019;64(3–4):373–88.

    Article  PubMed  Google Scholar 

  2. Temkin D, Harper K, Stratford B, Sacks V, Rodriguez Y, Bartlett JD. Moving policy toward a whole school, whole community, whole child approach to support children who have experienced trauma. J Sch Health. 2020;90(12):940–7.

    Article  PubMed  PubMed Central  Google Scholar 

  3. O’Connell M, Boat T, Warner K, Warner K. Preventing mental, emotional and behavioral disorders among young people: progress and possibilities. Washington, DC: Board on Children, Youth, and Families, Division of Behavioral and Social Sciences and Education, National Academies Press; 2009. p. 549.

    Google Scholar 

  4. Greenberg M. School-based prevention: current status and future challenges. Effect Educ. 2010;2(1):27–52.

    Article  Google Scholar 

  5. Hale D, Fitzgerald-Yau N, Viner R. A systematic review of effective interventions for reducing multiple health risk behaviors in adolescence. Am J Public Health. 2014;104(5):e19–41.

    Article  PubMed  PubMed Central  Google Scholar 

  6. Hill KG, Bailey JA, Steeger CM, Hawkins JD, Catalano RF, Kosterman R, et al. Outcomes of childhood preventive intervention across 2 generations: a nonrandomized controlled trial. JAMA Pediatr. 2020;174(8):764–71.

    Article  PubMed  Google Scholar 

  7. Lee R, Gortmaker S. Health dissemination and implementation within schools. In: Brownson R, Colditz G, Proctor E, editors. Dissemination and implementation research in health: translating science to practice. 2nd ed. New York: Oxford University Press; 2018. p. 401–16.

    Google Scholar 

  8. SAMHSA. Trauma-informed approach and trauma-specific interventions: Substance Abuse and Mental Health Services Administration (SAMHSA); 2014. Available from: https://www.samhsa.gov/nctic/trauma-interventions. Cited 2019 Feb 4

    Google Scholar 

  9. Bethell CD, Newacheck P, Hawes E, Halfon N. Adverse childhood experiences: assessing the impact on health and school engagement and the mitigating role of resilience. Health Aff. 2014;33(12):2106–15.

    Article  Google Scholar 

  10. CDC. Preventing Adverse Childhood Experiences (ACES): leveraging the best available evidence. Atlanta: National Center for Injury Prevention and Control, Centers for Disease Control and Prevention; 2019.

    Google Scholar 

  11. Klevens J, Metzler M. Bringing a health equity perspective to the prevention of child abuse and neglect. In: Lonne B, Higgins D, Scott D, Herrenkohl, editors. Re-visioning public health approaches for protecting children; 2019. p. 330–59.

    Google Scholar 

  12. Chafouleas SM, Johnson AH, Overstreet S, Santos NM. Toward a Blueprint for Trauma-Informed Service Delivery in Schools. School Mental Health. 2016;8(1):144–62.

  13. Durlak JA, Weissberg RP, Dymnicki AB, Taylor RD, Schellinger KB. The impact of enhancing students’ social and emotional learning: a meta-analysis of school-based universal interventions. Child Dev. 2011;82(1):405–32.

  14. Castro FG, Yasui M. Advances in EBI Development for Diverse Populations: Towards a Science of Intervention Adaptation. Prev Sci. 2017;18(6):623–9.

  15. Durlak J, DuPre E. Implementation matters: a review of research on the influence of implementation on program outcomes and the factors affecting implementation. Am J Community Psychol. 2008;41(3):327.

    Article  PubMed  Google Scholar 

  16. U.S. Department of Education. Prevalence and implementation fidelity of research-based prevention programs in public schools: final report. Washington, D.C.: U.S. Department of Education, Office of Planning, Evaluation and Policy Development, Policy and Programs Study Service; 2011. Report no.: ED-00-CO-0119

    Google Scholar 

  17. Grol R, Grimshaw J. Evidence-based implementation of evidence-based medicine. Jt Comm J Qual Improv. 1999;25(10):503–13.

    PubMed  CAS  Google Scholar 

  18. Proctor EK, Powell BJ, McMillen JC. Implementation strategies: recommendations for specifying and reporting. Implementation Science. 2013;8(1).

  19. Proctor E, Landsverk J, Aarons G, Chambers D, Glisson C, Mittman B. Implementation research in mental health services: An emerging science with conceptual, methodological, and training challenges. Administration and Policy in Mental Health and Mental Health Services Research. 2009;36(1):24–34.

  20. Lyon A, Bruns E. From Evidence to Impact: Joining Our Best School Mental Health Practices with Our Best Implementation Strategies. School Mental Health. 2019;11(1):106–14.

  21. O’Neill J, Clark J, Jones J. Promoting mental health and preventing substance abuse and violence in elementary students: a randomized control study of the michigan model for health. J Sch Health. 2011;81(6):320–30.

    Article  PubMed  Google Scholar 

  22. Shope J, Copeland L, Maharg R, Dielman T. Effectiveness of a high school alcohol misuse prevention program. Alcohol Clin Exp Res. 1996;20(5):791–8.

    Article  PubMed  CAS  Google Scholar 

  23. Bandura A. Human agency in social cognitive theory. Am Psychol. 1989;44(9):1175–84.

    Article  PubMed  CAS  Google Scholar 

  24. Rosenstock IM. Historical origins of the health belief model. Health Educ Monogr. 1974;2(4):328–35.

    Article  Google Scholar 

  25. CASEL. Program Guide CASEL: Collaborative for Academic Social and Emotional Learning. [Internet]. Program Guide. 2022. Available from: https://pg.casel.org/. Cited 8 Feb 2022.

  26. CDC. National Health Education Standards [Internet]. CDC Healthy Schools. 2019. Available from: https://www.cdc.gov/healthyschools/sher/standards/index.htm. Cited 21 Dec 2021.

  27. Rockhill S. Use of the Michigan Model for Health Curriculum among Michigan Public Schools. Lansing MI; 2017.

  28. Durlak J, Weissberg R, Pachan M. A meta-analysis of after-school programs that seek to promote personal and social skills in children and adolescents. Am J Community Psychol. 2010;45(3–4):294–309.

    Article  PubMed  Google Scholar 

  29. Eisman A, Kilbourne A, Ngo Q, Fridline K, Zimmerman M, Greene D, et al. Implementing a state-adopted high school health curriculum: a case study. J Sch Health. 2020;90(6):447–56.

    Article  PubMed  PubMed Central  Google Scholar 

  30. Neumann M, Sogolow E. Replicating effective programs: HIV/AIDS prevention technology transfer. AIDS education and prevention: official publication of the International Society for AIDS Education. 2000;12(5 Suppl):35–48.

  31. Kilbourne AM, Smith SN, Choi SY, Koschmann E, Liebrecht C, Rusch A, et al. Adaptive School-based Implementation of CBT (ASIC): clustered-SMART for building an optimized adaptive implementation intervention to improve uptake of mental health interventions in schools. Implementation Sci. 2018;13(1):119.

  32. Bandura A. Social learning theory: Prentice-Hall; 1977. Available from: http://mirlyn.lib.umich.edu/Record/000313256 CN - LB1084 .B3571

    Google Scholar 

  33. Rogers E. Diffusion of innovations. 2003. Available from: http://mirlyn.lib.umich.edu/Record/004335364 CN - HM 101 .R72 2003

    Google Scholar 

  34. Kilbourne AM, Almirall D, Eisenburg D, Waxmonsky J, Goodrich DE, Fortney JC, et al. Adaptive Implementation of Effective Programs Trial (ADEPT): cluster randomized SMART trial comparing a standard versus enhanced implementation strategy to improve outcomes of a mood disorders program. Implement Sci. 2014;9(1):132.

    Article  PubMed  PubMed Central  Google Scholar 

  35. Harvey G, Kitson A. PARIHS revisited: from heuristic to integrated framework for the successful implementation of knowledge into practice. Implement Sci. 2015;11(1):33.

    Article  Google Scholar 

  36. Kilbourne AM, Almirall D, Goodrich DE, et al. Enhancing outreach for persons with serious mental illness: 12-month results from a cluster randomized trial of an adaptive implementation strategy. Implement Sci. 2014;9:163. Published 2014 Dec 28. https://doi.org/10.1186/s13012-014-0163-3.

  37. Raghavan R. The role of economic evaluation in dissemination and implementation research. In: Dissemination and implementation research in health: translating science to practice. 2nd ed. New York: Oxford University Press; 2018. Available from: http://mirlyn.lib.umich.edu/Record/011515836 CN.

    Google Scholar 

  38. Eisman A, Kilbourne A, Dopp A, Saldana L, Eisenberg D. Economic evaluation in implementation science: making the business case for implementation strategies. Psychiatry Res. 2020;283:112433.

    Article  PubMed  Google Scholar 

  39. Kilbourne A, Goodrich D, Nord K, Van Poppelen C, Kyle J, Bauer M, et al. Long-Term Clinical Outcomes from a Randomized Controlled Trial of Two Implementation Strategies to Promote Collaborative Care Attendance in Community Practices. Administration and policy in mental health. 2015;42(5):642–53.

  40. Eisman A, Kiperman S, Rupp L, Kilbourne A, Palinkas L. Understanding Key Implementation Determinants for a School-Based Universal Prevention Intervention: A qualitative study. Translational Behavioral Medicine. 2022;12(3):411–22.

  41. Eisman A, Palinkas L, Herrenkohl T, Boyd C, Kilbourne A. Meeting the needs of trauma-exposed youth: enhancing implementation fidelity to a state-wide health curriculum. Washington DC: Academy Health and NIH; 2020.

    Google Scholar 

  42. Baumann A, Cabassa L. Reframing implementation science to address inequities in healthcare delivery. BMC Health Serv Res. 2020;20(190):9.

    Google Scholar 

  43. LeTendre M, Reed M. The effect of adverse childhood experience on clinical diagnosis of a substance use disorder: results of a nationally representative study. Subst Use Misuse. 2017;52(6):689–97.

    Article  PubMed  PubMed Central  Google Scholar 

  44. Pearson N, Naylor P, Ashe M, Fernandez M, Yoong S, Wolfenden L. Guidance for conducting feasibility and pilot studies for implementation trials. Pilot Feasibility Stud. 2020;6(1):167.

    Article  PubMed  PubMed Central  Google Scholar 

  45. Eisman A, Kilbourne A, Greene D, Walton M, Cunningham R. The user-program interaction: how teacher experience shapes the relationship between intervention packaging and fidelity to a state-adopted health curriculum. Prev Sci. 2020;21(6):820–9.

    Article  PubMed  PubMed Central  Google Scholar 

  46. United Health Foundation. Adverse childhood experiences: America’s health rankings analysis of U.S. HHS, HRSA, Maternal and Child Health Bureau (MCHB), Child and Adolescent Health Measurement Initiative (CAHMI), National Survey of Children’s Health Indicator Data Set, Data Resource Center for Child and Adolescent Health. 2020. Available from: https://www.americashealthrankings.org/explore/health-of-women-and-children/measure/ACEs/state/MI. Cited 2020 Jul 10.

  47. MiSHCA. MiSHCA. Available from: https://mishca.org/. Cited 2021 Feb 25.

  48. Escoffery C, Lebow-Skelley E, Haardoerfer R, Boing E, Udelson H, Wood R, et al. A systematic review of adaptations of evidence-based public health interventions globally. Implement Sci. 2018;13(1):125.

    Article  PubMed  PubMed Central  Google Scholar 

  49. Wiltsey Stirman S, Baumann A, Miller C. The FRAME: an expanded framework for reporting adaptations and modifications to evidence-based interventions. Implement Sci. 2019;14(1):58.

    Article  PubMed  PubMed Central  Google Scholar 

  50. Kirchner J, Waltz T, Powell B, Smith J, Proctor E. Implementation strategies. In: Brownson R, Colditz G, Proctor E, editors. Dissemination and implementation research in health: translating science to practice. 2nd ed. New York: Oxford University Press; 2018.

    Google Scholar 

  51. Ritchie M, Dollar K, Miller C, Oliver K, Smith J, Lindsay J, et al. Using implementation facilitation to improve care in the Veterans Health Administration (version 2): Veterans Health Administration, Quality Enhancement Research Initiative (QUERI) for Team-Based Behavioral Health; 2017.

    Google Scholar 

  52. Neumann PJ, Sanders GD, Russell LB, Siegel JE, Ganiats TG. Cost-effectiveness in health and medicine: Oxford University Press; 2016. p. 537.

    Book  Google Scholar 

  53. Drummond M, Coyle D. The role of pilot studies in the economic evaluation of health technologies. Int J Technol Assess Health Care. 1998;14(3):405–18.

    Article  PubMed  CAS  Google Scholar 

  54. Crespi CM. Improved designs for cluster randomized trials. Annu Rev Public Health. 2016;37(1):1–16.

    Article  PubMed  Google Scholar 

  55. Brener N, Kann L, Shanklin S, Kinchen S, Eaton D, Hawkins J, et al. Methodology of the youth risk behavior surveillance system--2013. MMWR Recomm Rep. 2013;62(RR-1):1–20.

    PubMed  Google Scholar 

  56. Morse JMR. Approaches to qualitative-quantitative methodological triangulation. Nurs Res. 1991;40(2):120–3.

    Article  PubMed  CAS  Google Scholar 

  57. Weiner BJ, Lewis CC, Stanick C, Powell BJ, Dorsey CN, Clary AS, et al. Psychometric assessment of three newly developed implementation outcome measures. Implement Sci. 2017;12(1):108.

    Article  PubMed  PubMed Central  Google Scholar 

  58. Creswell J, Plano Clark V. Designing and conducting mixed methods research. 3rd ed. Los Angeles: SAGE; 2018. p. 492.

    Google Scholar 

  59. Hsieh HF, Shannon SE. Three approaches to qualitative content analysis. Qual Health Res. 2005;15(9):1277–88.

    Article  PubMed  Google Scholar 

  60. Miles MB, Huberman AM, Saldaña J. Qualitative data analysis: a methods sourcebook. 3rd ed. Thousand Oaks: SAGE Publications, Inc; 2014. p. 381.

    Google Scholar 

  61. Corbin JM, Strauss AL, Strauss AL. Basics of qualitative research: techniques and procedures for developing grounded theory. 1st ed; 1998. p. 312.

    Google Scholar 

  62. Boyatzis RE. Transforming qualitative information: thematic analysis and code development. Thousand Oaks: Sage Publications; 1998. p. 184.

    Google Scholar 

  63. Saldana L. The stages of implementation completion for evidence-based practice: protocol for a mixed methods study. Implement Sci. 2014;9(1):43.

    Article  PubMed  PubMed Central  Google Scholar 

  64. Bowser D, Henry B, McCollister K. Cost analysis in implementation studies of evidence-based practices for mental health and substance use disorders: a systematic review. Implement Sci. 2021;16(26):15.

    Google Scholar 

  65. Holmes L, Landsverk J, Ward H, Rolls-Reutz J, Saldana K, Wulczyn F, et al. Cost calculator methods for estimating casework time in child welfare services: a promising approach for use in implementation of evidence-based practices and other service innovations. Child Youth Serv Rev. 2014;39:169–76.

    Article  PubMed  Google Scholar 

  66. Chamberlain P, Snowden L, Padgett C, Saldana L, Roles J, Holmes L, et al. A strategy for assessing costs of implementing new practices in the child welfare system: adapting the english cost calculator in the United States. Adm Policy Ment Health. 2011;38(1):24–31.

    Article  PubMed  PubMed Central  Google Scholar 

  67. Eisman A, Hutton D, Prosser L, Smith S, Kilbourne A. Cost-effectiveness of the Adaptive Implementation of Effective Programs Trial (ADEPT): approaches to adopting implementation strategies. Implement Sci. 2020;15(1):1–13.

    Article  Google Scholar 

  68. EuroQol Research Foundation. EQ-5D-5L User Guide, 2019. Available from: https://euroqol.org/publications/user-guide.

  69. Barrera M, Berkel C, Castro FG. Directions for the advancement of culturally adapted preventive interventions: local adaptations, engagement, and sustainability. Prev Sci. 2017;18(6):640–8.

    Article  PubMed  PubMed Central  Google Scholar 

  70. Johnston LD, Miech RA, O’Malley PM, Bachman JG, Schulenberg JE, Patrick ME. Monitoring the future national survey results on drug use: 1975-2017. Ann Arbor: Institute for Social Research; 2018. p. 126.

    Google Scholar 

  71. 2018 National Survey on Drug Use and Health (NSDUH): CAI Specifications for Programming. Rockville, MD: Center for Behavioral Health Statistics and Quality; 2017.

  72. Giles SM, Harrington NG, Fearnow-Kenney M. Evaluation of the All Stars Program: Student and Teacher Factors That Influence Mediators of Substance Use. J Drug Educ. 2001;31(4):385–97.

  73. Home | Michigan Model for Health [Internet]. Available from: https://www.michiganmodelforhealth.org/. Cited 14 Dec 2020.

  74. Stirman S, Miller C, Toder K, Calloway A. Development of a framework and coding system for modifications and adaptations of evidence-based interventions. Implement Sci. 2013;8(1):65.

    Article  PubMed  PubMed Central  Google Scholar 

  75. Thabane L, Ma J, Chu R, Cheng J, Ismaila A, Rios LP, et al. A tutorial on pilot studies: the what, why and how. BMC Med Res Methodol. 2010;10(1):10.

    Article  Google Scholar 

  76. Sim J. Should treatment effects be estimated in pilot and feasibility studies? Pilot Feasibility Stud. 2019;5(1):1–7.

    Article  Google Scholar 

  77. West BT, Welch KB, Galecki AT. Linear Mixed Models: A Practical Guide Using Statistical Software. 2nd ed. Chapman and Hall/CRC; 2014. [cited 2020 Feb 28].

  78. Miller T, Hendrie D. Substance abuse prevention dollars and cents: a cost-benefit analysis. Rockville: Center for Substance Abuse Prevention, Substance Abuse and Mental Health Services Administration; 2008.

    Google Scholar 

Download references

Acknowledgements

We would like to acknowledge David Hutton, Lisa Prosser, Dana Greene, Jr., Carol Boyd, Christina Harvey, and Kelly Walsh for their contributions to the project.

Funding

This research is supported by the National Institute on Drug Abuse, K01DA044279, PI: Eisman.

Author information

Authors and Affiliations

Authors

Contributions

AE wrote the first draft of the protocol, assembled the study team, secured the grant funding, and provided overall project leadership. AE, AK, CK, TH, and JF designed the implementation intervention strategies. AE, LL, and CK provided input on study measures for primary and exploratory outcomes, including implementation fidelity and student outcomes assessments, and study protocol timeline. LP provided input on qualitative and mixed methods data collection and analysis. JF and CK provided input on specific aspects of the implementation strategy design and tailoring. UA, LL, LP, AK, TH, and JF provided input on drafts of the study protocol, including critical review and writing. All authors read and approved the final protocol manuscript.

Corresponding author

Correspondence to Andria B. Eisman.

Ethics declarations

Ethics approval and consent to participate

This study is recruiting high school teachers from two regions in Michigan. Before participation, teachers will be randomized into two groups: REP and Enhanced REP. The current study was approved Expedited Review Category 7 by the Wayne State University Institutional Review Board, IRB-20-10-2821. The study was registered on ClinicalTrials.gov, NCT04752189, https://clinicaltrials.gov/ct2/show/NCT04752189, February 12, 2021. We obtain teacher consent prior to participation in the survey and semi-structured interviews using an online consent form. Parental opt-out consent and student assent will be obtained before student participation in the survey. Electronic data files will be password-protected and stored on a secure server through the institution with restricted access. Consent forms will be stored separately because they contain identifying information. Qualitative data will be transcribed using a HIPAA-compliant transcription service. The proposed research will include a data safety and monitoring board (DSMB) comprised of individuals who are not affiliated with the project and have expertise and experience with data safety and monitoring and human subjects research.

Consent for publication

Not applicable.

Competing interests

The authors declare they have no competing interests to report.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Eisman, A.B., Palinkas, L.A., Koffkey, C. et al. Michigan Model for HealthTM Learning to Enhance and Adapt for Prevention (Mi-LEAP): protocol of a pilot randomized trial comparing Enhanced Replicating Effective Programs versus standard implementation to deliver an evidence-based drug use prevention curriculum. Pilot Feasibility Stud 8, 204 (2022). https://doi.org/10.1186/s40814-022-01145-6

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s40814-022-01145-6

Keywords