Skip to main content

Researching the Impact of Service provider Education (RISE) Project — a multiphase mixed methods protocol to evaluate implementation acceptability and feasibility

Abstract

Background

Health and social service providers receive limited education on recognizing and responding to family violence. With adequate education, providers could be prepared to identify individuals subjected to family violence and help reduce the risk of associated impairment. Informed by the Active Implementation Frameworks, our research will determine the scope of strategies needed for the uptake and sustainability of educational interventions focused on family violence for providers. It will also determine the acceptability, feasibility, and proof-of-concept for a new educational intervention, called VEGA (Violence, Evidence, Guidance, Action), for developing and improving primary care provider knowledge and skills in family violence.

Methods

This paper details the protocol for the Researching the Impact of Service provider Education (RISE) Project. The RISE Project follows a sequential multiphase mixed method research design; qualitative and quantitative data are being collected and integrated over three conceptually and methodologically linked research phases. Activities primarily occur in Ontario, Alberta, and Quebec. Phase 1 uses a sequential exploratory mixed method research design to characterize the scope and salience of learning and implementation needs and preferences for family violence education. Phase 2 will use an embedded mixed method research design to determine whether VEGA technology supports providers to achieve their family violence learning goals with effectiveness, efficiency, and satisfaction. Phase 3 will use a concurrent mixed method research design to determine acceptability, feasibility, and proof-of-concept for evaluating whether VEGA improves primary care providers’ knowledge and skills in family violence. This final phase will provide information on implementation strategies for family violence education in the “real world.” It will also generate data on provider recruitment, retention, and data completeness, as well as exploratory estimates of the effect for provider outcome measures proposed for a randomized controlled trial.

Discussion

The RISE Project comprehensively integrates an implementation approach to improve family violence education for the health and social service professions. It will provide important information about factors that could influence the uptake and effectiveness of a health profession’s educational intervention into the real world, as well as provide foundational evidence concerning the tenability of using a randomized controlled trial to evaluate the impact of VEGA in primary care settings.

Peer Review reports

Background

Research consistently details the negative physical, emotional, and economic consequences of intimate partner violence (IPV), child maltreatment, and children’s exposure to IPV (collectively hereafter referred to as “family violence”) on the development and well-being of individuals, families, and communities [1,2,3,4]. Importantly, evidence indicates that exposure to one or more forms of family violence in childhood significantly increases an individual’s risk for further victimization over the life course [5,6,7]. Given the considerable overlap in the occurrence and health-related burdens associated with the various forms of family violence, health and social service providers (HSSPs), including physicians, social workers, nurses, midwives, as well as others, have been recognized as having a critical role in prevention and early intervention [8, 9]. However, several studies indicate that HSSPs have limited formal education related to family violence and that they experience challenges recognizing and responding to the various forms of family violence in their practice encounters [10,11,12,13]. In addition, evidence indicates an urgent need to increase the amount and quality of family violence education for HSSPs [11,12,13,14,15]; a scalable and efficacious educational intervention for HSSPs that addresses all forms of family violence has yet to be identified.

This paper details the protocol for the Researching the Impact of Service provider Education (RISE) Project. The RISE Project is a novel, multiphase mixed method research project. It is informed by the Active Implementation Frameworks (AIFs) and aims to determine the scope of strategies needed for the uptake and sustainability of educational interventions focused on family violence for HSSPs, as well as evaluate the uptake and educational impact of the Violence, Evidence, Guidance, Action (VEGA) educational intervention. VEGA (detailed below) is a publicly available, online educational intervention that was released in February 2020; it includes several pedagogical strategies to support increases in HSSP knowledge, attitudes, skills, and behaviors (KASB) related to recognizing and responding to all forms of family violence.

Educational interventions for HSSP recognition and response to family violence

Five systematic reviews provide important information regarding existing educational interventions focused on family violence and which have been empirically evaluated [16,17,18,19,20]. This literature indicates that available interventions have tended to focus on physical/sexual IPV and have focused on physician, nurse, and dental professionals in the USA, UK, and Australia. Importantly, these interventions do not concord with recent evidence about best practices for safely recognizing and responding to family violence in clinical encounters. Several interventions have emphasized screening for family violence exposures, despite no evidence indicating that screening leads to measurable improvements in health outcomes among individuals exposed to family violence and in fact can place victims at greater risk of harm [21, 22].

Similarly, few of the educational interventions acknowledge the significant evidence detailing the complex overlap between IPV, children’s exposure to IPV, and other forms of child maltreatment [23,24,25,26,27,28,29]. Given the disproportionate impacts of IPV for women, the extent of overlap between IPV and child maltreatment, and that mandatory reporting laws have progressively recognized the harmful effects of children’s exposure to IPV, advocates have been clear about the need for educational programs to carefully consider responses to suspected and disclosed IPV in clinical practice.

Existing interventions have also focused on measuring changes to health professional attitudes and knowledge and have less often incorporated reliable assessments of change in practice skills and behaviors; for example, knowing that a client presentation meets the threshold for suspicion of child maltreatment and requires reporting to child protection authorities, referral to resources, or other types of intervention. One exception is the study by Pelletier and colleagues [30] which demonstrated significant improvements in reporting accuracy using vignette-based assessment methods, from pre- to post-education [30,31,32].

In sum, peer-reviewed empirical literature which evaluates the value and impact of educational interventions focused on family violence among HSSPs is disparate, does not consider all forms of family violence, does not reflect current evidence for best practices in recognition and response, and is inadequately designed for scalability across disciplines and contexts. We do not know which educational approaches reliably change HSSP knowledge and skill related to family violence.

Though not yet formally evaluated, the VEGA Family Violence Educational Resources (see: https://vegaproject.mcmaster.ca/) offer important potential to address the critical education gaps related to family violence among HSSPs. VEGA is a free, brief, and evidence-informed health professions educational intervention that aims to improve HSSP KASB related to recognizing and responding to family violence in clinical practice. Informed by the AIF, the RISE Project (outlined below) is the first formal evaluation of VEGA. The RISE Project is using a robust, multiphased mixed method research design that will (1) identify and develop implementation and evaluation supports for the pan-Canadian dissemination and implementation of family violence education, including VEGA, to HSSPs; (2) use user-testing methodology to determine if the technology of VEGA allows its users to achieve their learning goals effectively, efficiently, and satisfactorily in a self-directed learning format; and (3) determine acceptability, feasibility, and exploratory estimates of impact for implementing and evaluating VEGA as a health professions education intervention in the Canadian primary care setting. The RISE Project has recently completed the qualitative strand of Phase 1 data collection; the quantitative strand for Phase 1 is currently ongoing. No results manuscripts have been generated or are under consideration.

The specific qualitative, quantitative, and mixed method research questions [33] informing each phase of the RISE Project are included in Table 1.

Table 1 RISE Project research questions

Implementation and research framework

All phases of the RISE Project are informed by a novel application of the Active Implementation Frameworks (AIFs) [34,35,36,37]. The AIFs are a determinant framework that acknowledges the importance of considering multiple levels and types of influence in the uptake and sustainability of educational interventions [38]. The AIFs outline five key determinants of implementation and evaluation success, including (1) a usable innovation (i.e., educational intervention); (2) implementation stages; (3) implementation teams; (4) the identification and enactment of implementation drivers; and (5) the incorporation of improvement cycles [34,35,36,37, 39]. Previous work by members of our own team [40,41,42,43,44,45], as well as others [37, 46,47,48], indicates that implementation and evaluation efforts guided by the AIFs have resulted in effective practice change. With respect to implementation stages, it is important to note that stage 1 (exploration) is the point at which an organization (or stakeholder group) considers the need and fit of a usable innovation. This stage of the AIF was addressed in the preparatory work for the RISE Project via the engagement of “champions” representing eight national organizations of HSSPs, primarily physicians and social workers (see Table 2).

Table 2 Collaborating organizations

Methods

Overall design

To achieve its aims, the RISE Project is using an emergent, sequential, multiphase mixed method research design [33]. We are collecting and integrating qualitative and quantitative data over the course of three conceptually and methodologically linked research phases. Each of the three phases is driven by its own mixed method research design, with each phase given equal priority and purposefully connected via incorporating the findings of previous phases [33, 49]. Informed by the guidelines detailed by Creswell and Plano Clark [33], Fig. 1 provides a graphical overview of our multiphase mixed method research design. Details concerning the procedures for each phase are described below. The RISE Project has been approved by the Hamilton Integrated Research Ethics Board (Project #: 11295), the University of Calgary Conjoint Research Ethics Board (Project #: REB20-0338), and McGill University’s Research Ethics Board (Project #: 20-06-038). The Standards for Reporting Framework for Implementation Studies (STaRI) Checklist has guided the reporting of our protocol [1]. Our completed checklist can be found in Supplementary File 1.

Fig. 1
figure 1

Overview of the RISE Project’s multiphase mixed method research design. The figure gives an overview of the RISE Project’s multiphase mixed method research design; it details three phases of research and each phase is characterized by its own mixed method research design. Phase 1 uses a sequential exploratory mixed method research design, which is given by the notation of QUAL ➔ quan. This notation indicates the qualitative strand occurs first, is given more weight, and informs the quantitative research strand. Phase 2 uses an embedded mixed method research design, which is given by the notation (qual(QUAN)). This notation indicates that although the phase starts with qualitative data collection, it is embedded within a larger quantitative paradigm; data for both strands are collected in the same data collection visit. Phase 3 uses a concurrent mixed method research design, which includes parallel collection of qualitative and quantitative data that occurs over multiple visits and analyzed separately (notation is QUAL + QUAN). The qualitative and quantitative data for each site in each province are weighted equally and findings for each strand, for each site, are integrated to create a comprehensive interpretation. The arrows connecting each phase indicate that some aspect of findings and methods (e.g., measures) from each phase, inform the next phase

Setting and participants

The majority of the RISE Project activities will take place in the Canadian provinces of Alberta, Ontario, and Quebec. Up to one-third of adult Canadians have reported exposure to child maltreatment or intimate partner violence in their lifetime [50, 51] and rates of police-reported family violence constitutes 26% of all police-reported crime in the nation [52]. Our focus on the social work and medical disciplines, as well as the provinces of Alberta, Ontario, and Quebec, were justified on the bases of five key reasons: (a) social workers and physicians are among the top three largest groups of health care and social service providers in Canada; (b) Alberta, Ontario, and Quebec are among the most populous of the Canadian provinces and contain a high proportion of the HSSPs with membership to our collaborating organizations; (c) the self-reported prevalence rates of family violence in each these three provinces are greater than 25%; (d) each province’s selection represents the range in the proportion of provincial residents having the ability to speak both of Canada’s official languages, English and French (Quebec = high proportion of the population is bilingual (44.5%); Ontario = moderate proportion (11.2%); Alberta = low proportion (6.6%)); and (e) each province has created primary care teams that are inclusive of a range of HSSPs that are likely to come into contact with individuals exposed to family violence, which is of particular relevance for phase 3 of our work [53,54,55,56,57,58,59,60].

Phase 1: The scope and salience of learning and implementation needs and preferences — a sequential exploratory mixed method study

Guided by stage 2 (installation) of the AIF, phase 1 of the RISE Project will determine the scope and saliency of the learning and implementation needs and preferences of social workers and physicians related to continuing education in family violence. According to Metz et al. [36], the installation phase of a new intervention is an often overlooked, but necessary stage for implementation success. During this stage, identifying factors for optimizing intervention delivery, uptake, sustainability, and impact (e.g., policy and practice drivers) is the focus. The objectives for phase 1 will be addressed using a sequential exploratory mixed method design [33]; qualitative and quantitative data will be collected sequentially from trainee and licensed physicians and social workers who are training or practicing in each of the three provinces of focus. The term “exploratory” denotes that this phase of the project will be qualitatively dominant, with the measures for the quantitative strand influenced by the initial qualitative findings [61].

Qualitative research strand

Design, sampling, and recruitment

Our team has drawn on the principles of qualitative description to guide sampling, data collection, and analysis procedures for the qualitative strand of phase 1 [62, 63]. Criterion-based sampling strategies will be used to recruit approximately 100 participants for this strand of data collection [64, 65]. We will operationalize criterion-based sampling via predefined eligibility criteria to recruit participants who (a) are 18 years of age or older; (b) are an undergraduate or graduate-level medical or social work trainee (with at least one clinical placement/practicum in the last 12 months) or practicing physician or social worker; and (c) reside in the province of Quebec, Alberta, or Ontario. Recruitment of participants will occur via a three-pronged approach, including the distribution of recruitment e-mails via our collaborating organizations. This process will be supplemented with the posting of recruitment materials on social media platforms, as well as e-mail requests for participation circulated to the professional networks of the study team. Third, snowball sampling methods will be used with each “source” participant to accumulate additional participants over time [65].

Qualitative data collection and analysis

Enrolled participants will be asked to complete a one-on-one, semi-structured qualitative interview with a member of the research team by Zoom or telephone [66]. A semi-structured interview guide consisting of 5–7 key, open-ended questions will guide data collection and are informed by the research objectives for phase 1. In keeping with the traditions of qualitative description, interview questions will be adapted throughout data collection to explore patterns in the data. Interviews will last between 30 and 45 min and will be audio-recorded and transcribed verbatim by a professional transcription service working with the research team. Demographic data will be collected from each participant using a short demographic questionnaire. Each participant will receive a $75.00 honorarium in the form of an e-gift card as a token of appreciation for their completion of the interview.

This phase will use an inductive and deductive approach to analyze participant descriptions of their needs and preferences related to education on family violence to produce: (a) new practice-based insights about the type and extent of family violence education that is needed among our disciplinary groups and (b) a detailed summary of drivers needed to scale-up and sustain family violence education, and VEGA more specifically, among HSSPs in Canada. Inductive conventional content analysis [67] of interview transcripts using the constant-comparison technique will identify pertinent concepts and constructs related to participants’ perceived learning and implementation needs and preferences related to education in family violence; it will also allow for an examination of the extent to which needs and preferences are consistent versus distinct across disciplines and field of specialties (e.g., community mental health; emergency medicine; etc.,). Summative content analysis, which is a deductive analytical technique [67], will provide counts of needs and preferences that are identified by participants; this information will aid in the interpretation of the results by demonstrating the learning and implementation needs and preferences that are most relevant for physicians versus social workers, as well as those that may be more or less salient across each discipline and field of specialty [68, 69].

Quantitative research strand

Design, sampling, and recruitment

The quantitative strand will corroborate and extend the findings from the qualitative strand of phase 1 and determine the frequency of family violence education learning and implementation needs and preferences within and across groups of Canadian providers and trainees at a national level; this will allow our team to make empirically supported decisions related to strategies for supporting the scalability and sustainability of family violence education — and VEGA more specifically, should it prove to be an effective educational intervention. The quantitative strand will follow a quantitative, cross-sectional survey research design [70]; the primary sampling frame will be the membership registries of our collaborating organizations, which provide lists of members who have provided consent to the respective organization to have their e-mail address available for research purposes. Using non-probability, opportunity-based sampling, eligible participants will be those who (a) are 18 years of age or older, (b) are a trainee or practicing physician or social worker residing in any Canadian province or territory, and (c) can provide informed consent and complete the self-report survey in either English for French. A request will be made to the administrator of each collaborating organization to distribute our recruitment and data collection materials via e-mail to individuals on the research registries and which directs interested participants on how to complete research activities. This process will be supplemented with the posting of recruitment materials on social media platforms, as well as e-mail requests for participation circulated to the professional networks of the study team.

Quantitative data collection and analysis

Informed consent and data collection procedures will be completed anonymously via Lime Survey at the participant’s convenience during a 4-week study window. Survey items will be informed by the coding categories and constructs identified in the qualitative strand of phase 1 (described above), as well as include validated measures from health professions education, implementation science, and family violence literature. Specifically, items will ask participants to self-report on their (a) socio-demographic characteristics and previous training in family violence; (b) readiness to undertake family violence training (e.g., Brief Readings to Change Scale [71]); (c) attitudes toward incorporating research evidence into their practice (e.g., Evidence Based Practice Attitudes Scale); (d) preparedness to address family violence in practice (e.g., an adapted Physician Readiness to Manage Intimate Partner Violence Survey (PREMIS) [72, 73] and Healthcare Provider Attitudes toward Child Maltreatment Reporting Scale [74, 75]); and (e) preferences, barriers, and facilitators related to participating in education activities focused on family violence (e.g., online vs. face-to-face learning, etc.) [76]. Participants who complete the anonymous survey will be given the option to enter their name in a draw to win one of six $500.00 honorariums.

All data from our Lime Survey platform will be exported into our data analysis software, SPSS (version 28). Estimates of the variability and saliency of HSSP learning and implementation needs, and preferences will be generated via statistics of dispersion and central tendency. Differences across provider groups and specialities will be evaluated using single-level correlation and regression analysis.

Phase 2: VEGA usability and exploratory assessment of education outcomes

According to the AIFs, stage 3 (initial implementation) refers to the point at which changes begin to occur within the overall practice environment. For the purposes of this project, this would include not only changes in HSSP KASB, but also changes in overall health and social service sector capacity to be able to safely recognize and respond to family violence. Guidelines from the developers of the AIF indicate that stage 3 is the point at which implementation challenges present themselves, as do opportunities to refine and expand the suite of strategies to support successful implementation and realize intervention impact [36]. Given the probability of implementation failure to occur at this stage, data-driven, pilot-based approaches are encouraged [77, 78]. For this reason, we will pilot VEGA through the application of usability testing. Usability testing focuses on the evaluation of intervention technology and is an essential step for realizing e-learning impacts; this is especially the case in health professions education [79,80,81,82].

Usability testing follows a data transformation variant of an embedded mixed method research design to generate a description of (a) intervention usability (i.e., the extent to which VEGA can be used by HSSP users to achieve their learning goals with effectiveness, efficiency, and satisfaction) and (b) educational impact [83]. Using this design, quantitative and qualitative strands of data collection are collected from the same participants during one data collection visit.

The intervention: Violence, Evidence, Guidance, Action (VEGA) project

VEGA is an online educational intervention that was created for Canadian HSSPs to develop and improve their KASB related to recognizing and responding to all forms of family violence. The intervention was developed using an iterative design process that incorporated systematic evidence reviews of the family violence literature, environmental scans of existing and relevant training resources, and consultation with clinical and research experts in the areas of family violence, instructional design, and health professions education. An important element of VEGA’s development also included repeat consultation from clinicians and scientists belonging to 22 national-level HSSP organizations in Canada.

VEGA follows a participatory, encounter-based curriculum over the course of four core learning modules; learning module content and pedagogical approaches are informed by evidence-based models of adult learning and cognitive processing [84,85,86,87,88] and follow the VEGA Competency Framework for Recognizing & Responding Safely to Family Violence. VEGA can be completed as a self-directed educational activity or it can be delivered by trained facilitators in a virtual or face-to-face workshop. Time to completion is approximately 3 h. Although freely available to HSSPs across Canada, the intervention has yet to undergo formal evaluation to determine its effectiveness. More information about VEGA and its competency framework can be found at vegaproject.mcmaster.ca.

Qualitative research strand

Sampling and qualitative data collection

Evidence indicates that 95% of usability problems can be identified with usability testing among a purposeful sample of five-to-ten potential end-users. In the case where the intent of the intervention is to meet the learning needs of several different end-users, there is a need to ensure there is sufficient user-testing with each user type [89, 90]. Given this information, we will use purposeful, criterion-based sampling [64, 65] to recruit a convenience sample of up to 20 trainee (~ 10 social work; 10 physician) and 20 licensed practitioners (~10 social work; ~ 10 practitioners) to participate in this phase of the project. Eligible participants will be those who are (a) 18 years of age or older; (b) a trainee or practicing physician or social worker residing in Hamilton, Calgary, or Montreal; and (c) willing to complete user-testing procedures synchronously with a member of the research team.

Enrolled participants will complete the self-directed format of the VEGA intervention using a think-aloud protocol [89]. Specifically, the participant will be prompted by a member of the research team to navigate through the VEGA intervention to complete a series of learning tasks that follow VEGA’s recommended learning pathway while “thinking aloud.” That is, participants will be prompted by the research team member to say their thoughts and activities “out loud” as they move through the intervention (e.g., “I am now scrolling down to see more content;” “I can’t locate the button to move forward in this module”). The think-aloud protocol will be audio-recorded and transcribed verbatim for qualitative data analysis. All participants will be offered continuing medical education credits (physicians; physician trainees) or a certificate of continuing education participation (social workers; social work trainees) for completing the VEGA intervention via the think-aloud protocol.

Qualitative data analysis and data transformation

De-identified think-aloud transcripts will be analyzed using framework analysis [91, 92]. This process involves the a priori indexing of the types of usability problems detailed by Hornbaek [93] and Hvannberg and Law [94] into NVivo data management software and completing iterative reviews of the transcripts to apply the indexed usability problems to the transcribed data. Application of usability problems to the data will be completed independently by two members of the research team and will allow for the determination of usability problems (if any), problem types, their frequency, and their severity (1 = mild problem, 2 = moderate problem, 3 = serious problem, 4 = critical problem). Differences in indexing of user problems will be resolved via consensus discussion among the analysts and the leads of the research team.

Quantitative research strand

Data collection

The quantitative strand of data collection follows a pre-post-research design [95]. Specifically, the sample participants who enroll in the qualitative portion above will self-complete a series of research assessments on our Lime Survey platform at two timepoints: immediately prior (time 1; pre-assessment) and immediately following their think-aloud protocol (time 2; post-assessment). Research assessments will be embedded in a hyperlink that is sent to participants via email 25 min prior to the start of the think-aloud protocol, as well as again immediately after the think-aloud protocol. In addition to socio-demographic characteristics (sex at birth, gender identification, age, location, professional status and discipline, years of practice), quantitative assessments pre- and post-the think-aloud protocol will include the same validated measures of KASB administered in the quantitative strand of phase 1, as well as a vignette-based assessment of knowledge and skill accuracy related to recognizing and responding to family violence. Vignette-based assessment methods are a common, robust measure of practitioner knowledge and skill accuracy related to family violence [30,31,32], as well as in medical and health professions education, more generally [96,97,98].

The post-assessment will also ask participants to self-report the extent to which they perceive their current clinical environment to be safe to discuss complex issues related to family violence (e.g., Safety Culture Scale [99, 100]), as well as their satisfaction with usability of the VEGA intervention (e.g., System Usability Scale (SUS)) [101, 102]. Validated measures will be supplemented with data that is compiled and tracked by the project’s research staff; this will include tracking (a) VEGA usability effectiveness via “learning task completion” during the think-aloud protocol with a yes/no checklist, (b) user time on “learning task,” (c) missing data at the item and group level at the pre- and post-assessment timepoints, and (d) the average time needed to complete pre- and post-research assessments. An honorarium ($150.00 for a practicing social worker or physician; $75.00 for resident physicians and trainee social workers) in the form of an e-gift card will be provided as a token of appreciation for the completion of quantitative measures.

Quantitative and integrated data analysis

Quantitative data will be analyzed and interpreted via estimates of central tendency and dispersion. Specifically, the proportion of participants reporting “satisfactory” usability for the VEGA intervention (i.e., a score of > 70) will be generated for trainees and practitioners by discipline (social work, medicine) [89, 93, 94]. In addition, the mean and range of SUS scores for the entire sample will be reported, as will the mean and range for participants who reported an SUS adjective rating of (a) poor, (b) “OK,” (c) good, (d) excellent, and (e) best imaginable. We will compute the range of missing data for all quantitative measures, with feasibility of collecting quantitative outcome data indicated by less than 20% missing data at the participant and group level for each timepoint. We will also generate and present (a) correlations between continuous user satisfaction scores and pre/post-practitioner scores on KASB measures and (b) cross-tabulations of satisfaction scores, the rate of usability problems, problem types, and problem severity [33, 103]. Upon review of results, team members will decide which usability changes — if any — need to occur prior to initiating phase 3 of the RISE Project. We will partner with the VEGA team to implement those changes to the intervention.

Phase 3: Determining the acceptability and feasibility of implementing and evaluating VEGA in primary care

Continuing to be guided by stage 3 (initial implementation) of the AIF, the primary aim of phase 3 is to determine the acceptability and feasibility of implementing and evaluating VEGA to improve HSSP KASB related to recognizing and responding to family violence in the primary care setting. Our secondary objectives for phase 3 are to determine exploratory estimates of the educational impact for the VEGA intervention, as well as describe the usefulness of implementation strategies to support and sustain VEGA educational impacts in the primary care setting. We will address our objectives using a concurrent mixed method research design; quantitative and qualitative strands of data collection and analysis will occur in parallel and be given equal priority [49]. Quantitative data will provide essential information about HSSP enrollment, retention, attrition, and data completeness; we will also generate exploratory estimates of education effect and variance. The qualitative strand of data collection will provide corroborating information about the acceptability and feasibility of the VEGA intervention in the primary care setting, as well as HSSP perceptions of perceived value and impact of the VEGA intervention and our implementation strategies. Table 3 details the specific acceptability, feasibility, and proof-of-concept objectives for phase 3 mapped to their type of outcome assessment, any relevant hypotheses, and analysis.

Table 3 Primary and secondary objectives, outcome variables, hypotheses, and analysis for phase 3 of the RISE Project

Quantitative research strand

Design, sampling, and recruitment

The quantitative strand will follow a non-experimental, repeated measures design [104]. HSSPs working in primary care clinics in the provinces of Ontario, Alberta, and Quebec will be recruited to undergo the VEGA intervention and complete quantitative measures of education outcomes at multiple timepoints. Measures will be administered to determine (a) the acceptability and feasibility of collecting data on proposed educational outcome measures for a definitive trial and (b) generate preliminary estimates of effect (i.e., proof-of-concept) and variability, which can inform sample size estimations for a definitive trial. To achieve our research aims, we will use a three-stage sampling strategy; two of the three stages are relevant for the quantitative strand of data collection.

Stage 1 will occur at the clinic level. Criterion and simple random sampling strategies will be used to enlist two primary care clinics in the provinces of Ontario, Quebec, and Alberta for participation [64, 65]. Given that samples of approximately 40 participants are generally sufficient for acceptability and feasibility studies [105,106,107], a roster of primary care clinics with a front-line complement of between 10 and 60 HSSPs who provide health and/or social services to individuals and families in each of the eligible provinces will be generated via provincial registries made publicly available on each province’s Ministry of Health website. Two clinics per province will be selected from these rosters for participation via a simple random sampling algorithm in SPSS. Directors of the selected clinics will be contacted by the research team via email and provided with a one-page overview of the project and the opportunity to meet synchronously with the research team to address study-related questions. Directors will be asked to provide consent for clinic participation, as well as the email addresses for all HSSPs within the clinic. To be eligible to participate, the Director of each of the selected clinics must (a) provide permission for the full complement of their HSSP staff to complete study-related activities and (b) have no ongoing participation in other education-related research projects. Random selection of replacement clinics will continue until we achieve our clinic sample aims.

Stage 2 sampling will occur at the individual level; full-population sampling will involve inviting all full-time and part-time HSSPs at each of the enrolled clinics to participate in the quantitative strand of our study [108]. We anticipate that this individual, convenience sampling approach will yield between 5 and 20 practitioners participating at each of the respective clinics. Eligible practitioners will be those who (a) have worked for the selected clinic for a minimum of 2 months, (b) intend on working (to the best of their knowledge) at the clinic until the end of the study, and (c) provide informed consent and are willing to complete quantitative and qualitative study procedures.

Implementation and educational intervention

Preparatory clinic webinars

Preparatory webinars with each of the enrolled clinics and their participating HSSPs will be conducted prior to launching baseline data collection (detailed below). Two members of the research team will facilitate the webinars with the purpose of increasing engagement in project activities, addressing any process queries prior to the launch of the data collection, and supporting readiness for implementation and evaluation activities [109,110,111].

Educational intervention

Enrolled HSSPs at each of the participating clinics will undergo a course of self-directed VEGA or workshop VEGA, as outlined in phase 2. This phase will be influenced by the perspectives of our collaborating organizations, the health professions education literature, as well as findings of phase 1 and phase 2 of the RISE Project. Should a clinic pursue the self-directed option of the VEGA intervention, each participant within the clinic will be provided with VEGA login and password information, as well as instructions to complete all VEGA learning modules within a 4-week period (i.e., the intervention period). Reminders for intervention completion will be automatically generated and sent to participants every week via the Lime Survey interface until intervention completion or until the participant’s intervention period has passed. Should the workshop format of VEGA be selected, all HSSPs in each clinic will be invited to participate in a VEGA workshop (one per clinic), co-facilitated by at least two trained facilitators from the VEGA team. Following the completion of intervention and data collection activities, workshop participants will be provided the log-in information to have unrestricted access to the self-directed format of the VEGA modules.

Participant consultation

Tri-weekly participant consultation will be provided via Zoom by two clinician members of the research team following each clinic’s intervention period. Consultation will focus on supporting HSSP clinical application of family violence recognition and response principles detailed in the VEGA curriculum [110, 112]. Consultation sessions will be 45 min in duration, audio-recorded, and transcribed verbatim for qualitative data analysis (detailed below).

Quantitative data collection

Quantitative data related to acceptability and feasibility will be collected by the project’s research coordinator (RC); this will include tracking the number of clinics approached for participation, who request preparatory webinars, and who enroll. It will also involve tracking HSSPs within each clinic who (a) inquire about participation, (b) are eligible, and (c) enroll. The RC will also record the number of (d) (i) contacts needed to complete consent procedures and (ii) contacts to arrange all research assessments and the number of HSSPs that (e) complete VEGA training modules; (f) dropout following consent; (g) could not be reached for follow-up; (h) complete quantitative research assessments at each timepoint; and (i) are approached, agree, complete, and withdraw from the qualitative data collection strand.

Quantitative data regarding VEGA educational outcomes will be collected via HSSPs’ self-completion of assessments administered by email at three timepoints: 1 week before the intervention period (time 1; baseline), immediately following the completion of the intervention/intervention time period (time 2; post-intervention), and 3 months following intervention completion (or the intervention period for participants who take the full-time frame or who do not complete all of the VEGA learning modules) (time 3; 3-month follow-up). Given that this work is based within the overall emergent, multiphase, mixed method research design, we anticipate that measures capturing education outcomes will be the same as those administered in phase 2 of the RISE Project, which includes a brief assessment of socio-demographic characteristics. However, it is possible that these measures may expand or change throughout the duration of the research program. Additionally, at each timepoint, we will ask participants to report on the number of referrals made over the previous month to (i) intimate partner violence services, (ii) parenting services/interventions, (iii) child welfare services, or (vi) psychotherapy services.

Quantitative data analysis

Given our primary focus on acceptability and feasibility, quantitative data for our primary objectives in phase 3 will be analyzed using descriptive statistics. In addition, based on our integration of sampling and recruitment recommendations in the literature [2, 3], we have proposed a priori thresholds for acceptability and feasibility as follows: the proportion of (a) primary care clinics and their HSSPs agreeing to participate will be 60% or greater; (b) enrolled HSSPs who complete all modules of the VEGA intervention will be 70% or greater; (c) missing data for each timepoint will be less than 20% at the HSSP and clinic levels; and (d) our team will be able to generate estimates of effect and variability for education outcome measures. Informed by the AIF and the broader implementation science literature, if more than one of these thresholds are not met, our team will consider revisions to our implementation and research procedures before proceeding to a definitive trial. Given our use of mixed methods, we expect that our qualitative data (outlined below) will be especially helpful for understanding how and why thresholds were or were not met and what can or should be augmented in the implementation and research procedures to bolster the possibility of future evaluation success. Secondary objectives will be addressed using regression analysis, with results being reported as estimates of effect (95% confidence interval) and associated p-values. Analyses will be exploratory, with no adjustments for multiple comparisons.

Qualitative research strand

Design, sampling, and recruitment

The qualitative strand of phase 3 will follow the principles of qualitative description [62, 63]. Driven by eligibility and enrollment procedures for the quantitative strand of data collection, we will use criterion-based sampling to select a sub-sample of HSSPs (n = 5–10, per clinic) who provided quantitative data to participate in a qualitative semi-structured interview. This sub-sample of HSSPs will be asked to complete an interview at two timepoints: (a) within 1 week of intervention completion and (b) 2 weeks following their submission of the 3-month quantitative research assessment. We will invite HSSPs who represent various genders, education levels, employment tenure, and previous training in family violence to participate in this strand of data collection. We will also recruit up to three managers, directors, or administrators (i.e., “managers/management”) from each of the enrolled clinics to participate in this strand of data collection. Management interviews will begin immediately following the intervention period for enrolled HSSPs in the same clinic. Eligibility and consent of HSSPs will have been obtained during quantitative study enrollment; consent for qualitative data collection will be verbally reconfirmed by the RC prior to qualitative data collection. Eligible managers will be those who have been working in their management role at the enrolled clinic for at least 6 months and who intend on working at the same clinic for the duration of the study.

Qualitative data collection

One-on-one interviews are a recommended method of data collection in applied qualitative research of interventions; they are a flexible approach that enables the gathering of an in-depth, first-hand account of a phenomenon in its given context [113]. A semi-structured interview guide consisting of 5–7 key, open-ended questions will guide data collection at each of the clinics; the guide will focus on the participants’ perception of the educational intervention and perceived impact, as well as the acceptability and burden of research, educational, and implementation support activities. HSSP and management interviews will be scheduled for between 45 and 60 min via Zoom at a time that is convenient for participants. Zoom is an externally hosted cloud-based service provided to the research team through our university. A link to Zoom’s privacy policy (https://zoom.us/privacy#_Toc44414835) will be provided to all potential participants in the consent form. Our research team will take all available precautions to reduce the risk of a privacy breach, including generating a unique Zoom link for each interview and providing a unique password for entry to each interview. In addition, each individual participant will be asked to refrain from using the video feature of Zoom, so that only the verbal content of each interview is audio-recorded for verbatim transcription. Management participants will also be asked to complete a brief, demographic information form. Qualitative interview participants will be provided a $75.00 honorarium in the form of an e-gift card at the completion of the interview.

Qualitative and integrated analyses

Qualitative data collection and analysis for each clinic will happen concurrently and begin immediately following the clinic’s intervention period; remaining clinical consultations and semi-structured interviews within and across clinics will have the opportunity to be informed by interim analysis of transcripts of both data types. Analysis of interview and clinical consultation transcripts will involve conventional and summative content analysis [67] to generate “within clinic” and “cross-clinic” summaries of acceptability, feasibility, and impact. Our data report will also incorporate counts of any perceived barriers and facilitators to acceptability and feasibility (alongside excerpts of qualitative data) identified in the interview or clinical consultation data via a mixed methods joint display. In addition, a modified stem-and-leaf plot will cross-tabulate scores on VEGA education outcome measures with qualitative excepts describing the perceived value and impact of the VEGA intervention and our implementation supports [114].

Rigor and integration across research phases

Strategies for ensuring the quality of mixed methods research continue to emerge in the methodological literature; the specific strategies for ensuring the rigor of multiphase mixed method research designs remain unclear. There is consensus however that rigor in any mixed methods research study requires prerequisite consideration of the expectations for rigor in the qualitative and quantitative strands of data collection. Informed by this, as well as guidelines articulated by Kefting [115] and O’Cathain [116], a range of strategies will be applied within and across all phases of our research program to achieve credible (internally valid), dependable (reliable), applicable (transferable, externally valid), and confirmable (neutral) findings [115, 116]. This includes (a) the commitment to publish the results for each phase of the RISE Project according to the Good Reporting of Mixed Methods Study Guidelines [116] as well as guidelines relevant to each phase of our work; for example, the Standards for Reporting Framework for Implementation Studies [117] and the CONSORT extension to pilot trials [118, 119] (both relevant for phase 3); (b) the purposeful integration of qualitative and quantitative approaches at the design, sampling, data collection, data analysis, or interpretation stages for each of the three phases of work; (c) employing psychometrically validated quantitative research instruments and repeat measurement approaches for phase 2 and phase 3; (c) a detailed audit trail throughout the entire research program; (d) field notes and analytical memos during qualitative data collection and analysis; and (e) investigator, data source, and data method triangulation where possible and appropriate [120, 121]. A detailed description for the strategies to ensure methodological rigor for each phase of the RISE Project will be reported in each phase’s result publication, respectively.

Discussion

The key factors for successful uptake and sustainability of evidence-based educational interventions for HSSPs remain speculative. Recent work by Thomas [122], Price [123], and Carney [124] speak to the potential for models of implementation science to reduce the chasm between the development, implementation, and sustainability of educational interventions among the health and social service professions. Critically, our model of implementation science acknowledges that interventions to improve HSSP KASB related to family violence — including the uptake, impact, and sustainability of VEGA — take place within a complex context of macro (e.g., regulatory college support, accreditation), meso (e.g., institutional policies and support), and micro (e.g., provider) factors; each of these factors needs to be identified and considered in implementation efforts given their potential role in influencing intervention take-up and therefore the ability to actualize the intervention primary (i.e., educational) and secondary (i.e., health) outcomes [111, 125,126,127,128].

The present study has several strengths. First is its commitment to identify and operationalize the key drivers of family violence education uptake (and VEGA uptake more specifically) and sustainability throughout the remaining phases of the work and over the long-term, which is likely to yield more rapid translation of education and health outcomes attributable to educational interventions. In addition, the triangulation of data types from trainees and practitioners in three provinces and from two of Canada’s largest HSSP disciplines [53] will enhance the credibility, transferability, and trustworthiness of our findings; it will also generate a more comprehensive understanding of family violence education drivers, perceived value and impact, and acceptability and feasibility for the evaluation of VEGA in the real world. A third and critical strength of the RISE Project is a strong collaboration with eight national-level HSSP organizations who are key advocacy bodies for continuing health professions education among social workers and physicians in Canada.

There are three principal limitations of the RISE Project. The first is the exclusion of other disciplines in two of the three phases of work. Second, the quantitative strands of data collection in phase 2 and phase 3 follow non-randomized designs, which precludes the possibility of making causal claims of intervention impact. Third, none of the phases will involve the collection of data from the perspective of individuals exposed to family violence. We have designed a separate study to elicit the perspectives of those who have survived family violence or who work as community advocates for survivors, and we expect to integrate the learning from that project into the present work, as project activities progress.

The RISE Project was developed with the overall objective of initiating a robust evidence-base concerning the need and preferences related to family violence education among HSSPs in Canada, as well as generate initial information about the value and impact of VEGA to improve HSSP recognition and response to family violence in their practice encounters. Informed by a model of implementation science, the AIFs, the research program has a central emphasis on identifying and addressing key drivers of family violence education uptake, sustainability, and impact among the HSSPs throughout project activities. This model provides a constructive framework for considering how the broader impacts of VEGA — or any other family violence educational intervention — can be quantified, explained, and leveraged within and across HSSPs and their service contexts, more generally.

Availability of data and materials

Not applicable.

Abbreviations

AIF:

Active Implementation Frameworks

HSSP:

Health and social service provider

KASB:

Knowledge, Attitudes, Skills, and Behaviors

RISE Project:

Researching the Impact of Service provider Education Project

VEGA:

Violence, Evidence, Guidance, Action

References

  1. Carr CP, Martins CM, Stingel AM, Lemgruber VB, Juruena MF. The role of early life stress in adult psychiatric disorders: a systematic review according to childhood trauma subtypes. J Nerv Ment Dis. 2013;201(12):1007–20.

    Article  PubMed  Google Scholar 

  2. McCrory E, De Brito SA, Viding E. The link between child abuse and psychopathology: a review of neurobiological and genetic research. J R Soc Med. 2012;105(4):151–6.

    Article  PubMed  PubMed Central  Google Scholar 

  3. Naughton AM, Maguire SA, Mann MK, Lumb RC, Tempest V, Gracias S, et al. Emotional, behavioral, and developmental features indicative of neglect or emotional abuse in preschool children: a systematic review. JAMA Pediatr. 2013;167(8):769–75.

    Article  PubMed  Google Scholar 

  4. Organization WH. World report on violence and health. Geneva: World Health Organization; 2002.

    Google Scholar 

  5. Capaldi DM, Knoble NB, Shortt JW, Kim HK. A systematic review of risk factors for intimate partner violence. Partn Abus. 2012;3(2):231–80.

    Article  Google Scholar 

  6. Kimber M, Adham S, Gill S, McTavish J, MacMillan HL. The association between child exposure to intimate partner violence (IPV) and perpetration of IPV in adulthood-a systematic review. Child Abuse Negl. 2018;76:273–86.

    Article  PubMed  Google Scholar 

  7. Li S, Zhao F, Yu G. Childhood maltreatment and intimate partner violence victimization: a meta-analysis. Child Abuse Negl. 2019;88:212–24.

    Article  PubMed  Google Scholar 

  8. World Health Organization. Responding to intimate partner violence and sexual violence against women: WHO clinical and policy guidelines. Geneva: World Health Organization; 2013.

    Google Scholar 

  9. National Institute of Health and Care Excellence (NICE). Domestic violence and abuse: how health services, social care, and the organizations they work with can respond effectively. London: NICE Department of Health; 2014.

    Google Scholar 

  10. Wathen CN, Tanaka M, Catallo C, Lebner AC, Friedman MK, Hanson MD, et al. Are clinicians being prepared to care for abused women? A survey of health professional education in Ontario, Canada. BMC Med Educ. 2009;9:34.

    Article  PubMed  PubMed Central  Google Scholar 

  11. McTavish JR, Kimber M, Devries K, Colombini M, MacGregor JCD, Wathen CN, et al. Mandated reporters’ experiences with reporting child maltreatment: a meta-synthesis of qualitative studies. BMJ Open. 2017;7(10):e013942.

    Article  PubMed  PubMed Central  Google Scholar 

  12. McTavish JR, Kimber M, Devries K, Colombini M, MacGregor JCD, Wathen N, et al. Children’s and caregivers’ perspectives about mandatory reporting of child maltreatment: a meta-synthesis of qualitative studies. BMJ Open. 2019;9(4):e025741.

    Article  PubMed  PubMed Central  Google Scholar 

  13. Crombie N, Hooker L, Reisenhofer S. Nurse and midwifery education and intimate partner violence: a scoping review. J Clin Nurs. 2017;26(15-16):2100–25.

    Article  PubMed  Google Scholar 

  14. Lewis NV, Feder GS, Howarth E, Szilassy E, McTavish JR, MacMillan HL, et al. Identification and initial response to children’s exposure to intimate partner violence: a qualitative synthesis of the perspectives of children, mothers and professionals. BMJ Open. 2018;8(4):e019761.

    Article  PubMed  PubMed Central  Google Scholar 

  15. Ambikile JS, Leshabari S, Ohnishi M. Curricular Limitations and Recommendations for Training Health Care Providers to Respond to Intimate Partner Violence: An Integrative Literature Review. Trauma Violence Abuse. 2021:1524838021995951. https://doi.org/10.1177/1524838021995951. Epub ahead of print. PMID: 33622184.

  16. Davidson LL, Grisso JA, Garcia-Moreno C, Garcia J, King VJ, Marchant S. Training programs for healthcare professionals in domestic violence. J Womens Health Gend Based Med. 2001;10(10):953–69.

    Article  CAS  PubMed  Google Scholar 

  17. Sawyer S, Coles J, Williams A, Williams B. A systematic review of intimate partner violence educational interventions delivered to allied health care practitioners. Med Educ. 2016;50(11):1107–21.

    Article  PubMed  Google Scholar 

  18. Turner W, Hester M, Broad J, Szilassy E, Feder G, Drinkwater J, et al. Interventions to improve the response of professionals to children exposed to domestic violence and abuse: a systematic review. Child Abuse Rev. 2017;26(1):19–39.

    Article  PubMed  Google Scholar 

  19. Zaher E, Keogh K, Ratnapalan S. Effect of domestic violence training: systematic review of randomized controlled trials. Can Fam Physician. 2014;60(7):618–24 e340-7.

    PubMed  PubMed Central  Google Scholar 

  20. Divakar U, Nazeha N, Posadzki P, Jarbrink K, Bajpai R, Ho AHY, et al. Digital education of health professionals on the management of domestic violence: systematic review and meta-analysis by the Digital Health Education Collaboration. J Med Internet Res. 2019;21(5):e13868.

    Article  PubMed  PubMed Central  Google Scholar 

  21. MacMillan HL, Wathen CN, Jamieson E, Boyle MH, Shannon HS, Ford-Gilboe M, et al. Screening for intimate partner violence in health care settings: a randomized trial. JAMA. 2009;302(5):493–501.

    Article  CAS  PubMed  Google Scholar 

  22. McLennan JD, MacMillan HL, Afifi TO, McTavish J, Gonzalez A, Waddell C. Routine ACEs screening is NOT recommended. Paediatr Child Health. 2019;24(4):272–3.

    Article  PubMed  PubMed Central  Google Scholar 

  23. Hester M. The three planet model - towards an understanding of contradictions in approaches to women and children's safety in contexts of domestic violence. Br J Soc Work. 2011;41:837–53.

  24. García-Moreno C, Pallitto C, Devries K, Stöckl H, Watts C, Abrahams N. Global and regional estimates of violence against women: prevalence and health effects of intimate partner violence and non-partner sexual violence. Geneva: World Health Organization; 2013.

    Google Scholar 

  25. Wathen CN, MacMillan H. Children's exposure to intimate partner violence: impacts and interventions. Paediatr Child Health. 2013;18(8):419–22.

    PubMed  PubMed Central  Google Scholar 

  26. Vu NL, Jouriles EN, McDonald R, Rosenfield D. Children’s exposure to intimate partner violence: a meta-analysis of longitudinal associations with child adjustment problems. Clin Psychol Rev. 2016;46:25–33.

    Article  PubMed  Google Scholar 

  27. Evans SE, Davies C, DiLillo D. Exposure to domestic violence: a meta-analysis of child and adoelscent outcomes. Aggress Violent Behav. 2008;13(2):131–40.

    Article  Google Scholar 

  28. Wood SL, Sommers MS. Consequences of intimate partner violence on child witnesses: a systematic review of the literature. J Child Adolesc Psychiatr Nurs. 2011;24(4):223–36.

    Article  PubMed  Google Scholar 

  29. Fry D, Fang X, Elliott S, Casey T, Zheng X, Li J, et al. The relationships between violence in childhood and educational outcomes: a global systematic review and meta-analysis. Child Abuse Negl. 2018;75:6–28.

    Article  PubMed  Google Scholar 

  30. Pelletier HL, Knox M. Incorporating child maltreatment training into medical school curricula. J Child Adolesc Trauma. 2017;10(3):267–74.

    Article  PubMed  Google Scholar 

  31. Markenson D, Tunik M, Cooper A, Olson L, Cook L, Matza-Haughton H, et al. A national assessment of knowledge, attitudes, and confidence of prehospital providers in the assessment and management of child maltreatment. Pediatrics. 2007;119(1):e103–8.

    Article  PubMed  Google Scholar 

  32. Tufford L, Bogo M, Katz E, Lee B, Ramjattan R. Reporting suspected child maltreatment: educating social work students in decision making and maintaining the relationship. J Soc Work Educ. 2019;55(3):579–95.

    Article  Google Scholar 

  33. Creswell JW, Plano Clark VL. Designing and conducting mixed methods research 3rd ed. Los Angeles: SAGE; 2017.

    Google Scholar 

  34. Fixsen DL, Blase KA, Timbers GD, Wolf MM. In search of program implementation: 792 replications of the Teaching-Family Model. In: Bernfeld GA, Farrington DP, Leschied AW, editors. Offender Rehabilitation in Practice. England: John Wiley & Sons, Ltd.; 2001.

    Google Scholar 

  35. Fixsen DL, Naoom SF, Blase KA, Friedman RM, Wallace F. Implementation research: a synthesis of the literature. Tampa: University of South Florida; 2005. Report No.: FMHI Publication #231

    Google Scholar 

  36. Metz A, Bartley L. Active Implementation Frameworks for program success: How to use implementation science to improve outcomes for children. Zero to Three. 2012 (March 2012):11-8.

  37. Metz A, Bartley L, Ball H, Wilson D, Naoom S, Redmond P. Active implementation frameworks for successful service delivery: Catawba County child wellbeing project. Res Soc Work Pract. 2015;25(4):415–22.

    Article  Google Scholar 

  38. Nielson P. Overview of theories, models, and frameworks in implementation science. In: Nielson P, Birken SA, editors. Handbook on implementation science. United Kingdom: Edward Elgar Publishing; 2020.

    Chapter  Google Scholar 

  39. Fixsen DL, Blase KA, Metz A, Van Dyke M. Statewide implementation of evidence-based programs. Except Child. 2013;79(3):213–30.

    Article  Google Scholar 

  40. Barac R, Kimber M, Johnson S, Barwick M. The effectiveness of consultation for clinicians learning to deliver motivational interviewing with fidelity. J Evid Inf Soc Work. 2018;15(5):510–33.

    Article  PubMed  Google Scholar 

  41. Barwick M, Barac R, Kimber M, Akrong L, Johnson SN, Cunningham CE, et al. Advancing implementation frameworks with a mixed methods case study in child behavioral health. Transl Behav Med. 2020;10(3):685-704. https://doi.org/10.1093/tbm/ibz005.

  42. Couturier J, Kimber M, Barwick M, Woodford T, McVey G, Findlay S, et al. Themes arising during implementation consultation with teams applying family-based treatment: a qualitative study. J Eat Disord. 2018;6:32.

    Article  PubMed  PubMed Central  Google Scholar 

  43. Couturier J, Lock J, Kimber M, McVey G, Barwick M, Niccols A, et al. Themes arising in clinical consultation for therapists implementing family-based treatment for adolescents with anorexia nervosa: a qualitative study. J Eat Disord. 2017;5:28.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  44. Kimber M, Barac R, Barwick M. Monitoring fidelity to an evidence-based treatment: practitioner perspectives. Clin Soc Work J. 2017;47(2):207–21.

    Article  Google Scholar 

  45. Kimber M, Barwick M, Fearing G. Becoming an evidence-based service provider: staff perceptions and experiences of organizational change. J Behav Health Serv Res. 2012;39(3):314–32.

    Article  PubMed  Google Scholar 

  46. Velonis AJ, O'Campo P, Rodrigues JJ, Buhariwala P. Using implementation science to build intimate partner violence screening and referral capacity in a fracture clinic. J Eval Clin Pract. 2019;25(3):381–9.

    Article  PubMed  Google Scholar 

  47. Saldana L, Chamberlain P, Wang W, Hendricks BC. Predicting program start-up using the stages of implementation measure. Admin Pol Ment Health. 2012;39(6):419–25.

    Article  Google Scholar 

  48. Romney S, Israel N, Zlatevski D. Exploration stage implementation variation: its effect on the cost-effectiveness of an evidence-based parenting program. Z Psychol. 2014;222:37–48.

    Google Scholar 

  49. Creswell JW, Plano Clark VL. Designing and conducting mixed methods research. 2nd ed. Thousand Oaks: SAGE Publications; 2011.

    Google Scholar 

  50. Stewart DE, MacMillan H, Kimber M. Recognizing and responding to intimate partner violence: an update. Can J Psychiatr. 2021;66(1):71–106.

    Article  Google Scholar 

  51. Afifi TO, MacMillan HL, Boyle M, Taillieu T, Cheung K, Sareen J. Child abuse and mental disorders in Canada. CMAJ. 2014;186(9):E324–32.

    Article  PubMed  PubMed Central  Google Scholar 

  52. Statistics Canada. (2021) Family 1 Violence in Canada: A Statistical Profile, 2019. 2021. Ottawa: Statistics Canada. Available from: https://www150.statcan.gc.ca/n1/daily3quotidien/210302/dq210302d-eng.htm.

  53. Canadian Institute for Health Information. Canada’s health care providers: provincial profiles (2007-2016 Data Tables). Ottawa: Canadian Institute for Health Information; 2017.

    Google Scholar 

  54. MacLaurin B, Trocme N, Fallon B, Sinha V, Freehan R, Enns R, et al. Alberta incidence study of reported child abuse and neglect (2008): major findings. Calgary: University of Calgary, Work FoS; 2013.

    Google Scholar 

  55. Clement ME, Chamberland C, Bouchard C. Prevalence, co-occurrence and decennial trends of family violence toward children in the general population. Can J Public Health. 2016;106(7 Suppl 2):eS31–7.

    PubMed  Google Scholar 

  56. Afifi TO, MacMillan HL, Taillieu T, Cheung K, Turner S, Tonmyr L, et al. Relationship between child abuse exposure and reported contact with child protection organizations: results from the Canadian Community Health Survey. Child Abuse Negl. 2015;46:198–206.

    Article  PubMed  Google Scholar 

  57. Afifi TO, McTavish J, Turner S, MacMillan HL, Wathen CN. The relationship between child protection contact and mental health outcomes among Canadian adults with a child abuse history. Child Abuse Negl. 2018;79:22–30.

    Article  PubMed  Google Scholar 

  58. Burczycka M. Section 2: Police-Reported Intimate Partner Violence in Canada, 2017. In: Family Violence in Canada: A Statistical Profile [Internet]. Ottawa: Statistics Canada. 2018. Available from: https://www150.statcan.gc.ca/n1/pub/85-002-22x/2018001/article/54978/02-eng.htm.

  59. Statistics Canada. English-French Bilingualism R 1 eaches New Heights. 2017. In: Census in Brief [Internet]. Ottawa: Statistics Canada. Available from: https://www12.statcan.gc.ca/census-recensement/2016/as-sa/98-200-x/2016009/98-200-4x2016009-eng.cfm.

  60. Canadian Institute for Health Information. In: Information CIfH, editor. Canada’s health care providers, 2016 to 2020 - data tables. Ottawa: CIHI; 2022.

    Google Scholar 

  61. Leech NL, Onwuegbuzie AJ. A typology of mixed methods research designs. Qual Quant. 2009;43(2):265–75.

    Article  Google Scholar 

  62. Kim H, Sefcik JS, Bradway C. Characteristics of qualitative descriptive studies: a systematic review. Res Nurs Health. 2017;40(1):23–42.

    Article  CAS  PubMed  Google Scholar 

  63. Neergaard MA, Olesen F, Andersen RS, Sondergaard J. Qualitative description - the poor cousin of health research? BMC Med Res Methodol. 2009;9:52.

    Article  PubMed  PubMed Central  Google Scholar 

  64. Palinkas LA, Horwitz SM, Green CA, Wisdom JP, Duan N, Hoagwood K. Purposeful sampling for qualitative data collection and analysis in mixed method implementation research. Admin Pol Ment Health. 2015;42(5):533–44.

    Article  Google Scholar 

  65. Patton MQ. Qualitative research and evaluation methods. 4th ed. Thousand Oaks: SAGE Publications; 2015.

    Google Scholar 

  66. Rubin HJ, Rubin IS. Qualitative interviewing: the art of hearing data. Los Angeles, CA: SAGE; 2012.

    Google Scholar 

  67. Hsieh HF, Shannon SE. Three approaches to qualitative content analysis. Qual Health Res. 2005;15(9):1277–88.

    Article  PubMed  Google Scholar 

  68. Crowe M, Inder M, Porter R. Conducting qualitative research in mental health: thematic and content analyses. Aust N Z J Psychiatry. 2015;49(7):616–23.

    Article  PubMed  Google Scholar 

  69. Vaismoradi M, Turunen H, Bondas T. Content analysis and thematic analysis: implications for conducting a qualitative descriptive study. Nurs Health Sci. 2013;15(3):398–405.

    Article  PubMed  Google Scholar 

  70. Olsen J, Christensen K, Murray J, Ekbom A. An introducation to epidemiology for health professionals. New York: Springer; 2010.

    Book  Google Scholar 

  71. Goldman GD. Initial validation of a brief individual readiness for change scale (BIRCS) for use with addiction program staff practitioners. J Soc Work Pract Addict. 2009;9(2):184–203.

    Article  Google Scholar 

  72. Connor PD, Nouer SS, Mackey ST, Tipton NG, Lloyd AK. Psychometric properties of an intimate partner violence tool for health care students. J Interpers Violence. 2011;26(5):1012–35.

    Article  PubMed  Google Scholar 

  73. Short LM, Alpert E, Harris JM Jr, Surprenant ZJ. A tool for measuring physician readiness to manage intimate partner violence. Am J Prev Med. 2006;30(2):173–80.

    Article  PubMed  PubMed Central  Google Scholar 

  74. Foster RH, Olson-Dorff D, Reiland HM, Budzak-Garza A. Commitment, confidence, and concerns: assessing health care professionals’ child maltreatment reporting attitudes. Child Abuse Negl. 2017;67:54–63.

    Article  PubMed  Google Scholar 

  75. Singh S, Knox M, Pelletier H. Exploratory factor analysis and psychometric evaluation of the healthcare provider attitudes toward child maltreatment reporting scale. Childrens Health Care. 2017;46(4):356–65.

    Article  Google Scholar 

  76. Hamzehgardeshi Z, Shahhosseini Z. Psychometric properties of an instrument to measure facilitators and barriers to nurses’ participation in continuing education programs. Global J Health Sci. 2014;6(5):219–25.

    Article  Google Scholar 

  77. Akin B, Strolin-Goltzman J, Collins-Camargo C. Successes and challenges in developing trauma-informed child welfare systems: a real-world case study of exploration and initial implementation. Child Youth Serv Rev. 2017;82:42–52.

    Article  Google Scholar 

  78. Collin-Vézina D, McNamee S, Brazeau C, Laurier C. Initial implementation of ARC Framework in juvenile justice settings. J Aggress Maltreat Trauma. 2019;28(5):631–54.

    Article  Google Scholar 

  79. Asarbakhsh M, Sandars J. E-learning: the essential usability perspective. Clin Teach. 2013;10(1):47–50.

    Article  PubMed  Google Scholar 

  80. Sandars J. The importance of usability testing to allow e-learning to reach its potential for medical education. Edu Prim Care. 2010;21(1):6–8.

    Article  Google Scholar 

  81. Sandars J, Lafferty N. Twelve tips on usability testing to develop effective e-learning in medical education. Med Tech. 2010;32(12):956–60.

    Google Scholar 

  82. Sandars J, Brown J, Walsh K. Producing useful evaluations in medical education. Educ Prim Care. 2017;28(3):137–40.

    Article  PubMed  Google Scholar 

  83. Abran A, Khelifi A, Suryn A, Seffah A. Usability meanings and interpretations in ISO standards. Softw Qual J. 2003;11(4):325–38.

    Article  Google Scholar 

  84. Adams NE. Bloom’s taxonomy of cognitive learning objectives. J Med Libr Assoc. 2015;103(3):152–3.

    Article  PubMed  PubMed Central  Google Scholar 

  85. Armson H, Elmslie T, Roder S, Wakefield J. Is the cognitive complexity of commitment-to-change statements associated with change in clinical practice? An application of Bloom’s taxonomy. J Contin Educ Heal Prof. 2015;35(3):166–75.

    Article  Google Scholar 

  86. Shannon S. Educational objectives for CME programmes. Lancet. 2003;361(9365):1308.

    Article  PubMed  Google Scholar 

  87. Su WM, Osisek PJ. The revised Bloom’s taxonomy: implications for educating nurses. J Contin Educ Nurs. 2011;42(7):321–7.

    Article  PubMed  Google Scholar 

  88. Anderson LW, Krathwohl DR, Bloom BS. A taxonomy for learning, teaching, and assessing: a revision of Bloom’s taxonomy of educational objectives. Pennsylvania: Longman; 2001.

    Google Scholar 

  89. Bastien CJM. Usability testing: a review of some methodological and technical aspects of the method. Int J Med Infom. 2010;79(4):e18–23.

    Article  Google Scholar 

  90. Faulkner L. Beyond the five-user assumption: benefits of increased sample sizes in usability testing. Behav Res Methods Instrum Comput. 2003;35(3):379–83.

    Article  PubMed  Google Scholar 

  91. Kimber M, McTavish JR, Luo C, Couturier J, Dimitropoulos G, MacMillan H. Mandatory reporting of child maltreatment when delivering family-based treatment for eating disorders: a framework analysis of practitioner experiences. Child Abuse Negl. 2019;88:118–28.

    Article  PubMed  Google Scholar 

  92. Gale NK, Heath G, Cameron E, Rashid S, Redwood S. Using the framework method for the analysis of qualitative data in multi-disciplinary health research. BMC Med Res Methodol. 2013;13:117.

    Article  PubMed  PubMed Central  Google Scholar 

  93. Hornbaek K. Current practice in measuring usability: challenges to usability studies and research. Int J Hum Comput Stud. 2006;64:79–102.

    Article  Google Scholar 

  94. Hvannberg ET, Law LC. Classification of usability problems (CUP) scheme: INTERACT; 2006. p. 655–62.

    Google Scholar 

  95. Millsap RE, Maydeu-Olivares A (Editors). The SAGE Handbook of Quantitative Methods in Psychology. Thousand Oaks; 2009.

  96. Gonsalvez CJ, Terry J, Deane FP. Using standardised vignettes to assess practicum competencies in psychology and other disciplines. Australia: Australian Government, Training DoEa; 2016.

    Google Scholar 

  97. Peabody JW, Luck J, Glassman P, Dresselhaus TR, Lee M. Comparison of vignettes, standardized patients, and chart abstraction: a prospective validation study of 3 methods for measuring quality. JAMA. 2000;283(13):1715–22.

    Article  CAS  PubMed  Google Scholar 

  98. Sowden GL, Vestal HS, Stoklosa JB, Valcourt SC, Peabody JW, Keary CJ, et al. Clinical case vignettes: a promising tool to assess competence in the management of agitation. Acad Psychiatry. 2017;41(3):364–8.

    Article  PubMed  Google Scholar 

  99. Vogus TJ, Cull MJ, Hengelbrok NE, Modell SJ, Epstein RA. Assessing safety culture in child welfare: evidence from Tennessee. Child Youth Serv Rev. 2016;65:94–103.

    Article  Google Scholar 

  100. Vogus TJ, Sutcliffe KM. The Safety Organizing Scale: development and validation of a behavioral measure of safety culture in hospital nursing units. Med Care. 2007;45(1):46–54.

    Article  PubMed  Google Scholar 

  101. Bangor A, Kortum P, Miller J. Determining what individual SUS scores mean: adding an adjective rating scale. J Usability Stud. 2009;4(3):114–23.

    Google Scholar 

  102. Brooke J. System usability satisfaction: a retrospective. J Usability Stud. 2013;8(2):29–40.

    Google Scholar 

  103. Gutterman TC, Fetters MD, Creswell JW. Integrating quantitative and qualitative results in health science mixed methods research through joint displays. Ann Fam Med. 2015;13(6):554–61.

    Article  Google Scholar 

  104. Verma JP. Repeated measures design for empirical researchers. Hoboken: Wiley; 2016.

    Google Scholar 

  105. Lancaster GA, Dodd S, Williamson PR. Design and analysis of pilot studies: recommendations for good practice. J Eval Clin Pract. 2004;10(2):307–12.

    Article  PubMed  Google Scholar 

  106. Whitehead AL, Julious SA, Cooper CL, Campbell MJ. Estimating the sample size for a pilot randomised trial to minimise the overall trial sample size for the external pilot and main trial for a continuous outcome variable. Stat Methods Med Res. 2016;25(3):1057–73.

    Article  PubMed  Google Scholar 

  107. Billingham SA, Whitehead AL, Julious SA. An audit of sample sizes for pilot and feasibility trials being undertaken in the United Kingdom registered in the United Kingdom Clinical Research Network database. BMC Med Res Methodol. 2013;13:104.

    Article  PubMed  PubMed Central  Google Scholar 

  108. Morse JM. Qualitative nursing research: a contemporary dialogue. 2nd ed. Newbury Park: SAGE Publications; 1991.

    Book  Google Scholar 

  109. Greenhalgh T, Robert G, Macfarlane F, Bate P, Kyriakidou O. Diffusion of innovations in service organizations: systematic review and recommendations. Milbank Q. 2004;82(4):581–629.

    Article  PubMed  PubMed Central  Google Scholar 

  110. Damschroder LJ, Aron DC, Keith RE, Kirsh SR, Alexander JA, Lowery JC. Fostering implementation of health services research findings into practice: a consolidated framework for advancing implementation science. Implement Sci. 2009;4:50.

    Article  PubMed  PubMed Central  Google Scholar 

  111. Colombini M, Alkaiyat A, Shaheen A, Garcia Moreno C, Feder G, Bacchus L. Exploring health system readiness for adopting interventions to address intimate partner violence: a case study from the occupied Palestinian Territory. Health Policy Plan. 2020;35(3):245–56.

    Article  PubMed  Google Scholar 

  112. Albers B, Metz A, Burke K, Bührmann L, Bartley L, Driessen P, et al. Implementation support skills: findings from a systematic integrative review. Research in Social Work. Practice. 2020;31(2):147–70.

    Google Scholar 

  113. O'Cathain A, Croot L, Duncan E, Rousseau N, Sworn K, Turner KM, et al. Guidance on how to develop complex interventions to improve health and healthcare. BMJ Open. 2019;9(8):e029954.

    Article  PubMed  PubMed Central  Google Scholar 

  114. Happ MB, Dabbs AD, Tate J, Hricik A, Erlen J. Exemplars of mixed methods data combination and analysis. Nurs Res. 2006;55(2 Suppl):S43–9.

    Article  PubMed  Google Scholar 

  115. Krefting L. Rigor in qualitative research: the assessment of trustworthiness. Am J Occup Ther. 1991;45(3):214–22.

    Article  CAS  PubMed  Google Scholar 

  116. O'Cathain A, Murphy E, Nicholl J. The quality of mixed methods studies in health services research. J Health Serv Res Policy. 2008;13(2):92–8.

    Article  PubMed  Google Scholar 

  117. Pinnock H, Barwick M, Carpenter CR, Eldridge S, Grandes G, Griffiths CJ, et al. Standards for Reporting Implementation Studies (StaRI): explanation and elaboration document. BMJ Open. 2017;7(4):e013318.

    Article  PubMed  PubMed Central  Google Scholar 

  118. Eldridge SM, Chan CL, Campbell MJ, Bond CM, Hopewell S, Thabane L, et al. CONSORT 2010 statement: extension to randomised pilot and feasibility trials. BMJ. 2016;355:i5239.

    Article  PubMed  PubMed Central  Google Scholar 

  119. Lancaster GA, Thabane L. Guidelines for reporting non-randomised pilot and feasibility studies. Pilot Feasibility Stud. 2019;5:114.

    Article  PubMed  PubMed Central  Google Scholar 

  120. Mann CJ. Observational research methods - research design II: cohort, cross sectional, and case-control studies. Emerg Med J. 2003;20(1):54–60.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  121. Law M, Stewart DE, Pollock N, Letts L, Bosch J, Westmorland M. Guidelines for critical review form - quantitative studies. Hamilton: McMaster University; 1998.

    Google Scholar 

  122. Thomas DC, Berry A, Djuricich AM, Kitto S, Kreutzer KO, Van Hoof TJ, et al. What is implementation science and what forces are driving a change in medical education? Am J Med Qual. 2017;32(4):438–44.

    Article  PubMed  Google Scholar 

  123. Price DW, Wagner DP, Krane NK, Rougas SC, Lowitt NR, Offodile RS, et al. What are the implications of implementation science for medical education? Med Educ Online. 2015;20:27003.

    Article  PubMed  Google Scholar 

  124. Carney PA, Crites GE, Miller KH, Haight M, Stefanidis D, Cichoskikelly E, et al. Building and executing a research agenda toward conducting implementation science in medical education. Med Educ Online. 2016;21:32405.

    Article  PubMed  Google Scholar 

  125. Childs S, Blenkinsopp E, Hall A, Walton G. Effective e-learning for health professionals and students--barriers and their solutions. A systematic review of the literature--findings from the HeXL project. Health Inf Libr J. 2005;22(Suppl 2):20–32.

    Article  Google Scholar 

  126. Curran VR, Fleet L, Kirby F. Factors influencing rural health care professionals’ access to continuing professional education. Aust J Rural Health. 2006;14(2):51–5.

    Article  PubMed  Google Scholar 

  127. Mansouri M, Lockyer J. A meta-analysis of continuing medical education effectiveness. J Contin Educ Heal Prof. 2007;27(1):6–15.

    Article  Google Scholar 

  128. Marinopoulos SS, Dorman T, Ratanawongsa N, Wilson LM, Ashar BH, Magaziner JL, et al. Effectiveness of continuing medical education. Evid Rep Technol Assess (Full Rep). 2007;149:1–69.

    Google Scholar 

Download references

Acknowledgements

The authors acknowledge the permission from McMaster University and Dr. Harriet MacMillan to implement and evaluate VEGA in phase 2 and phase 3 of the RISE Project. We would also like to thank the “champions” of our Collaborating Organizations (https://riseproject.mcmaster.ca/collaborators) and the RISE Project research staff for their support of project activities, including Ilana Allice, Alice Cavanagh, Anita Acai, Ash Lowenthal, Anaïs Cadieux Vanvliet, Manya Singh, Rosemary Perry, Savanah Smith, Ayda Ferdossifard, Ruby Mann, Shipra Taneja, and Kinza Syed.

Funding

The RISE Project is funded by the Public Health Agency of Canada (PHAC; Agence de la santé publique du Canada; Agreement Number/Numéro d’entente: 1920-HQ-000088). The PHAC did not have any role in the development of the RISE Project protocol. The views expressed in this paper do not necessarily represent the views of the Public Health Agency of Canada.

Author information

Authors and Affiliations

Authors

Contributions

MK, MV, GD, DCV, and DS were all involved in the development of this protocol. MK drafted this manuscript. All authors were involved in the critical revision of the paper for intellectual content and its final approval before submission. All authors read and approved the final manuscript.

Corresponding author

Correspondence to Melissa Kimber.

Ethics declarations

Ethics approval and consent to participate

The RISE Project has been approved for human research by the Hamilton Integrated Research Ethics Board (Project #: 11295), the University of Calgary Conjoint Research Ethics Board (Project #: REB20-0338), and McGill University’s Research Ethics Board (Project #: 20-06-038).

Consent for publication

Not applicable

Competing interests

The authors declare no competing interests.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary Information

Additional file 1.

 Standards for Reporting Implementation Studies: Completed StaRI Checklist.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Kimber, M., Vanstone, M., Dimitropoulos, G. et al. Researching the Impact of Service provider Education (RISE) Project — a multiphase mixed methods protocol to evaluate implementation acceptability and feasibility. Pilot Feasibility Stud 8, 135 (2022). https://doi.org/10.1186/s40814-022-01096-y

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s40814-022-01096-y

Keywords