Вы находитесь на странице: 1из 8

King County School-based SBIRT Program Evaluation Plan

Seattle Children’s Research Institute


Kate Katzman, MPH, Maria Stepanchak, MPH, Cari McCarty, PhD

Executive Summary
Public Health – Seattle & King County, along with Reclaiming Futures, has adapted a
Screening, Brief Intervention, and Referral to Treatment (SBIRT) model to promote health and
well-being among middle school students. The school-based SBIRT program (SBIRT-SB) consists
of: Screening for substance use, mental health issues, and strengths; a Brief Intervention
based on motivational interviewing that involves semi-structured 15-20 minute sessions with
both the youth alone and together with their caregiver; and Referral To assessment and/or
other community-based services and supports, including counseling, mentoring, and youth
leadership opportunities. This program is being implemented across 48 middle schools in King
County.

Seattle Children’s Research Institute will conduct a process evaluation of SBIRT-SB over the
next 15 months. As part of the process evaluation, we aim to understand whether this is the
most appropriate model of care for youth in middle school settings, whether the program was
implemented as intended and how it can be improved. To help answer these questions, we
will collect qualitative data across stakeholders and quantitative indicators of program
implementation, satisfaction, and success. Descriptive analyses will be used to understand
whether additional modifications are needed to the intervention model. Reporting of results
will be consistent with data suppression standards mandated by FERPA and aggregate data will
be periodically shared back with key stakeholders and participants. This evaluation is limited
by several factors, including the sample of schools participating in the program, the timing of
implementation, and the fact that many participating schools are implementing multiple
programs to improve student health and well-being making it challenging to assess the impact
and implementation of SBIRT-SB alone.

Seattle Children’s Research Institute and Public Health – Seattle & King County have also
submitted a grant proposal to the Conrad N. Hilton Foundation to fund an outcome evaluation
of SBIRT-SB from April 2019 to December 2021. The outcome evaluation will focus on changes
in key outcomes, including youth intention to use substances, mean number of days of
substance use over the previous month, and youth perceptions of connection to adults at
school. A power calculation was used to identify required sample sizes to assess change and
we have outlined a plan for data collection. Indicators and performance measures for the
outcome evaluation are in development and will be revised and refined based on program
staff, interventionist, and school administration feedback over the next year. The outcome
evaluation will be limited by similar factors as the process evaluation, including limited
generalizability of findings outside of the participating schools, the lack of a control group,
and the potential for differences in implementation strategies due to the fact that this
program is being implemented in a real-world setting.

Table of Contents

Page 1 of 8
SBIRT-SB Program Overview
3

Process Evaluation
Overview 3
Analytic Plan 4
Potential Limitations
4
Communication Plan
5

Outcomes Evaluation
Overview 5
Main Outcomes Analytic Plan
6
Potential Limitations
7
Communication Plan
8


Page 2 of 8
SBIRT-SB Program Overview
Screening, Brief Intervention, and Referral to Treatment (SBIRT) is a public health approach to
identifying and addressing substance use and related risks. The King County School-based
SBIRT intervention model (also known as SBIRT-SB) has been adapted from Conrad N. Hilton’s
Reclaiming Futures Project in collaboration with Evan Elkin, and is novel in its application to
middle school students. This novel implementation of SBIRT consists of: Screening for
substance use, mental health issues, and strengths; a Brief Intervention based on motivational
interviewing that involves semi-structured 15-20 minute sessions with both the youth alone
and together with their caregiver; and Referral To assessment and/or other community-based
services and supports, including counseling, mentoring, and youth leadership opportunities. As
part of this model, individual school districts and middle schools are empowered to determine
what screening approach (indicated or universal), staffing, and implementation strategies will
work best for their setting. Schools are responsible for maintaining and expanding their
referral network and relationships with community-based organizations to ensure student
needs are met. Screening is conducted using a version of the Check Yourself tool designed for
middle school settings. Check Yourself is a multi-risk electronic screening tool that includes
personalized feedback for the student based on their responses developed by Drs. Cari
McCarty and Laura Richardson from Seattle Children’s and Tickit Health. Based on their
screening results students are prioritized into one of three tiers defined by the risk factors
they endorse. SBIRT interventionists at the school conduct brief intervention (15-20 minute)
sessions with youth using motivational interviewing strategies to assess strengths, facilitate
goal setting, provide referrals, and follow-up as needed. If appropriate, SBIRT interventionists
will engage with caregivers/trusted adults to participate in brief intervention sessions with
the youth. SBIRT interventionists will also conduct group sessions with caregivers/trusted
adults outside of school hours. These sessions will offer caregivers an opportunity to reinforce
and strengthen their communication skills as well as their ability to relate to the youth they
support if appropriate.

SBIRT-SB Process Evaluation


Year 1: Sept. 2018 – Dec. 2019

Process Evaluation Overview

In the next 15 months, we will conduct a process evaluation of the King County School-based
SBIRT Program to address the following three questions:

1. Is this the most appropriate model of care for youth in middle school settings?
(Including understanding the youth experience with the Check Yourself tool, the
usefulness and appropriateness of BI sessions, and to assess any changes in attitudes or
intention to use substances)
2. Was the program implemented as intended? Why or why not?
3. How well did we do and how can the program be improved?

Page 3 of 8
The process evaluation will include the collection of both quantitative indicators of program
implementation, satisfaction, and success as well as qualitative data across stakeholders
(youth, caregivers, interventionists, school administrators, and King County staff). For this
evaluation we will not receive or collect any student names or contact information.

Process Evaluation Analytic Plan

Analyses will be descriptive in nature in order to help inform whether additional modifications
are needed to the intervention model. Quantitative data will be evaluated using descriptive
statistical analyses. All results will be reported at the district-level only, unless school-level
reporting is required and does not compromise data security or confidentiality. Qualitative
data (including all notes and transcripts of interviews and focus groups) will be analyzed using
thematic analyses to contextualize quantitative results and generate ideas for improvement
to the intervention.

When reporting results of the process evaluation, we will apply Family Educational Rights and
Privacy Act (FERPA) mandated data suppression standards, which require that all cell sizes <10
be suppressed to ensure that a member of the community cannot reasonably identify any
individual student. We will report these cells as “<10” or apply other blurring techniques as
needed.

Process Evaluation Potential Limitations

We identified some potential limitations to the process evaluation including (but not limited
to):
• Schools implementing SBIRT-SB applied for funding to a Request for Proposals from
Public Health – Seattle & King County. Intervention middle schools may differ from
other middle schools in the county who did not apply for funding depending on existing
needs for services, ability to take on a new project, and other influencing factors at a
school or district level.
• It will be difficult to tease out if trends in the data are specifically due to the SBIRT-SB
Program and not due to other factors or services in place at participating schools. We
will use the qualitative interviews to understand the role of SBIRT-SB in the context of
other school resources. Findings from the small qualitative sample may not be
generalizable to all youth and caregivers in King County. We will minimize this by
recruiting a diverse sample of youth and caregivers (across gender, age, race and
ethnicity) from multiple schools to account for this. We will use the themes from the
qualitative methods to generate ideas for improvement in the intervention rather than
provide conclusive results.
• Due to timing and the way this intervention is structured, we will need to select our
sample for this evaluation based on school readiness and implementation of the
intervention as intended. We will maximize diversity in our sample by selecting schools
from different geographic areas; however, sampling will be limited to school
implementation timelines and progress.

Page 4 of 8
Process Evaluation Communication Plan

As part of the rapid-cycle evaluation, we recognize the need to share results with the
program team on process evaluation measures on a regular basis. When available, data will be
presented over time by district to identify trends and give the program team information to
make modifications to the intervention as needed. Progress reports (due 1/15/19, 4/15/19
and 7/15/19) will include a written summary of the findings as well as any recommendations
for changes to the intervention. Our final report (due 12/31/19) will include a thorough
description of the data, analyses, and recommendations for modifications to the intervention
for subsequent years. As requested, we will provide King County staff with data and
presentation materials to assist with further discussion and dissemination of the findings. The
below table details audiences, format and the frequency information will be shared
throughout the process evaluation.

Table 1. Process Evaluation communication plan and audiences.


Output from SCRI Audience Format Frequency
Deliverables King County Public Health In accordance As described in
to be distributed by them with the BSK the Workplan +
Workplan + timeline to other audiences as Deliverable timeline
needed Templates & (working
Evaluation Plan Guidance document)
document
Performance Measurement
Plan
Progress Reports
Final Evaluation Report
Narrative Report Summary SBIRT-SB program staff Report of February 2019
aggregate data August 2019
reported by (1 month after
district grantee due
date)

Narrative Report Slide SBIRT SB Evaluation Slides of February 2019


deck Workgroup aggregate data August 2019
overall

REDCap Data Snapshot SBIRT-SB Evaluation 3-5 slides of Monthly March


Workgroup aggregate data in to June 2019
REDCap overall

Summary of youth and SBIRT-SB Evaluation Data placemat of June-July 2019


caregiver quantitative data Workgroup; KII school aggregate data by
collection staff sample; KII SBIRT-SB district
program staff sample

SBIRT-SB Outcomes Evaluation

Page 5 of 8
Years 2&3: April 2019 – September 2021

Outcomes Evaluation Overview

We have outlined a plan to conduct an outcomes evaluation of the King County School-based
SBIRT Program, addressing the following questions:

1. What is the impact of SBIRT on intermediate youth outcomes, including preventing or


delaying the onset of substance use and improving protective factors?

2. To what extent and in what ways does SBIRT lead to a coordinated and integrated
system of emotional/behavioral support in middle schools?

3. How does SBIRT impact youth academic performance, discipline, and attendance?

Together with Public Health – Seattle & King County we submitted a grant proposal to the
Conrad N. Hilton Foundation to fund some of this work from April 2019 to December 2021.
Indicators and performance measures for the outcomes evaluation are in development and
will be revised and refined based on program staff, interventionist, and school administration
feedback over the next year.

The outcomes evaluation will include the collection of survey data and rigorous statistical
analyses to examine change over time in adolescent self-report of substance use, feeling
supported, and school connectedness. Public Health – Seattle & King County will contribute to
the analyses by collecting and analyzing additional available data sources including the
Healthy Youth Survey, school climate surveys, and Office of Superintendent of Public
Instruction (OSPI) performance data, and/or other academic data requested from schools.

FERPA mandated data suppression standards will be applied when reporting the results of the
outcome evaluation, similarly to the process described above for reporting the results of the
process evaluation. Qualitative reflection events with youth, interventionists, and school
district administration will be used to understand why or why not our results reflect their
implications

Main Outcomes Analytic Plan


Outcome 1: Change in future intention to use substances

We will ask participants about their future intentions to use substances (likely/maybe/
unlikely) at three time points: baseline, after their final SBIRT meeting, and two months after
baseline assessment. We will use mixed-effect logistic regression to model the odds of
participants reporting future intentions to use substances and allow random intercepts in the
model to account for both within-participant and within-school correlations. We will include a
fixed effect for time point in order to assess if the odds of reporting future intentions to use
substances decreased immediately after participating in the SBIRT and/or two months later,
compared to baseline. We will also adjust the model for relevant confounders identified a
priori.

Page 6 of 8
Power: Prior research has estimated that the intra-class correlation (ICC) among students in
the same school is 20%.1 Assuming we will observe a similar ICC, we will have 99% power to
detect a medium effect size (as small as 0.2) comparing the proportion of participants
reporting intentions to use substances in the future at each of the post-SBIRT time points with
baseline if we enroll 800 students from 15 schools.

Outcome 2 Change in mean number of days of substance use in the past month

We will ask participants how many days in the past month they used substances at both
baseline and two months after participation in SBIRT. We will use linear mixed-effect
regression to model the change in the mean number of days participants reported using
substances prior to participating in the SBIRT intervention and two months after participation.
The models will include a fixed effect for time point to evaluate the change in mean number
of days of substance use reported by participants two months after they participated in the
intervention compared to baseline and random intercepts in to account for within-participant
and within-school correlations. We will also adjust the model for relevant confounders
identified a priori.

Power: Assuming an ICC of 20% among students in each school we will also have 98% power to
detect a medium effect size in reduction of the mean number of days participants reported
using substances within the past month at baseline and two months after participation in the
SBIRT intervention if we enroll 800 students from 15 schools. A medium effect size refers to a
difference that is half of the standard deviation for the outcome.2

Outcome 3: Increase in score on school connection scale

We will ask participants at baseline and two months after participation in SBIRT to rate how
connected they feel to adults at their school using the school connectedness scale. We will
calculate the proportion of participants whose ratings on the school connectedness scale were
at least one point higher two months after participating in SBIRT than at baseline. We will use
the binomial test to assess if the proportion of participants with increased ratings on the
school connectedness scale is statistically significant.

Outcomes Evaluation Potential Limitations

The outcomes evaluation limitations include (but are not limited to):
• This is a real-world implementation of SBIRT, not a randomized trial. Thus, the
evaluative design lacks a control group, making it difficult to fully account for trends
over time and other factors in schools that may influence outcomes. The collection of

1Shackleton, N., Hale, D., Bonell, C., & Viner, R. M. (2016). Intraclass correlation values for
adolescent health outcomes in secondary schools in 21 European countries. SSM-population
health, 2, 217-225.
2Cohen, J. (1988). Statistical power analysis for the behavioral sciences. Hillsdale, NJ: L.
Lawrence Earlbaum Associates, 2.

Page 7 of 8
primary data from non-intervention schools is not feasible within the budgetary
constraints; however, we hope to work with King County Public Health to capitalize on
existing data from other data sources for some outcomes such as attendance and
academic performance.
• Over the two years of data collection, it will be difficult to account for differences in
the way the intervention is implemented over time and by school. To overcome this,
we will be gathering intervention implementation details via a grantee narrative
report (challenges, successes, etc.) and using reflection events with interventionists
and school administration staff to provide additional context.
• A single follow-up assessment of youth self-report of substance use, mental health,
and other symptoms two months after the intervention is limited in its ability to
demonstrate change over time due to the short-term follow-up window and reliance
on adolescent report of behaviors.
• Results from this outcome evaluation from a subset of middle schools in King County
may not be generalizable to all middle school populations in other settings which may
differ by population, availability of resources, political environment, and other
factors.

Outcomes Evaluation Communication Plan

As part of communicating the findings and understanding meaning of the data we have
included two types of reflection events in the evaluation plan. With focus groups of youth, we
will present the data and gather their perspectives on why we did or did not see changes in
youth-reported substance use behaviors or academic outcome measures. Through group
reflection events with interventionists and school administrators at two time points in the
evaluation, we aim to understand their interpretation of trends in the data and also the
context of why there may be differences at a school and/or district level.

As described in the evaluation plan, we will generally report on output-level data on a


quarterly basis, project objective-level research findings at the mid-point and end of the
project, and strategic objective-level findings at the end of the project. In addition to regular
progress reporting and a final report of all findings, we will provide King County staff with
data and presentation materials as requested. As our evaluation findings develop, we will
discuss with Public Health – Seattle & King County opportunities for presentation at
conferences and publication of results in peer-reviewed journals.

Page 8 of 8

Вам также может понравиться