To evaluate the effects of Project ALERT on adolescents' lifetime and 30-day use of cigarettes, alcohol, marijuana, and inhalants.
Cluster randomized trial.
Schools from 11 states were enrolled in 2 successive cohorts from 2004 to 2008.
All public schools in the United States that included grades 6 through 8 and enrolled at least 100 students in sixth grade were recruited. Of the 40 schools that began the study, 34 (17 per condition) completed it. Data were analyzed from 5883 unique participants.
Project ALERT, a manualized classroom-based substance use prevention curriculum for the middle grades, was taught to sixth and seventh graders.
Main Outcome Measures
Students were surveyed before the onset of the intervention, as sixth graders, and after the completion of the 2-year intervention, as seventh graders. Outcome measures included lifetime and 30-day use of cigarettes, alcohol, marijuana, and inhalants.
At baseline, students in the intervention condition were slightly to moderately more likely to report use for each of the 8 measures examined than were students in the control condition. For all measures except lifetime use of cigarettes, these differences were less pronounced at follow-up and therefore were in the direction of favorable program effects. These changes were statistically significant, however, for only 1 outcome measure, past 30-day use of alcohol (reduction in the adjusted odds ratio from 2.07 at baseline to 1.32 at follow-up; P = .006).
Project ALERT was not effective when delivered to the sixth grade population we targeted.
clinicaltrials.gov Identifier: NCT00650585
During the past several decades, the Safe and Drug-Free Schools Program of the US Department of Education has served as the primary sponsor for school-based substance use prevention curricula. A number of such curricula are now recognized as evidence based by at least 1 of several actively maintained, federally sponsored registries (eg, the National Registry of Evidence-Based Programs and Practices1 or Blueprints for Violence Prevention2). As of 2005, 43% of the nation's public middle schools implemented at least 1 evidence-based substance abuse prevention curriculum.3
Although there is a substantial and growing body of research literature documenting the positive effects of school-based substance use prevention programs,4 many have been evaluated only within the context of efficacy trials. Such trials are typically conducted under optimum conditions in which they are usually assessed or supervised by the developer. Effectiveness trials, on the other hand, are typically delivered under large-scale, real-world conditions. As might be expected, the results of program evaluations tend to attenuate as they move along a continuum from efficacy to effectiveness.5,6
In this article, we report the results of an effectiveness trial of Project ALERT, a substance use prevention curriculum that, along with LifeSkills Training, was the most prevalent evidence-based program in the nation's public middle schools as of 2005.3 Project ALERT has been listed on various registries as “model”7 and “exemplary,”8 but it is identified as “promising” (as opposed to model) on the most rigorous of these, Blueprints for Violence Prevention.9 This is, in part, because the curriculum has yet to be successfully replicated in multiple sites and to demonstrate sustained effects overtime on targeted risk behaviors.2 It is also of concern that 2 of the 3 published evaluations of Project ALERT have been conducted by its developer10; studies of this nature have typically yielded stronger results than those conducted by independent evaluators.11
Because Project ALERT is a school-based program and students are naturally clustered in schools, cluster designs have been the standard for evaluations of such programs. To date, there have been 3 discrete cluster randomized trials of Project ALERT in which core lessons were taught to seventh graders and booster lessons to eighth graders.12- 14 In a seminal trial, the developer used outside health educators to deliver the curriculum to seventh and eighth grade students. She found that seventh grade students who experimented with cigarettes were less likely to have smoked in the past month as eighth graders and more likely to have stopped using cigarettes. Project ALERT had little effect and a negative effect for baseline nonusers and committed smokers, respectively. For seventh graders who indicated that they had never used cigarettes or marijuana, Project ALERT reduced eighth grade initiation and current use of marijuana. No demonstrable program effect was found at the individual level for eighth graders who were classified as marijuana users at baseline. Although some modest decrease in alcohol use was found at the end of seventh grade, the results disappeared in eighth grade.12 The investigators followed these students through 12th grade and found that, once the lessons stopped, Project ALERT's effects on substance use also disappeared.15 Since this study, there have been 2 further randomized controlled trials of a revised version of Project ALERT16 that added 3 lessons addressing smoking cessation skills, alternatives to alcohol use, and the consequences of alcohol and inhalant use.14 The first, conducted by the developer, found positive effects after the eighth grade booster lessons on the initiation of cigarette and marijuana use, current cigarette use, and alcohol misuse, whereas no significant effects were found for the initiation of alcohol use, current alcohol use, or current marijuana use.13 The second, conducted by an independent evaluator, used Cooperative Extension agents to teach the curriculum and found no program effects, overall or by students' level of risk.14 In the cluster randomized trial reported herein, we assessed the curriculum's effects on lifetime (ie, initiation or uptake) and 30-day (ie, recent) use of cigarettes, alcohol, marijuana, and inhalants.
All public schools in the United States that housed at least grades 6 through 8, did not currently use an evidence-based substance use prevention program in those grades, and committed themselves to including all of their sixth grade students were eligible for participation in the study. We recruited schools in 2 cohorts spaced 1 year apart (in the 2004-2005 and 2005-2006 academic years) to minimize management burden and allow extended time for recruitment. We used the same selection criteria for both cohorts, with the following exception: we reduced sixth grade enrollment requirements from 200 to 100 students in the second cohort to expand our pool of eligible schools. We recruited 8059 schools from the universe of eligible schools as indicated by the Common Core of Data.17 All participating schools pledged not to administer any evidence-based curricula to the cohort of students targeted or to students in higher middle school grades.
We selected a school-level clustered randomized design rather than randomization of students or classes within schools to avoid attenuating study effects through contamination. Sample size requirements were determined a priori by examining the number of schools per condition needed to detect relative differences of 33% and 50% between the intervention and control conditions in prevalence rates for the primary outcomes of interest at follow-up. Clustering effects were accommodated by assuming an intraclass correlation of 0.01 after adjustment for covariates, which was higher and thus more conservative than the 0.003 value reported by Ellickson and Bell.12 The assumed average number of students per school was 240, which was expected to provide 144 completed surveys per school at follow-up after allowing for 40% attrition. Estimated prevalence rates for the substance use measures were based on national data for seventh grade students derived from the Pride Questionnaire Report.18 Analyses based on a formula by Murray19 indicated that 17 schools per condition would be sufficient, with 80% power and an α of .05 (2-tailed), to detect a 50% relative difference at follow-up for all outcomes of interest (except marijuana use, for which the power for 30-day marijuana use would be only 70%) and a 33% relative difference for lifetime use of alcohol, cigarettes, and inhalants.
One of us (R.L.F.) randomly assigned schools to the experimental condition, blocked by school district. Assignments were made on a flow basis as soon as a district's schools entered the study. Single schools from different districts were paired and randomly assigned to a condition. Assignment was implemented through the use of computer-generated random numbers.20 Students were unaware of their school's assignment status.
Of the 10 schools enrolled in the first cohort, we dropped 3 schools from the study that were assigned to the intervention group because they failed to include a majority of their sixth graders in the intervention, much less a census as requested. Investigators compared the baseline rates of substance use at the 1 remaining intervention school in that district and selected the most comparable control school to remain in the cohort. Of the 3 residual control schools in the district, 1 elected not to participate further. The remaining 2 were reenrolled in the 2005-2006 academic year in the study's second cohort of 30 schools and then randomly assigned to the intervention or control group. All schools in the second cohort completed the study. The study was completed in 34 schools in 21 school districts from 11 states. Study participants consisted of all sixth graders in recruited schools, with the exception of those in self-contained classrooms, whom we excluded because of concerns about their ability to respond to our surveys in a meaningful fashion.
Schools and teachers received compensation from the funding organization. To assist us with collecting active parental consent forms, all schools were given a choice between an incentive of $1000 to be given to the school or $200 to be given to each classroom of sixth graders. The latter incentive was designed to be split evenly between teachers (in the form of a gift card) and their students (in the form of a pizza party or something similar). The incentive was payable when 90% of the parental consent forms were signed and returned at the school or classroom level, regardless of whether parents provided or withheld consent. To facilitate our school recruitment effort, each school was promised $500 for each year that it participated in the study. Schools assigned to the control group were offered an additional $1000 as well as training and curriculum materials to be used with the cohorts after ours. In addition, teachers implementing Project ALERT program were given $60 to videotape each lesson, with a bonus of $100 if they taped all 11 core lessons and $30 if they taped all 3 booster lessons. Their schools were allowed to keep the video cameras once the booster lessons were completed. Each of these contributors was or will be paid from the funding organization.
We hired data collectors in each participating school district who administered surveys before students' exposure to the first Project ALERT lesson, as sixth graders, between January 25 and April 13, 2005, for the first cohort and between October 18, 2005, and March 9, 2006, for the second cohort. The data collectors surveyed students in each cohort a second time at least 30 days after the implementation of the last booster lesson, when they were seventh graders, between April 28 and May 30, 2006, for the first cohort and March 9 and June 5, 2007, for the second cohort. We enclosed each student's assent form and survey in an envelope, on the outside of which was a removable label with the student's name. Data collectors distributed each survey to the appropriate student and instructed the student to remove the name label. At that point, each survey was identified only by a unique code number that had been previously assigned by the research team, which maintained exclusive possession of the link to their names. Students were assured of confidentiality on their assent forms and verbally by their data collectors. Data collectors remained in the front of the classroom during survey administration to ensure students' privacy. Students were instructed to enclose their completed surveys in the envelopes before returning them to the data collectors. All study procedures were approved by the institutional review board of the Pacific Institute for Research and Evaluation.
Project ALERT is a 2-year manualized, classroom-based substance use prevention curriculum, originally designed for seventh and eighth graders and subsequently disseminated to sixth through eighth graders. The program targets cigarette, alcohol, marijuana, and inhalant use and seeks to motivate students not to use substances, to provide the skills to resist inducements from peers to use substances, and to support attitudes and beliefs that mitigate substance use. The curriculum itself consists of 11 lessons the first year, followed by 3 booster lessons the following year. These lessons were designed to fit within 45-minute class periods and are considered most effective when taught once a week.21 Activities include guided class discussions, small group activities, role-playing exercises, and videos.
Principals decided which school staff members would participate in our study. Altogether, 45 teachers and 1 counselor administered the core curriculum to sixth graders in the intervention schools, and 44 teachers and 1 counselor administered the booster lessons the following year to seventh graders. Instructors completed the Project ALERT training program at the study's expense. We suggested that lessons be taught weekly, but because of conflicts some were taught on a different schedule (mean [SD], 1.21 [0.42] lessons/wk). To reach all of the students, multiple teachers taught the program in each school, and many teachers taught multiple classes of sixth graders (mean [SD], 3.2 [1.8]; range, 1-10) and seventh graders (mean [SD], 3.7 [2.3]; range, 1-13). No adverse events or negative side effects from the program were reported. Students in the control group completed the same Drug Abuse Prevention survey as students in the intervention group but did not receive the Project ALERT intervention. All schools were allowed to administer non–evidence-based programs if they so desired.
We used 2 strategies to determine whether participating instructors administered all Project ALERT lessons. First, instructors recorded all sessions of the first class period to which they delivered the curriculum, using the video cameras that we provided. We received a complete set of lessons from all but 4 instructors, thus ensuring that 633 of 641 lessons (98.8%) that were recorded were taught. Second, instructors completed and returned attendance logs that tracked students' attendance at each of the lessons. We received completed attendance logs for the 8 classes with missing recordings, giving us confidence that all 641 classes that were recorded were actually taught. In addition, we received completed logs from 82 of 84 instructors (98%) who administered lessons to additional classes. Altogether, we were thus assured that at least 2074 of the total of 2129 lessons (97.4%) were actually taught.
This study sought to evaluate individual level effects of the 2-year Project ALERT curriculum on adolescents' lifetime and recent use of cigarettes, alcohol, marijuana, and inhalants. Students completed an 81-item self-report questionnaire that has been used in previous evaluations of the curriculum.13,14 The main outcomes examined were measures of the 4 substances constituting the study's primary outcomes. Lifetime use questions asked whether the respondent had ever used the substance (yes or no); those concerning recent use asked how many days they had used the substance in the preceding 30 days. Response options for the latter question included none, 1 or 2 days, 3 to 5 days, 6 to 19 days, and 20 or more days in the past month. Response options were dichotomized into none and at least 1 day.
We also examined other variables as covariates. Specifically, we assessed 2 potential mediators of substance use: expectations of future use and offers to use (1 item each), which were specific to cigarettes, alcohol, and marijuana. Data on the percentage of students at each school receiving free and reduced-price lunches and on urbanicity (urban, suburban, or rural) were obtained from the 2004-2005 Common Core of Data.17 To describe our participants, students reported their race and ethnicity by using questions developed by the US Census Bureau.22 At the end of each school year, we asked the schools which, if any, prevention activities (other than Project ALERT in intervention schools) they implemented; responses were scored 0 (no programs), 1 (no evidence-based programs), or 2 (≥1 evidence-based program) and were averaged across the first and second years of participation. The semester that students completed the pretest (fall vs spring of sixth grade) also was examined as a covariate.
We began by cleaning our data to remove cases with more than 1 logical inconsistency (eg, respondents who reported that they had used alcohol within the preceding 30 days but never in their lifetime), which decreased our pretest sample by 3.6% of respondents and our posttest sample by 7.5%. There was no evidence to suggest differential inconsistency as a function of intervention group. Missing covariate data were imputed by means of the Expectation Maximization algorithm, which uses maximum-likelihood estimation to ensure consistency between the variance-covariance matrix from the observed and imputed data.23 Missing data were imputed using the expectation maximization algorithms implemented in the missing value analysis module in SPSS statistical software (version 13.0; SPSS Inc, Chicago, Illinois). The proportion of missing values was minimal for covariates related to sex (1.1%), race (African American [1.7%] and white [1.7%]), and ethnicity (Hispanic [7.7%]).
We then compared key baseline scores of schools assigned to the intervention and control groups with regard to student- and school-level characteristics and potential confounders as measured at the school level (ie, concurrent exposure to prevention and when students took the pretest). These tests were performed using t tests (using the binomial approximation to the variance for dichotomous characteristics) for school-level characteristics. Student-level characteristics were compared using a hierarchical linear model analog to the t test, assuming a Bernoulli-distributed outcome. Students did not differ on outcomes as a function of when they completed the pretest (fall vs spring). There was no evidence of a cohort effect; therefore, cohort was not considered further. A Heckman 2-step procedure24,25 was used to produce a student-level variable representing selection biases as measured by these covariates (ie, the inverse Mills ratio). Similar comparisons were used to determine whether attrition was systematically related to these student-level characteristics.
The study's primary analyses were performed using hierarchical nonlinear modeling. Each model regressed 1 of the 8 substance use outcomes on (1) perceived availability/offers of use, (2) intentions to use, (3) the orthogonal interactions of the preceding 2 predictors with intervention status, (4) the inverse Mills ratio, (5) an intervention status dummy variable, (6) a time dummy variable, and (7) an orthogonal interaction term for intervention status by time. Models took the following general form:
Log [poutcome/(1 − poutcome)] = π0 + π1 (Time) + π2 (Time × Intervention), in which p indicates the probability of the outcome.
Further information concerning our analysis strategy may be obtained from one of us (C.L.R.). All models were run using HLM software, version 6.04.26
For cohorts 1 and 2, the flow diagrams presented in Figure 1 and Figure 2, respectively, display the assignment of schools and students to the intervention and control groups and their movement through the study.
Flow diagram showing participation in cohort 1. Ten schools were randomly assigned to an intervention or a control group. After consent was obtained, pretest evaluations were administered to sixth graders. Posttest evaluations were administered after the booster lessons (in the intervention group) to the seventh graders.
Flow diagram showing participation in cohort 2. Thirty schools were randomly assigned to an intervention or a control group. After consent was obtained, pretest evaluations were administered to sixth graders. Posttest evaluations were administered after the booster lessons (in the intervention group) to the seventh graders.
Students' baseline characteristics, disaggregated by treatment group, are displayed in Table 1. Schools in the control group were more likely to offer prevention programming not related to Project ALERT (1.12 vs 0.85; P = .01), and intervention group students were more likely to have used alcohol in the past 30 days than control group students (7.5% vs 5.5%; P = .03). As can be seen in Table 1, students who dropped out of the study were less likely to be white (47.6% vs 52.5%). Differential attrition was not a problem in our study because attrition between times 1 and 2 was approximately 21% in both the intervention and control groups.
There were 5883 unique participants who contributed data to the analyses. We chose an intent-to-treat model for our analysis strategy and therefore used all available data, regardless of whether there were repeated observations for any particular case. After data cleaning, we had repeated observations for 4466 of the 5883 unique participants (75.9%). Examining these data by time and intervention group, our final analyses were based on 2765 intervention pretest observations, 2324 intervention posttest observations, 2805 control pretest observations, and 2358 control posttest observations.
Raw baseline and follow-up prevalence rates for all 8 outcome measures in both study conditions are provided in Table 2. As expected, use of all substances increased over time in both conditions. Results of the tests to determine the statistical significance of apparent intervention effects, after covariate adjustment, are displayed in Table 3. Intervention effects are captured with the time × intervention terms. Because the odds ratios for these interaction terms are not readily interpretable, we decomposed the interactions by calculating discrete odds ratios for times 1 and 2 using predicted probabilities from our models. With the exception of lifetime cigarette use, almost all of the differences between the 2 groups in time-related patterns of use were nearly identical or in the desired direction. More specifically, although intervention students tended to exhibit more use at baseline, rates of use for the 2 groups generally converged at follow-up. A significant difference in the expected direction was found for 1 of the 8 outcome measures examined: 30-day alcohol use. This effect exhibited a similar pattern: although intervention students were twice as likely to have drunk alcohol in the past 30 days at time 1 (odds ratio, 2.07), intervention and control students were more similar at time 2 (odds ratio, 1.32).
We conducted a large randomized trial of Project ALERT, one of the nation's preeminent school-based substance use prevention curricula, to test for 8 potential program effects on lifetime and 30-day use of cigarettes, alcohol, marijuana, and inhalants. Of these, we found a single positive effect on 30-day alcohol use. This finding might be considered an anomaly in that none of the 3 previous evaluations of Project ALERT reported a main effect of this nature; positive results have only been found for alcohol misuse, cigarette use, and marijuana use.
Although this study was conducted largely under real-world (or effectiveness) conditions, a number of circumstances should be considered when interpreting its results. First, we tested the program's effects on sixth graders, not the seventh graders for whom it originally was developed, and it is possible that the material may have been developmentally inappropriate for some of the students targeted. It is also likely that, had we targeted older students, substance use may have been more frequent, in which case we would have had more analytic power to determine study outcomes. Our statistical power appears to have been further eroded by the somewhat larger-than-typical interclass correlations observed for a number of the outcome measures, marijuana use in particular. Second, although our study schools were national, we recruited a convenience sample, and the external validity of the study findings is thus limited. Third, we waited until the conclusion of the 3 booster lessons administered in the study's second year to administer our initial posttest evaluation. Because main effects for evaluations of this nature generally attenuate after a prevention curriculum's initial year, the likelihood that we would have found significant effects for Project ALERT may have suffered as a result. On the other hand, our strategy of waiting at least 30 days after the final booster lesson before administering the posttest evaluation and our use of data collectors who were unknown to the students outside the context of the study should have minimized any contextual demand effects that may otherwise have exaggerated programmatic outcomes.
We believe that our analysis strategy was quite conservative because it made use of all available data, regardless of students' actual level of exposure to the intervention. We used a repeated-measures strategy, rather than a regressor approach, to statistically control for baseline status on key measures and then to examine differences between intervention and control groups at the posttest evaluation. We chose the former strategy because our use of hierarchical nonlinear modeling enabled us to use all respondents, regardless of whether they yielded missing data at the pretest or the posttest evaluation, which is more consistent with an intent-to-treat approach to analysis. Also, when we compared the effects of both analysis strategies using only those cases with data at the pretest and posttest evaluations, the pattern of results was nearly identical.
Taking into consideration our findings in conjunction with those reported earlier,12- 14 should Project ALERT be considered evidence based? The answer, of course, depends on the standards of evidence used. We believe that the methodological rigor of this and previous evaluations of the curriculum have all been high. All 4 of these studies have used randomized controlled trials at different sites with large samples, have achieved reasonably high follow-up rates, and have taken into account preexisting differences at baseline and the clustering effects associated with schools as the unit of assignment. However, the program effects yielded by these evaluations have been inconsistent, and our 1 positive finding related to 30-day alcohol use has not been replicated in any of the others. Study findings suggest that Project ALERT was not effective when delivered to the sixth grade population we targeted.
Correspondence: Christopher L. Ringwalt, DrPH, Pacific Institute for Research and Evaluation, 1516 E Franklin St, Ste 200, Chapel Hill, NC 27514 (email@example.com).
Accepted for Publication: December 19, 2008.
Author Contributions: Dr Ringwalt had full access to all of the data in the study and takes responsibility for the integrity of the data and the accuracy of the data analysis. Study concept and design: Ringwalt and Flewelling. Acquisition of data: Ringwalt, Clark, and Hanley. Analysis and interpretation of data: Ringwalt, Shamblen, and Flewelling. Drafting of the manuscript: Ringwalt. Critical revision of the manuscript for important intellectual content: Clark, Hanley, Shamblen, and Flewelling. Statistical analysis: Shamblen and Flewelling. Obtained funding: Ringwalt. Administrative, technical, and material support: Clark and Hanley. Study supervision: Ringwalt and Clark.
Financial Disclosure: None reported.
Funding/Support: This study was supported by grants 2003-DR-FX-001 and 2007-JF-FX-0064 from the Office of Juvenile Justice and Delinquency Prevention, Office of Justice Programs, US Department of Justice (Dr Ringwalt).
Role of the Sponsor: The Office of Juvenile Justice and Delinquency Prevention is a governmental agency that had no role in the design and conduct of the study; collection, management, and analysis of the data; or preparation, review, or approval of the manuscript.
Additional Contributions: Sharon Fowler and Tina Owen, of the Pacific Institute for Research and Evaluation, assisted with manuscript preparation. Chris Wiesen, PhD, of the Odum Institute for Research in Social Sciences, provided statistical expertise. Members of our Evaluation Advisory Board provided guidance throughout the project. We thank the participating students, parents, teachers, school and school district staff, and local data collectors, without whom this study would not have been possible.
Ringwalt CL, Clark HK, Hanley S, Shamblen SR, Flewelling RL. Project ALERTA Cluster Randomized Trial. Arch Pediatr Adolesc Med. 2009;163(7):625–632. doi:10.1001/archpediatrics.2009.88