It wasn't effective anywhere, that's the problem. Look at the studies done on it, which are listed in smjjames link. Studies either find no effect whatsoever, or participants took more drugs or tried them at younger ages than non-participants. I'll just drop the overview of the studies here:
Studies on effectiveness
1992 – Indiana University
Researchers at Indiana University, commissioned by Indiana school officials in 1992, found that those who completed the D.A.R.E. program subsequently had significantly higher rates of hallucinogenic drug use than those not exposed to the program.[24]
1994 – RTI International
In 1994, three RTI International scientists evaluated eight previously-done quantitative analyses on DARE’s efficacy that were found to meet their requirements for rigor.[25][26] The researchers found that DARE’s long-term effect couldn’t be determined, because the corresponding studies were “compromised by severe control group attrition or contamination.”[26] However, the study concluded that in the short-term “DARE imparts a large amount of information, but has little or no impact on students’ drug use,” and that many smaller, interactive programs were more effective.[25][27]
After the 1994 Research Triangle Institute study,[28][29] an article in the Los Angeles Times stated that the “organization spent $41,000 to try to prevent widespread distribution of the RTI report and started legal action aimed at squelching the study.”[30] The director of publication of the American Journal of Public Health told USA Today that "D.A.R.E. has tried to interfere with the publication of this. They tried to intimidate us."[31]
1995 – California Department of Education
In 1995, a report to the California Department of Education by Joel Brown Ph. D. stated that none of California's drug education programs worked, including D.A.R.E. "California's drug education programs, D.A.R.E. being the largest of them, simply don't work. More than 40 percent of the students told researchers they were 'not at all' influenced by drug educators or programs. Nearly 70 percent reported neutral to negative feelings about those delivering the antidrug message. While only 10 percent of elementary students responded to drug education negatively or indifferently, this figure grew to 33 percent of middle school students and topped 90 percent at the high school level." In some circles educators and administrators have admitted that DARE in fact potentially increased students exposure and knowledge of unknown drugs and controlled substances, resulting in experimentation and consumption of narcotics at a much younger age. Criticism focused on failure and misuse of tax-payer dollars, with either ineffective or negative result state-wide. [30]
1998 – National Institute of Justice
In 1998, a grant from the National Institute of Justice to the University of Maryland resulted in a report to the NIJ, which among other statements, concluded that "D.A.R.E. does not work to reduce substance use."[32] D.A.R.E. expanded and modified the social competency development area of its curriculum in response to the report. Research by Dr. Dennis Rosenbaum in 1998[33] found that D.A.R.E. graduates were more likely than others to drink alcohol, smoke tobacco and use illegal drugs. Psychologist Dr. William Colson asserted in 1998 that D.A.R.E. increased drug awareness so that "as they get a little older, they (students) become very curious about these drugs they've learned about from police officers."[34] The scientific research evidence in 1998 indicated that the officers were unsuccessful in preventing the increased awareness and curiosity from being translated into illegal use. The evidence suggested that, by exposing young impressionable children to drugs, the program was, in fact, encouraging and nurturing drug use.[35] Studies funded by the National Institute of Justice in 1998,[32][36] and the California Legislative Analyst's Office in 2000[37] also concluded that the program was ineffective.
1999 – Lynam et al.
A ten-year study was completed by the Donald R. Lynam and colleagues in 2006 involving one thousand D.A.R.E. graduates in an attempt to measure the effects of the program. After the ten-year period, no measurable effects were noted. The researchers compared levels of alcohol, cigarette, marijuana and the use of illegal substances before the D.A.R.E. program (when the students were in sixth grade) with the post D.A.R.E. levels (when they were 20 years old). Although there were some measured effects shortly after the program on the attitudes of the students towards drug use, these effects did not seem to carry on long term.[38]
2001 – Office of the Surgeon General
In 2001, the Surgeon General of the United States, David Satcher M.D. Ph.D., placed the D.A.R.E. program in the category of "Ineffective Primary Prevention Programs".[6] The U.S. General Accounting Office concluded in 2003 that the program was sometimes counterproductive in some populations, with those who graduated from D.A.R.E. later having higher than average rates of drug use (a boomerang effect).
2007 – Perspectives on Psychological Science
In March 2007, the D.A.R.E. program was placed on a list of treatments that have the potential to cause harm in clients in the APS journal, Perspectives on Psychological Science.[39]
2008 – Harvard
Carol Weiss, Erin Murphy-Graham, Anthony Petrosino, and Allison G. Gandhi, “The Fairy Godmother—and Her Warts: Making the Dream of Evidence-Based Policy Come True,” American Journal of Evaluation, Vol. 29 No.1, 29–47(2008) Evaluators sometimes wish for a Fairy Godmother who would make decision makers pay attention to evaluation findings when choosing programs to implement. The U.S. Department of Education came close to creating such a Fairy Godmother when it required school districts to choose drug abuse prevention programs only if their effectiveness was supported by "scientific" evidence. The experience showed advantages of such a procedure (e.g., reduction in support for D.A.R.E., which evaluation had found wanting) but also shortcomings (limited and in some cases questionable evaluation evidence in support of other programs). Federal procedures for identifying successful programs appeared biased. In addition, the Fairy Godmother discounted the professional judgment of local educators and did little to improve the fit of programs to local conditions. Nevertheless, giving evaluation more clout is a worthwhile way to increase the rationality of decision making. The authors recommend research on procedures used by other agencies to achieve similar aims.
2009 – Texas A&M
“The Social Construction of ‘Evidence-Based’ Drug Prevention Programs: A Reanalysis of Data from the Drug Abuse Resistance Education (DARE) Program,” Evaluation Review, Vol. 33, No.4, 394–414 (2009). Studies by Dennis Gorman and Carol Weiss argue that the D.A.R.E. program has been held to a higher standard than other youth drug prevention programs. Gorman writes, “what differentiates D.A.R.E. from many of the programs on evidence-based lists might not be the actual intervention but rather the manner in which data analysis is conducted, reported, and interpreted.” Dennis M. Gorman and J. Charles Huber, Jr.
As the D.A.R.E. program has been subjected to increasing scrutiny over the years, its overall effectiveness has become something of a controversy and is still much debated.
The U.S. Department of Education prohibits any of its funding to be used to support drug prevention programs that have not been able to demonstrate their effectiveness.[40] Accordingly, D.A.R.E. America, in 2004, instituted a major revision of its curriculum which is currently being evaluated for possible effectiveness in reducing drug use.[41]
The U.S. Substance Abuse and Mental Health Services Administration (SAMHSA) identified alternative start-up regional programs, none of which have longevity nor have they been subjected to intense scrutiny.[42]
So, DARE seems to make things worse. It started in 1983, and you can look at violent crime rates, and see that they were
falling until the very year that DARE started, but then began to skyrocket:
My point here isn't that DARE caused this, it's that just looking at correlations is meaningless. If crime had
fallen immediately after DARE was initiated, they'd be
taking credit for that. But that's not what happened: an unprecedented
crime spike occurred after DARE was initiated. But they wash their hands of that "clearly not our fault, because we're an
anti-crime initiative". Taking credit for any positive change but absolving oneself of blame for any negative change is clear institutional confirmation bias.
However, since the studies suggest DARE makes drug-taking behavior
more likely, then we can in fact make a solid claim that DARE contributed some small part to the late 1980s crime wave, and in fact
impeded the subsequent declines, rather than contributing to them. In fact, the homicide rate fell strongly again from 2005-2010, at the same time that DARE funding was slashed because of negative political reactions to the previous studies.
When I said I hadn't looked into it "too much" I meant, I've
read the descriptions of the studies relating to DARE's effectiveness, without chasing up the original research, or looking for other articles.