Welcome to the California Department of Alcohol and Drug Programs

Evaluation Advisory Group Meeting Notes

Department of Alcohol and Drug Programs
Office of Applied Research and Analysis
Substance Abuse and Crime Prevention Act of 2000
Evaluation Advisory Group (EAG)
Meeting Notes
February 28, 2001


Kathy Jett, Director, Department of Alcohol and Drug Programs, welcomed participants, and issued a challenge for the group. She invited attendees to assist the Department in developing a strategy for the implementation and evaluation of Proposition 36 - the Substance Abuse and Crime Prevention Act of 2000 (the Act), which becomes effective on July 1, 2001. The Department will be responsible for producing an annual evaluation report and a longitudinal (5 year) evaluation, by which the Act will measure its success.

She spoke to the administration of funds; from the state to the county, with the counties having discretionary authority for spending when in compliance with the provisions of the Act. One of the pressing issues involves assessment tools currently in use, and the need to standardize assessment core data to be collected, since assessment tools currently in use may have an emphasis in one area (such as mental health), and not adequately address other areas.

Ms. Jett also alluded to the fact that many agencies collect enormous amounts of data currently; the groups’ challenge will be to determine if the data collected provides all of the information necessary to complete a comprehensive evaluation. The evaluation, and the process, is the most important piece of Proposition 36.

Ms. Jett thanked the group for coming, for their participation in the process, and encouraged continued refining of the group, so that all issues and client populations are fully represented.

Evaluation Requirements, Process and Status

Susan Nisenbaum, Deputy Director, Office of Applied Research and Analysis, within the Department of Alcohol and Drug Programs, also welcomed the group. Susan then asked each participant to introduce themselves to the group, express their areas of interest with regard to the group, and any concerns they might have regarding what they expected to accomplish in this first meeting.

Susan stated she (within the Department) is responsible for the evaluation of the Act, and expects that the group will help determine what needs to be done. She stated that by careful, effective, supportable research, the evaluation should measure the outcomes and impacts of the Act. This will bring about statewide, and possibly national, policy changes, and working within this very tight timeframe, look at whether treatment works, and whether treatment pays. The passage of Proposition 36 indicates the inclination of California’s citizens toward humane treatment instead of incarceration, and that this population needs to be treated as patients rather than criminals.

The evaluation should address what has been accomplished so far, and focus on annual, long-term, and public safety issues.

The evaluation will be conducted by a public university:

  • funding of $300,000 to start
  • Administered by the Department

Purpose of the EAG

Samantha briefly summarized the process of selecting the public university to conduct the longitudinal study required by legislative language. DADP received four proposals: UCLA, UCSF, UC-Riverside, and CalPoly at San Luis Obispo.

Group Discussion: Supporting and Advising on the Evaluation

There are different ideas of what constitutes success with regard to an effective evaluation.

The group offered that definitions must be clear.

What is do-able:

  • what data is already in place
  • what are the common elements of data being gathered
  • what must be collected in addition to what is being collected
  • what must be done to rework existing systems to include additional requirements
  • how to keep the process as user friendly as possible, without adding layers and placing an additional burden on providers
  • what existing tools are available
  • are they linked in any way; how can linkages be accomplished
  • do they provide all the data required for the annual, longitudinal, and county annual reports
  • who paid for services
  • what is the impact of the Act
  • what will be the influence of this group
  • what must be collected at each level, county and statewide
  • how to identify expertise to develop/operationalize process
  • research questions: what do we need answers to
  • what measurable results/outcomes at provider level need to be addressed
  • legislation drives and constrains what the group can/should do (the language contained in the legislation will determine actions).

The group will impact state level leadership on Proposition 36 implementation and evaluation. The process evaluation is extremely important.

The group discussed the provisions of the law, and discussed studying different portions of the required evaluation at different times throughout. Year one does not have to be the same as year two, since there will be limited information available at the end of the first year. The evaluation should be driven by the research questions. It was suggested that an open interpretation of the legislative language would be useful, and the group should not be overly focused on the treatment side. The composition of the group should reflect diversity, and have adequate representation for all populations.

The following items were offered for thought:

  • Section 11999.9: Annual evaluation and cost effectiveness
  • Cost effectiveness/process evaluation
  • Year one does not have to be the same as year two
  • Is sampling the best method?
  • CALTOPPS captures some of the necessary data
  • Learn what we can from the CALWORKS model
  • Outside evaluation process - longitudinal study by outside evaluator
  • Latitude in legislative language
  • Flexibility: "aimed at program effectiveness"
  • No explicit definition of "program" in the law
  • Must have feedback; cannot just monitor
  • Changes of cost in crimes; descriptive information
  • No control group
  • Limited information available at end of first year
  • Cost benefit analysis
  • Treatment vs. incarceration
  • Impact on criminal behavior
  • Reduction of costs on all impacted systems
  • Community safety (public)
  • What constitutes effectiveness
  • Financial impact
  • Drug Courts that already exist

County Reports

  • Drug Court experience; stick with agreed on forms
  • What is a result of Prop 36
  • When does client become eligible (clock starts ticking upon conviction)
  • Who are Prop 36 people
  • Net widening
  • Purpose and intent of law
  • Threats to internal validity

ADP: Revise Current Data Collection System

  • CADDS data collection
  • Does county annual data have consistent criminal justice reporting capability
  • Confidentiality: who owns the information
  • No/minimal duplication of effort
  • Well-being outcomes
  • Original vision/Timeline

Request for Application - Attachment A

  • Annual report (to Legislature)
  • Released when/budget issue
  • February of '02 problem?
  • Review period for selecting contractor much too short
  • Ask for correct skill sets
  • What is required background of contractor
  • Balance important/quick timeframe
  • Remain flexible; 2 cycle process

How should this group work: advisors to ADP or evaluators? Who will facilitate? ADP or evaluators?

Next Steps

When shall we meet? Meet with state data people Think of research questions Email group with suggestions.

Susan thanked the group for their participation and hard work, and invited continued communication and participation.


Overview of Evaluation Implementation Plan:

3-Point Process

  • Work Advisory Committee (Representatives of Corrections, Probation and Parole, Board of Prison Terms, Department of Justice, and the Universities)
  • Selected for knowledge of databases and evaluation research
  • Assists with resource identification, evaluation guidelines, design and data collection methods

Required Activities:


  • Lower incarceration costs
  • Crime reduction
  • Reduced jail/prison construction
  • Reduced welfare costs
  • Adequacy of funds(?)
  • Other issues
Long-term evaluation:
  • Effectiveness and fiscal impact

County Annual Reports:

  • Number and characteristics of clients served
  • ADP "promulgate" form to be used for reporting
  • Other information - as required


  • Criminal justice system
  • Data/process
  • Composition of group and its diversity
  • Open interpretation of the law
  • Be question driven on research
  • Don’t be overly treatment focused

Research Frame:

  • Speak to changes in costs of crime
  • No control group - work limits
  • "Meat" is study of implementation process (especially years 1 and 2)
  • Outcome is evaluation of the last 3 years
  • Different approaches each year
  • Sampling methodology versus statewide

Annual Research Frame:

  • CALTOPPS gathers some of needed data currently
  • Is there enough funding? Can we obtain more?
  • ADP resources: enough?
  • Study different parts of required evaluation at different times
  • Begin with goal of monitoring change in basic ways. Establish what indicators of change we want most, first.

Research Frame:

  • Monitoring differences from other evaluation efforts (with feedback)
  • Recognize implementation as an ongoing process - ever changing - cyclical "annual process report"
  • Learn what we can from CALWORKS model

Outside Evaluation:

  • "Aimed at" evaluation effectiveness (gives flexibility) and financial impact of SACPA programs
  • No explicit definitions of programs in the law
  • Cost benefit analysis of treatment versus incarceration and whether more/less likely to be involved in crimes; public safety impact (community)
  • Effectiveness of treatment on recidivism and cost benefit (financial impact)

County Reports:

  • Once form is decided on, stick with it
  • Treatment keeps lots of data already
  • Drug Court data includes criminal justice data
  • Define "Result" - not clear in language
  • How do we define when people are "Prop 36 eligible" (Points in time)
  • Net widening
  • Threats to internal validity
  • Annual reports to ADP include arrest and court data?
  • DOJ reports - don’t duplicate
  • Data linkage for longitudinal data
  • Confidentiality
  • Who owns what information
  • Longitudinal criminal and treatment histories - can we do this?

RFA Questions:

  • Do we submit the annual report to the legislature?
  • Is February of ’02 a problem?
  • Realistic timeframe to establish Interagency Agreement/select contractor (Review period too short)
  • Quality of selection process an issue
  • Remain flexible; take 21 days
  • 2 cycles for selection of contractor

Next Steps:

  • Meeting frequencies
  • Discuss data with state departments
  • Research Questions - think about this during interim

Next Meeting:
Thursday, March 22, 2001
The Department of Food and Agriculture Building
1220 N Street, Room A-477
10:00 a.m. - 3:00 p.m.
ADP (OARA) will facilitate