Use the links below to skip to the chapter you wish to view:
- Chapter 1—The Counties Have Provided Weak Oversight of the JJCPA
- Chapter 2—The State Has Not Provided Sufficient Oversight of the JJCPA
The Counties Have Provided Weak Oversight of the JJCPA
The five counties we reviewed did not adequately oversee their JJCPA planning efforts. One county did not have the required Coordinating Council, and two lacked some of the required representatives on their Coordinating Councils. We determined that up to 10 other counties in the State also lack Coordinating Councils. In addition, although state law requires counties to update their comprehensive plans annually, the five counties we reviewed have made only infrequent and limited revisions since initially developing their plans in 2001, despite significant changes in the statewide juvenile justice landscape. Moreover, the counties’ comprehensive plans generally lacked critical information, such as how they define at‑riskAB 413 (Chapter 800, Statutes of 2019) deleted the term “at-risk” used to describe youth for purposes of various provisions in the California Education and Penal Codes and replaced it with the term “at-promise.” However, the term “at-risk” currently remains in JJCPA as part of the California Government Code. As a result, we use the term “at-risk” consistent with the JJCPA throughout our report. youth, despite state law requiring Coordinating Councils to describe how they will serve this population. Plans that are outdated and lack critical information are of limited value for stakeholders and the public because they do not demonstrate how counties are adapting to changes in their juvenile justice environment.
Although counties have broad discretion in how they choose to spend JJCPA funds, they have not demonstrated that the programs they have chosen to operate are effective. State law requires counties to include a description or analysis in their year‑end reports of how their JJCPA‑funded programs may have contributed to or influenced countywide juvenile justice trends. However, the counties we reviewed have not submitted to Community Corrections meaningful evaluations of the effectiveness of their JJCPA programs, hindering the ability of decision makers and stakeholders to gauge whether the counties are using JJCPA funds in a manner that reduces juvenile crime and delinquency.
Coordinating Council Representatives
- One from the district attorney’s office.
- One from the public defender’s office.
- One from the sheriff’s department.
- One from the board of supervisors.
- One from the department of social services.
- One from the department of mental health.
- One from a community‑based drug and alcohol program.*
- One from a city police department.
- One from the county office of education or a school district.
- One from the community at large.
- Two from nonprofit CBOs providing services to minors.
Source: State law.
* This member can be one of the two representatives from nonprofit CBOs providing services to minors.
The Coordinating Councils We Reviewed Did Not Always Include Statutorily Required Representatives
The JJCPA requires each county’s Coordinating Council to develop, review, and annually update its comprehensive plan in part with the goal of reducing juvenile crime and delinquency through crime prevention strategies. A Coordinating Council must include representatives from at least 11 specific entities, which the text box lists. The Coordinating Councils we reviewed met at least annually but had varied processes to update their plans, which we describe later in this section. The diverse representation of Coordinating Councils is key to ensuring the multiagency approach that the JJCPA requires and to directing JJCPA funding toward the services, geographic areas, and programs that councils deem most important.
However, we found that some counties that receive JJCPA funding do not have Coordinating Councils. Of the five counties we reviewed, Mendocino has not had a Coordinating Council since 2009. The county’s chief probation officer was unsure why the county lacks a Coordinating Council. Because Mendocino lacks a Coordinating Council but has still received JJCPA funds, we researched whether any of the remaining 53 counties also lack Coordinating Councils. Six counties confirmed that they lacked Coordinating Councils during our audit period. Another four counties’ websites are unclear whether they have councils, and these counties did not respond to our inquiries; therefore, they may also lack Coordinating Councils. For instance, Plumas County received JJCPA funding during fiscal year 2018–19, and its probation department submitted a comprehensive plan to Community Corrections in May 2018. However, the county board of supervisors approved a resolution in September 2019 indicating that it was seeking to establish a Coordinating Council, thus acknowledging that it did not have one. We describe in Chapter 2 why counties without Coordinating Councils likely continued to receive JJCPA funding.
Of the four counties we visited that had established Coordinating Councils, only two—San Joaquin and Santa Barbara—had all of the required representatives in each of the five years we reviewed, as Table 2 shows. Kern’s Coordinating Council lacked a representative from a drug and alcohol abuse prevention program in fiscal years 2016–17 through 2017–18. The county’s probation department attempted to find a representative for the vacancy in August 2016 but did not receive any responses from interested applicants, and it did not revisit this vacancy until September 2019.
|Did the Coordinating Council include all statutorily required members throughout the fiscal year?
Source: State law and Coordinating Council meeting minutes and rosters.
* Mendocino did not have a Coordinating Council during our audit period.
Similarly, Los Angeles’s Coordinating Council lacked representatives from several required entities throughout each of the five years in our audit period. Although its April 2016 meeting minutes reflect that the Coordinating Council believed it had all of the required representatives, the county did not have a statutorily complete council until May 2018. Specifically, the county’s Coordinating Council did not include a representative from the county social services department until January 2017, from a community‑based drug and alcohol abuse prevention program until February 2017, and from two CBOs until January 2018. It then lacked a community representative from January 2018 through May 2018, when it finally had a fully constituted Coordinating Council. The chair of the county’s Coordinating Council assumed her position in February 2017 and told us that she immediately started working to add the missing representatives, but she did not know why the county did not take steps to fill the vacancies earlier. Without the diverse representation envisioned by the JJCPA, counties are unable to meet the JJCPA’s requirement to have a multiagency approach to juvenile justice planning.
A failure to adopt bylaws may have contributed to the lack of required representatives on some Coordinating Councils. As an established best practice in the absence of statutory or regulatory requirements, bylaws are the main governing document of a board or council and guide how the entity will operate. Bylaws increase the level of accountability, transparency, and effectiveness of entities and clearly outline authority levels, rights, and expectations. As a result, we expected counties to have established bylaws for their Coordinating Councils that describe how they fill their memberships and maintain required representation. San Joaquin and Mendocino, however, did not have any bylaws governing their Coordinating Councils. Mendocino has not had a Coordinating Council since at least 2009, and in response to our inquiry, San Joaquin said that it intended to establish bylaws going forward.
Further, the Coordinating Councils at the counties we reviewed had different processes for updating their comprehensive plans. For example, Kern’s and San Joaquin’s probation departments updated their counties’ comprehensive plans and submitted them to their respective Coordinating Councils for approval. The probation department in Mendocino, which did not have a Coordinating Council during our audit period, updated and submitted the county’s plans to Community Corrections. The remaining two counties—Los Angeles and Santa Barbara—have more inclusive plan development processes. Until recently, Los Angeles’s probation department updated its county’s comprehensive plan and submitted it to its Coordinating Council for approval. However, in 2019 Los Angeles’s Coordinating Council established an ad hoc subcommittee, whose members are proportionally representative of the full council’s composition, to update and revise its plan. The subcommittee’s first plan revision was for fiscal year 2019–20.
Santa Barbara used a work group during our audit period to develop and update its comprehensive plan. Before 2018 Santa Barbara used a temporary, informal work group with members appointed by the Coordinating Council to draft each year’s comprehensive plan. In 2018 the county formally established the work group, which meets every month to address issues that the Coordinating Council assigns to it. The work group is composed of members from each county agency required by statute to have a representative on the Coordinating Council, as well as from two city police departments and three CBOs. The county’s probation department told us that the Coordinating Council created the work group in part because it allows members to discuss the developing plan and provide input on its goals, objectives, and strategies. The probation department explained that having the work group ensures that juvenile justice agencies, county agencies, and community partners prepare in partnership the draft comprehensive plan that the Coordinating Council reviews and adopts.
Many Counties’ Comprehensive Plans Are Outdated and Incomplete
Although state law requires counties to annually update their comprehensive plans to reflect their current approaches to responding to at‑risk youth and juvenile offenders, the five counties we reviewed have rarely made substantial revisions to their plans over the last 20 years, despite significant changes in state law and decreases in juvenile arrest rates. Moreover, most of the counties’ comprehensive plans failed to define or explicitly identify at‑risk youth—a population that state law requires counties to address in their plans. When counties make only minimal updates to their comprehensive plans and fail to adequately identify services and strategies to address at‑risk youth, their comprehensive plans are likely to be outdated, incomplete, and of limited use for stakeholders and the public.
Despite Significant Changes in the Juvenile Justice Landscape, the Counties Have Rarely Modified Their Comprehensive Plans
State law requires Coordinating Councils to annually update their counties’ comprehensive plans and to submit them for the upcoming fiscal year in the format that Community Corrections specifies. Beginning in fiscal year 2002–03, Community Corrections implemented a template, referred to as an application for funding, that required each Coordinating Council to indicate either that the county was applying for continued funding without making changes to its plan or that it had made substantive modifications to its plan. However, even if a county had made a substantive modification to its plan, Community Corrections initially did not require its Coordinating Council to submit the revised plan. In fiscal year 2006–07, Community Corrections modified the application for funding to require a Coordinating Council to include its fully revised plan if it indicated on the application that the county had made substantial changes to the plan components. Such changes could include the removal or addition of a program, changes in the target population served by a program, or significant changes in a program’s outcomes. Other modifications to the plan might include changes in the prioritization of areas in the community that are affected by juvenile crime, changes in the resources that provide services to youth and their families, and changes to the county’s responses to at‑risk youth and juvenile offenders. In fiscal year 2016–17, Community Corrections significantly revised its template by consolidating into one plan the required information for the JJCPA and the Youthful Offender Block Grant, which we describe in the Introduction.
We expected that Coordinating Councils would periodically revise their counties’ plans to reflect major changes in the statewide juvenile justice landscape. Since the passage of the JJCPA, various state laws have substantially shifted the way that the State and local governing entities treat juvenile offenders. For example, as Figure 1 shows, in 2014 and 2016, California voters approved propositions that reduced certain crimes from felonies to misdemeanors and reduced the penalties for certain drug‑related offenses. These reductions and other shifts in state policy over the last two decades likely contributed to a decrease in statewide juvenile arrest rates, which declined by 76 percent from 2002 through 2018. We expected that in response to the decreasing number of juvenile offenders, counties would have periodically reassessed the areas where juvenile crime occurs and made changes to their strategies for addressing juvenile crime. In fact, according to Community Corrections, the comprehensive plans are intended to describe how JJCPA‑funded programs fit within the context of counties’ overall juvenile justice strategies. By updating their comprehensive plans, Coordinating Councils could demonstrate to their communities that their counties are appropriately modifying their strategies for serving juveniles to reflect changes in the State’s approach to addressing juvenile crime and delinquency.
Significant Changes in the Juvenile Justice Landscape Merited Revisions to Counties’ Comprehensive Plans
Source: State law, California Department of Justice’s Juvenile Justice in California annual reports, 2002 through 2018, and the five counties’ comprehensive plans.
However, the Coordinating Councils for the counties we reviewed generally did not update their counties’ comprehensive plans, and when they did, the counties made only limited revisions that failed to demonstrate how their strategies for addressing juvenile crime and delinquency had changed over the last 20 years. For example, as Figure 2 shows, San Joaquin has not reported any significant changes to its comprehensive plan that would indicate a shift in the county’s strategy for addressing juvenile crime and delinquency. Instead, its changes were at the program level, such as when it reported in fiscal years 2004–05 and 2010–11 that it removed programs, and in fiscal years 2015–16 and 2017–18 that it added programs operated by its probation department. However, San Joaquin did not explain whether or how either of these changes represented a shift in its approach to addressing juvenile crime and delinquency. In addition, Mendocino made some changes to its comprehensive plan in fiscal years 2004–05 and 2009–10, but it did so primarily to eliminate certain JJCPA‑funded programs, largely because of budget reductions. It did not make any further changes to its comprehensive plan until fiscal year 2019–20.
Coordinating Councils Have Made Few Changes to Their Comprehensive Plans
Source: Counties’ comprehensive plans submitted to Community Corrections, fiscal years 2002–03 through 2019–20.
Similarly, one of the few changes Kern made to its comprehensive plan was in fiscal year 2004–05, when it terminated a program operated by its probation department because of funding constraints. Kern did not update its plan again until fiscal year 2010–11, when it added a JJCPA‑funded program and revised its method for assessing whether juveniles are at risk of reoffending. Although this latter change is significant because it represents a shift in the county’s strategy for identifying and prioritizing juveniles, Kern made no further significant changes to its comprehensive plan until fiscal year 2019–20. Given that the State’s approach to juvenile justice has transformed significantly in the nearly 20 years since the Legislature enacted the JJCPA, we expected to see corresponding shifts in the strategies and services the counties describe in their annual plans. Because the counties’ Coordinating Councils generally did not revise their comprehensive plans to reflect changes in state policy, some of the plans are likely outdated and do not accurately reflect the counties’ strategies for addressing juvenile crime and delinquency.
Los Angeles and Santa Barbara recently conducted countywide evaluations of their respective juvenile justice systems, resulting in complete revisions of their comprehensive plans. In 2017 Los Angeles contracted with an external evaluator to assess the county’s implementation of JJCPA‑funded programs, determine the programs’ effectiveness, and make recommendations for system improvements. The review contributed to Los Angeles making some changes to its programs in fiscal year 2018–19 and to the county completely revising its comprehensive plan for fiscal year 2019–20. Previously, Los Angeles had acknowledged in its plan for fiscal year 2016–17 that it had not evaluated or redesigned its JJCPA‑funded service delivery system since the Legislature enacted the JJCPA in 2000.
Similarly, in 2017 Santa Barbara embarked on a review of its juvenile justice system by comparing various data elements of its system, such as juvenile hall population, against four counties it selected for proximity, demographic similarity, and progressive practices. This review resulted in its Coordinating Council completely revising the county’s comprehensive plan for fiscal year 2018–19. In addition, Santa Barbara was the only county we reviewed that described in its comprehensive plan how its juvenile justice system was affected by a state law change in 2007 that shifted the State’s responsibilities for housing certain types of juvenile offenders from the State to the counties. By conducting such countywide evaluations of their juvenile justice systems, the Coordinating Councils in Los Angeles and Santa Barbara provided valuable updates to their comprehensive plans about their current approaches to addressing juvenile crime and delinquency. However, had these two counties made significant changes to their plans regularly over the last two decades, these comprehensive revisions might not have been necessary because their plans would have already reflected changes in state policy and juvenile justice trends.
The other three counties’ Coordinating Councils cited different reasons for why they rarely revised their comprehensive plans. San Joaquin acknowledged that it did not make many changes to its plan but stated that it believed it met reporting requirements by noting the few changes it did make in its application for funding. Although it may have satisfied Community Corrections’ limited reporting requirements, San Joaquin did not make significant changes to its comprehensive plan to respond to trends in juvenile justice over the last 20 years, as we note in Figure 2. Because of turnover in the chief probation officer’s position, Mendocino could not explain why it rarely updated its comprehensive plan, whereas Kern indicated that it did not believe there was a need for substantial changes to its plan. We disagree because state law requires Coordinating Councils to annually reassess their countywide juvenile justice programs and strategies. Moreover, counties should update their plans to reflect changes both to the populations of at‑risk youth and juvenile offenders that they need to serve and to the areas in their communities at highest risk of juvenile crime.
Community Corrections’ limited oversight of the contents of counties’ comprehensive plans and its reliance on the application for funding, discussed earlier in this section, contributed to the inadequacies we identified in counties’ plans. As we describe in the Introduction, Community Corrections was responsible for reviewing and approving counties’ comprehensive plans until 2016. From fiscal years 2006–07 through 2016–17, the application for funding stated that Coordinating Councils must include counties’ comprehensive plans with their applications for funding if the plans were substantially changed; however, Community Corrections stated that in practice, it did not require Coordinating Councils to submit the revised plans. In addition, the application for funding allowed Coordinating Councils to check a box if they had not revised their plans, without requiring them to explain their reasons for leaving their plans unchanged. As a result, the applications for funding may not have always contained the most up‑to‑date information about counties’ juvenile justice strategies and may not have provided stakeholders with the reasons Coordinating Councils did not make changes to their plans for significant lengths of time.
Although Community Corrections revised its template for comprehensive plans for counties to use beginning with fiscal year 2017–18, its current instructions do not require counties to explain any updates to their comprehensive plans or to justify why their plans remain unchanged. Community Corrections explained that it assumes that counties are complying with their JJCPA responsibilities and does not believe that the JJCPA requires counties to explain why they have or have not modified their comprehensive plans. However, we believe that nothing prevents Community Corrections from collecting this information and trying to hold counties accountable for preparing comprehensive plans that provide meaningful, up‑to‑date information regarding their approaches to serving juvenile offenders and at‑risk youth.
County Plans Would Benefit From Defining At‑Risk Youth
The JJCPA requires counties to describe their approaches to responding to juvenile offenders and at‑risk youth in their comprehensive plans. Although it does not explicitly define the term at risk, the JJCPA suggests the term includes youth who are at risk of committing crimes. The JJCPA also does not identify risk factors—which, according to the National Institute of Justice, are preexisting personal characteristics or environmental conditions that increase the likelihood of delinquent behavior or other negative outcomes. For instance, repeated absences from school or an unstable home life are risk factors that may lead to a youth engaging in delinquent behavior, according to the Office of Juvenile Justice and Delinquency Prevention. Because counties must plan for responding to the needs of at‑risk youth, we expected their comprehensive plans to have clearly defined what type of youth they consider to be at risk.
However, four of the five counties we reviewed have not defined in their comprehensive plans the types of youth they consider to be at risk or formally identified the factors that make those youth at risk. Without specific, documented definitions of at‑risk youth, counties cannot effectively complete the required components of their comprehensive plans. Specifically, if counties do not identify the youth who are at risk, their comprehensive plans cannot identify all the resources available or their strategies for responding to those youth. Moreover, stakeholders cannot be certain whom the counties intend to serve, other than juvenile offenders, and which youth may be eligible to participate in both JJCPA‑funded services and other services that the counties provide. Of the five counties’ comprehensive plans that they submitted since the inception of the JJCPA, only Los Angeles included a definition of at‑risk youth in its comprehensive plan, and it did that only in its fiscal year 2019–20 plan. Its definition includes a robust list of risk factors that indicate when a youth is at risk of engaging in delinquent behavior.
According to the Office of Juvenile Justice and Delinquency Prevention, there is no single path to delinquency, but the presence of several risk factors often increases a youth’s chance of offending. Because counties’ youth populations may have unique needs and face different challenges, it may be reasonable for each to have a different definition of at‑risk youth. For instance, one county may focus on preventive programs that address truancy or literacy, while another may focus on rehabilitative services for formerly incarcerated juveniles. Since the Coordinating Councils of four counties we reviewed had not formally defined at‑risk youth in their comprehensive plans, we asked the county probation departments how their counties informally defined at‑risk youth. As Table 3 shows, these four counties’ definitions varied, and none of them formally identified risk factors. For example, we expected the counties to have specified risk factors in a manner similar to those Los Angeles outlined in its definition, which includes cognitive factors, family situations, peer associations, and academic factors.
|DOES THE COUNTY’S COMPREHENSIVE PLAN FORMALLY:
|DEFINITION OF AT‑RISK YOUTH USED BY COUNTY
|DEFINE AT‑RISK YOUTH?
|IDENTIFY RISK FACTORS?
|Youth—both those who have already committed a crime and those who have not—who are at risk of future criminal behavior if their needs are not addressed.
Uses the National Conference of State Legislatures’ description of risk factors that increase a youth’s likelihood to engage in delinquent behavior, including the following:
|Youth who do not successfully transition into adulthood. For example, youth with delinquent behavior that could lead them to not complete their high school education or to become involved with the justice system.
|Youth at risk of entering the juvenile justice system or increasing system involvement.
|Youth at risk of being removed from their homes.
Source: Comprehensive plans and interviews with county probation officials.
By defining their at‑risk populations, counties can effectively plan their comprehensive juvenile justice strategies and stakeholders can easily identify appropriate services for youth who are at risk. For example, Los Angeles’s probation department told us that parents often ask what services might be available for their children who are exhibiting delinquent behavior. Counties’ comprehensive plans could serve as helpful resources for stakeholders interested in knowing what services counties provide and the characteristics of the populations they serve. When counties do not specify and publicize who their at‑risk populations are, parents and stakeholders may not know where to turn for services to assist the youth in their care. Likewise, without this definition, the counties themselves cannot demonstrate that they have complied with state law requiring them to develop comprehensive plans that assess existing services for and includes responses to juvenile offenders and at‑risk youth.
Counties Have Not Demonstrated Whether Their JJCPA‑Funded Programs Are Effective
The counties we visited generally have not demonstrated that the programs they have chosen to operate represent an effective use of JJCPA funds. Although counties have broad discretion to use their JJCPA funds for any element of response to juvenile crime that is proven effective, not evaluating the effectiveness of those uses hinders a county’s ability to maximize the use of the funds. Nonetheless, three of the five counties we reviewed have not evaluated the effectiveness of their JJCPA‑funded programs. Further, although the two other counties—Los Angeles and San Joaquin—contracted with external evaluators for several years to assess the effectiveness of their JJCPA‑funded programs, they did not include the results of the evaluations in their year‑end reports to Community Corrections. As a result, Los Angeles and San Joaquin missed an opportunity to inform decision makers, stakeholders, and other counties about the promising results from their program evaluations.
Four of the five counties we reviewed generally used JJCPA funds for probation department programs, which primarily serve juvenile offenders. As Figure 3 shows, with the exception of Los Angeles, the counties each used more than two‑thirds of their JJCPA funds for probation department programs in fiscal year 2017–18. This was a consistent theme in the four counties’ spending over our five‑year review period. Two of the counties, Kern and Mendocino, used their JJCPA funding solely for programs their probation departments operated, including gang prevention and suppression programs that provide supervision and supportive services to juvenile offenders who are involved with gangs. Kern’s probation department also operated a second JJCPA‑funded program that focuses on increasing efforts to ensure that juvenile offenders successfully transition from custody to their communities. Kern’s probation department indicated that the county funds programs for juvenile offenders because these youth have the most serious needs and require more intensive services to prevent them from reoffending than youth who have not yet committed offenses.
In Fiscal Year 2017–18, Most Counties We Reviewed Spent the Majority of JJCPA Funds on Programs Their Probation Departments Operated
Source: County expenditure records.
In contrast, San Joaquin and Santa Barbara both used JJCPA funds for school‑based programs that their probation departments provided. San Joaquin’s probation department operated a school‑based program that assigns probation officers to specific school sites where they work with school staff to supervise juveniles on probation and to ensure their educational needs are met. Although this program focuses primarily on juvenile offenders, the county’s program description indicates that probation officers at the school sites also have regular contact with at‑risk youth and provide them with intervention and referral services. Similarly, the probation department for Santa Barbara operated a school‑based program that combined probation supervision with counseling opportunities.
In addition, three of the five counties either contract with other local entities and CBOs or coordinate with probation departments or other agencies to operate programs that serve juvenile offenders or at‑risk youth. For example, San Joaquin contracts with a CBO to operate neighborhood service centers that work with juvenile offenders and at‑risk youth and their families by assessing their needs and connecting them with services such as health and nutrition education and counseling. Los Angeles, which has the most diverse blend of service providers for JJCPA‑funded programs, operates an after‑school program to provide juvenile offenders and at‑risk youth with enrichment programs, supervision, and individualized treatment through the coordinated services of CBOs; the probation department; and other local government entities, such as county and city parks and recreation departments and local school districts. It also contracts with a CBO that operates a writing program that teaches interpersonal skills to juvenile offenders subject to long‑term detention in juvenile hall. Santa Barbara, although it operates two probation programs, also coordinates with CBOs and the county behavioral wellness department to provide services to juvenile offenders. For the five counties we visited, we present in Appendix A program descriptions; program expenditures incurred by probation departments, other local government entities, and CBOs; and select demographic information for participants in their JJCPA programs.
Regardless of the programs they choose to operate with JJCPA funds, counties have not submitted meaningful evaluations of the effectiveness of those programs in their year‑end reports. Unlike their comprehensive plans, the year‑end reports that counties submit to Community Corrections must include an assessment of the effectiveness of their JJCPA‑funded programs. Specifically, counties must include descriptions or analyses of how their JJCPA‑funded programs may have contributed to or influenced countywide juvenile justice trends, such as declining arrests. However, the five counties we reviewed did not include such descriptions or analyses in their October 2018 year‑end reports—the most recent reports available during our audit—even though Community Corrections’ reporting template specifically directs them to do so. One county—Kern—failed to identify any juvenile justice trends or how its JJCPA‑funded programs may have affected those trends. The remaining four counties generally described juvenile justice trends within their counties but did not specifically identify whether or how their JJCPA‑funded programs may have affected those trends. For example, in its October 2018 year‑end report, Santa Barbara stated that the number of juveniles referred to probation had decreased, as had juvenile arrest rates, and it noted that these trends were reflective of similar statewide trends. Although the county concluded that its JJCPA‑funded strategies had undoubtedly played a role in the trends, it did not offer evidence to support this assertion. Similarly, Mendocino noted that its juvenile arrest rates had declined but did not specify whether or how its JJCPA‑funded programs might have contributed to this decrease.
Los Angeles and San Joaquin have contracted with external evaluators for several years to assess the effectiveness of some or all of their JJCPA‑funded programs. For our audit period of fiscal years 2013–14 through 2017–18, the counties’ most recent evaluations involved programs they operated during fiscal year 2016–17 and for which they could have reported information to Community Corrections in October 2018. When we reviewed the year‑end reports the two counties submitted to Community Corrections, we were surprised to find that they did not include key findings from their respective program evaluations. For example, San Joaquin’s external evaluator found that juveniles participating in one of its school‑based programs had lower arrest rates and incarcerations. However, in its 2018 year‑end report, San Joaquin made no mention of these positive outcomes. Instead, the county listed juvenile justice statistics and concluded that the programs it operated with JJCPA funds were highly effective, without citing any evidence. In the case of Los Angeles, its external evaluator concluded that participants in its school‑based probation program were less likely to reoffend within six and 12 months after program enrollment than youth on other forms of probation. However, Los Angeles’s 2018 year‑end report did not mention this positive result. Rather, the county briefly described its crime statistics and mentioned that one of its recently funded programs significantly improved educational outcomes for juvenile offenders, but it did not offer evidence for how it had reached this conclusion. By not including details of the reduced rates of arrest and incarcerations in their year‑end reports, these two counties missed an opportunity to inform decision makers, stakeholders, and other counties about the effectiveness of their use of JJCPA funds.
Counties Can Increase Their Ability to Measure Program Effectiveness by Using JJCPA Funds to Improve Their Data Collection
Since 2017 counties have been required to include in their comprehensive plans a description of data that they intend to use to measure the success of their JJCPA‑funded programs. All five counties we reviewed reported that they planned to use data primarily from their county probation departments’ case management systems, such as sentencing information, to track information and outcomes for participants in JJCPA‑funded programs. However, we found that the counties’ case management systems’ capabilities may be insufficient to track and produce information on program participants. Specifically, when we requested basic information about the participants in JJCPA‑funded programs, the five counties were not always able to provide this information and, in some instances, provided inaccurate information. Without reliable information about the individuals who participated in JJCPA‑funded programs, counties cannot adequately assess the effectiveness of those programs in reducing juvenile crime and delinquency.
Mendocino, Los Angeles, and San Joaquin could not provide any data, such as age, race, or gender, on participants in at least one of their programs for certain fiscal years. Specifically, Mendocino could not provide any information about the participants in its gang intervention program for fiscal years 2016–17 and 2017–18. Similarly, Los Angeles did not collect and therefore could not provide data on participants in one of its largest programs during fiscal year 2017–18—a mental health program that the county spent roughly $4.5 million of JJCPA funds in that year to operate. Los Angeles explained that state law no longer required it to report these data. However, we question why Los Angeles stopped collecting data for this program but continued collecting data on participants in other JJCPA programs that it chose to fund. We also found a case in which Los Angeles could not identify the JJCPA program an individual participated in or, once the program could be identified, how long the individual participated.
Mendocino and Los Angeles explained that they did not collect data for these programs because the Legislature amended state law removing the requirement to report on specific outcomes in 2017. Although the JJCPA no longer requires counties to report program‑specific outcome data, such as the arrest and probation violation rates for program participants, it requires counties to assess the effectiveness of their JJCPA‑funded programs. For example, counties must summarize or analyze, based on available information, how their funded programs may have contributed to or influenced countywide juvenile justice data trends, such as the number of incarcerations within the county. To determine how their funded programs may have contributed to countywide juvenile justice trends, counties must maintain data on participants in those programs.
San Joaquin could not provide data for three of its five programs because the probation department does not track the data for these programs in its juvenile probation case management system and therefore could not compile it for our request. San Joaquin indicated that its external evaluator collects and analyzes the data directly from the CBO operating one of these programs and that the probation department plays no part in this program other than to provide it with funding. Similarly, the probation department explained that the county could not provide data for the other two JJCPA programs, which began operation in fiscal year 2017–18, because the data for these programs exist in its adult case management system and in a separate referral system. The probation department stated that the data for these programs were not formatted in a way that could be retrieved or compiled in order to respond to our request. However, the probation department indicated that moving forward, it will capture or integrate this information into its juvenile case management system and analyze it for the county’s annual JJCPA program evaluation report.
Although Kern asserted that it collects and tracks data on participants in its JJCPA programs, it too was unable to identify all who had participated. Because of data issues it attributed to its case management system that it was not aware of until we requested the information, the probation department had difficulty identifying all of the participants in its JJCPA‑funded programs. In fact, Kern’s juvenile programs probation director informed us that when she assumed her position in 2018, she wanted to determine whether the county’s two JJCPA‑funded programs were the most effective use of JJCPA resources. However, she said, the outcome data available were limited and did not allow for a review of outcome measures for the programs. The probation department stated that the county has had plans since 2015 to implement a countywide criminal justice information system, but it indicated that the county is still working to identify how best to implement such a system. Without accurate data on participants in its JJCPA‑funded programs, Kern cannot assess the effectiveness of those programs toward reducing juvenile crime and delinquency.
Santa Barbara tracks the individuals to whom it provides JJCPA services in its case management system, but it cannot always identify in which of its two programs they participated. As a result, the county cannot consistently assess how effective each of its programs is at reducing the likelihood of at‑risk youth or justice‑involved juveniles committing crimes.
In addition to the limitations we identified with the counties’ data and their case management systems, we found that Los Angeles has been aware of other issues with its data for several years but has not taken the steps necessary to improve its ability to conduct meaningful evaluations of its programs’ effectiveness. Since at least 2013, Los Angeles has contracted with the RAND Corporation (RAND) to evaluate the effectiveness of its JJCPA programs. From fiscal years 2013–14 through 2016–17, RAND consistently reported that Los Angeles did not maintain the information necessary to measure program‑specific outcomes for several programs the county operated with JJCPA funds. For example, Los Angeles’s Department of Mental Health administers an evaluation before and after an individual’s participation in one of the county’s JJCPA‑funded mental health programs to reflect any changes in the participant’s overall psychological state. However, because few participants completed both evaluations for this program, RAND indicated that limited information was available to assess the impact of the program. In fact, in fiscal year 2016–17, only 13 percent of the participants in the mental health program completed both of the evaluations. As a result, RAND called into question the appropriateness and reliability of its findings on the effectiveness of this program and any programs that similarly lacked sufficient information. RAND noted that measuring these outcomes can be problematic because the probation department’s data are only as reliable as the information it obtains from the entities that operate programs, such as CBOs and other local entities.
Finally, the counties we reviewed have not maximized their use of JJCPA funding to improve their data collection and tracking efforts. Although counties may use this funding for system enhancements to provide data for measuring the success of their JJCPA programs and strategies, none of the five counties reported doing so over the past five fiscal years. This is particularly troubling given that each county has unspent JJCPA funds. As we discuss in Chapter 2, the counties we reviewed did not spend roughly 4 percent to 14 percent of the JJCPA funds they received from fiscal years 2013–14 through 2017–18. Counties are missing an opportunity to enhance their ability to conduct meaningful evaluations of their JJCPA programs when they do not use available funding to make improvements to data collection and tracking efforts.
To ensure that counties adequately identify how they serve at‑risk youth, the Legislature should require counties to define at‑risk youth—including identifying specific risk factors—in their comprehensive plans.
To ensure that counties comply with juvenile justice planning requirements to serve both juvenile offenders and at‑risk youth, the Legislature should require Community Corrections to review counties’ annual comprehensive plans to ensure that they include an adequate county‑specific definition of at‑risk youth.
The Legislature should direct Community Corrections to monitor counties’ year‑end reports to ensure that they include meaningful descriptions or analyses of how their JJCPA‑funded programs may have contributed to or influenced countywide juvenile justice trends, as required by state law.
To ensure that their Coordinating Councils meet statutory requirements and are transparent to stakeholders, both Mendocino and San Joaquin should develop and implement bylaws for their Coordinating Councils and Mendocino County should reinstate its Coordinating Council.
To determine the effectiveness of their use of JJCPA funds, Kern, Los Angeles, Mendocino, San Joaquin, and Santa Barbara should include in their year‑end reports to Community Corrections descriptions or analyses of how their JJCPA‑funded programs influenced their juvenile justice trends, as required by law.
To adequately assess the effectiveness of their programs at reducing juvenile crime and delinquency, Los Angeles, Mendocino, and San Joaquin should collect data on all participants in each JJCPA program and for each service they provide.
To accurately assess the effectiveness of their programs, Kern, Los Angeles, and Santa Barbara should determine how to accurately identify in their case management systems the JJCPA programs and services in which each individual participates or should enhance these systems to provide this capability.
To ensure that counties’ comprehensive plans are informative and up to date, Community Corrections should revise its comprehensive plan template to require Coordinating Councils to specify plan components their counties are changing and to describe those changes. If a county is making no changes, the template should require the Coordinating Council to explain why no changes to the plan are necessary.
The State Has Not Provided Sufficient Oversight of the JJCPA
In the previous chapter, we identify numerous shortcomings in the counties’ administration and planning of the JJCPA. These shortcomings—which include counties’ lacking Coordinating Councils, not having all the required representatives on their councils, and not always meaningfully updating their comprehensive plans—indicate the importance of effective state oversight. Community Corrections plays a key role in ensuring transparency related to the JJCPA, as state law requires it to collect and post information to its website that counties submit. Although it determines the format in which counties provide that information, Community Corrections does not review their reporting or require the counties to address deficiencies in that reporting. As a result, the value of the information on Community Corrections’ website is diminished. Additionally, state law does not include a mechanism for Community Corrections or any state agency to restrict the counties’ spending of JJCPA funding if they fail to comply with key legal requirements. Finally, although the amounts of JJCPA growth funding that counties receive have increased significantly in recent years, the State does not guarantee the amounts of this funding, and consequently some counties explained that they are hesitant to spend it on long‑term programs. Increasing the guaranteed amount of base JJCPA funds to capture and stabilize some of the growth funding would provide counties with a more reliable source of funding.
Community Corrections Does Not Provide Oversight of Counties’ Implementation of the JJCPA
Community Corrections plays a key transparency role with regard to the JJCPA because of its responsibility to collect and post information from counties to its website. Community Corrections has specified formats for counties to use in reporting their comprehensive plans and year‑end reports, which helps ensure consistency among the counties’ submissions. Consequently, we expected that it would review and assess whether the information it receives from counties is reasonable and provides a meaningful response to the elements the JJCPA requires. However, Community Corrections takes a narrow approach to its role with regard to the JJCPA. Specifically, it believes its JJCPA responsibility is limited to collecting information from the counties, posting that information to its website, and reporting a compilation of that information annually to the Governor and Legislature. Community Corrections states on its website that it will not review or make any changes to information counties submit.
We reviewed the fiscal year 2013–14 through 2017–18 year‑end reports that counties submitted to Community Corrections and found several instances in which counties did not report information correctly. For example, six counties reported in their October 2017 and 2018 year‑end reports that they operated a JJCPA‑funded program titled Salaries and Benefits. One of these counties—Calaveras County—explained in its description of its Salaries and Benefits program that it placed minors into one of two JJCPA programs, early intervention or intensive supervision. Although some expenses in Calaveras County may have indeed been for salaries and benefits to operate their two programs, we question why Community Corrections did not follow up with Calaveras or counties that similarly did not report their programs correctly.
Moreover, although Community Corrections provides on its year‑end report template 35 program expenditure categories for direct services—such as after‑school services, gang intervention, and substance abuse screening—we found counties were overly relying on unspecific categorizations that were not helpful in determining the type of programs that they operated. Specifically, we identified more than 200 instances in the past five fiscal years in which counties categorized their program expenditures as Other Direct Service. Of those, we identified nearly 80 instances in which counties could have categorized the activities as school‑based or truancy programs, which are not currently categories. We believe the counties’ overreliance on the category Other Direct Service reduces the usefulness of categorizing programs. If Community Corrections expanded its list to include more categories of programs, such as school‑based and truancy programs in its list of categories, other counties and stakeholders may find more value in its website as they search for specific types of programs.
In response to these issues, Community Corrections stated that it does not consider overseeing how counties name and describe their programs as part of its role. We believe these issues would be relatively simple for counties to correct if Community Corrections conducted a review of the information they submit to ensure that they have accurately reported and appropriately categorized their programs. Community Corrections could then request counties to fix the identified issues. By not reviewing the information counties submit, Community Corrections is missing an opportunity to expand its list of program classifications so that counties can appropriately categorize their programs and key stakeholders can properly identify them.
Although we believe it should oversee the information counties report to it and request that counties fix reporting errors, Community Corrections has no authority to compel counties to comply with key requirements of the JJCPA. Until 2017 state law required Community Corrections to review and approve only those comprehensive plans that fulfilled the plan requirements of the JJCPA. The law also prohibited the counties from allocating JJCPA funding until Community Corrections had approved their comprehensive plans. However, an amendment to state law that took effect in 2017 generally removed the requirement for Community Corrections to approve comprehensive plans, and thus the law no longer requires that the counties’ spending of JJCPA funds be contingent on approval from Community Corrections. Consequently, counties that do not meet the requirements of the JJCPA continue to receive and spend funding. Specifically, we identified up to 11 counties that may not have Coordinating Councils but have reported to Community Corrections that they are using JJCPA funds.Although Alpine County did not have a Coordinating Council, it did not spend any of its JJCPA funds during our audit period. Alpine County stated that it does not currently participate in the JJCPA but that it is considering establishing a Coordinating Council in the future so that it can spend JJCPA funds. To compel counties to comply with the requirements of the JJCPA, state law needs to provide authority for the State to prohibit counties from spending funding until they meet those requirements.
Because it receives the comprehensive plans and determines the format in which counties report those plans, Community Corrections is in a good position to provide oversight of counties’ implementation of the JJCPA. Specifically, Community Corrections should modify its template for comprehensive plans to require counties to report about their Coordinating Councils, thereby taking steps to mitigate the risk that a county would submit a plan that a Coordinating Council has not approved. Moreover, Community Corrections could take action to identify and mitigate other shortcomings we identified in counties’ implementation of the JJCPA that we list in Table 4. Taking such actions would help ensure that counties not only comply with state law but also that they meaningfully plan for and report on their JJCPA expenditures.
|COUNTIES DID NOT ALWAYS:
|Have Coordinating Councils
|Have all required representatives on Coordinating Councils
|Meaningfully update their comprehensive plans
|Include a definition of at‑risk* youth in their comprehensive plans
|Include meaningful descriptions or analyses of the effectiveness of their JJCPA programs in their year‑end reports
|Accurately report information in their year‑end reports to Community Corrections
Source: Counties’ documentation regarding their implementation of the JJCPA, interviews with county probation departments, and information Community Corrections collects from counties.
* AB 413 (Chapter 800, Statutes of 2019) deleted the term “at-risk” used to describe youth for purposes of various provisions in the California Education and Penal Codes and replaced it with the term “at-promise.” However, the term “at-risk” currently remains in JJCPA as part of the California Government Code. As a result, we use the term “at-risk” consistent with the JJCPA throughout our report.
Community Corrections Is Not Maximizing the Usefulness of the Information It Collects From Counties
State law requires Community Corrections to collect and post to its website a description or summary of the programs, strategies, and system enhancements that the counties have supported with JJCPA funds. Community Corrections is also required to submit an annual report to the Governor and Legislature that summarizes this information, along with countywide trend data. These requirements are part of Community Corrections’ mandate to collect and maintain information related to juvenile justice so that the public is aware of the impact of state and local programs on juvenile justice and so that local entities can access information about promising practices and innovative approaches to reducing juvenile crime and delinquency. As a result, we expected Community Corrections to maximize the utility of county‑reported data by presenting the JJCPA information on its website in a manner that enables users to review and compare the program information from multiple counties. However, Community Corrections does nothing beyond posting on its website the individual reports that counties submit, without synthesizing the information in those reports in a manner that is helpful to users.
According to Community Corrections, it posts counties’ information in the format in which it was submitted because it interprets its statutory responsibility to post a description or summary of counties’ JJCPA information narrowly. Moreover, Community Corrections has not calculated the cost of organizing and displaying JJCPA information on its website in more useful ways and therefore has not determined whether it would need additional resources to do so. Community Corrections already displays other statewide data, such as grants counties receive and jail population trends, on its website in a manner similar to the interactive graphic we describe later in this section. Therefore, it seems reasonable for Community Corrections to present JJCPA information in a way that adds value to the individual counties’ submissions by aggregating and enabling users to navigate the data. Doing so would help Community Corrections further satisfy its duty to identify and promote evidence‑based and innovative programs by enabling users to conduct searches or compare the programs counties are operating using JJCPA funding. At the least, it should determine the resources necessary to make this change. By simply posting to its website the information that counties submit, Community Corrections has missed an opportunity to provide local entities with a valuable resource.
Some of the counties we visited expressed that Community Corrections could improve the information it displays about the programs that other counties are funding with their JJCPA allocations. According to Santa Barbara, for example, it would be helpful if Community Corrections provided more detail about individual programs, such as how a county funded the program and who operated it. Mendocino said that it would be helpful if Community Corrections provided additional analysis of the information that counties submit instead of being only a repository of documents. Similarly, San Joaquin agreed that it would be helpful and increase transparency if Community Corrections’ website allowed counties to easily compare information about JJCPA‑funded programs that counties operate. A more useful and navigable display would allow users to search for a specific program type, such as gang intervention, and would summarize information about gang‑intervention programs from other counties across the State. Community Corrections’ website could then provide a summary of program descriptions, funding levels, juvenile trend data, and the counties’ opinions about how these programs influenced juvenile trends. Users could then compare similar programs operated by multiple counties.
Using expenditure information from Community Corrections’ website, we created an interactive graphic on our website that allows users to search for programs operated by a specific county or to search for similar programs that fall under the same expenditure category operated by any county.To view a display of program budgets and information for all counties that participate in the JJCPA, visit our interactive dashboard in the online version of this report at www.auditor.ca.gov/reports/2019-116/supplementalgraphic.html. Because Community Corrections does not review or correct information counties submit but instead relies on them to submit accurate information, our graphic may contain some inaccuracies. Nonetheless, we believe it provides users with easy access to financial information for all of the JJCPA programs each county operated from fiscal years 2013–14 through 2017–18. In particular, users can select an individual county to view a summary of that county’s JJCPA‑funded programs. Users can also select a specific program type to view summary information about those programs, including which counties operated them. Community Corrections could incorporate other data that it collects from the counties, such as program descriptions and data the county used to measure program effectiveness, to increase the utility of the comparison between counties beyond the financial information we present. Given that it already collects the information counties report, Community Corrections is best positioned to provide additional value by presenting that information in a manner that enables users to easily review how counties across the State use JJCPA funds to address juvenile crime and delinquency. Moreover, Community Corrections has the capability to develop a more robust presentation of JJCPA information because it currently presents other statewide information using the same software that we used to create our interactive graphic.
The Current JJCPA Funding Process Is Not Predictable and Should Be Improved
As we describe in the Introduction, the State provides counties with JJCPA funding through an annual guaranteed amount, referred to as base funding, and—if funds are available—an additional variable amount, referred to as growth funding. Because growth funding relies on several factors that can change from year to year, it is not predictable. The amount of annual growth funding the State has provided to counties has increased significantly since fiscal year 2014–15 and represented about one‑third of the $159 million in total JJCPA funding counties received in fiscal year 2018–19. Because counties have difficulty anticipating how much JJCPA growth funding they will receive each year, they did not spend their total JJCPA allocations during our five‑year review period. To encourage counties to spend more of their JJCPA funding each year, the Legislature should act to stabilize the amount of JJCPA funding it allocates to counties.
The State allocates motor vehicle license fee revenues to a number of sources, including the Enhancing Law Enforcement Activities Subaccount (ELEAS account) that provides funding for local law enforcement activities. As Figure 4 shows, state law directs $490 million from motor vehicle license fees to the ELEAS account each year. From this account, it designates $107 million as the initial allocation, or base funding, to counties for the JJCPA, which the State Controller’s Office pays to counties at regular intervals during each fiscal year. Since fiscal year 2012–13, the State has consistently provided the same amount of base funding to counties, which it allocates based on county populations. However, state law also provides for an additional allocation of funding for local law enforcement activities—including the JJCPA—in the event the State collects more motor vehicle license fees than required for its initial allocations. Once the ELEAS account reaches the established limit of $490 million, the State deposits additional funds into a separate growth account, then allocates this growth funding to counties in the same manner as the base funding.
The State Provides Counties With Both Base and Growth JJCPA Funding
Source: Government Code and Revenue and Taxation Code.
The annual amount of growth funding the State provided to counties increased by $53 million from fiscal years 2014–15 through 2019–20. The annual growth funding allocations depend on the amount the State collects in vehicle license fees, which reflects the number of vehicles purchased during the year and the market value of each vehicle. In addition, the local law enforcement allocation is only one of the motor vehicle license fee allocations established in state law. As a result of these factors, the amount of JJCPA growth funding the State provided counties in fiscal years 2014–15 through 2019–20 varied significantly, as Figure 5 shows. For example, the State distributed almost $7 million in JJCPA growth funding to counties in fiscal year 2014–15, nearly $16 million during fiscal year 2015–16, and $60 million in fiscal year 2019–20. Because base funding is fixed at $107 million, the growth funding the State allocated in fiscal year 2019–20 represents about a third of the counties’ JJCPA allocations.
Growth Funding Increased Significantly From Fiscal Years 2014–15 Through 2019–20
(Dollars in Millions)
Source: State law and payment records from the State Controller’s Office.
* If there are no changes to state law governing JJCPA allocations, the State will allocate $107 million in base funding to counties in fiscal year 2019–20.
The five counties we visited have not spent all of the JJCPA funding the State has provided. The State provides JJCPA funding to counties based on their populations. For example Los Angeles, which had roughly one quarter of the State’s estimated total population for 2017, received more than $212 million in total JJCPA funding from fiscal years 2013–14 through 2018–19, while Mendocino received just $1.9 million during the same period because it has a much smaller population. However, none of the counties spent all of the JJCPA funding they received, as Figure 6 shows. When we asked the probation departments at the five counties why they had not spent all of their JJCPA funds, Kern, Los Angeles, and Santa Barbara indicated that they had not done so because of the variability in the amount of growth funding they receive each year. Mendocino explained that it had not spent growth funds because it had not yet fully spent the base funds it received and accumulated in years before our audit period. Finally, San Joaquin stated that there is a chance that the State could reduce or eliminate some funding.
Counties Have Not Spent All of the JJCPA Funds They Received From
Fiscal Years 2013–14 Through 2017–18
Source: State Controller’s Office payment records and county accounting records.
Because the State does not guarantee the amount of growth funding counties receive, some counties informed us that they limit how they use the growth funds. For instance, in accordance with a county policy, Los Angeles has allocated its growth funding to what it considers to be one‑time uses of funds. San Joaquin’s chief probation officer stated that it maintains a 12‑ to 18‑month reserve because the State could reduce or eliminate some funds it provides to counties. Santa Barbara, which spent nearly all of the JJCPA base and growth funding it received from fiscal years 2013–14 through 2017–18, is also concerned that amounts of growth funding may decrease in the future. In fact, Santa Barbara explained that it is currently considering what actions it may take if it does not receive growth funding or if growth funding is reduced in future years because of a recession.
The counties’ approaches to managing growth funds are an indication of the challenge that the variability of this funding presents to them. Increasing the JJCPA base funding amount would enhance counties’ abilities to accurately predict their future JJCPA funding allocations because the State would guarantee a greater amount of total JJCPA funding in law. This change could allow counties to rely on a greater percentage of their total JJCPA funds as a stable source of funding. In fiscal year 2014–15, growth funds represented just 6 percent of the total JJCPA funds that counties received. However, in fiscal year 2019–20, it represented more than one‑third. Moreover, the annual amount of growth funds the State has allocated to counties consistently increased from fiscal years 2014–15 through 2019–20. Although the counties we reviewed did not spend all of the JJCPA funds they received—which includes both base and growth funding—from fiscal years 2013–14 through 2017–18, all five counties spent more than just the total of the base funds they received during those five years. If the Legislature were to use some growth funds to increase the amount of base funds counties receive, counties could realize a better balance between stable base funds and less predictable growth funds. Because state law requires Community Corrections to collect JJCPA expenditure information from counties annually, we believe it is well positioned to determine an appropriate higher amount of base funding.
To enable Community Corrections to provide effective oversight of the required elements of the JJCPA, the Legislature should amend state law to describe a process for restricting the spending of JJCPA funding by counties that do not meet the requirements of the JJCPA. As part of that process, the State should prohibit counties that have not established Coordinating Councils from spending JJCPA funds.
To make JJCPA funding more stable and predictable, the Legislature should amend state law to increase the amount of guaranteed JJCPA funding the State provides to counties. If the Legislature decides to stabilize JJCPA funding, it should direct Community Corrections to evaluate the expenditure information counties submit and identify an appropriate amount of base funding. The Legislature should further direct Community Corrections to assess every five years the percentage of total JJCPA funds that growth funds represent to determine whether the base funding needs to be adjusted.
To ensure that counties include accurate information in their comprehensive plans and year‑end reports, Community Corrections should review the information counties submit to it and follow up with them to obtain missing information or to clarify information that seems incorrect.
To better promote effective local efforts related to the JJCPA, Community Corrections should include on its website the capability for stakeholders, counties, and other interested parties to review and easily compare the JJCPA information of multiple counties. Specifically, its website should allow users to be able to select a specific type of JJCPA‑funded program and easily review information the counties submitted for all programs associated with that program type. Community Corrections should determine the cost of providing this additional service and, if necessary, request additional resources.
We conducted this performance audit in accordance with generally accepted government auditing standards and under the authority vested in the California State Auditor by Government Code 8543 et seq. Those standards require that we plan and perform the audit to obtain sufficient, appropriate evidence to provide a reasonable basis for our findings and conclusions based on the audit objectives. We believe that the evidence obtained provides a reasonable basis for our findings and conclusions based on our audit objectives.
ELAINE M. HOWLE, CPA
California State Auditor
May 12, 2020