October 17, 2019
Elaine M. Howle
California State Auditor
621 Capitol Mall, Suite 1200
Sacramento, CA 95814
Re: California State Board of Education Response to Audit Report 2019—101, “K-12 Local Control Funding”
Dear Ms. Howle,
On behalf of the State Board of Education, I appreciate the opportunity to respond to the California State Auditor’s report, entitled “K-12 Local Control Funding.”
Before addressing the substance of the report, I would first like to express appreciation for the professionalism and diligence of the audit team assigned to this audit. Implementation of the Local Control Funding Formula (LCFF) has been a complex undertaking that the State Board of Education (Board) takes very seriously. LCFF implementation involves practices that are complex in their own right, including local budget decisions and implementation of instructional programming. Moreover, throughout the implementation process, the Board has proactively solicited and balanced perspectives of the diverse education stakeholders across the state. That background and context is important to understanding the present policy circumstances, and we appreciate the audit team’s effort to situate their work within that background and context as well as their openness to considering issues that our team raised.
I also appreciate that the audit report acknowledges that Board staff and staff from the California Department of Education (CDE) are working on a revised template for the Local Control and Accountability Plan (LCAP) to implement changes to the statutes governing the LCAP template made by AB 1840, Chapter 426, Statutes of 2018. That legislation was enacted to improve the LCAP template based on lessons learned over the first few years of LCAP implementation.
We expect that the Board will adopt a new LCAP template at its January 2020 meeting. It is our hope that the revised template will enhance transparency around how funds are used within the LCAP. The anticipated template revisions will ensure the LCAP development process supports more meaningful evaluation of underlying performance in consultation with local stakeholders, prioritization of actions and budgetary resources in response to needs identified, and ongoing monitoring of effectiveness of those actions in improving opportunities and outcomes for students.
The enclosed attachment includes detailed responses to the audit report recommendations.
Karen Stapf Walters
California State Board of Education
The Legislature delegated to the State Board of Education (Board) key policymaking decisions related to Local Control Funding Formula (LCFF) implementation, and the Board has dedicated substantial time and attention to that important work. The perspective gained from that work over the last six years provides important context and insight as the Legislature and the public evaluate the findings from the audit and recommendations contained in the audit report.
Additionally, AB 1840, Chapter 426, Statutes of 2018, requires the Board to make significant changes to the Local Control and Accountability Plan (LCAP) template by January 2020. Local educational agencies (LEAs) (school districts, county offices of education, and charter schools) have not developed LCAPs using the new template that reflect the statutory changes enacted last year, and the audit therefore could not include review of LCAPs adopted under the new LCAP template. As a result, details on the work to update the LCAP template provide further context for the findings and recommendations included in the audit report.
One of LCFF’s key innovations was to shift the focus of state accountability from “inputs” to “output.” Instead of focusing on whether districts are simply spending money within a categorical program, LCFF holds districts accountable for improving opportunities and outcomes for students. This innovation has also led to significant changes in local planning and budgeting practices by bringing a more explicit focus through the LCAP on whether the decisions LEAs make about how to use their limited resources are improving student outcomes.
This shift in state policy responded to decades of experience with the former categorical approach, under which local accountability was driven by year-to-year accounting procedures and compliance monitoring rather than a focus on whether spending decisions lead to improved outcomes. A few concrete examples illustrate some of the limitations of the former categorical approach and the potential challenges of using such an approach within the context of the LCAP.
- An LEA’s LCAP sets a goal of improving student attendance and proposes an action to hire a new counselor. Because the identity of that person is unknown, the LCAP lists a planned expenditure of $100,000, which is in the middle of the pay scale. If the counselor is hired at the bottom of the pay scale, the actual expenditure is $70,000. Although the position is staffed and the services provided as planned under a “categorical type approach,” the LEA would be required to “make up” $30,000 in additional expenditures the following year. On the other hand, if the LEA hired a counselor at the top of the pay scale who earned $130,000, then that over-expenditure would need to be funded out of a non-restricted source, impacting its ability to fund other services.
- Another hypothetical illustrates a different challenge. An LEA’s LCAP proposes to hire three new counselors, but the LEA is unable to fill one of those positions in the first year. Under a “categorical type approach,” the full cost of that unfilled position would carry over on a one-time basis to the following year. Hiring a fourth position would not be responsible since the funding for that position is one-time.
In isolation of a single action on a LCAP, these scenarios may not seem significant, but across an entire LCAP this return to a categorical-era focus on actual spending rather than improving could pose serious challenges and substantially impact the LEA’s budgeting process. Experience from past categorical programs underscores that LEAs and schools sometimes struggle to come up with meaningful and useful ways to expend time-limited, one-time dollars. LEAs generally spend 80 to 85 percent of their budgets on personnel, which are mostly ongoing costs. Much research about improvement in education settings has underscored the importance of sustainability and continuity.
The possibility that LEAs might be under-delivering for the student groups that generate additional funding is a concern to the Board. Although a categorical-type approach focused only on expenditures may have the advantage of being easy to tabulate, there is risk both that such an approach oversimplifies the relevant question (are dollars being spent versus are students receiving better services, either in quantity or quality) and that this oversimplification would turn the LCAP into an accounting exercise instead of a planning document focused on improving opportunities and outcomes for students.
As the audit report acknowledges, the Board is required to update the LCAP template by January 2020 in response to amendments made by AB 1840, Chapter 426, Statutes of 2018. That legislation, which reflected a compromise negotiated through the budget process, required significant changes to the LCAP template intended to enhance transparency around the use of funds within LCAPs, including the requirement to increase or improve services for low-income students, English learners, and foster youth.
Board staff and California Department of Education (CDE) staff are in the process of developing a recommendation for the revised template for consideration by the Board at its January meeting. As part of this revision process, staff have convened several stakeholder sessions in order to receive feedback and suggestions on proposed changes. Proposed changes include:
- A new requirement that LEAs aggregate planned expenditures, and estimated actual expenditures, for all actions included for each goal within an LCAP, including source of funding for those expenditures.
- A new requirement to aggregate the expenditures associated with actions that increase or improve services for low-income students, English learners, and foster youth, and to show that aggregated total in conjunction with the estimated additional revenue the LEA receives under LCFF based on those students.
Recent legislation also requires LEAs to include, with their adopted LCAP, an LCFF Budget Summary for Parents using a template developed by the Superintendent of Public Instruction. This requirement requires LEAs to detail the total planned expenditures on actions that increase or improve services for low-income students, English learners, and foster youth; the estimated additional revenue the LEA receives under LCFF based on serving those students; and how services for those student groups are improved if the total planned expenditures are less than the estimated additional revenue. The LCFF budget summary document also compares the total planned expenditures on actions to increase or improve services with the total estimated actual expenditures on those actions, and requires LEAs to explain how any decline in actual expenditures impacted the LEA’s ability to deliver increased or improved services.
The audit report notes concerns that stakeholders cannot easily and systematically see how funds generated by low-income students, English learners, and foster youth are being spent within the LCAP and whether the actions planned to benefit those students were implemented as planned. We believe the revised LCAP template will address these concerns. The new template will consolidate, in one place, expenditures associated with all actions within the LCAP, broken down by source of fund. The actions that contribute to increased or improved services will be clearly marked, and the template requires the expenditures for those actions to be totaled and compared to additional funding generated by low-income students, English learners, and foster youth. To the extent stakeholders or policymakers desire to understand how LCFF funds support those actions, the total expenditures can be disaggregated into fund source based on the LCAP expenditure table.
In response to feedback from stakeholders, the new template that the Board will consider in January 2020 will also require LEAs to identify within the Annual Update all significant differences between planned actions and implemented actions, in addition to material differences between planned expenditures and actual expenditures. This new requirement will enhance transparency as to whether an LEA implemented the actions it said that it would and, if not, require an explanation for the departure. This requirement will also bring transparency as to whether LEAs implement the actions as planned, which is absent from the current LCAP template.
As intended by AB 1840, the new LCAP template will provide enhanced transparency as to whether expenditures on actions to benefit low-income students, English learners, and foster youth are on par with the additional funding the LEA receives. The new template will provide this information without requiring a profound shift in the underlying policy behind LCFF and will therefore maintain as a primary focus whether the additional funds provided under LCFF are used to increase or improve services provided to high-need students.
It is also important to note that early evidence suggests that LCFF is, in fact, leading to improved outcomes for the students who generate the additional funds. Over the last 18 months, several researchers have evaluated whether “LCFF is working” using data-driven methodologies, with two showing positive evidence that the academic performance of the student groups that generate additional funds under LCFF has improved at a greater rate in school districts that have higher concentrations of those students (and therefore receive additional funds under LCFF) and a third showing strong academic gains for California relative to other states.
- Learning Policy Institute, 2018 (https://learningpolicyinstitute.org/product/ca-school-finance-reform-brief): Increased LCFF funding and the greater share of unrestricted funding that LCFF provided are correlated with greater gains in graduation rates and student performance on CAASPP, with particularly strong improvement on graduation rates and math for low-income students in those districts that receive additional funds for those students under LCFF.
- America’s Promise Alliance, 2019 (https://www.americaspromise.org/2019-building-grad-nation-report): California is one of three states for which improvement in graduation rates correlates with gains in three other measures of academic proficiency. The authors suggest this correlation shows that the graduation rate gains are accompanied by real gains in student knowledge and preparation, rather than lower standards.
- Learning Policy Institute, 2019 (https://learningpolicyinstitute.org/product/positive-outliers-districts-beating-odds-report); Learning Policy Institute, 2019 (https://learningpolicyinstitute.org/sites/default/files/product-files/Positive_Outliers_Qualitative_REPORT.pdf): A number of LEAs are beating the odds compared to their peers, showing gains for African-American and Latino students under LCFF and the new state academic standards. One factor leading to this improvement is the flexibility around use of funds that LCFF ushered in, and these LEAs consistently used that flexibility to recruit, support, and retain a strong teacher workforce.
A 2019 Public Policy Institute of California study of inputs under LCFF also shows that LEAs are using their LCFF funds consistent with the intent to increase or improve services for high-need students: https://www.ppic.org/publication/school-resources-and-the-local-control-funding-formula-is-increased-spending-reaching-high-need-students/. Although that study highlights continued challenges in equal access to qualified and experienced teachers, it shows that schools with more high-need students are receiving greater staffing resources under LCFF, on average, even if overall expenditures may not be substantially higher due to lower salaries paid to less experienced teachers who are often assigned to these schools.
Recommendations to the Board
The audit report includes four recommendations to the Board.
Recommendation #1: Annual Update: Merge with Goals Section within the LCAP. The audit report recommends that the Board merge the Annual Update section with the Goals, Actions, and Services section.
Board staff anticipate recommending that the Board adopt a revised template at its January 2020 meeting that integrates the annual update and the LCAP consistent with this recommendation.
AB 1840, Chapter 426, Statutes of 2018, substantially restructured the LCAP template statutes and consolidated the LCAP and Annual Update into a single section of statute. Under prior law, the annual update had been addressed in a separate code section from the LCAP itself, which limited the Board’s ability to integrate the annual update within the LCAP. The updated statute specifies that the LCAP and annual update can be part of the same template, which will allow the template to embed the progress monitoring features of the annual update within the planning sections.
We believe that this change will make it easier for stakeholders to understand whether the actions are being implemented as planned and how those actions are impacting opportunities and outcomes for students. It will also reinforce the expectation that the LCAP process support strategic planning, which will help LEAs monitor progress and evaluate whether the planned actions are improving student outcomes. Finally, this revision will substantially reduce the length of LCAPs, which should help improve transparency and accessibility for stakeholders.
Recommendation #2: LCAP Annual Update: Evaluating Implementation of Individual Actions. The audit report recommends that the Board amend the LCAP template to require LEAs to include analysis of the effectiveness of each individual action included in the LCAP, in addition to analyses for overarching goals.
Board staff do not anticipate recommending that the Board adopt a revised template that requires LEAs to evaluate the effectiveness of each individual action included in the LCAP, for several reasons.
First, such an approach assumes a linear causal chain between each individual action and a particular student outcome. Consistent with research and practical experience in education policy, several studies analyzing LCAPs have found that multiple, individual actions work together to support a broader goal to improve performance on a set of interrelated metrics.
For example, a school district might set a goal of improving literacy in third grade and identify several actions and services to achieve that goal, such as adopting a new instructional program; providing professional learning to teachers, administrators, and other personnel to implement that program effectively; and providing new instructional materials. Additionally, the district may adopt related actions such as hiring new counselors to support struggling students, hiring attendance counselors to help improve student attendance, or purchasing new data systems to provide teachers with analytics on individual student performance. This recommendation would artificially force LEAs to view each action in isolation, which is as likely to undermine meaningful evaluation of programmatic effectiveness as to enhance it.
Additionally, the audit report correctly notes the challenge presented for stakeholders when LCAPs are hundreds of pages long. A number of the changes to the LCAP template that staff expect to present to the Board in January 2020—such as integrating the annual update with the LCAP, incorporating summary tables for expenditures within the LCAP, and providing the required justification for LEA-wide and schoolwide actions that contribute to increased or improved services in a single section—will reduce the overall length of LCAPs and make it easier for stakeholders to see, in one place, key information about the relevant portion of the LCAP.
This recommendation is at odds with that goal. For a hypothetical LCAP with 120 actions, this recommendation would entail 120 fields of new text, which likely would be repetitive. Moreover, the audit report notes that some LCAPs include formulaic responses to narrative prompts that do not provide meaningful analysis or information for stakeholders. Mandating action-by-action analysis seems to invite precisely this type of an approach.
Responsive to the underlying concern, the recommended LCAP template instructions will include language specifying that LEAs may group actions under a goal with a set of metrics and encouraging LEAs to do so if there are multiple, unrelated actions included under a single goal. This grouping would allow for more robust analysis of whether the strategy the LEA is using to impact a specified set of metrics is working.
Recommendation #3: Increased or Improved Services: Schoolwide and Districtwide Actions. The audit report recommends that the Board update the LCAP instructions to include key information from CDE’s Uniform Complaint Procedure (UCP) appeal decisions related to LCAPs. Specifically, the audit report references UCP decisions related to the requirement that LEAs explain how districtwide and schoolwide actions are principally directed toward serving low-income students, English learners and/or foster youth if those actions are used to demonstrate increased or improved services for those student groups.
The revised LCAP template presented to the Board in January 2020 will include instructions that reflect these appeal decisions.
Additionally, Board staff will work with the CDE to include this information on the “Frequently Asked Questions” section of CDE’s website related to LCFF and the LCAP, and to include relevant information from any future UCP appeal determinations.
Recommendation #4: Accessibility of Language. The audit report recommends that the Board “instruct districts to ensure that their LCAPs are sufficiently clear and effective, including but not necessarily limited to ensuring that districts articulate a logical connection between their needs and goals, provide sufficiently detailed descriptions of services within the LCAP’s analysis section, and are written in a manner that is easily understandable.”
The revised LCAP template presented to the Board in January 2020 will include instructions emphasizing the LCAP’s purposes, which include ensuring that stakeholders can clearly see and understand how the LEA is aligning its budgetary resources in response to performance across the statutorily defined priorities and whether those strategies are working to improve opportunities and outcomes for students. The recommended instructions will also detail the purpose of each LCAP section to reinforce both the importance of conveying the information in each section understandably to stakeholders and how that particular information reinforces the planning process that is ultimately supposed to be memorialized in the adopted LCAP. Finally, the recommended instructions will also include language in the instructions encouraging LEAs to avoid jargon and review language in draft LCAPs for accessibility to non-educators and the broader public.
Although not addressed in the recommendations, the audit report includes a discussion of the transition from the state’s prior revenue limits and categorical funding system to LCFF. The report included a calculation of LCFF funding that LEAs received prior to 2018 as the state transitioned toward fully funding LCFF, specifically the breakdown of base, supplemental, and concentration funds within the LCFF formula, that differed from the method that LEAs applied pursuant to expenditure regulations adopted by the Board. The alternate calculation in the audit report assumed that the share of LCFF funds attributable to the supplemental and concentration factors would immediately transition to fully funded levels, but prorated to the LEA’s then-total level of funding.
When LCFF funding began in 2013-14, the state was just recovering from the Great Recession and LEAs had gone through years of significant budget cuts. LCFF was a significant restructuring of the method under which LEAs received state funding, and the statutory formula included a gradual transition that would phase-in full funding over a period of several years. The Board’s regulations reflected this gradual approach adopted by the Legislature: each LEA was required to calculate its obligation to increase or improve services annually relative to a baseline tied to actions that it provided to one of the three student groups that generate additional LCFF funds in the year immediately preceding LCFF’s enactment.
Notably, stakeholders did not submit comments in the regulatory process recommending that the Board adopt an approach similar to the alternative presented in the audit report, likely due to the recognition of how disruptive such an approach would have been. Using just one example from Figure 7 of the report, without a gradual phase-in period, the three school districts audited would have had to redirect $140 million, overnight, to actions that “increase or improve services.” However, a significant portion of that funding would have been budgeted in the prior year for core programs or other activities that would not meet the regulatory standard for increasing or improving services. The audit report’s alternative method of calculation would likely have resulted in substantial cuts to LEAs’ core programs, including layoffs of personnel, and implementation of a host of new programs all at once, which would likely have undermined the effectiveness of those programs.
CALIFORNIA STATE AUDITOR’S COMMENTS ON THE RESPONSE FROM THE CALIFORNIA STATE BOARD OF EDUCATION
To provide clarity and perspective, we are commenting on the State Board’s response to our audit. The numbers below correspond to the numbers we have placed in the margin of the State Board’s response.
An underlying theme within the first two pages of the State Board’s response is that implementation of some of our recommendations may lead the State down a path toward a return to categorical funding. During our work on this audit, representatives from the State Board and CDE made similar statements on various occasions; namely, that monitoring of district spending equates to categorical funding. We fundamentally disagree with that notion; in fact, we state on in the Audit Results that tracking information about districts’ spending of their LCFF allocations does not represent a return to categorical funding.
A critical point of our recommendations from this audit is that the State needs to better establish the linkages among three key components of the LCFF process: funding from the State (or inputs) to districts, the services that districts purchase with that funding, and improvements in student educational performance (outcomes). We depict this point in Figure 9. Ignoring the linkages between these LCFF components or ignoring the inputs to the LCFF process significantly reduces stakeholder assurance that the billions the State invests annually in LCFF—$62 billion for fiscal year 2018–19 alone—have the desired effect of improving student achievement.
As we mention in the Audit Results, by collecting and reporting additional information about the districts’ use of supplemental and concentration funding, the State could better ensure that it and other stakeholders understand how the districts’ spending of these funds affects intended student groups and whether further action is necessary to close persistent achievement gaps.
Contrary to the State Board’s assertion, tracking the districts’ spending of LCFF funding is not merely “an accounting exercise.” As we indicate in in the Comment #1, the State needs to better establish the linkages between funding, services, and student achievements. Furthermore, we explain in the Audit Results that the current requirement that a district must spend supplemental and concentration funding to increase or improve services by a specific percentage is essentially meaningless because it is unclear how a district would demonstrate such improvement and neither the county offices nor CDE is responsible for verifying that the district actually achieved the specific percentage increase in services. This approach reduces transparency and accountability by leaving stakeholders without a legitimate, tangible measurement against which to hold districts accountable for using the funding they receive to provide services to improve student achievement.
The State Board takes a narrow view of our recommendation. We recommend that it require districts to include analyses of effectiveness of individual services, in addition to analyses for overarching goals, so a broader perspective would not be lost. Our recommendation is consistent with state law, which as we state in the Audit Results, requires the LCAP template to include an assessment of the effectiveness of specific services described in the LCAP toward achieving the goals. The analyses of individual services would allow districts to highlight the effectiveness of particular services. In fact, Oakland Unified provided precisely this kind of analysis in its LCAP, as we state in the Audit Results. Without this kind of information, it can be difficult to determine, from among dozens of services provided, which particular services were effective in improving outcomes.
We disagree. Rather than being repetitive like the current template, the analysis we recommend would provide unique, critical information that would enable stakeholders to hold districts accountable to use their limited resources to continue funding effective services and discontinue ineffective services. As we indicate in the Audit Results, the excessive length of LCAPs results from districts including descriptions of numerous services, which obscures whether any particular service was effective. We believe having 120 services/actions for a single goal would reduce clarity for stakeholders, as we had difficulty with the 38 services that Clovis Unified included for one of its goals.
We appreciate the State Board’s perspective on the State’s transition period for LCFF. Because a key part of our audit included examining how districts spent their LCFF funding, we estimated the difference in the results between the funding method used during transition and the method the State will use upon full implementation. Additionally, as we state in the Audit Results, the State’s decisions to base supplemental and concentration funding amounts on prior year spending rather than proportions of intended student groups likely deferred improvements in performance outcomes for intended student groups.
We believe the State Board overstates the disruption of the approach we outlined that applies the funding formulas described in State law. Our analysis of the three districts’ funding revealed that by basing supplemental and concentration funds on the proportions of intended student groups, districts would not have lost funding for their core programs. Rather, at that time, districts would have faced the decisions of selecting which categorical programs to retain to increase and improve services for intended student groups and how to use their new flexibility to address local needs. In fact, the three districts continue to provide services similar to those provided under some of the former categorical programs and fund them with supplemental and concentration funding by principally directing them toward intended student groups. As we state in the Audit Results, the approach the State chose likely delayed improvements in performance outcomes for intended student groups.