
Credit: Alison Yin / EdSource
Accountability has been a central plank in California’s — and our nation’s — school reform efforts for over two decades. Over nearly that entire period, California has been criticized (including by me) for being one of the few states that does not include a measure of student achievement growth in our accountability system. The current approach, exemplified in the California School Dashboard, rates schools on their average performance levels on the state’s standardized tests, and on the difference between the school’s average performance this year and last year.
But the state doesn’t have, and has never had, a student-level growth model for test scores. Student-level growth models are important because they do a much better job than the state’s existing measures of capturing school effectiveness at improving student achievement. This is because growth models directly compare students to themselves over time, asking how much individual children are learning each year and how this compares across schools and to established benchmarks for annual learning. The crude difference models the state currently displays in the dashboard could give the wrong idea about school performance, for instance, if there are enrollment changes over time in schools (as there have been since the pandemic).
Growth models can help more fairly identify schools that are often overlooked because they are getting outsize results with underserved student groups. In other words, they send better, more accurate signals to report card users and to the state Department of Education about which schools need support and for which students. Along with Kansas, California has been the last holdout state in adopting a report card that highlights a growth model.
Though the state’s task force on accountability and continuous improvement, on which I served, wrapped up its work and recommended a growth model almost nine years ago, the process of adopting and implementing a growth model has been — to say the least — laborious and drawn-out. Still, I was delighted to see that the California Department of Education (CDE) has finally started providing growth model results in the California School Dashboard! This is a great step forward for the state.
Beyond simply including the results in the dashboard, there are some good things about how the state is reporting these growth model results. The growth model figures present results in a way I think many users will understand (points above typical growth), and results for different student groups can be easily viewed and compared.
There is a clear link to resources to help understand the growth model, too. The state should be commended for its efforts to make the results clear and usable in this way.
It doesn’t take a detailed look at the dashboard to see, however, that there are some important fixes that the State Board of Education should require — and CDE should adopt — as soon as possible. Broadly, I think these fixes fall into two categories: technical fixes about presentation and data availability, and more meaningful fixes about how the growth model results are used.
First, the data are currently buried too deeply for the average user to even find them. As far as I can tell, the growth model results do not appear on the landing page for an individual school. You have to click through using the “view more details” button on some other indicator, and only then can you see the growth model results. The growth model results should, at minimum, be promoted to the front page, even if they are put alongside the other “informational purposes indicator” for science achievement. A downloadable statewide version of the growth model results should also be made available, so that researchers and other interested analysts can examine trends. Especially in light of the long shadow of Covid on California’s students, we need to know which schools could benefit from more support to recover.
Second, the state should prioritize the growth model results in actually creating schools’ dashboard ratings. Right now, the color-coded dashboard rating is based on schools’ status (their average scale score) and change (the difference between this year’s average score and last year’s). It would be much more appropriate to replace the change score with these growth model results.
There are many reasons why a growth model is superior, but the easiest to understand is that the “change” metrics the state currently uses can be affected by compositional changes in the student body (such as which kinds of students are moving into and out of the school). Researchers are unanimous that student-level growth models are superior to these change scores at accurately representing school effectiveness. Even for California’s highly mobile student population, growth models can accommodate student mobility and give “credit” to the schools most responsible for each child’s learning during that academic year.
To be sure, I think there are other ways the dashboard can likely be improved to make it more useful to parents and other interested users. These suggestions have been detailed extensively over the years, including in a recent report that dinged the state for making it difficult to see how children are recovering post-Covid.
The adoption of a growth model is a great sign that the state wishes to improve data transparency and utility for California families. I hope it is just the first in a series of improvements in California’s school accountability systems.
•••
Morgan Polikoff is a professor at the University of Southern California’s Rossier School of Education.
The opinions expressed in this commentary represent those of the author. EdSource welcomes commentaries representing diverse points of view. If you would like to submit a commentary, please review our guidelines and contact us.