You are on page 1of 3

Response from the California Charter School Association over concerns

about faulty data from Fordham blog that is critical of the SSM
Accountability Model. November 18, 2015

Dear Dr. Kogan,


Thank you again for speaking with us and for making your data and code available.
We appreciated your time and found the materials useful in understanding the
issues associated with your analysis of the SSM regression.
Utilizing the Ohio data you sent us, we ran it through our SSM code in STATA and
have concluded that you did not accurately replicate Californias model. To claim
that you did is incorrect and troubling. We find two errors in your methodology that
completely change the resulting predictions for schools (specifically tables 1 and 4)
and therefore the conclusions that can be drawn from your findings. The result is
that your analyses fail to support two of the three findings in your abstract, along
with many of the claims made in your paper. In light of this, we request that
you immediately retract your article and policy brief and replace them
with ones that accurately apply the California Charter Schools Association
SSM metric with corrected results as outlined below.
We have a call scheduled with Chad Aldis, Fordham Vice President for Ohio Policy
and Advocacy, Friday afternoon on another topic. It would be helpful to hear
from you by end of day tomorrow so we have some sense as to whether we
need to address this matter with Fordham directly.
The California Charter Schools Association has consistently stated that a growth
model based on individual student data would be a very important addition to the
accountability landscape in California. We have also stated that the best framework
would be a combination of measures that take into account students proficiency,
growth over time, and progress compared to other schools serving similar
demographics of students. It seems troubling that high stakes closure decisions
would be made in Ohio solely on the basis of student growth in grades 4-8 in
combination with achievement data that are highly biased by student poverty,
ethnicity, and other student demographic factors.
We have found our Similar Students Measure (SSM) to be a useful lens through
which to assess school performance given the demographics of students served in
each school. We use it in combination with achievement, improvement, and
postsecondary readiness indicators to identify schools that appear to be
underperforming according to publically available data. We then engage with each
school in a deep dive, multiple measure review to ensure that our closure advocacy
decisions are based on a deep and careful assessment of school performance across
all grade levels. When charter schools are underperforming, the California Charter
Schools Association has itself strongly advocated for those schools closures.
Below please find more details on the errors we see in your Fordham
commentary and policy brief.
The first error you already acknowledged: the wrong kind of standard errors were
applied in your analysis. You used the standard error of the predicted means for
each observation where we instead use standard error of the forecast. Forecast
errors are more conservative measures of uncertainty surrounding individual model
estimates that vary less across observations and are therefore more appropriate for
this type of regression model. As indicated in Table 1 of your report, your standard
error placed 1,205 schools far above their predicted performance and 1,026
schools far below their predicted performance with zero schools in between. If you

were to use the correct standard errors, Table 1 would instead show a more
normalized distribution of performance where most schools are "within predicted,"
which is exactly what we see in California.
This undermines the first point in your abstract: that the SSM is ineffective at
measuring student achievement. You noted that the SSM is problematic for
assessing the achievement of Ohio schools and it incorrectly classifies a large
number of Ohio schools as being high- and low-performing. We now see that your
conclusion was based on inaccurate analysis.
The second error we have identified is that you used enrollment instead of tested
students to weight the regressions. When using the correct weights (see attached
data and code), the results in table 4 are reversed: charters are the school type
most disadvantaged by the SSM. Charters are even more disadvantaged than the
Big 8 urban schools. After controlling for value added scores, charters are the school
type most likely to score low on the SSM. The unmeasured disadvantages that you
discuss in your paper (e.g. unsupportive parents) appear to be most common in
charters. This undermines the third point in your abstract: that the SSM artificially
inflates the measured performance of Ohio charter schools.
Using the correct weights causes Ohio's R-squared values on the SSM to be between
0.8 and 0.9 for all grade spans. These are actually even higher than the R-squared
values we see in CA. In your abstract you cited that observable student
demographic differences account for only about half of the variation in achievement
among Ohio schools compared to 85% to 90% in California. As we explained on the
phone, you incorrectly referenced Californias data: we cite in our SSM Technical
Guide (pages 19-24) that the R-squared values we find in California for 2013 are
actually between .072 and 0.81. Moreover, we now see, using the correct
methodology, that the SSM for Ohio would account for 80-90% of achievement. This
undermines the first point as well as the first policy implication in your abstract.
You note in your article that there is almost no relationship between student
achievement growth, as measured by value added, and the scores we get using the
Similar Students measure. While there is only a 0.25 correlation between the Ohio
SSM (run using the corrected methodology) and the Ohio value added scores, the
exact same correlation exists between the Ohio Performance Index and value added
scores. We do find moderately higher levels of correlation 0.42 between the SSM
and value added scores when we look at Los Angeles schools (which had value
added scores issued by LAUSD). But ultimately the SSM is not a growth model.
Rather it is a measure of relative school achievement after accounting for
demographic factors known to affect achievement.
In sum, the number of inaccuracies and mistakes in your policy brief and associated
article, combined with the sweeping leaps in logic used to draw your conclusions,
lead us to request that you retract and revisit your analysis. We may choose to
philosophically and respectfully disagree but we cannot sit by and allow a deeply
flawed analysis of our measure to stand without correction.
Thank you for your consideration.
Very sincerely,
Elizabeth Robitaille, Ed.D.
Senior Vice President, Achievement and Performance Management
California Charter Schools Association
250 E. 1st St, Ste. 1000, LA, CA 90012
Website: www.calcharters.org

Californias charter public school movement grows as 36,000 more students choose to attend charters
this school year. Over 580,000 students are now being educated in 1,230 charters across the
state. Charter schools are delivering on the promise of a great public education. Visit our website for
more information.

You might also like