Professional Documents
Culture Documents
about faulty data from Fordham blog that is critical of the SSM
Accountability Model. November 18, 2015
were to use the correct standard errors, Table 1 would instead show a more
normalized distribution of performance where most schools are "within predicted,"
which is exactly what we see in California.
This undermines the first point in your abstract: that the SSM is ineffective at
measuring student achievement. You noted that the SSM is problematic for
assessing the achievement of Ohio schools and it incorrectly classifies a large
number of Ohio schools as being high- and low-performing. We now see that your
conclusion was based on inaccurate analysis.
The second error we have identified is that you used enrollment instead of tested
students to weight the regressions. When using the correct weights (see attached
data and code), the results in table 4 are reversed: charters are the school type
most disadvantaged by the SSM. Charters are even more disadvantaged than the
Big 8 urban schools. After controlling for value added scores, charters are the school
type most likely to score low on the SSM. The unmeasured disadvantages that you
discuss in your paper (e.g. unsupportive parents) appear to be most common in
charters. This undermines the third point in your abstract: that the SSM artificially
inflates the measured performance of Ohio charter schools.
Using the correct weights causes Ohio's R-squared values on the SSM to be between
0.8 and 0.9 for all grade spans. These are actually even higher than the R-squared
values we see in CA. In your abstract you cited that observable student
demographic differences account for only about half of the variation in achievement
among Ohio schools compared to 85% to 90% in California. As we explained on the
phone, you incorrectly referenced Californias data: we cite in our SSM Technical
Guide (pages 19-24) that the R-squared values we find in California for 2013 are
actually between .072 and 0.81. Moreover, we now see, using the correct
methodology, that the SSM for Ohio would account for 80-90% of achievement. This
undermines the first point as well as the first policy implication in your abstract.
You note in your article that there is almost no relationship between student
achievement growth, as measured by value added, and the scores we get using the
Similar Students measure. While there is only a 0.25 correlation between the Ohio
SSM (run using the corrected methodology) and the Ohio value added scores, the
exact same correlation exists between the Ohio Performance Index and value added
scores. We do find moderately higher levels of correlation 0.42 between the SSM
and value added scores when we look at Los Angeles schools (which had value
added scores issued by LAUSD). But ultimately the SSM is not a growth model.
Rather it is a measure of relative school achievement after accounting for
demographic factors known to affect achievement.
In sum, the number of inaccuracies and mistakes in your policy brief and associated
article, combined with the sweeping leaps in logic used to draw your conclusions,
lead us to request that you retract and revisit your analysis. We may choose to
philosophically and respectfully disagree but we cannot sit by and allow a deeply
flawed analysis of our measure to stand without correction.
Thank you for your consideration.
Very sincerely,
Elizabeth Robitaille, Ed.D.
Senior Vice President, Achievement and Performance Management
California Charter Schools Association
250 E. 1st St, Ste. 1000, LA, CA 90012
Website: www.calcharters.org
Californias charter public school movement grows as 36,000 more students choose to attend charters
this school year. Over 580,000 students are now being educated in 1,230 charters across the
state. Charter schools are delivering on the promise of a great public education. Visit our website for
more information.