In the first post in our series dissecting the new school ranking system created by Governor John Kasich and the Republican legislature we listed the six criteria that will be used to rank all school districts/charter schools in the state, explained why the decision to heap massive new responsibilities upon the Ohio Department of Education widely misses the mark (despite ODE being the correct agency), and concluded by dismantling the first measure in the ranking system.  If you have not yet read that post, you can access it here.

In this second post in the series we will examine the requirements of item #2 in the list of ranking criteria.

Again, the six criteria that were adopted in the final bill as follows [bold added for emphasis]:

  1. Performance index score for each school district, community school, and STEM school and for each separate building of a district, community school, or STEM school. For districts, schools, or buildings to which the performance index score does not apply, the superintendent of public instruction shall develop another measure of student academic performance and use that measure to include those buildings in the ranking so that all districts, schools, and buildings may be reliably compared to each other.
  2. Student performance growth from year to year, using the value-added progress dimension, if applicable, and other measures of student performance growth designated by the superintendent of public instruction for subjects and grades not covered by the value-added progress dimension;
  3. Performance measures required for career-technical education under 20 U.S.C. 2323, if applicable. If a school district is a “VEPD” or “lead district” as those terms are defined in section 3317.023 of the Revised Code, the district’s ranking shall be based on the performance of career-technical students from that district and all other districts served by that district, and such fact, including the identity of the other districts served by that district, shall be noted on the report required by division (B) of this section.
  4. Current operating expenditures per pupil;
  5. Of total current operating expenditures, percentage spent for classroom instruction as determined under standards adopted by the state board of education;
  6. Performance of, and opportunities provided to, students identified as gifted using value-added progress dimensions, if applicable, and other relevant measures as designated by the superintendent of public instruction.

The department shall rank each district, community school, and STEM school annually in accordance with the system developed under this section.

#2 – Value-Added Student Growth.  The process of obtaining value-added progress is done through a complicated series of statistical calculations by an out-of-state company, EVAAS.  The exact calculations are generally considered proprietary and secretive, though the creator of the system, William Sanders, refutes that claim and directs people to a 1997 publication that he says explains his formula.  That controversy aside, value-added data is basically a comparison of a student’s standardized test results from one year to the next, with the outcome compared to the student’s expected results relative to the pool of all tested students.  When a group of student results is combined, the teacher, school, and district, are assigned a value based on the performance of the students.

As this measure is still based on standardized test score results, the objections we raised about test bias in our first post still hold true.  However, to be fair to proponents of this measure, value-added results are touted as removing the bias by comparing a student’s own growth from year to year to determine whether the student has gained “a year’s worth of growth.”  As you can see on the chart below, the distribution is more even, displaying that the poverty level of the district does not appear to play a significant role in these results.  Note: a gain index value of greater than 2 is considered more than a single year of growth, between 2 and -2 is approximately one year’s growth, and less than -2 is less than a single year of growth.

Presuming this is considered more accurate for districts in terms of accounting for external factors, we would weight this criterion greater than the Performance Index Score.  But treating value-added results with such emphasis introduces the major flaw in this criteria, the limited pool of contributing students.

In Ohio, value-added results are based on the results of standardized tests that students take in back-to-back years (required to demonstrate yearly growth).  Due to this systemic limitation, only districts/schools that give the Ohio Achievement Assessments in reading and math in grades 3-8 receive these results.  So while the PI Score eliminated 65 schools from the rankings, a lack of existing value-added scores leaves 188 schools without data (19%).  And as with the PI Score scenario, charter schools are the most common schools (137 total) without value-added scores due to the large number of charters serving high school students.  In other cases, a charter school serves very few students across a large grade band, resulting in too few students to provide statistically reliable value-added scores.  Traditional districts, because of their legal requirement to serve all students within their boundaries, have value-added results, though high school performance has absolutely no bearing in this category. Therefore, the inequity among districts  abounds with this measure, preventing districts with high-performing secondary programs from adequately demonstrating their students’ progress.

And if that isn’t questionable enough, the most controversial aspect of this requirement is where it explains that the Ohio Department of Education is required to create value-added measures for those grades and subjects that aren’t covered by the standardized testing.  How do we even begin to explain to legislators and lobbyists that entertain such an absurd idea that just because they write something down on paper that it doesn’t make it so?

William Sanders, often referred to as the Godfather of value-added reporting, has been working on his model since the 1990s and is STILL fine-tuning the process and defending its validity.  In a 2008 article, Audrey Amrein-Beardsley, an assistant professor at Arizona State, writes that “although the EVAAS model is probably the best and most sophisticated one we have of this type,” it has “significant issues”: “insufficiency of validity studies, the difficulties with the userfriendliness of the model, the lack of external reviews, and the methodological issues with missing data, regression to the mean, and student background variables.”

The nonprofit organization Battelle for Kids can be credited with bringing the concept of value-added reporting to Ohio (to be clear, it’s not inherently negative) in an effort to help school districts improve their educational practices.  In a 2011 report published by Battelle for Kids and commissioned by The Bill and Melinda Gates Foundation, an aggressive education reformer and school accountability advocate, a strong case is offered against Kasich’s brand of hurry-up-and-create-tests legislation:

While many inputs may go into growth models, the availability and use of high quality tests are essential. Growth models, in some way or another, rely on tests of students’ knowledge in a particular subject or content area.

Now connect that concept of needing “high quality tests” with the knowledge from our prior post that Kasich cut the Ohio Department of Education budget by 12.6% as you read the following information from that same Gates Foundation report:

Implementing growth measures—especially for the purpose of examining educator effectiveness—is a challenge in any environment. It is important to consider the experience, expertise and capacity you will need from internal resources as well as an external model provider in order for your organization to be successful.

While you may not need economists or statisticians on staff, internal capacity must be considered. State or district staff must have the ability and will to drive implementation decisions and oversee any work with an external provider to ensure success. If undertaking the analysis internally, your state or district should still consider outside support in the way of an external review of your systems, methods and processes to ensure your models are behaving as expected and producing valid results.

As with any investment, the financial costs to implement and sustain a program are important to consider. Education spending is scrutinized, so you need to be prepared to explain the short- and long-term costs of implementing a growth model. You also need to be able to make a sound argument for spending public dollars on these measures.

To make the investment worthwhile, consider how you will ensure financial sustainability for the continued use of growth data. Remember to account for internal personnel and infrastructure, as well as available funding, to determine whether you could eventually conduct the analysis in-house, or if you need a long-term partnership with a provider.

You need to consider the time and personnel your organization has to invest to make the implementation successful.

In early June, we calculated that the cost of creating tests for additional tests would be over $300 million.  That obscene cost did not even calculate a test for every grade and every subject as this value-added criterion that Kasich signed into law.  While we calculated that figure based on student participation in courses annually, the fact is that schools actually offer many more specific courses that will need, according to Battelle for Kids and the Gates Foundation and the budget bill, high-quality tests “for subjects and grades not covered by the value-added progress dimension.”  Currently, Ohio has 21 different standardized tests among grades 3-8 and grade 10.  Conversely, the Ohio Department of Education has approximately 500 unique courses on record that will, through this legislation, require that a high quality test be created in order to measure student growth.  And there are very few, if any, existing models for these tests and even fewer that would provide a valid correlation to the existing Ohio tests in order to use them to compare school performance. The creation of measures to fill in all the caps not only WILL cost a lot of money, but it SHOULD cost a lot of money to be done properly.

How many more reasons do the education experts at the Statehouse need in order to comprehend that they have adopted a law that not only can’t be implement fairly, but can’t be implemented AT ALL?

Since we’re only one-third of the way through the six criteria, we have many more reasons to offer.

  • I have friends who work for Battelle for Kids, and I can tell you that a) most of the people working there are teachers/former education administrators and b) are people genuinely interested in improving education in this country. The key to BFK is that they are a non-profict, this isn’t some attempt to funnel work to a private corporation to payback campaign donations.

  • Anonymous

    I completely agree.  I’ll go as far as to say everyone at Battelle for Kids is interested in helping schools improve their educational practice.  BFK existed WAY before this current Governor and legislature was in place.  The only smart thing Ohio has done involving value-added was to retain BFK’s services in providing professional development for value-added.

    If I implied that BFK was anything but helpful in this process, it was not intended that way at all.

  • James

    One of the other issues to take into account is the fact that Ohio, as well as other states adopting the Common Core Standards, will be transitioning to new tests. In order for any value added model to be considered valid and reliable, it must look at several years of data before applying its formula to predict student growth. How will the state determine value-added scores during this period of time?

  • Anonymous

    Don’t even get me started on that one, James!  We could likely write another entire series of posts (and will do at least one full post) on the the impending corruption of this year’s legislation by the new Common Core.  Like so many of our posts this year, it would be a barrel of laughs if it wasn’t all devastatingly true.

  • Value Added has been introduced in Ohio some time ago. And true, it does approach finding progress from year to year with each student in a reasonable way, however taken at face value, (no pun intended) it basically puts each student on an Individual Education Plan. Individual Education plans are great tools to track a students progress, but less than wonderful logistically for the regular classroom teacher. It takes a good special education teacher hours and hours of work to develop  solid Individual Education Plans for small groups of students. Consider the size of a regular classroom, 18-24 multiply that by an Individual Education Plan for each student. There are other consideration also, like the cognitive ability of a student to read at the third grade level, reaching that level and staying there over the course of a few years. Over those few years who will take the wrap for lack of progress measured by standards legislators decide are adequate? And the finished product, may indeed be a student graduating from H.S. with reading skills of a 3.5 or 4th grader. Progress measured on individual ability year to year that follows a student regardless of his actual grade placement makes sense, evaluating schools or teachers on that progress against standard tests or “bars” set by legislators is  is nonsense.

  • How many years does it take for a community school to fail?  Some have been failing for six years!  They don’t deserve my tax money.  These community schools are just profit makers for corporations.  The children are in classrooms with little real school equipment and some young children are forced to sit at large tables in folding chairs. That is not only uncomfortable but it adds to problems with classroom management and student fatigue.  Instructional materials and curriculum supplies often do not arrive until a few months after school starts.  Teachers at these charter schools are working 10-11 hours/day and must even eat lunch with their students.  When you figure out how much these teachers are paid along with their salaries, their pay rates are close to the minimum wage.  Which fat cats are making money off of the charter schools?

  • Duckmonkeyman

    But if value-add is proprietary methodology that is so complex that few can understand it, then it serves a limited purpose because 1) science demands that research be openly published such that the results can be independently verified.   And 2) if those being measured by value-add (teachers/administrators)  cannot underand it enough to see a direct correlation to day to day practices for self-reflection and improvement, then the methodology becomes useless and seen as more of a mysterious puntative tool.    At one private company, their bonus “formula” was not only secret, it was too complex to be taken seriously.  Hence, bonuses were seen as a joke except by those that (often unfairly) received the most regardless of merit.

    All teachers know that there is great variability in classrooms.  I often ask the scenario “Who is the better teacher?  Teacher A has a class of motivated, high achieving, well-fed, and parent-supported group of students and posts high value-add gains.  Teacher B is given large classes of unmotivated, disadvantaged, and disruptive class of students but works just as hard to acheive small value add gains.”     No good answer yet.

    People want simplistic answers with simple numbers.   Humans cannot be reduced to numbers nor can teaching be entirely reduced to engineered statistics.  I remain skeptical of value-add and PI measures as should everyone – the legislatures included ASSUMING their motives are to improve education and not just cut costs.

  • Anonymous

    “Which fat cats are making money off of the charter schools?”

    White Hat Dave.  Except he and his wife recently wrote a memo to their employees that they (the Brennans) couldn’t afford to continue subsidizing White Hat schools and cost-cutting was on its way:

Looking for something?

Use the form below to search the site:

Still not finding what you're looking for? Drop a comment on a post or contact us so we can take care of it!