Friday, April 06, 2007

Value-added or value-maybe

I have a variation on an old saying that reads, "If you have to join 'em, beat 'em."

Ohio is currently rolling out its value-added system. For years, many in education feared the concept of value-added. They feared a system that purported to be able to analyze test results and see if teachers and schools were able to add value during the previous school year. The standard for added value is the oft-used phrase, a year for a year: Did the student gain a year of knowledge during the most recent school year?

The year for a year has its own set of issues; such as, what is a year of growth in absolute terms, not defined by historical gains from similarly-situated students? For this discussion, I am going to ignore such issues and concentrate solely on the whitewash being used to spin lagging results into congratulatory press releases.

In logic, something is either A or not-A; there is no gray separating the two. In addition, saying something is not-not-A implies that the thing is A; the nots cancel. But in statistics, the nots don't cancel. So, not-not-A does not imply A.

A closer look: My district uses the standard value-added definition for not achieving a year of growth; performance two or more standard deviations [1] below the expected result. The reason for this range is that the statistics used in value-added analysis cannot provide exact figures. No one can state with certainty that a child achieved growth of, say, 1.23 standard deviations from the expected growth.

We can state with great confidence that the child who achieved at 2.24 standard deviations below expectations did not grow an academic year. Because of the inexactness of the statistics used in value-added analysis, it would not be fair to designate a teacher, school, or district as a poor performer unless the students achieved below the level of confidence; two standard deviations below expectation.

I can accept that. Statistics - an inexact science subject to noise - has to allow for a range of gray before making a positive statement. But the range covered by possible outcomes above the two-standard-deviations-below-expectation mark accounts for over 97% of all possible outcomes. Or, put differently, a score in the 3rd percentile is considered as not-not-A; the level which is now defined as A, the expected result of a year for a year.

The sleight of hand: The Olentangy School District has reported to our community that its students are gaining a year for a year simply because they are achieving above the 3rd percentile. The statement that district students saw a year of growth simply means that the students did not see the not a year of growth. But that does not necessarily imply that district students saw a year of growth. Yet the not-not-A has becomes A due to a sleight of hand. What is true in logic is not true in statistics, but is good PR nonetheless.

As a recent board member (resigned a little over six months ago), I saw the data from two years ago - poor performance - but was never able to get copies of last year's data (the latest reported data) since the administration was concerned about the results. Don't know for certain if last year's data was as bad, but in my experience possible good results were easy to obtain while possible poor results were always a struggle, unless the results were bound for public release through the state or local press. [2]

Now when I say that two years ago the district showed poor performance, I really mean that most combinations of school, grade-level, and subject, revealed achievement below expectations, including some combinations below the two standard error mark of failure.

The system that feared value-added has simply spun its value into that which tells a story worthy of congratulations. The 3rd percentile, a very low standard indeed. But the new standard nonetheless.

"If you have to join 'em, beat 'em". They won!

Note:

[1] It's actually two standard errors, but, for a population of one, the standard error is the same as the standard deviation. Remember, the district is stating that all students will grow one academic year per school year. That means that the focus is on individual students, not some aggregate class, school, etc.

That said, the district may be using two standard errors of measurement as stated at the board meeting. If that is indeed the case, the district has created its own system, one different from the now-standard Sanders value-added analysis. Sanders uses two standard errors not standard errors of measurement.

The district's system would mean that they are at the 25th percentile, give or take, depending on the test. Still a very low standard of achievement.

[2] Consider the arrogance of an administration that wouldn't provide its own board member with timely data.

No comments: