I continue to suffer mixed emotions with regard to high-stakes testing. I blogged last year, here, on that topic and on the test we take here at the Martin J. Gottlieb Day School, the Iowa Test of Basic Skills or ITBS. Despite those mixed emotions, after conducting a thorough analysis, we did go ahead and publish our results. The “results” consisted of sharing the “Grade Equivalent Scores” for each grade in our school for each of the three major areas we test for: Language, Reading and Math. You can revisit how we analyzed the data and how we chose to present it, here.
I joked in that blog post that if I did not blog the results in the next year, it will have meant that we took a dip! That was a joke, and in fact, as I will show below we, again, scored quite well. Before posting, however, I want to state clearly that these are gross oversimplifications. We have disparate class sizes and welcome a diverse student body. It is valuable data – both the class averages and tracking classes over time. It is why we take the tests; they provide one valuable data point among many.
The other issue is in the proper understanding of what a “grade equivalent score” really is. For a detailed explanation, I encourage reading this source, here. But to quote the source:
Grade-equivalent scores attempt to show at what grade level and month your child is functioning. However, grade-equivalent scores are not able to show this. Let me use an example to illustrate this. In reading comprehension, your son scored a 7.3 grade equivalent. The seven represents the grade level while the 3 represents the month. 7.3 would represent the seventh grade, third month, which is December. The reason it is the third month is because September is zero, October is one, etc. It is not true though that your son is functioning at the seventh grade level since he was never tested on seventh grade material. He was only tested on fifth grade material. That’s why the grade-equivalent scores should not be used to decide at what grade level a students is functioning.
So…not to put to fine a point on it…higher scores are better than lower scores. Tracking the grades over time, one would like to see…
- The same grade score as well or better each year. BUT – it depends significantly on the makeup of the class and where they were prior. AND
- The same class grow at least a year’s worth of growth. BUT – it depends significantly on the class remaining exactly the same (which is rare) and is a pretty fuzzy statistic to begin with because it is an average.
With all those caveats in mind, in the spirit of full transparency, and with the attitude that all data is valuable data, allow me to present comparative data from last year and this year. How did we do?
First up? Language.
Remember…in order to track a class you have to compare 2011 to 2012. For example, in 2011, the Language Grade Equivalent of Average for Grade Two was 3.4. In 2012, those kids are now in Grade Three and scored 5.1. That class “grew” 1.7 from last year to this. (Also, the scale stops at 13…it is the highest score available.)
What does this graph tell us?
It tells me that each grade scored at just about the same or higher all across the board. And in the one grade where it “slipped,” Grade Six? 10.3 is an awfully high number for Grade 6 (even if it doesn’t mean they are like a Grade Ten class)!
It also tells me that each class grew at least one grade equivalency from 2011 to 2012 (technically Grade One grew .9). Again, great data.
Let’s move on to Reading.
Very similar to the one before. Grades are maintaining excellence from last year and growth is nearly a year in each grade (and in some cases significantly higher)!
So far, so good…and frankly, what we would have expected. The one place where we might see some unpredictability is in Math. We went ahead and overhauled our entire Lower School Math curriculum by adopting Singapore Math in Grades K-5. We expected transition issues the first year. How did we do?
Here we find a few surprises. We would have assumed (and, in fact did) that the transition would be easier in the lowest grades and harder in the higher ones. And maybe it was for the students and the teachers. But our test scores reflect the opposite. The grade scores are flat (or slightly higher) in Grades K-3, but jump up in Grades 4 & 5. The class scores show tremendous growth of more than a year’s growth across the board. (The only exception came in Grade One, which only “grew” .6 from the prior year. There are lots of factors involved in testing and one hesitates to draw too many conclusions from one test. It will be noted for observation. One is also sensitive to teachers’ feelings in being this transparent. Their courage at this level of exposure is to be commended.)
This, we hope, is the first bump of Singapore Math with bigger bumps to come. We are also pleased that our Middle School Math scores remain consistently excellent.
So, as with last year, all receiving teachers will have prior years’ data and be charged with making the next year even better. They have been up to the task the last two years and we look forward to more learning, more growth and more excellence in the year to come.
Speaking of the year to come? Wonder who will be teaching what next year? Stay tuned to next week’s blog post!