Since the first time I saw the school rankings in the Boston Globe over a decade ago, I have been frustrated by the simplistic and misleading approach that this news outlet has taken in publicizing the scores from our state’s high stakes test. The approach is simply to rate the top schools from “Number One” to whatever the final number is depending on the grade level that was tested. For instance, if you were a school that had third graders in your building last spring then you had 954 other schools to compare yourself with.
As I discuss my thoughts here on these rankings, I need to make it clear that my intention is not to criticize or praise a school that I reference, but simply to clarify how this works for those who take these rankings too seriously.
Going back to third grade for a moment, the “number one” ranked school in the state in English Language Arts was the Richmond Consolidated School which had 100-percent of its students score in either Advanced or Proficient. By the way, the Richmond Consolidated School tested only 19 students. Compare this to the school that had the largest third grade population in the state, the Woodland School in Milford, MA which tested 303 students and ranked 571. Clearly we are comparing apples and oranges and it is unfair to the students and teachers to portray such a misleading picture. There are countless examples of these same types of comparisons that can be done at every grade level. This is without even getting into the demographics of individual schools and communities.
Here’s a another thing that irks me about the Boston.com ratings
Using the Grade 10 English Language Arts rankings as an example this time, I would like to ask this question. Do you think that a school ranked “number one” clearly outperformed a school ranked 99th? While the answer is an emphatic NO, if I were a typical parent from Andover, Brookline or any of the 23 schools that were ranked 99 I would probably be wondering why my child’s school is apparently so far away from “number one.”
The explanation is pretty straight forward, there were 28 schools that had 100% of their students score either Advanced or Proficient and were therefore ranked “number one.” The next ranking was “number 29,” a ranking that was shared by 22 schools that had 99% of its students scoring in the top two levels of the ELA MCAS. So, the good news for folks who ranked “number 99” is that 96% of their students scored either Advanced or Proficient.
Growth Scores Are A Better Measure
Thankfully our state’s Department of Education has moved to a growth model in regards to testing. What is a growth model?
Here is a quick definition from the DESE’s website –
For K-12 education, the phrase “growth model” describes a method of measuring individual student progress on statewide assessments (tests) by tracking the scores of the same students from one year to the next. Traditional student assessment reports tell you about a student’s achievement, whereas growth reports tell you how much change or “growth” there has been in achievement from year to year.
Shouldn’t we be paying more attention to these measures? Isn’t it more important to show where students were and how we track their growth and chart their progress compared to all of the students who had a similar score during the previous school year? For example, if we had a student who was in the lowest category (warning), shouldn’t we get some credit for moving them along to the next level (needs improvement)? The obvious answer is – yes!
In addition, I am sure that there are students that walk in the door in September and could score in the advanced level on that year’s MCAS test on day one of the school year. Therefore, I think it is insignificant when these students score advanced in May of the same school year. Again, we need to show that we are supporting student growth no matter where they are on day one of the school year.
One More Thing About Ranking Ourselves Based On Standardized Test Scores
For those who aren’t aware of the correlations between socioeconomics and standardized test, there are clear connections between standardized test results and the median household income in a community or a state. Check out the graphic below depicting average NAEP scores across our country and the median household income in each state.