“The preliminary PARCC results showed that in most grades, students who took PARCC math and English language arts tests on a computer were less likely to score in the “meeting expectations” range…(Link to source)”
|(Photo credit: Wikipedia)|
Following my post on the misleading nature of the Boston.com MCAS rankings, a blog commenter noticed that the three growth scores I referenced were are three highest areas of growth for Burlington in 2013. While that was true, I am happy to share all of our district growth scores from 2013 in the table below. The range between 40-60 is considered moderate growth and anything below 40 is considered low growth.
This evening at our School Committee Meeting I shared some initial thoughts on our 2013 MCAS scores and why the Boston.com rankings are so misleading. I have shared my main points from the presentation in the video below.
As our students in Massachusetts settle in for their annual round of “high-stakes” testing, I think it is the perfect time for people to take a few minutes to watch the video from TEDxCreativeCoast titled The Future Will Not Be Multiple Choice. The presentation by Jaime McGrath (an elementary school teacher in Savannah, GA) and Drew Davies (a web designer) was posted on Mind Shift’s blog about a month ago and I forgot about it until I saw a tweet last night with the link. It really is a must watch for anyone who thinks that our current educational structure is adequate.
It’s no newsflash the current structure of most classrooms is unchanged from the structure that was created to educate students for an industrial society back in the 19th Century. At one point in our history fitting the right piece in the right hole as quickly as possible and being able to retain large amounts of trivial information in order to regurgitate it or draw from it quickly may have actually been useful. However in a day and age where asking the right questions is of more value than providing a quick response to a multiple choice question, we are past the point of needing a change.
In fact the findings of educators like McGrath, who stray from the current script and look at problem-based education and a focus on “design thinking” are clear.
“All we did was give them the challenge, point them in the right direction and give them the space to be creative,” noted McGrath.
Here are a few of my take-aways made by the co-presenters Jaime McGrath (an elementary school teacher in Savannah, GA) and Drew Davies (a web designer):
- Reports predict that 65% of our students will be working in jobs that don’t exist yet.
- “Such simple tasks as manipulation of blocks helps infants and toddlers develop early skills, including math literacy – the language of numbers.” Huttenlocher, Jordan, and Levine 1994
- Don’t need students skilled in picking A, B, C, D
- “A true understanding of reality is not possible without a certain element of imagination…” Lev Vygotsky
- Design in education compliments all learning styles
- Will it be messy and risky? But what is the reality we are trying to prepare our kids for?
- The future is not a multiple choice test, it is a design challenge
So my question about the state assessment (or a national assessment) posed above was – “How will these students pass the state (or national) assessment. Here’s my answer – “Who Cares!”
I think the bottom line is that students who are being taught in classrooms where they are being taught to think will be successful on any measure.
Since the first time I saw the school rankings in the Boston Globe over a decade ago, I have been frustrated by the simplistic and misleading approach that this news outlet has taken in publicizing the scores from our state’s high stakes test. The approach is simply to rate the top schools from “Number One” to whatever the final number is depending on the grade level that was tested. For instance, if you were a school that had third graders in your building last spring then you had 954 other schools to compare yourself with.
As I discuss my thoughts here on these rankings, I need to make it clear that my intention is not to criticize or praise a school that I reference, but simply to clarify how this works for those who take these rankings too seriously.
Going back to third grade for a moment, the “number one” ranked school in the state in English Language Arts was the Richmond Consolidated School which had 100-percent of its students score in either Advanced or Proficient. By the way, the Richmond Consolidated School tested only 19 students. Compare this to the school that had the largest third grade population in the state, the Woodland School in Milford, MA which tested 303 students and ranked 571. Clearly we are comparing apples and oranges and it is unfair to the students and teachers to portray such a misleading picture. There are countless examples of these same types of comparisons that can be done at every grade level. This is without even getting into the demographics of individual schools and communities.
Here’s a another thing that irks me about the Boston.com ratings
Using the Grade 10 English Language Arts rankings as an example this time, I would like to ask this question. Do you think that a school ranked “number one” clearly outperformed a school ranked 99th? While the answer is an emphatic NO, if I were a typical parent from Andover, Brookline or any of the 23 schools that were ranked 99 I would probably be wondering why my child’s school is apparently so far away from “number one.”
The explanation is pretty straight forward, there were 28 schools that had 100% of their students score either Advanced or Proficient and were therefore ranked “number one.” The next ranking was “number 29,” a ranking that was shared by 22 schools that had 99% of its students scoring in the top two levels of the ELA MCAS. So, the good news for folks who ranked “number 99” is that 96% of their students scored either Advanced or Proficient.
Growth Scores Are A Better Measure
Thankfully our state’s Department of Education has moved to a growth model in regards to testing. What is a growth model?
Here is a quick definition from the DESE’s website –
For K-12 education, the phrase “growth model” describes a method of measuring individual student progress on statewide assessments (tests) by tracking the scores of the same students from one year to the next. Traditional student assessment reports tell you about a student’s achievement, whereas growth reports tell you how much change or “growth” there has been in achievement from year to year.
Shouldn’t we be paying more attention to these measures? Isn’t it more important to show where students were and how we track their growth and chart their progress compared to all of the students who had a similar score during the previous school year? For example, if we had a student who was in the lowest category (warning), shouldn’t we get some credit for moving them along to the next level (needs improvement)? The obvious answer is – yes!
In addition, I am sure that there are students that walk in the door in September and could score in the advanced level on that year’s MCAS test on day one of the school year. Therefore, I think it is insignificant when these students score advanced in May of the same school year. Again, we need to show that we are supporting student growth no matter where they are on day one of the school year.
One More Thing About Ranking Ourselves Based On Standardized Test Scores
For those who aren’t aware of the correlations between socioeconomics and standardized test, there are clear connections between standardized test results and the median household income in a community or a state. Check out the graphic below depicting average NAEP scores across our country and the median household income in each state.