Burlington Public Schools Assessment Update from October 18, 2016 School Committee Meeting

The presentation embedded above was shared at our School Committee Meeting last Tuesday. It is a brief overview of some of our assessment data from across the district. 
Elementary Assessment Data 
Slides 3-11 are a glimpse at our Response to Intervention (RTI) Assessment data that we utilize for monitoring the progress of our elementary students. The highlights are as follows:
Slide 3 – This slide shows the trends in our Burlington elementary data with a focus on the fact that we moved 207 students from Tier 2 back to Tier 1 last year.
Slides 4-5 – These two slides are graphs that show concretely the increase in Tier 1 students during the course of the school year along with a decrease in Tier 2 students.
Slide 6-7 – These slides compare both  our mid-year data and end-of-year data from 2015 to 2016. In both cases, there is a clear improvement from 2015 to 2016. 
Slides 8-9 – These slides are further evidence of the fact that our elementary students are receiving focused instruction that is allowing them to improve upon deficits and move back to Tier 1.
Slides 10-11- These are the district-wide elementary results ELA and Math assessments. As the results indicate, we are hitting our mark of at least 80% of our students in Tier 1, no more than 15% in Tier 2, and no more than 5% in Tier 3. 
Slide 12 and 14 – These are our PARCC scores from last year for grades 3-8. It is noteworthy that all of our students took their PARCC assessments on digital devices. It is also worth noting that the Massachusetts Department of Elementary and Secondary Education stated the following about computer-based testing:

“The preliminary PARCC results showed that in most grades, students who took PARCC math and English language arts tests on a computer were less likely to score in the “meeting expectations” range…(Link to source)”

With this in mind, our Burlington elementary and middle school students still scored near or above 750, which is “meeting expectations” . 
Slides 16-17 – These slides show our high school MCAS scores for ELA and Math. In both areas, our students continue to perform well with 94% scoring in Advanced and Proficient in ELA and 89% in Advanced and Proficient in Math.  
Slide 18 – This slide shows the tremendous growth in our Advanced Placement courses at BHS. Since 2012, we have increased the number of AP Tests given from 260 to 404. At the same time, we have increased the percentage of passing scores from 72.9% to 82.2%. In addition, it is worth noting that over the same timespan our percentage of students passing the AP test compared to the state average has gone from 1% below the state average to 11.7% above the state average.


MCAS Student Growth Scores Across All of our Burlington Schools

Massachusetts Comprehensive Assessment System
  (Photo credit: Wikipedia)

Following my post on the misleading nature of the Boston.com MCAS rankings, a blog commenter noticed that the three growth scores I referenced were are three highest areas of growth for Burlington in 2013. While that was true, I am happy to share all of our district growth scores from 2013 in the table below. The range between 40-60 is considered moderate growth and anything below 40 is considered low growth.

We can report that all of our grade levels where growth is measured scored in the moderate range or higher. While we will review a number of different points at our annual MCAS Review at an October School Committee Meeting, I didn’t want it to appear that we were holding anything back. For those who are interested in digging deeper prior to our meeting, the DESE has a lot of data from our 2013 MCAS results available on its website. Here is the link to the Burlington section.
Enhanced by Zemanta

But How Will Their Students Pass The State Assessment?

As our students in Massachusetts settle in for their annual round of “high-stakes” testing, I think it is the perfect time for people to take a few minutes to watch the video from TEDxCreativeCoast titled The Future Will Not Be Multiple Choice.  The presentation by Jaime McGrath (an elementary school teacher in Savannah, GA) and Drew Davies (a web designer) was posted on Mind Shift’s blog about a month ago and I forgot about it until I saw a tweet last night with the link. It really is a must watch for anyone who thinks that our current educational structure is adequate.

It’s no newsflash the current structure of most classrooms is unchanged from the structure that was created to educate students for an industrial society back in the 19th Century.  At one point in our history fitting the right piece in the right hole as quickly as possible and being able to retain large amounts of trivial information in order to regurgitate it or draw from it quickly may have actually been useful. However in a day and age where asking the right questions is of more value than providing a quick response to a multiple choice question, we are past the point of needing a change.

In fact the findings of educators like McGrath, who stray from the current script and look at problem-based education and a focus on “design thinking” are clear.

“All we did was give them the challenge, point them in the right direction and give them the space to be creative,” noted McGrath.  

Here are a few of my take-aways made by the co-presenters Jaime McGrath (an elementary school teacher in Savannah, GA) and Drew Davies (a web designer):

  • Reports predict that 65% of our students will be working in jobs that don’t exist yet.
  • “Such simple tasks as manipulation of blocks helps infants and toddlers develop early skills, including math literacy – the language of numbers.” Huttenlocher, Jordan, and Levine 1994
  • Don’t need students skilled in picking A, B, C, D 
  • “A true understanding of reality is not possible without a certain element of imagination…” Lev Vygotsky 
  • Design in education compliments all learning styles 
  • Will it be messy and risky? But what is the reality we are trying to prepare our kids for? 
  • The future is not a multiple choice test, it is a design challenge

So my question about the state assessment (or a national assessment) posed above was – “How will these students pass the state (or national) assessment. Here’s my answer –  “Who Cares!”

I think the bottom line is that students who are being taught in classrooms where they are being taught to think will be successful on any measure.

The Boston.com MCAS School Rankings Stink!

Since the first time I saw the school rankings in the Boston Globe over a decade ago, I have been frustrated by the simplistic and misleading approach that this news outlet has taken in publicizing the scores from our state’s high stakes test.  The approach is simply to rate the top schools from “Number One” to whatever the final number is depending on the grade level that was tested. For instance, if you were a school that had third graders in your building last spring then you had 954 other schools to compare yourself with.

As I discuss my thoughts here on these rankings, I need to make it clear that my intention is not to criticize or praise a school that I reference, but simply to clarify how this works for those who take these rankings too seriously.

Going back to third grade for a moment, the “number one” ranked school in the state in English Language Arts was the Richmond Consolidated School which had 100-percent of its students score in either Advanced or Proficient.  By the way, the Richmond Consolidated School tested only 19 students. Compare this to the school that had the largest third grade population in the state, the Woodland School in Milford, MA which tested 303 students and ranked 571.  Clearly we are comparing apples and oranges and it is unfair to the students and teachers to portray such a misleading picture. There are countless examples of these same types of comparisons that can be done at every grade level.  This is without even getting into the demographics of individual schools and communities.

Here’s a another thing that irks me about the Boston.com ratings

Using the Grade 10 English Language Arts rankings as an example this time, I would like to ask this question.  Do you think that a school ranked “number one” clearly outperformed a school ranked 99th?  While the answer is an emphatic NO,  if I were a typical parent from Andover, Brookline or any of the 23 schools that were ranked 99 I would probably be wondering why my child’s school is apparently so far away from “number one.”

The explanation is pretty straight forward, there were 28 schools that had 100% of their students score either Advanced or Proficient and were therefore ranked “number one.” The next ranking was “number 29,” a ranking that was shared by 22 schools that had 99% of its students scoring in the top two levels of the ELA MCAS.  So, the good news for folks who ranked “number 99” is that 96% of their students scored either Advanced or Proficient.

Growth Scores Are A Better Measure

Thankfully our state’s Department of Education has moved to a growth model in regards to testing.  What is a growth model?

Here is a quick definition from the DESE’s website

For K-12 education, the phrase “growth model” describes a method of measuring individual student progress on statewide assessments (tests) by tracking the scores of the same students from one year to the next. Traditional student assessment reports tell you about a student’s achievement, whereas growth reports tell you how much change or “growth” there has been in achievement from year to year.

Shouldn’t we be paying more attention to these measures? Isn’t it more important to show where students were and how we track their growth and chart their progress compared to all of the students who had a similar score during the previous school year?   For example, if we had a student who was in the lowest category (warning), shouldn’t we get some credit for moving them along to the next level (needs improvement)?  The obvious answer is – yes!

In addition, I am sure that there are students that walk in the door in September and could score in the advanced level on that year’s MCAS test on day one of the school year.  Therefore, I think it is insignificant when these students score advanced in May of the same school year.  Again, we need to show that we are supporting student growth no matter where they are on day one of the school year.

One More Thing About Ranking Ourselves Based On Standardized Test Scores  

For those who aren’t aware of the correlations between socioeconomics and standardized test, there are clear connections between standardized test results and the median household income in a community or a state.  Check out the graphic below depicting average NAEP scores across our country and the median household income in each state.

Source: http://www.edpolicythoughts.com/2012/10/why-does-massachusetts-rank-highly.html

Concluding Thoughts About Standardized Tests
In closing, I think that measuring student progress is critical. However, I think we have to keep standardized test results in the proper perspective. In Burlington, we are always of the opinion that we can do a better job for our students. There are certainly areas where we think our state tests scores could be better and we will have plans in place to accomplish this. However, we also have to be careful not to be focused solely on these tests when we talk about our progress.  Our feeling is that these tests are the floor and not the ceiling for what we hope to see our students accomplish.  As a community, we need to make sure that we are utilizing multiple measures to chart the progress of our schools and our students.  
As a parent of three children in another district (grade 1, grade 7 , and grade 9), I am less concerned about the standardized test scores of my students and more interested in whether or not they are developing the skills that they will need to be successful after their formal education is complete. I am fairly confident that their MCAS results or their scores on whatever new federal or state standardized test comes down the pike is not something that will have a major impact in their success.  If the major focus of their schools is on these results then I pretty sure I can find a computer program that can prepare them equally well.
Don’t get me wrong, I think we need schools more than ever. The dilemma is that we need schools that realize the world that we are preparing our students for is one that has changed dramatically and that we cannot prepare students with business as usual.
Here are few blog posts that reference this idea:

An Interesting Question To Ponder – Are Schools KillingYour Child’s Creativity?

A decade of No Child Left Behind: Lessons from a policy failure

Enhanced by Zemanta