The Boston.com MCAS School Rankings Stink!
Since the first time I saw the school rankings in the Boston Globe over a decade ago, I have been frustrated by the simplistic and misleading approach that this news outlet has taken in publicizing the scores from our state’s high stakes test. The approach is simply to rate the top schools from “Number One” to whatever the final number is depending on the grade level that was tested. For instance, if you were a school that had third graders in your building last spring then you had 954 other schools to compare yourself with.
As I discuss my thoughts here on these rankings, I need to make it clear that my intention is not to criticize or praise a school that I reference, but simply to clarify how this works for those who take these rankings too seriously.
Going back to third grade for a moment, the “number one” ranked school in the state in English Language Arts was the Richmond Consolidated School which had 100-percent of its students score in either Advanced or Proficient. By the way, the Richmond Consolidated School tested only 19 students. Compare this to the school that had the largest third grade population in the state, the Woodland School in Milford, MA which tested 303 students and ranked 571. Clearly we are comparing apples and oranges and it is unfair to the students and teachers to portray such a misleading picture. There are countless examples of these same types of comparisons that can be done at every grade level. This is without even getting into the demographics of individual schools and communities.
Here’s a another thing that irks me about the Boston.com ratings
Using the Grade 10 English Language Arts rankings as an example this time, I would like to ask this question. Do you think that a school ranked “number one” clearly outperformed a school ranked 99th? While the answer is an emphatic NO, if I were a typical parent from Andover, Brookline or any of the 23 schools that were ranked 99 I would probably be wondering why my child’s school is apparently so far away from “number one.”
The explanation is pretty straight forward, there were 28 schools that had 100% of their students score either Advanced or Proficient and were therefore ranked “number one.” The next ranking was “number 29,” a ranking that was shared by 22 schools that had 99% of its students scoring in the top two levels of the ELA MCAS. So, the good news for folks who ranked “number 99” is that 96% of their students scored either Advanced or Proficient.
Growth Scores Are A Better Measure
Thankfully our state’s Department of Education has moved to a growth model in regards to testing. What is a growth model?
Here is a quick definition from the DESE’s website –
For K-12 education, the phrase “growth model” describes a method of measuring individual student progress on statewide assessments (tests) by tracking the scores of the same students from one year to the next. Traditional student assessment reports tell you about a student’s achievement, whereas growth reports tell you how much change or “growth” there has been in achievement from year to year.
Shouldn’t we be paying more attention to these measures? Isn’t it more important to show where students were and how we track their growth and chart their progress compared to all of the students who had a similar score during the previous school year? For example, if we had a student who was in the lowest category (warning), shouldn’t we get some credit for moving them along to the next level (needs improvement)? The obvious answer is – yes!
In addition, I am sure that there are students that walk in the door in September and could score in the advanced level on that year’s MCAS test on day one of the school year. Therefore, I think it is insignificant when these students score advanced in May of the same school year. Again, we need to show that we are supporting student growth no matter where they are on day one of the school year.
One More Thing About Ranking Ourselves Based On Standardized Test Scores
For those who aren’t aware of the correlations between socioeconomics and standardized test, there are clear connections between standardized test results and the median household income in a community or a state. Check out the graphic below depicting average NAEP scores across our country and the median household income in each state.
![]() |
| Source: http://www.edpolicythoughts.com/2012/10/why-does-massachusetts-rank-highly.html |
An Interesting Question To Ponder – Are Schools KillingYour Child’s Creativity?
A decade of No Child Left Behind: Lessons from a policy failure
Connected Educator Month Begins In Two Weeks! What Will You Do To Connect?
Connected Educator Month (CEM) is a month-long celebration of community, with educators at all levels, from all disciplines, moving toward a fully connected and collaborative profession.
The goals of Connected Educator Month include:
- Helping more districts promote and integrate online social learning into their formal professional development
- Stimulating and supporting collaboration and innovation in professional development
- Getting more educators connected (to each other)
- Deepening and sustaining the learning of those already connected
![]() |
| The Connected Educator Month District Tool Kit |
Join Us For Our First Parent Technology Night – Elementary School 1:1 Program
Burlington Teachers Highlighted In MTA Today Cover Story
The Massachusetts Teachers Association highlighted the great efforts of our Burlington teachers in the most recent edition of the MTA Today. You can check out the entire article here. The article on Burlington begins on page six.
Here’s A Great Way To Keep Up With #BPSCHAT – Our District Twitter Hashtag
Paper.li is a great resource to create a newspaper highlighting the feed from a Twitter hashtag. We have our own weekly edition from our Burlington Public Schools hashtag #BPSChat which you can check out below.
If you would like to receive a weekly update by email, just click on the subscribe button on the top right hand side of the paper.

Our Newest Blog And Other Ways To Follow Burlington Public Schools
We are excited to announce the creation of our district’s newest blog – Burlington Public Schools Blog. The word “Blog” in the title is actually used as a verb to represent all of the active bloggers throughout our school district. This space will be utilized to share a blog post daily from one of our staff or student bloggers.
Here are a few other ways you can follow our district’s learning journey:
Our new district Instagram account – We will routinely post photos from around the district here.
Our new district Twitter Account – @BurlMASchools – While many of our staff and administrators have been on Twitter for quite some time, we have not had a general account for the district aside from @BPSAlerts, which we reserve for emergencies, and @BPSEdtech, the Twitter account of our EdTech Team.
Finally, we don’t want to forget our District’s Facebook Page which is another great way to stay on top of the happenings in our school district.
Let us know if there are any other social media resources that you would like to see Burlington Public Schools access to share information!
Strong Intentions Lead To Amazing Unintended Benefits #LeadershipDay13
- Presented at regional conferences
- Been highlighted by national educational media outlets EdWeek and Edutopia
- Served as consultants for app developers
- Consulted with our state’s Commissioner and Secretary of Education
- Run their own technology conference
- Created viral videos that have been noted by national media outlets
- Taught university professors
- Assisted with product design
- Consulted with districts from all over the country on 1:1
- Created a successful blog that has followers from around the globe
Plan C – Why I’m An Educator #SAVMP
| photo via http://edelmaneditions.com/ |
Old News – Schools Have Dropped The Ball On Teacher Evaluation
Before I get into the specifics on this, I will put out a disclaimer that there may be outliers who feel that they have a meaningful process for conducting teacher evaluations. I am confident that there are some school communities out there who have initiated teacher evaluation processes that increase teacher capacity and improve student learning. Unfortunately, history has proven that these school communities are extremely rare.
One great reference point on this is a report completed in 2009 by The New Teacher Project titled The Widget Effect. This comprehensive study looked at 15,000 teachers and 1,300 administrators from the states of Ohio, Arkansas, Illinois, and Colorado. Here are the main points from the Executive Summary that are worth some thought:
- In districts that use binary evaluation ratings (generally “satisfactory” or “unsatisfactory”), more than 99 percent of teachers receive the satisfactory rating.
- When all teachers are rated good or great, those who are truly exceptional cannot be formally identified.
- 73 percent of teachers surveyed said their most recent evaluation did not identify any development areas, and only 45 percent of teachers who did have development areas identified said they received useful support to improve.
- 66 percent of novice teachers in districts with multiple ratings received a rating greater than “satisfactory” on their most recent performance evaluation
- 41 percent of administrators reporting that they have never “nonrenewed” a probationary teacher for performance concerns in his or her final probationary yea
Looking at the data from such a comprehensive study it is clear that Teacher Evaluation has been looked at as just another thing on the lengthy to-do list in most schools instead of a critical component in school improvement. We can talk semantics and look at the word “evaluation” and the fact that it infers that something is going to be done to the individual being evaluated and not a two-way conversation where teachers and administrators work together to collectively to improve individual classroom and school-wide outcomes.
The following statement from the Executive Summary of the The Widget Effect explains clearly where the focus of teacher evaluation has been:
“…information on teacher performance is almost exclusively used for decisions related to teacher remediation and dismissal paints a stark picture: In general, our schools are indifferent to instructional effectiveness—except when it comes time to remove a teacher.”
Schools that have success in changing this focus and having a teacher evaluation that is meaningful will be ones where administrators and teachers partner to ensure regular focused discussions around teaching and learning. Given the amount of collaboration in most schools, this will prove to be a cultural change as much as it is a technical change. However, with studies citing the average tenure of a school superintendent or principal in the vicinity of three years (with larger districts and schools averaging less), it is clear to me that if administrators are taking too much control then sustainability will be impossible.
Now that I’ve looked at a bit of the history surrounding teacher evaluations and despite some big concerns, I will focus on some of the reasons I am optimistic about this undertaking in my next post.









