Wednesday, March 13, 2019

MVP Math Claims about Gains in Chapel Hill - Carrboro Debunked!

This is a long post, but worth the read if you really want to understand how you parents and other MVP adopters have been duped into believing the Chapel Hill - Carrboro City (CHCC) school system performance data about the MVP curriculum.  This article will debunk that data unequivocally using data from the NC Department of Public Instruction.

Cliff Notes Version

  1. CHCC / MVP published performance data that shows a consistent year after year positive trend.
  2. In reality, several segments of the data are conveniently incorrect and do not match what the NC Department of Public Instruction publishes.
  3. Using the corrected data, the "improvements" are no better than Wake County over the same period and worse than NC overall over the same period.
  4.  Subgroup gains were insignificant and dubious compared to NC.
  5. As parents and taxpayers, you should be deeply concerned about how decisions are made in your school system based on such misrepresented results.

The Gory Details

Background

In the March 1, 2019, Green Hope High School MVP FAQ, Wake County (NC) Public School System (WCPSS) claimed, "The Chapel Hill - Carrboro City School System has been using MVP since 2015 and have shown gains in proficiency and closing of the achievement gap." Additionally, on September 22, 2017, the MVP Facebook page posted the graphic shown.

While Wake MVP Parent is generally suspicious of MVP improvement claims due to inconsistent implementation and poor results in Wake County, we felt it was worth looking behind the graphic to see what makes up these numbers, since these results seem to be such a point of pride for MVP apologists.

Fortunately for those of us in NC, this data is publicly available on the Department of Public Instruction's (DPI's) testing results webpage.  We encourage you to behold the results yourself.  But we've done the work for you.

Let's recap what this data is telling us.  At the end of each Math 1 course, NC administers a standard end-of-course (EOC) exam to those students.  The scores are captured along with thousands of other data points about public school results.  Data is available down to the district, school, and 12 subgroups of detail such as race, gender, economic status, and English language proficiency.  For each student, the result of the exam is a score 1 through 5.  If a student scores 3-5, they are considered grade level proficient (GLP).  If they score 4-5, they are considered college and career ready (CCR).  This graphic about CHCC is addressing the CCR results only and is measuring the percent of exam takers who are considered college and career ready.

MVP was implemented in the 2014-15 school year for CHCC.  This data begins the prior year 2013-14, before MVP was implemented in CHCC.  The following 3 years ending with 2016-17 school year seems to show a remarkable improvement, with each year increasing steadily, ostensibly due to the application of MVP.  Note that my home school system WCPSS did not begin implementing MVP until the 2017-18 school year.

Comparing Portion Improvements 

The first seemingly misleading number on this chart is the 27% increase shown in blue.  You have to read this carefully, because this is the actual increase in the portion of students who are CCR.  The actual increase in CCR performance - according to this graphic - is 17%, which is 79% - 62%.  The 27% is actually reflecting the percent increase in percent.  The 62% portion increased by 27% to get to 79%.  62 x 1.27 = 78.74.  OK.  No problem.  This is an important comparison approach we will use later.

2014 Subgroup Data Completely Incorrect

Now, here's where the chart integrity falls apart.  I went and actually looked up all this data.  I downloaded the 4 spreadsheets corresponding to each year's drill-down of data. First of all, I was pleased that the data available at DPI matched exactly with the 2015-17 subgroup data provided in the CHCC graphic (green box).  This meant that the graphic author and I were indeed using the same publicly available official data. However, the 2014 data did not match at all.  Each and every number from DPI was different from the CHCC data.  I show the corrected data in the table above left.


2014-2015 Summary Data Conveniently Incorrect, Misrepresenting Trend

Then I looked at the data for ALL students.  In other words, the numbers which made up the summaries at the top of the graphic.  I had an additional decimal point in the DPI data, and am showing here that the 2016-17 numbers matched (green box).  Good job!  However, the 2015 number, which the graphic claimed was 70%, was actually 78.4%.  Uh oh.  I thought that this 70% was perhaps a typo (or wishful thinking) on someone's part because the subgroup data for 2015 matched perfectly as noted previously (see green cells below).  How could someone have gotten the All Students data point wrong (red, below) when it's on the SAME ROW as the subgroup data, which was correct?

ACTUAL 2015 SPREADSHEET FROM DPI, file = accdrilldwn15.xlsx (color coding mine)

Likewise, here is the ACTUAL DPI data from 2014, file = acctdrilldwn14.xlsx (color coding mine)
The 71.5 is shown in red, as are the subgroup data points, incorrectly shown on the CHCC graphic, each and every one of them.

So let's recap.  The original CHCC graphic data shows an increase EACH year from 62% all the way to 79%.  But in reality, the numbers start at 71.5% pre-MVP, then go up to 78.4% in 2015, then BACK DOWN to 76.3% in 2016, then back up for 2017.  In reality, the total point increase was 7.9 points (71.5% to 79.4%) and the portion increase was a modest 11%.  That's a very different story than was told on the CHCC MVP graphic because the starting point pre-MVP was not nearly as low as the graphic stated.  Check out the bar chart.  The "Published - WRONG" data sure does look a lot more appealing than the CORRECT data but the CORRECT data did not fit the narrative that MVP is an obvious contributor to student math performance improvement.  Were these incorrect data points used accidentally or intentionally?  Did MVP create the graphic and did CHCC sign off on it?

No Better than Wake, Worse than State

Let's continue, shall we?  An 7.9 point increase and 11% portion increase is still good, right?  I mean, this is due to implementing MVP so it must be good!  Being that I was trained using that "old math" back in the 80's I learned to look at data in context. 

Since I had the 4 data sets open, I also looked at Wake County and the State of NC data.  Wake County, which did not implement MVP until 2017-18 school year, somehow was able to achieve the same portion level increase as CHCC which implemented MVP!  But let's look at even a bigger picture: the great State of NC.  Miraculously, NC overall was able to boost her CCR data by 7.2 points, almost as much as CHCC.  But, when we look at portion increase, NC went up by a whopping 15% during this same span!  That's because an increase from 46.9 to 54.1 (7.2 points) is really more impressive than an increase from 71.5 to 79.4 (7.9 points).  So if CHCC would have just not implemented MVP, perhaps they would have ridden the coattails of NC to even higher gains during this same period!

NC Subgroup Gains were also Good

So what about all those big subgroup gains?  Well, the NC data is not too shabby.  The state showed impressive gains in each and every subgroup, as did CHCC.  And let's not forget, Chapel Hill - Carrboro City school system has the 2nd worst achievement gap in the nation.  So there's nowhere to go but up for them.  By the way, the #1 gap in the nation is Berkeley, another MVP adopter. 


Subgroups Performing Better/Worse than Others

Continuing onward...  Now this is where I may be going too far in drawing a conclusion.  So let me just present the data.  If we look at the portion increases of the subgroups for NC versus CHCC, we have the table to the right.  The subgroups where NC did better than CHCC: All, Asian, and White.  The groups where CHCC did better than NC: Economically Disadvantaged (EDS) and Limited English Proficiency (LEP).  So these results present a real conundrum for people who endorse MVP and the graphic posted on the MVP Facebook page in September 2017.  First of all, what the White and Asian subgroups have in common is that they are the largest demographics which are taking Math 1 in middle school and honors classes in high school.  In WCPSS, this is where we have seen MVP adopted more strictly.  Did CHCC have the same implementation inconsistency?  And then the conclusion that somehow the MVP curriculum - which is heavily dependent on reading proficiency and facilitation and discovery learning teaching style - is somehow dramatically better for the EDS and LEP subgroups is a stretch beyond my wildest imagination.

Conclusion

So, in conclusion, if you've read this far, I encourage you to go check this data for yourselves.  Links are provided.  

I don't know if CHCC or MVP authored the graphic.  I hope that this post will lead to that answer.  I tried to give the author of the CHCC graphic the benefit of the doubt.  I tried to find the cited wrong data somewhere in the official data sets - perhaps explaining a typo or copying from the wrong row.  I tried to see if there was a standard adjustment made which could have explained the wrong data shown.  But I just couldn't, being faced with the real official DPI data in black and white.  Frankly, the 2015 summary results (70%) and all the 2014 data seems contrived, if not fabricated.  This graphic was and still used to partially justify WCPSS's adoption and confidence in MVP.   But the data shown from Chapel Hill - Carrboro was just too convenient and tidy to be true.  An increase every year in every category?  No context considered?   Consider this MVP claim DEBUNKED.

Update 7/31/19:

Around 4/24/19, I noticed that the aforementioned graphic was removed from MVP’s Facebook page.

To this day, I have not been able to determine if MVP or Chapel Hill created the graphic and no one will explain the mismatches.  Here are the steps I took in trying to get to the truth:

  • 3/13/19: Wrote this blog article explaining my findings and documenting my sources.
  • 3/13/19, 9:23 PM: I contacted Chapel Hill informing the of the blog I wrote and asked for an explanation for the mismatches.  (Email 1)
  • 3/15/19, 4:25 PM: wrote back and said "The CHCCS does not have any affiliation with the MVP Facebook account or data represented on their site." (Email 2)
  • 3/15/19, 6:04 PM: I wrote Chapel Hill back and challenged them to look into it and correct the record.  (Email 3)  No response.
  • 3/15/19, 9:23 AM: I also wrote WCPSS informing them of the findings, since they so highly valued CH's results in their praise of MVP. (Email 4). No response.
  • 3/19/19, 7:24 AM: I then wrote MVP and showed them the article I wrote and asked them if it was fabricated.  (Email 5)  No response.

Author: Blain Dillard

1 comment:

  1. wow...this is very revealing and disturbing. Great job breaking it down for us to see the inconsistencies.

    ReplyDelete