Thursday, March 28, 2019

Ready? Set? NO! The Story of Miranda and Neha - MVP and Your GPA

Legal Disclaimer

This blog article is a parody which tells the story of two fictional characters.  In this fictional story, the characters have experiences which I believe are representative of numerous anecdotes and concerns shared by parents.



If you've read articles on this blog, you've no doubt read about impacts MVP is having on some students' confidence in math, impacts on wallets as "parents of means" hire tutors or buy other resources, and most importantly, impact on learning as many students in MVP Heavy classes continue to fail assessments despite considerable at-home efforts.

At the end of the day (or semester in this case), what will be left as a reminder of your child's MVP experience will be grades on their transcript.  Do grades really matter?  Educators will answer that question different ways depending on the circumstances.  But let's be honest.  Most parents who care enough about their kids' math class to get on a Facebook group and post and comment, probably are parents who are expecting their kids to go to college.  And colleges care about grades and transcripts, including GPA and class rank.  They really do.  Volunteer service and sports are great, too.  But grades matter.

Ready?

So let's do an MVP exercise.  Miranda and Neha are best friends and have been going to school together since 3rd grade.  They are both are great at science and math and have straight A's through 7th grade.  Miranda hopes to be a doctor after studying Chemical Engineering like her mother.  Neha is interested in fighting climate change and hopes to become a Data and Computer Scientist.  They hope to go college together and be roommates!

Set?

Both Miranda and Neha sign up for Math 1 Honors in 8th grade.  Miranda gets a teacher who follows the MVP math program lightly, minimizing the class discovery exercises, and using her old Math 1 notes to make sure the students understand the math each day.  Neha gets a different teacher who is "being true to the curriculum."  Neha can't hang out with Miranda much any more because she is meeting with her tutor after school 4 days a week and on Sunday.

Miranda earns an A in her Math 1 Honors class, but despite all the extra help, Neha earns a D.  Neha is so discouraged because she thought she was a great math student.  She used to beat Miranda on every test in 6th and 7th grade!  Now Neha is discouraged and decides it would be safer to take Math 2 as an Academic class.  Miranda continues with Math 2 Honors.  As luck would have it, Miranda and Neha each are again placed into MVP Math 2 classes taught with differing levels of MVP intensity.   Miranda makes a B this time, and Neha improves to a C.

Miranda finishes out high school alternating A and B all the way through AP Calculus and AP Stats.  Neha eventually gets back to a B in MVP Math 3 and 4 - both at the Academic level.  Deciding instead to focus on another field of study, Neha takes 3 elective Academic courses instead of Calculus and Statistics, earning 2 A's and a B.

NO!

The tale of two students.  Both equally excellent in math when 8th grade started.  On two different paths by the time high school is over.  Using the math we learned today, we can do some simple calculations and determine the GPA impact to Neha as compared to Miranda, based only on these 7 courses, is -0.297.  Let's round that to -0.3 using what we learned in Unit 1.



Let's say Miranda and Neha make the same grades in every other class, and Miranda ends up with a 4.5 GPA and ranked 19th in her class.  Neha, having devolved from a "math whiz" to a "math avoider," ends up with a 4.2 GPA and is ranked 134th.  While Neha's 4.2 GPA is still excellent, it is unlikely these two young ladies will be roommates in the same college.

Try the MVP GPA Impact Calculator for yourself.

Conclusion

Well, booo hooo.  Not going to get a lot of sympathy from some people - I get that.  But is it really fair that two equally intelligent students have their grades and aspirations impacted because of the luck of the draw related to teacher assignment combined with an unproven experimental curriculum and teaching method that INTENTIONALLY WITHHOLDS a math education from Neha, BY DESIGN?   It could just as well be a 3.5 versus 3.2 GPA - but the consequences are similar.  Grades matter and GPA matters.

Granted, this scenario may be an extreme case.  Or is it?

Author: Blain Dillard

Wednesday, March 27, 2019

MVP Eradicates WCPSS's Performance Lead over NC in "Economically Disadvantaged" and "Limited English Proficiency" Subgroups


"MVP is low entry high ceiling," they tell us.  But how are subgroups of Economically Disadvantaged Students (EDS) and Limited English Proficiency (LEP) faring in Wake County?  Let's take a look.

Source

All data quoted here is available to the public at the Department of Public Instruction's (DPI's) testing results webpage.  We encourage you to check the results yourself.

Background

In Wake County Public School System (NC), Math 1 courses adopted MVP last school year 2017-18.  NC captures 2 metrics for the Math 1 EOC, Grade Level Proficiency (GLP) and College and Career Readiness (CCR).  There was a change to the way the EOC was reported in 17-18, making comparisons to prior years difficult without math acrobatics that are the exclusive intellectual property of WCPSS.  Therefore, I will call the raw data for 15-16 and 16-17 "Apples" and the raw data for 17-18 "Orange".

However, there is meaning in comparing Wake to the state of NC, and then trending that number.  Wake County has been one of the top performing school systems in the state for many years, consistently outperforming the state in all subgroups.  But, as you will see here, that gap is closing for the "all students" data set, and even more so for the EDS and LEP groups, especially in 17-18, after one year of using supposed "free" OER resource MVP.

"What's the Cliff Notes Version?" you ask.  

The pictures speak for themselves.

For Group 1, Economically Disadvantaged, Wake once led NC by 1.9% and 4.5% for GLP and CCR, respectively.  Now, we trail the state by 9.3% and 11.8%!  Thanks MVP!

For Group 2, Limited English Proficiency, Wake once led NC by 27.8% and 43.2% for GLP and CCR, respectively.  Now, the gap has crashed and we are about dead even at 0.4% and 5.0%!  Thanks MVP!

For Group 3, All Students, Wake once led NC by 19.2% and 26.9% for GLP and CCR, respectively.  Now, that gap has continued to decline to 13.4% and 18.9%!  Thanks MVP!

Conclusion

I continue to be skeptical of all WCPSS's Year 1 (2017-18) MVP data claims due to the secret mystery algorithm used to calculate the performance numbers, which were reported without the context of the state numbers.  But mainly, due to the widely irregular adoption we have seen of MVP in Wake County, how can anyone make cause-and-effect claims, including me?  That said, it's the only data we have, and this report shows that EDS and LEP students are getting the shaft.

Author: Blain Dillard

Thursday, March 21, 2019

Why My MVP Golf Score Improved, and other Exaggerations

If I told you the average golf score for Wake County golfers improved 1.9 strokes last year as compared to the prior year, and then told you that was due to the new Golf Galaxy store on Hwy 55 in Cary which opened in 2016, what would you think of my statistical analysis abilities?  My guess is that you might question my logic.  You might wonder if all golfers in Wake County visited that Golf Galaxy, and if so, did they purchase any products or services (such as lessons) that would have helped their scores?  How much golf practice did they do in their own time?  Did they receive any other golf lessons or tips from others?  Did they watch any golf training videos on YouTube?  You might wonder how the golf scores fared for golfers who visited other golf stores, if at all.  What were the trends in prior years?  Up?  Down?  You might also wonder how the average scores in NC were overall for the same period.  You would certainly want to look into all of those matters before making such a cause -> effect claim about such an improvement.  To do so otherwise would be an irresponsible use of golf statistics!

Too bad WCPSS and Mathematics Vision Project doesn't approach stats like you and I would.  They published this graphic on Twitter in November 2017 boasting of WCPSS's improvement in Math 1 proficiency due to MVP math, a "free OER" according to the statement.

Side note: "OER" mean "open educational resource".  "Free" usually means "of no cost" but that is a different story. 

For this story, I'd like to say it's easy to verify the data using thNC Department of Public Instruction data websiteBut that's not 100% true, because in 2017-18 school year, the DPI changed the way it calculates the EOC for Math 1.  Therefore, comparisons to prior years are not as easy.

Cases in point, when I looked up the data, the grade level proficient (GLP) and college & career readiness (CCR) numbers seem to be way down from 2016-17 to 2017-18 - for both WCPSS and State of NC.  This is because the 2017-18 data is only reporting high school results, whereas in prior years, the middle school Math 1 EOCs were wrapped into the metrics.  Those middle school Math 1 EOCs are typically higher (because those are the kids taking higher math sooner), so omitting them, brings the average down.

I have asked WCPSS for an explanation as to how they arrived at the 1.5 and 1.9 numbers, but have yet to receive a response other than a March 3, 2019 acknowledgement of the change confirming "an additional level of analysis was required at the district level in order to provide a more accurate cross-year comparison." Additionally, "Staff is working to get more information on the two data points in question and will follow up with you shortly to provide additional details and/or data that may be helpful."  Well, THAT is what I want to see!

Anyway, we can use these numbers as they are and still make a point or two.  Let's take a look at the 2 graphs, one for CCR and the other for GLP.   For sake of simplicity, let's call the data points for 14-15, 15-16, and 16-17, the apples.  And the 17-18 data points, the oranges.



Both graphs show the 3 years preceding 17-18, the first year MVP was implemented.  During the apple years, the number went up for both WCPSS and the state.  Both graphs also show the apples to oranges drop from 16-17 to 17-18 due to the change in how the calculation was done.  So there's no benefit in comparing apples to oranges.

But there is benefit in demonstrating that even if the 1.9% and 1.5% were completely true and verifiable, those gains were lower than the PRIOR year, when both metrics rose 2.3%!  

This goes back to the golf score average improving because of the new Golf Galaxy.  If my score was already getting better the prior year, how can I say the new Golf Galaxy helped me this year?  I can't!

Additionally, let's look at the trends comparing WCPSS to the state overall.  For CCR, the gap between WCPSS and the state has declined from a peak of 13.4 in 15-16 to a recent 8.8 in 17-18.  For GLP, the gap has closed from 11.9 to 7.7.

So just to be clear.  I am comparing apples to apples for the 3 years shown 14-15, 15-16, and 16-17.  And I am comparing oranges to oranges in 17-18.  The gap is closing between WCPSS and the state.  Is that because the state is doing better?  Or something else?

So in the end, I don't think it matters how WCPSS arrived at the 1.5 and 1.9 numbers.  Either way, the gains are no better than the prior year, and the gap with the state is closing.  In other words, I bet that if we applied the "additional level of analysis" at the state level, we would find their gains to be more than 1.5 and 1.9 for 17-18, further trivializing the boastful graphic posted by WCPSS and MVP.

Therefore, I declare this graphic from WCPSS claiming an improvement in proficiency in year one DEBUNKED!  Now, go hit 'em long and straight - and NO three-putting!

UPDATE 3/29/19:
Dr. Denise Tillery provided me with data I requested showing this data in context and an explanation why the DPI data is not usable as-is and has to be tweaked in order to make sense of it.  I'll save that debate for another day because it's irrelevant here.  Here is the data for the 2 years leading up to year 17-18, which was the first year of MVP.

So this is eye opening because, as I expected, the 1.5% and 1.9% claims were nothing remarkable.  In fact, in the prior year, the gains were 4.6% and 3.2%.  So the "rate of change" declined.  The 1.5 and 1.9 were nothing to brag about.  Plus, take into account that MVP adoption adherence is wildly irregular across the county AND the fact that parents are hiring tutors when their kid suddenly has become a math un-wiz and can't do problem #1 on homework.  So there are always extenuating factors which contribute to data.

When I confronted WCPSS leaders about this in a face to face meeting on April 8, 2019, they admitted the misleading nature of the graphic from MVP, but claimed their own chart was intending to say "The numbers were actually up, when usually we have a decline when introducing a new curriculum"  So they were just pleased the numbers didn't go down!  What a low bar to set in an experiment conducted on our children.  So let's think this through...Our performance is increasing - let's introduce a new curriculum which we know will bring it down!

In a separate communication, WCPSS admitted the cost of "free OER resource" MVP to be $1.4M.

So, here is the updated graphic.   At that meeting, WCPSS said they would ask/demand MVP take down the twitter post which misrepresents the WCPSS chart.  It's still there as of 4/11/19.

UPDATE 4/24/19:  The tweet has been removed from the MVP website!!!  Thank you WCPSS for doing what's right!  Now if only Chapel Hill - Carrboro would do the same.




Author: Blain Dillard

Sunday, March 17, 2019

Guest Author: Berkeley Parent Advocate Shares MVP Experiences

Wake parents are partnering with Berkeley parents as well as those from Anacortes, Washington, to learn about their experiences in advocating for math education for their students impacted by MVP.  In this article, Berkeley High School (California) parent advocate and Wake MVP Parent guest author Donna Storey tells the history of BHS's MVP journey.  

    Cliff Notes Version

    1. Berkeley High School implemented MVP beginning in 2015 hoping to close its long standing achievement gap.   
    2. Students frustrated and confused, losing confidence in math.  School system blames children. 
    3. School system attempts to hide data, but is eventually required to release data by a California Public Records Act request.
    4. D/F grade rates increase, even bolstered by 54% of surveyed students receiving private tutoring.
    5. Parents created support websites and tutoring programs.

    Main Article


    In the fall of 2015, the Berkeley High School math department launched a brand-new Common-Core-aligned curriculum for the class of 2019: Mathematics Vision Project  Many hoped this new approach to teaching math would help close the long-standing achievement gap at the school.

    I was serving as the secretary of the Academic Choice Advisory Council, a sort of PTA for Berkeley High’s largest “small learning community.” As early as the fall of 2015, parents were coming to our monthly meetings to express concern that their children were struggling with the new curriculum. With no textbook and little guidance from the teachers, students found homework confusing and frustrating. Students who had once excelled in and loved math lost confidence in their abilities. During information nights, math department representatives shared very little information about the new curriculum, but merely assured parents that everything was on track and this new approach would end up being a great improvement. Many parents and students felt equally confused in the second year of MVP, but most assumed it must be their child’s personal issue.

    By the winter of 2018, the class of 2019 was now taking the junior-year MVP class called Math 3. The D/F rate for these students was reportedly at an all-time high. Although my children had both graduated from Berkeley High, I heard stories from parents of juniors that made me very concerned.

    I requested the D/F rates from the Berkeley Unified School District by means of the California Public Records Act and received the following information, updated March 7, 2019:

    Aggregate D/F Rate for MVP Math Courses at Berkeley High by Semester
    Course
    2016-17 (S2)
    2017-18 (S1)
    2017-2018 (S2)
    2018-19 (S1)
    Math 1
    28.4%
    25.4%
    28%
    26.3%
    Math 2
    25.2%
    26.7%
    26.9%
    23.8%
    Math 3
    not yet offered
    18.1%
    20.7%
    6%**

    A request for more detailed math performance data in the years before MVP was denied by the Berkeley Unified School District, but the “Update on Common Core Mathematics” (item #12.2) presented to the Superintendent of the Berkeley Unified School District on March 22, 2017 contained the following data, which confirms that there has been a notable increase in D/F rates since the introduction of MVP in the 2015-2016 school year:

    TABLE 1: D/F Rates for all 9th grade math students semester 1 (Algebra I or Geometry)
    Academic Year           All students
    2013-2014                15%
    2014-2015                17%

    TABLE 2: Math 1 D/F rates over 1.5 years (this does not include Advanced Math 1 or Math 1X) [first year of MVP at Berkeley High School]
    Term                           All students
    Sem 1 15/16                         16.7%
    Sem 2 15/16                         19.4%

    Teachers and administrators continued to reassure parents all was well and they were handling any issues behind the scenes. Students of means relied on private tutors (54% of the 132 parents who took a community survey reported they’d hired a private tutor). Other concerned parents came together to create support websites and encourage the math department to make videos to explain concepts to students, although teachers did not emphasize the availability of this material. A group of parents started a free Sunday afternoon tutoring program to supplement after-school tutoring by teachers at Berkeley High. The Berkeley Unified School District hired a math coordinator to oversee math instruction in the Universal Ninth Grade program beginning with the class of 2022 to provide students with more support. We are also requesting that the District provide an update to the 2017 report to give the community vital information on student performance.

    MVP is now in its fourth year and many students and parents at Berkeley High are still struggling and feel their challenges are not being taken seriously. When a parent group from Wake County, North Carolina reached out to Berkeley High Parent Advocate, the issues they were experiencing with the launch of MVP were sadly familiar: student frustration and confusion, many losing confidence in math; a lack of data on student performance so that the impact of MVP was hidden from the community; and assurances that MVP was a cutting-edge curriculum and any problem was the fault of students.

    I hope that students and families in all school districts who are using MVP can connect, share their stories, and work to open a dialogue with MVP and their school districts. MVP is a new curriculum, and no matter how carefully designed, the reality of student experiences must be taken into account in order to best serve our kids. For example, EdReports found that MVP did not meet expectations for differentiated instruction for diverse learners and only partially meets expectations for gathering information about students’ prior knowledge and providing guidance for remediation. We must not take MVP as a perfect program that cannot be changed or challenged, but rather as an experimental approach that will benefit from student feedback and careful monitoring by teachers and administrators. Indeed, one of the unfortunate side-effects of the lack of transparency and dialogue is that teachers and administrators also suffer unnecessary stress when they take a defensive rather than collaborative position in relation to the community.

    I sincerely hope we can work together to support math success for all of our students.

    **I will be encouraging the community to ask for more information on the dramatic decrease in D/F rates in Math 3. We do know that the Math 3 staff was changed this year, and two teachers known to be especially demanding were reassigned to other courses in 2018-2019.

    Wednesday, March 13, 2019

    MVP Math Claims about Gains in Chapel Hill - Carrboro Debunked!

    This is a long post, but worth the read if you really want to understand how you parents and other MVP adopters have been duped into believing the Chapel Hill - Carrboro City (CHCC) school system performance data about the MVP curriculum.  This article will debunk that data unequivocally using data from the NC Department of Public Instruction.

    Cliff Notes Version

    1. CHCC / MVP published performance data that shows a consistent year after year positive trend.
    2. In reality, several segments of the data are conveniently incorrect and do not match what the NC Department of Public Instruction publishes.
    3. Using the corrected data, the "improvements" are no better than Wake County over the same period and worse than NC overall over the same period.
    4.  Subgroup gains were insignificant and dubious compared to NC.
    5. As parents and taxpayers, you should be deeply concerned about how decisions are made in your school system based on such misrepresented results.

    The Gory Details

    Background

    In the March 1, 2019, Green Hope High School MVP FAQ, Wake County (NC) Public School System (WCPSS) claimed, "The Chapel Hill - Carrboro City School System has been using MVP since 2015 and have shown gains in proficiency and closing of the achievement gap." Additionally, on September 22, 2017, the MVP Facebook page posted the graphic shown.

    While Wake MVP Parent is generally suspicious of MVP improvement claims due to inconsistent implementation and poor results in Wake County, we felt it was worth looking behind the graphic to see what makes up these numbers, since these results seem to be such a point of pride for MVP apologists.

    Fortunately for those of us in NC, this data is publicly available on the Department of Public Instruction's (DPI's) testing results webpage.  We encourage you to behold the results yourself.  But we've done the work for you.

    Let's recap what this data is telling us.  At the end of each Math 1 course, NC administers a standard end-of-course (EOC) exam to those students.  The scores are captured along with thousands of other data points about public school results.  Data is available down to the district, school, and 12 subgroups of detail such as race, gender, economic status, and English language proficiency.  For each student, the result of the exam is a score 1 through 5.  If a student scores 3-5, they are considered grade level proficient (GLP).  If they score 4-5, they are considered college and career ready (CCR).  This graphic about CHCC is addressing the CCR results only and is measuring the percent of exam takers who are considered college and career ready.

    MVP was implemented in the 2014-15 school year for CHCC.  This data begins the prior year 2013-14, before MVP was implemented in CHCC.  The following 3 years ending with 2016-17 school year seems to show a remarkable improvement, with each year increasing steadily, ostensibly due to the application of MVP.  Note that my home school system WCPSS did not begin implementing MVP until the 2017-18 school year.

    Comparing Portion Improvements 

    The first seemingly misleading number on this chart is the 27% increase shown in blue.  You have to read this carefully, because this is the actual increase in the portion of students who are CCR.  The actual increase in CCR performance - according to this graphic - is 17%, which is 79% - 62%.  The 27% is actually reflecting the percent increase in percent.  The 62% portion increased by 27% to get to 79%.  62 x 1.27 = 78.74.  OK.  No problem.  This is an important comparison approach we will use later.

    2014 Subgroup Data Completely Incorrect

    Now, here's where the chart integrity falls apart.  I went and actually looked up all this data.  I downloaded the 4 spreadsheets corresponding to each year's drill-down of data. First of all, I was pleased that the data available at DPI matched exactly with the 2015-17 subgroup data provided in the CHCC graphic (green box).  This meant that the graphic author and I were indeed using the same publicly available official data. However, the 2014 data did not match at all.  Each and every number from DPI was different from the CHCC data.  I show the corrected data in the table above left.


    2014-2015 Summary Data Conveniently Incorrect, Misrepresenting Trend

    Then I looked at the data for ALL students.  In other words, the numbers which made up the summaries at the top of the graphic.  I had an additional decimal point in the DPI data, and am showing here that the 2016-17 numbers matched (green box).  Good job!  However, the 2015 number, which the graphic claimed was 70%, was actually 78.4%.  Uh oh.  I thought that this 70% was perhaps a typo (or wishful thinking) on someone's part because the subgroup data for 2015 matched perfectly as noted previously (see green cells below).  How could someone have gotten the All Students data point wrong (red, below) when it's on the SAME ROW as the subgroup data, which was correct?

    ACTUAL 2015 SPREADSHEET FROM DPI, file = accdrilldwn15.xlsx (color coding mine)

    Likewise, here is the ACTUAL DPI data from 2014, file = acctdrilldwn14.xlsx (color coding mine)
    The 71.5 is shown in red, as are the subgroup data points, incorrectly shown on the CHCC graphic, each and every one of them.

    So let's recap.  The original CHCC graphic data shows an increase EACH year from 62% all the way to 79%.  But in reality, the numbers start at 71.5% pre-MVP, then go up to 78.4% in 2015, then BACK DOWN to 76.3% in 2016, then back up for 2017.  In reality, the total point increase was 7.9 points (71.5% to 79.4%) and the portion increase was a modest 11%.  That's a very different story than was told on the CHCC MVP graphic because the starting point pre-MVP was not nearly as low as the graphic stated.  Check out the bar chart.  The "Published - WRONG" data sure does look a lot more appealing than the CORRECT data but the CORRECT data did not fit the narrative that MVP is an obvious contributor to student math performance improvement.  Were these incorrect data points used accidentally or intentionally?  Did MVP create the graphic and did CHCC sign off on it?

    No Better than Wake, Worse than State

    Let's continue, shall we?  An 7.9 point increase and 11% portion increase is still good, right?  I mean, this is due to implementing MVP so it must be good!  Being that I was trained using that "old math" back in the 80's I learned to look at data in context. 

    Since I had the 4 data sets open, I also looked at Wake County and the State of NC data.  Wake County, which did not implement MVP until 2017-18 school year, somehow was able to achieve the same portion level increase as CHCC which implemented MVP!  But let's look at even a bigger picture: the great State of NC.  Miraculously, NC overall was able to boost her CCR data by 7.2 points, almost as much as CHCC.  But, when we look at portion increase, NC went up by a whopping 15% during this same span!  That's because an increase from 46.9 to 54.1 (7.2 points) is really more impressive than an increase from 71.5 to 79.4 (7.9 points).  So if CHCC would have just not implemented MVP, perhaps they would have ridden the coattails of NC to even higher gains during this same period!

    NC Subgroup Gains were also Good

    So what about all those big subgroup gains?  Well, the NC data is not too shabby.  The state showed impressive gains in each and every subgroup, as did CHCC.  And let's not forget, Chapel Hill - Carrboro City school system has the 2nd worst achievement gap in the nation.  So there's nowhere to go but up for them.  By the way, the #1 gap in the nation is Berkeley, another MVP adopter. 


    Subgroups Performing Better/Worse than Others

    Continuing onward...  Now this is where I may be going too far in drawing a conclusion.  So let me just present the data.  If we look at the portion increases of the subgroups for NC versus CHCC, we have the table to the right.  The subgroups where NC did better than CHCC: All, Asian, and White.  The groups where CHCC did better than NC: Economically Disadvantaged (EDS) and Limited English Proficiency (LEP).  So these results present a real conundrum for people who endorse MVP and the graphic posted on the MVP Facebook page in September 2017.  First of all, what the White and Asian subgroups have in common is that they are the largest demographics which are taking Math 1 in middle school and honors classes in high school.  In WCPSS, this is where we have seen MVP adopted more strictly.  Did CHCC have the same implementation inconsistency?  And then the conclusion that somehow the MVP curriculum - which is heavily dependent on reading proficiency and facilitation and discovery learning teaching style - is somehow dramatically better for the EDS and LEP subgroups is a stretch beyond my wildest imagination.

    Conclusion

    So, in conclusion, if you've read this far, I encourage you to go check this data for yourselves.  Links are provided.  

    I don't know if CHCC or MVP authored the graphic.  I hope that this post will lead to that answer.  I tried to give the author of the CHCC graphic the benefit of the doubt.  I tried to find the cited wrong data somewhere in the official data sets - perhaps explaining a typo or copying from the wrong row.  I tried to see if there was a standard adjustment made which could have explained the wrong data shown.  But I just couldn't, being faced with the real official DPI data in black and white.  Frankly, the 2015 summary results (70%) and all the 2014 data seems contrived, if not fabricated.  This graphic was and still used to partially justify WCPSS's adoption and confidence in MVP.   But the data shown from Chapel Hill - Carrboro was just too convenient and tidy to be true.  An increase every year in every category?  No context considered?   Consider this MVP claim DEBUNKED.

    Update 7/31/19:

    Around 4/24/19, I noticed that the aforementioned graphic was removed from MVP’s Facebook page.

    To this day, I have not been able to determine if MVP or Chapel Hill created the graphic and no one will explain the mismatches.  Here are the steps I took in trying to get to the truth:

    • 3/13/19: Wrote this blog article explaining my findings and documenting my sources.
    • 3/13/19, 9:23 PM: I contacted Chapel Hill informing the of the blog I wrote and asked for an explanation for the mismatches.  (Email 1)
    • 3/15/19, 4:25 PM: wrote back and said "The CHCCS does not have any affiliation with the MVP Facebook account or data represented on their site." (Email 2)
    • 3/15/19, 6:04 PM: I wrote Chapel Hill back and challenged them to look into it and correct the record.  (Email 3)  No response.
    • 3/15/19, 9:23 AM: I also wrote WCPSS informing them of the findings, since they so highly valued CH's results in their praise of MVP. (Email 4). No response.
    • 3/19/19, 7:24 AM: I then wrote MVP and showed them the article I wrote and asked them if it was fabricated.  (Email 5)  No response.

    Author: Blain Dillard

    Thursday, March 7, 2019

    Myers-Briggs and Why MVP Doesn't Work at Scale


    I am no professional educator like the elite experts who created and/or selected this Mathematics Vision Project (MVP) curriculum for my school district - Wake County Public School System - but I have been through 6 years of undergrad and grad school, and countless hours of corporate training in my 30+ year career.  I even taught a few classes myself at the college and professional level.  So I'm no stranger to the education process, and like most parents, I've got some common sense, too.

    Cliff Notes Version

    Setting aside the debates about "integrated math vs. sequential math curricula" and "procedural fluency vs. conceptual learning," I think the crux of the problem with MVP in WCPSS can be explained with these 3 observations about education mechanics:
    1. Facilitation as a teaching style is very difficult to do well, especially for math personality types
    2. Facilitated discovery as a learning style does not match all students' personality types
    3. Facilitation as a primary teaching style is not a good fit for teaching math

    The Gory Details


    Introduction to Facilitation

    This may be oversimplifying it, but I think teachers are called to their field because 1) they care for children and 2) they have a knack for explaining things.  With MVP, the approach to pedagogy (the study of how knowledge and skills are exchanged in an educational context) has shifted from direct TEACHING to more FACILITATION.  With facilitation, EXPLAINING THINGS is removed from the equation.  The job of the facilitator is not to explain things, but to coerce and manipulate students into EXPLAINING THINGS or figuring things out for themselves.  This is especially challenging in MATH, where such a large component of MATH education involves explaining and understanding concepts never before encountered.

    Facilitation not a Good Fit for All Teachers

    I've been a student of excellent facilitation and I've tried to do it myself.  It is hard.  It requires psychology and a keen sense of observation of human interactions.  It is not a natural method for most people, and probably more difficult for those of us who are among the Sensing-Thinking-Judging personality types of the Myers-Briggs Type Indicator (MBTI) test - you know, like many scientists, mathematicians, and engineers.  On the other hand, facilitation as a profession may be better suited for people who are on the Intuitive-Feeling-Perceiving quadrants of Myers-Briggs.  These are just generalities, but corporations across America have used MBTI for years to help over 50 million employees do their jobs better.
    Note: The Myers–Briggs Type Indicator is an introspective self-report questionnaire with the purpose of indicating differing psychological preferences in how people perceive the world around them and make decisions.  I expect that many parents in the Triangle area are familiar with MBTI through professional training in their jobs.    

    Facilitation is not a skill set that can be taught in a 4 day (or 4 month) class to math teachers who, in general, have personalities which are not well-suited to lead teenagers in a pure facilitation experience 5 days a week.   I concede there may be a few MVP teachers around Wake County and other parts of the US who can do it satisfactorily 5 days a week, but in general, pure facilitation of MATH is NOT a transformation that is scalable to all math teachers.

    Additionally, the learning styles of students which work best in that type model are unique to some students. How many students will continue to ask questions when the response from the teacher is another question, followed by another question, followed by another question? Only some. Others will pipe down and stop asking questions. My son did. So I will never say that MVP doesn't work for ANY students and ANY teachers. I think there are some combinations that might work. I'm saying that AT SCALE, and for ALL, if done AS INTENDED and AS DESIGNED - IT CANNOT WORK. And that is why we have intense pockets of complaints and random pockets of contentment around the county.

    Facilitation & Discovery Learning not a Good Fit for All Students

    Likewise, facilitation is not a one-size-fits-all method that works for all students.  HOW we learn is a function of the personality type of the student.  Forcing teachers and students into a teaching/learning model based largely on teacher facilitation and student discovery goes against the personality preferences of many teachers AND many students. This is neither EQUITY nor a recognition of the DIVERSITY of teaching/learning preferences.

    For most teachers who at heart just want to use his or her tools (which may or may not include facilitation) to educate their students, it is a square peg in a round hole.  Forcing teachers into this model who are not so predisposed to it will inevitably lead to career dissatisfaction and attrition.  Teacher training is not the problem.  And student learning is not the problem.


    Facilitating Math is an Oxymoron

    Next, let's consider if facilitating the subject of mathematics even makes sense.  According to the Journal of Extension, "Although teaching and facilitating are not mutually exclusive processes, each method has a set of characteristics that distinguishes it from the other."   Look at this chart comparing teaching and facilitating, and think through each item with MATH in mind.  I've highlighted some of the key differences particularly relevant to math.  I'm not down on facilitation as a practice.  It can be an excellent way to engage a group to solve a problem that has issues and perspectives and politics and points of view and where the answer can be "it depends".  Is that the best model for teaching and learning MATH, where the answer is either right or wrong?  NO!  Indeed, there may be more than one way to solve every math problem.  Do we want to sacrifice learning at least one way that works at the expense of having a facilitated discussion about all the possible alternatives the student group can come up with?  Definitely NOT!  Can students learn significant mathematical concepts and skills from facilitation?  UNLIKELY!  Can students discover processes and concepts for which they do not have the prerequisite skills, like Pascal and Newton and Euclid did?  DOUBTFUL!  It's not to say that these students will never reach such platitudes, but we're in high school math.  We need the math teacher - not the students - to teach math.
    MVP apologists tout these methods as THE solution to all that ails students in their future higher education and workforce selves.   Yet, in college, it is highly unlikely you would encounter such teaching methods until senior or graduate level courses, if ever.  And certainly it is unlikely that would be in a class, the purpose of which is to establish basic foundations of quantitative subjects like MATH 1, 2, and 3!  Where you see pure facilitated instruction is in the social sciences or professional continuing education training.  Facilitation requires students who are mature enough to be facilitated and a subject suitable for debate, alternatives, discussion, opinion, flexibility, grey areas, and emotion.  Sensing-Thinking-Judging personality types + facilitating + teen agers + high school math = an equation that does not add up to anything but frustration for both teacher and student.

    Too Many Square Pegs and Round Holes to Scale

    All of the above doesn't mean that I am against a math teacher using facilitation skills to engage a class during a lesson.  What I am against is a prescriptive approach where teachers who are ill-equipped to be facilitators are mandated to teach a subject which is ill-fitted for facilitation to students whose personalities may or may not be receptive to such a style.  Otherwise, what we have is a square peg IN a round hole teaching about a square peg WITH a round hole to a square peg IN a round hole.  That is why MVP cannot scale as is, and therefore, since WCPSS has already scaled it, that is why it is failing.

    Author: Blain Dillard