Saturday, September 14, 2019

Let Wake County Teachers Create World Class Math Curriculum for NC

It's been a while since I've blogged, due to pending litigation against me by my county's math curriculum vendor MVP.  The lawsuit is still in progress and my attorney filed a response in Utah Court on September 9, 2019.  All those documents are at the link above.  I'm not going to allow MVP to silence my First Amendment rights, and I hope others will do the same and continue to speak out.  As my attorney, Jeff Hunt, wrote in my response:
This case is about a company attempting to use the judicial process to punish a parent who dared to voice reasonable concerns that the company’s educational program was not beneficial to his child and other similarly situated children. Instead of addressing such concerns in a productive dialogue, the company is seeking to silence them outright. But it is a parent’s obligation, right, and privilege to take action and, in this case, speak publicly to government officials and institutions and to other interested parents about matters of such important public concern as the well-being and proper education of children. Moreover, the Supreme Court has time and again emphasized that commentary like the statements at issue here—issues of public importance—“occupies the highest rung of the hierarchy of First Amendment values, and is entitled to special protection.”
With that said, here goes...

Please ponder these questions from a Wake County teacher: 

  • Several teachers have spoken for MVP because they didn’t have a curriculum before MVP and if MVP is taken away they say they won't have a curriculum to use… so question is what in the world were they teaching with before MVP?
  • Wake County has some of the best math teachers in the state, why not instead get those top-performing teachers to pool their resources to make a curriculum?
  • Why does Wake county trust a curriculum written by folks from Utah (with a different standard course of study) more than they trust their own teachers here?

My Proposal

Indeed.  I would like to share a radical proposal.  I work in the IT industry and we use this notion all the time:  Use what we sell and sell what we use.  A private company recognizes its most valuable resources are not the property it owns or the products it sells, but THE PEOPLE who work for that company.  Why should government agencies - such as a school system - think any differently?

So here’s an idea for Wake County Public School System (WCPSS) to address 3 current problems at once: 1) continued budget expansion, 2) underpaid teachers, and 3) pouring more money into a widely unpopular and problematic secondary math curriculum.

By my estimation, WCPSS must have nearly 600 math teachers teaching with MVP.  Before MVP, every school and every teacher had curriculum resources they used to teach Math 1, 2, and 3.  Those resources are still available on Google drives or on teacher laptops. So the resources for an excellent mostly-matched-to-standards curriculum exist in aggregate across this very large county.  With some level of adult coordination and project management, WCPSS could invest money in making robust system-wide math curriculum resources including:

  • A database of high quality class lessons which map to state standards and made available to students (and parents) for use after class
  • A math problem bank (some with and some without worked examples) which could be used by teachers or students for class work, homework, quizzes, and tests.  
  • Problems could be mapped to lessons (which are mapped to standards). 
  • Problems could be rated for difficulty which would allow teachers to build assignments and assessments appropriate for scaffolding. 
  • This could include MVP problems which are utilized during the appropriate time at the teacher’s discretion.
  • Continued refinement or adjustments to standards changes year after year.
  • Teachers could use their own creativity to deliver the material using methods best suited for their style and students’ needs.

The results would be truly high quality resources, better performing students, and increased teacher satisfaction and buy-in.  ALL teachers would access to ALL resource for ALL students. Problem difficulty ratings would ensure ALL students are met where they are and can be challenged to go higher. Assessments would be fair because they would only contain problems within the realm of what is expected.  Parents would have resources (notes + examples) to help students if needed.

This investment to do this could be in the form of tooling (software) and labor (paying teachers lucrative bonuses to contribute to the project).  Save the millions spent on one-time-use MVP workbooks and teacher re-education required to teach/facilitate using discovery methods, and shower that money on the excellent teachers who know best how to teach Wake County students. Even paying 100 math teachers $50 per hour for 40 hours each would be a fraction of what we are spending on MVP annually.  And the result would be one of the best math curriculums in the country. It’s a win for taxpayers, teachers, and students - and common sense. 

And not to get ahead of myself, if we had such a WCPSS-created curriculum, couldn't we charge a nominal fee to other NC counties for access?  Or perhaps petition the State for additional funds to maintain the curriculum based on State standards?  Wouldn't the State of NC welcome this as a cost saving solution?

Hammer Time or Something Else?

I know I have over-simplified this proposal and there will be skeptics.  This is intended to be a conversation starter, not a step by step implementation plan.  Some will say that MVP was purchased with the hope it would improve mathematics critical thinking due to an audit done in ~2016-17 showing poor results or trends.  But what actions are Wake County school admins and parents taking to shore up gaps experienced with MVP?

  • extra tutoring resources at school
  • extra tutoring resources at home for parents who can afford it
  • a WCPSS website with the beginnings of what I am proposing here
  • links to videos with lessons
  • references to websites with direct instruction and examples
  • allowing some (not all, apparently) teachers to use their own resources to supplement MVP when needed

Aren't we going full circle with these actions / remediations?  Why purchase a resource which was to be an end-to-end stand-alone curriculum to ostensibly make things better, when you must use the pre-exiting resources to shore up gaps?  What have we, as a county, gained by this - other than frustration?

By replacing existing or former curriculums with off-the-shelf end-all-be-all curriculums, WCPSS leadership has fallen for the age-old phrase "if all you have is a hammer, everything looks like a nail."  They have incorrectly viewed the problems of declining performance (whether math or other subjects) as only fixed by replacing curriculums with wildly different solutions.  This groupthink continues to be perpetuated by WCPSS leadership as justification for MVP or any other new idea that comes down the pike.  MY OPINION is that our prior curriculums were fine - excellent, in fact - though they perhaps needed some organization to make them more accessible.  I think our leadership has incorrectly diagnosed causation, and jumped ahead to feel-good tweet-worthy edufad actions for the sake of "doing something."

Wednesday, July 24, 2019

Look No Further! Even More Analysis of MVP Ground Zero in American Fork, Utah

British prime minister Benjamin Disraeli once said: "There are three kinds of lies: lies, damned lies, and statistics."  I have frequently debunked data showing supposed MVP "successes" by showing how it was used out of context (see Why My MVP Golf Score Improved, and other Exaggerations), or was outright wrong or falsified data (see MVP Math Claims about Gains in Chapel Hill - Carrboro Debunked!).

A few months ago I wrote about math proficiency data found in Utah in MVP Ground-Zero Math Performance Data Exposed, and it Ain't Pretty: An Analysis of American Fork Junior & Senior High Math Trends.  Recently, the original report I did about these Utah schools was called into question.  The purpose of this article is to elaborate on why I believe this data is golden and should bring grave concern to those hopeful that MVP will ultimately prove to be a success.  As time has passed, I am more and more convinced that THIS is THE most important MVP dataset to examine with care.


As noted in the prior article, American Fork Junior High is ground zero for MVP because this is where the MVP founder and one of its authors is a math teacher.  Certainly, there can be no other school in our solar system which has a better MVP situation:
  1. Worldwide subject matter & thought leader about MVP as a practitioner of the MVP "craft"
  2. The 9th grade math teachers love the program and buy-in to it, according to one teacher there ("the materials are excellent")
  3. When in doubt, any teacher certainly can obtain on-demand professional development by merely asking the founder for some pointers.  (Whereas the rest of us poor schmucks have to pay tax dollars to fly the MVP founder and team back to Wake County for more teacher refreshers on "How to Properly MVP")


I decided to take another look at what's going on in Utah.  In my prior article, I examined math proficiency scores available to the public at Utah's state education website.  I looked at data for 3 schools in particular:  

One was the founder's home school (American Fork Junior High).  The other was the high school fed by that school (American Fork High).  Both schools are in American Fork, Utah, which is north of Provo.  And the third was Fairfield Junior High in Kaysville, UT, which as the other article explains, was #1 in a list of 20 junior high schools similar to American Fork Junior High in 2017-18.  Kaysville is north of Salt Lake City.

The two American Fork schools are in the Alpine District, which has 12 junior high and 9 high schools, and ~79k students.   Fairfield is in the Davis District about an hour away, which has 17 and 9 junior and senior high schools, respectively, and has ~72k students.  

By comparison, Wake County Public School System has about 36 and 28 middle and high schools, respectively, with 160k students (Source:  So we are a little bit larger than both of these districts combined, in number of students and middle/junior/senior schools.

It's worth noting that WCPSS is considerably more diverse by race and ethnicity measures, and has a larger percentage economically disadvantaged and English Language Learners student groups than either of the Utah districts examined.

As noted in the prior article, American Fork Junior High School feeds American Fork High School exclusively.  This was told to me by a teacher at the high school.  This makes sense, based on the fact that the high school (2365 students) is only slightly larger than the junior high school (1962).  In fact, the school attendance coverage map (source: ) confirms that since the American Fork High School map completely contains the American Fork Junior High coverage map.  Portions of students from another junior high school make up the remainder of the American Fork High School population.  It might be worth learning more about the other junior high, but I've learned quite a bit with what is here already.


As a computer scientist and manager with 30+ years experience in the I/T industry, I know how important it is for potential customers to know my company uses its own products.  It is almost impossible to sell something if you cannot point to your own organization as "Customer Reference #1."

There is actually a Wikipedia page which defines this notion, also sometimes referred to as eating your own dog food or cooking.  It reads, "Eating your own dog food, also called dogfooding, occurs when an organization uses its own product. This can be a way for an organization to test its products in real-world usage. Hence dogfooding can act as quality control, and eventually a kind of testimonial advertising. Once in the market, dogfooding demonstrates confidence in the developers' own products."

I will give the MVP founder this credit: He has been able to convince his own junior high school to eat his own dog food.  In other words, they use MVP.  I know from writing to teachers at American Fork High School that they do not use MVP, except perhaps very rarely.  On July 23, 2019, I wrote to a Public Relations representative at Alpine School District to ask them these questions:
  1. Does your district, which includes 9 high schools and 12 junior high schools according to your website, use the MVP math curriculum as a matter of policy?  
  2. If not, then why not?
  3. If curriculum decisions are not made at the district level but at the school level, can you tell me which schools of the 21 do use MVP as their math curriculum?
As of this publication date, I do not yet have an answer.  I do expect to receive an answer because this person has been helpful and prompt in the past.  Stay tuned here for an update.

So, here's issue #1 for me - before we even get to data: Why has MVP not been solidly adopted in Alpine School District and perhaps broadly across other parts of Utah?  I understand that this is not their product directly, but the State of Utah has had its hands in the original funding for MVP back in the 2012 timeframe, according to "STEM IS DEAD IN UTAH COURTESY OF THE USOE."  If MVP was widely adopted locally, then I believe we would have heard about it as Customer Reference #1.  But, we have no customer references for MVP other than the ones we have discovered on our own, or from teachers tweeting about attending MVP training.  MVP will not tell us who their customers are, other than to state they have customers in 30+ states.  Why the big secret? 

Here's issue #2 for me: Given the obvious lack of broad adoption of MVP in its home district of Alpine or its home state of Utah (or any other large districts in the country) why would "very large WCPSS" bite off on this venture with a very small company which lacks the staff and a proven ability to scale its product successfully? 


In the prior article, I shared data for the 3 schools discussed above.  In this new chart, I am refining the chart to include additional important context including each district and the whole state.  Additionally, I am honing in on just 9th grade for the junior high data because in most cases, that is when Math 1 is taught, and that is where MVP is used at American Fork Junior High School.

This is an attempt to do a better job of convincing you that this data matters.  Specifically, we have 3 experimental groups:
  1. American Fork Junior High 9th grade which has the best case scenario of MVP
  2. American Fork High which receives the majority of its students from MVP-using American Fork Junior High
  3. Fairfield Junior High 9th grade which does not use MVP, but instead creates its own curriculum
We also have several pseudo-control groups.  These are groups which, granted, do include the experimental groups as well, but the rest of the makeup of the control groups may or may not use MVP.  The assumption is that by and large, they do not.  Control groups are:
  1. All Alpine District Junior Highs - 9th grade
  2. All Alpine District Senior Highs
  3. All Davis District Junior Highs - 9th grade
  4. All Utah Junior Highs - 9th grade
  5. All Utah Senior Highs
This is quite a busy chart, I know.  But please read it carefully to understand what it means.  I've used critical thinking to annotate it to assist with comprehension.

You can draw your own conclusions, but here is what I see:
  1. American Fork Junior High (MVP Home) - fared slightly better (-8.7% decline in 3 year proficiency rate) than Alpine district as a whole (-9.3%). We don't yet know what curriculum the rest of the district is using but that is pending a response from Alpine.   
  2. American Fork Junior High fared worse than Utah overall (+0.5% increase).  I see this as a RED FLAG because we have no indication that Utah at large is using MVP, so perhaps using MVP seems to be negatively impacting this one school disproportionately as compared to others in the state.  The MVP best case scenario should have been on par or better than the state if it truly is a superior curriculum.
  3. American Fork Junior High fared considerable worse than a similar school Fairfield Junior High (+26.6%).  This was pointed out in the prior article but in that article I was looking at the blended scores of the junior highs, which would include Math 7 and Math 8.  So when I focus on 9th grade (which would be Math 1 and sometimes Math 2), the results are even more stunning for Fairfield Junior High!  I see this as a RED FLAG for MVP because the contrast is HUGE.
  4. Fairfield fared quite better than other junior high schools in the Davis District (+11.2%).  Since Fairfield has reached higher proficiency levels, it may be hard to maintain the pace that the district makes.
  5. Fairfield obviously is outpacing the state overall (+0.5%).
  6. Davis District is clearly doing something well.  With the exception of Fairfield, they are using the Carnegie Curriculum.  My source tells me they are considering switching to more of a discovery learning model like MVP.  WHY. ON. EARTH??
  7. American Fork High School is fed with MVP students and a few others from the district.  Their results (-32.3%) were noted as abysmal in the prior article.  The other high schools in the district are not faring too well either (-20.9%), but remember the terrible results from the high school are embedded in that 20.9% decline.  It takes more than one bad apple to spoil the whole bunch, girl, but this one is pretty bad.  I see this as a HUGE RED FLAG for MVP.
  8. While all state high schools overall declined slightly (-4%), American Fork High School in comparison is way off the mark.  
  9. Note that when comparing American Fork Junior High to Fairfield Junior High, Fairfield has a more challenging student population to educate in that it has a larger percentage of economically disadvantaged, students with disabilities, and English language learners.  So maybe we should be taking #equity curriculum advice from them instead of MVP.  YELLOW FLAG for MVP.
So let's recap those RED FLAGS comparing 3 year change in proficiency rates:
  1. American Fork Junior High (-8.7) vs Utah (+0.5) = NET -9.2 WORSE
  2. American Fork Junior High (-8.7) vs Fairfield Junior High (+26.6) = NET -35.3 WORSE
  3. American Fork High (-32.3) vs Utah (-4) = NET -28.3 WORSE
To provide a little more detail, I did pull the actual test results for the 3 schools in question.  There was one anomaly in the data with the Math 3 scores for 2017.  The number of students assessed dropped considerably.  I saw this in other data in Utah as well, and have written them asking for an explanation.  So those data points are shown in red text below.

The table above color codes cells to attempt to show "cohorts" of students.  In other words, the yellow group takes Math 1 at American Fork Junior High in 2015, then Math 2 at the senior high in 2016, and Math 3 in 2017 (that is the suspicious data point).  The cyan and dark pink cohorts look more useful in that the number of students looks consistent.

You could probably find some other schools similar to American Fork Junior High which are not using MVP and which are doing worse.  I looked at this set of 3 schools due to one being the MVP home base (American Fork Junior High), and the other two having a data relationship with it: one based on student population progression (American Fork High) and the other on similarity (Fairfield Junior High).

I have no way of knowing the true cause of the steep decline in math proficiency at American Fork High School.  Is it the students coming in with 1 year (or 2 years, in the case of 68 students in 2016) of MVP in junior high school who lack foundations which impact their high school performance in Math 2 and 3?  Or is the Math 2 and Math 3 curriculum at the high school so bad is is disproportionately destroying these students' STEM potential?  Something is definitely wrong.

Alpine School District has a long history of trail-blazing experimental math programs such as ones based on "investigations math" or "constructivism."  It is right next the Brigham Young University, the source of at least one MVP founder. According to BYU's Math Education Department Fails Students, "BYU has for several years been the hotbed of new-age education fads.  Constructivism was pushed into Alpine School District thanks to the people at BYU."   In Investigations Math Summary, one parent gives background and plenty of resources.  Just search on the word "alpine."

I have tried to avoid confirmation biases I might have about this high school, but I can't help but wonder, which is my issue #3: I cannot for the life of me explain their steep decline in math proficiency other than to recognize that they are largely receiving students who went through MVP Math 1 (and sometimes Math 2) in junior high school.  Either that, or a major change at the high school that has gone uncorrected for 3 years.  My concern is that our WCPSS Math 1-2-3 MVP alumni will meet the same fate when they reach Trigonometry, Calculus, and/or college level math. 

I want to stress that this analysis is not a statistical analysis and article.  This is a common sense analysis and article.  It doesn't calculate margins of error and confidence levels.  I have no control over these experiments.  The samples which make up this data are not completely known other than what people have told me, and what I can infer by the data.  I am just a parent reading data available to the public, created in a state far away, about an experimental fad program which has been foisted upon tens of thousands of students per year and 50+ schools in Wake County, NC, much to our dissatisfaction.

You draw your own conclusions, and if you can find an MVP silver lining in this data, I'd like to hear it.

Thursday, June 13, 2019

Instructional Update Reveals MVP Guidance to WCPSS Teachers - No Wonder There's Confusion and Failure!

Parents have been asking since February for clarity about what goes on in the MVP class and the role of direct instruction, supplementation, etc.  WCPSS glossed over it in the Green Hope-only MVP FAQ they provided in March, but parents have continued to hear from students and teachers who performed or described the classroom experience in a way which still resulted in a lack of learning and failing grades.

That input from students, parents, and teachers, led me to sketch this image of the spectrum of how MVP is used.
WCPSS staff, who will not engage in a conversation with parents because they're hiding behind their lawyer, have said or written nothing to me to correct or clarify my assertion.  They broke yet another promise to deliver all things related to MVP understanding on June 7, by providing only a 1.5 page response containing MORE promises.

If it were not for teachers, parents would know very little of what is going on behind the school house doors.  I recently received an instructional update from someone at one of Wake County's 60+ MVP-adopting schools.  This conveys guidance received regarding how MVP is to be handled in class with respect to other instructional methods.

  • We have never said you can't supplement.  Teachers are expected to differentiate and supplement for students as needed.  But it must be done while staying true to the MVP structure. 
  • MVP must be used as the primary source in math instruction.  Students need to experience the productive struggle to form stronger math understanding and skills. Supplement with other materials as appropriate.  Examples:
    • Notes after the MVP lesson to highlight the key vocabulary from the lesson and for students to use as study guides (students copy or you provide guided notes they can glue into a notebook)
    • If a lesson requires prior knowledge (example: specific vocabulary or skill), and that is not the focus of the lesson, but you know the students have never learned that vocabulary/skill, you may use supplemental materials to ensure they have the knowledge needed prior to the lesson
    • If a standard is not found within any lessons, you must supplement to ensure students are taught that standard
    • You should not be providing supplemental packets prior to lessons where the material found in the packet is what the students will be discovering during the unit’s lessons
    • You should be having students work in small groups often/daily
      • As you walk around checking in with groups, you should facilitate and guide their discussion
      • If the majority of the class seems stuck, there’s no reason you can’t redirect their attention to you and you guide them as a whole group until they are “unstuck” then let them continue in small groups
      • You should not sit back and wait for them to ask you for help; you are there to ensure the discussions happen appropriately; ask probing questions or give reminders as needed 
      • If you have behavior concerns: 
        • assign the groups strategically
        • give them less time in groups
        • remind them of the rules/expectations often
        • assign roles within the group
        • sit with a group or hover in strategic areas, etc.
        • contact parent, reach out to the appropriate case manager and/or administrator 
    • If an activity focusing too heavily on a non-standard skill, alter the lesson by demonstrating that activity rather than having students take excessive time on a non-standard skill

On one hand, that is the most specific insight parents have had related the MVP in-class guidance.  On the other hand, wow.  Look how much more WCPSS is asking teachers to JUGGLE while maintaining this facade of MVP fidelity.  It is no wonder we are seeing a breadth of results across the county.

I meticulously dissected those words to create a visual flowchart. This is not an official chart issued by WCPSS and it may not be 100% accurate, but it's pretty close to what the words say.  And the words ARE from WCPSS.  But if you follow the flow chart closely, it's understandable:

THIS ^^^^ is what WCPSS is going to continue to pour money and time into, while sacrificing your students' bad learning experiences and grades into  "improvements" over time, bit by bit, until every teacher can manage to "get by" without getting caught supplementing and without students' grades suffering so much that the current level of parent pressure is maintained.  It's a game and it's a facade.  And your students are the losers.

Show in a more simplified manner, this chart is equally as helpful:

So I have a few observations:
  1. There is admission that the teacher needs to vet the materials for standard compliance.  I thought MVP matched NC standards and that was mandatory for the procurement vetting process?
  2. Likewise, there may be tasks MVP does that are NOT relevant to standards, and those too, must be vetted by the teacher and minimized. 
  3. The right side of the flow chart demonstrates a key weakness of MVP.  The teacher could very well spend the majority of class time on the MVP task where many students are stuck.  The teacher can iteratively redirect students and guide them, but eventually class-time may expire if the teacher doesn't pull the rip-cord in time to provide the notes (which are not allowed until after the lesson). 
  4. What happens when less than half the students are stuck?  You keep on truckin' until the bell rings and hope those kids 1) have internet at home, 2) know what to search for on internet or 3) can afford a tutor.

Maybe if Wake County can perfect the process of MVP, then our results will be as good as what they have where MVP guru and founder Travis Lemon is a teacher, in American Fork, Utah.  Their results have (NOT) been spectacular!  See MVP Ground-Zero Math Performance Data Exposed, and it Ain't Pretty: An Analysis of American Fork Junior & Senior High Math Trends

Hey, I have an idea for a new & improved process!

Wednesday, June 5, 2019

Witness Tampering, Suppressing Public Records - Will Anyone Be Held Accountable for the Latest Deceit from WCPSS?

This is an account of how I came to secure a solid damning testimony from a teacher in Utah about MVP.  And then, how one Wake County Public School senior leader managed to later directly receive a profuse apology from that teacher after the teacher suddenly and completely recanted their testimony to me.


How WCPSS found a loophole to avoid answering a public records request with an embarrassing finding upon given TWO chances to be honest and transparent


How I proved it by obtaining a truthful response to a public records request from the other party in Utah.

  • Around March 25, 2019, I decided I would like to investigate the performance of the school where MVP founder and creator Travis Lemon was teaching.  I had heard he was a teacher in Utah but didn’t know where. I searched on Google and one of the first hits was on That site listed Travis Lemon as a teacher at American Fork High School. There were 7 teacher ratings averaging 4.95 which was outstanding.  But it was the write-in comments which attracted my attention:
    • Feb 17, 2017: Awesome! He makes you think. Some students don't work.
    • Feb 08, 2015: Great teacher. Very nice! He doesn't tell you what to do but give you things to figure it out. Sometimes not very clear but in the end I learn so much.
    • Jun 16, 2013: Excellent teacher. Works hard to make you think and share. Pushes you to think and to make students come to the board. I like this class.
    • Oct 28, 2011: Awesome teacher, wish that I could have had a teacher like this before. Helps me and lets me retake and make up work. Do you things by the deadline!
    • Jul 16, 2011: Awesome teacher! He pushes you to think about things. Helps before and after school almost everyday. I have struggled with math but now I really get it!!!
    • Nov 29, 2010: Allows students to ask a lot of questions and helps before and after school. Makes you work and think.
    • Apr 08, 2010: I like how he teaches. He does a lot to let everyone help. He also answers a lot of questions.
  • Honestly, my first reaction was that most of those comments sounded awfully similar in their pattern:  (Short 1-2 word praise) + (Couple of short sentences saying things a teacher would love to hear - like class is not too easy and not too hard - it's JUST RIGHT!)
  • I went to American Fork High School’s website and didn’t see Travis Lemon listed as a teacher.  So on April 4 I sent an email to all math teachers there.
  • Within 80 minutes, I received the first of several responses which told me that Lemon was NOT a teacher at the high school, but he was a teacher at American Fork JUNIOR High School (not senior high).  Also, I was surprised to hear some critique of MVP.

  • I visited the American Fork Junior High website and sure enough, Travis Lemon was a teacher there.  I was curious about how they were using MVP so I used my same approach and quickly sent an email to those teachers as well.
  • The next day 4/5, I received an email from Karen Feld at the junior high who was not happy that I reached out to her and her peers.

  • Once I realized Lemon’s ratemyteacher data had him listed at the wrong school, I sent him an email:

  • I never did receive a response, but when I checked the website the data I found before was gone.  Fortunately, Google saved a cached version which I captured on 4/6 based on a 3/25 version.

  • I also received an email from one HIGH SCHOOL teacher who had strong words against MVP.  This high school teacher received students who had Lemon as a Junior High teacher, and claimed they were not prepared for the advanced class.  On April 8, I interviewed this teacher. I will call this person Teacher D as I later used this and other teachers’ comments in the material objection document.

  • On May 6 at 4:10pm, I received an email from Teacher D retracting their entire testimony.

  • I assumed the teacher got in trouble and texted them back and apologized.  
  • I proceeded to move the file from the folder in which it resided.
  • After receiving that text, I was pretty certain Teacher D was compelled to make sure I deleted the testimony.  So I submitted a public records request to WCPSS:

  • The next day around 12:30pm I received a folllow-up text from Teacher D.  The file was still there.

  • I did not realize that Google keeps the file in a TRASH folder, so the link still worked.  This time I really did delete it (though I saved a new copy for myself) However, around 4:17pm, I received ANOTHER follow-up.  

  • I guess that worked, because there were no more messages.

  • On May 15, I received a response to my public records request from WCPSS official Tim Simmons.

  • The only email produced was from Dr. Denise Tillery to Karen Feld, one of the Junior High teachers:

  • This email irritated me because it perpetuated the lie that I am having an “isolated experience” about MVP.

  • On 5/16, I wrote back to Tim Simmons questioning that there was only one email provided, and also complaining about Dr. Tillery’s characterization.  I copied many administrators, including Tillery, as well as the school board.

  • On 5/17, I received the following response from Simmons:

  • Back on 5/16, I also filed a public records request with the Alpine School District:

  • Apparently, on 5/28 I received the response, but I didn’t see that until 6/4.  The response had about ten emails between WCPSS and Alpine school district for various subjects other than MVP (which should have been sent to me by WCPSS also because I did not specify a subject in my public records request).  
  • I also was provided the two emails related to a student record request as well as the email between Tillery and Feld which WCPSS sent.
  • But most stunningly, I received evidence in (at least) two new emails that WCPSS (again) deceived the public because there was correspondence between Teacher D and Dr. Denise Tillery.
  • For some reason, it seems Teacher D was working on iterations of the email throughout the day on May 8 and 9 because all of the emails shown demonstrate a progression in editing.  I don’t know if all of these were SENT or if they were just drafts.

  • However it is the final email and subsequent response from Dr. Tillery which matters.  Please note: everything below the red line is the email which was sent to me from the teacher retracting their testimony.

  • And then the response back from Tillery:
  • Melody Apezteguia is the Assistant Principal at American Fork High School.
  • I don't know how Teacher D was contacted by WCPSS. It's possible that Dr. Tillery reached to Melody Apezteguia since she would have the authority to do an investigation to find out who the anonymous Teacher D was. Not sure how Teacher D would have otherwise thought to suddenly retract their statement to me, relentlessly follow-up to make sure I deleted it, then send an apology email to someone they have never heard of 2000 miles away. Connect the dots.
  • For what it's worth, I sincerely regret that Teacher D was caught up in this mess. Even through this presentation I have tried to maintain confidentiality, though the information is there in public records. Teacher D simply gave an opinion, one of many which ultimately have been suppressed by the heavy hand of Wake County Public School System.
  • It is very much worth noting that even without Teacher D's testimony, in the article "MVP Ground-Zero Math Performance Data Exposed, and it Ain't Pretty: An Analysis of American Fork Junior & Senior High Math Trends," I demonstrated the plummeting results of math students at Teacher D's high school. My understanding is that 100% of American Fork Junior High feeds into the high school. Something is happening to these students with respect to their math abilities. It can't be ignored. It is UNDENIABLE.


This concludes the presentation of evidence about this topic. On June 4, 2019, I subsequently presented a 3 minute summary version of this debacle to the Wake County Board of Education (1h:1m:30s mark in video).

I regretted having to make that speech, but there has been a pattern of deception from MVP and the WCPSS school system - especially Dr. Tillery - and it would have be irresponsible not to call her out on it as soon as I had the proof. I do believe it is completely unacceptable to tamper with a witness in a complaint and then lie or deceive about it by omitting public records response content. Frankly, I was not sure if I should call the police or write a speech, because I believe there has been some law broken in the situation. I will find out. She should be held accountable and removed from position or fired, in addition to the communications director who responded TWICE to me untruthfully after being given a 2nd chance. How can ANYONE trust anything she and her team says to THEM??? I CAN'T.
I am not alone in my complete and utter distrust of the school system underneath Superintendent Cathy Moore. This brings into question any of the other responses we have received to date as the result of public records requests. As a parent fighting MVP in Wake County, our only leverage it seems is to publicly expose what we are finding which is contrary to what we are being told, time and time again. These are not careless mistakes, they are outright intentional cases of deception... OF TAXPAYERS WHO ULTIMATELY PAY THEIR SALARY.

UPDATE June 8, 2019

According to WRAL's Kelly Hinchcliffe who followed up with WCPSS, they found a loophole.  I made my request to WCPSS on May 6, 5:29pm and asked for emails "from the dates April 1, 2019 to present" so they only gave damaging emails through that May 6 timestamp.

If I would have wanted WCPSS emails from April 1 until May 6, 5:29pm I would have stated that. The intent was that "present" moves along with time, and changes according to when WCPSS runs the report.  Most honest adults and database query writers would know that. 

They sent the request to Tech Services on May 7. 

The two damaging emails that were omitted occurred on May 9.  I received those from Alpine district in a request submitted later in May.  Alpine also confirmed the two confidential student emails.   

The response from WCPSS back to me citing only one email and 2 confidential ones was not until May 15, 3:10pm.  I would have expected the report to be emails with a date greater than or equal to April 1, with NO END DATE.  If an end date for the query was explicitly specified when it was executed, "present" should be sometime around May 14 or 15, which would have included May 9.  Additionally, on May 16, I copied the staff (including Tillery) and the Board on the fact that I got only the one response.  So she knew she was withholding those May 9 emails, IF she read my mail.  If they want to play that game, then I would argue "present" reset to May 16 once I sent that followup email.

PLUS they did not send me ANY of the 10 emails UNRELATED to MVP. Not that I wanted them, but the point is that they interpreted my intentions enough to avoid sending me mails not related to MVP, yet failed to interpret what "present" meant.

Did they give any explanation as to why they were in contact with my key witness?  NO.

All of the above points to one thing: Contempt for parents & students and deceit.  OK, that's two things.

UPDATE June 17, 2019

EMail from me to Tim Simmons on June 17:
Tim, If you will refer to the attached May 16 email I sent back to you and others (including Tillery) I was questioning the breadth of the response I received. So I DID ask you directly. So I would have assumed, obviously wrongly, that Dr. Tillery would have been forthcoming and said that she has 2 emails that would fit my search criteria if only "present" were closer to the May 15 date when I was sent the ONE email. So yes, in this case, the process time was an issue which created a convenient loophole.

Blain Dillard
On 6/15/19 8:48 AM, Timothy Simmons _ Staff - Communications wrote:
Hello Mr. Dillard,  Sorry for any confusion. The lag time between when a request is processed by tech services and when it is sent has never been an issue for me before. If someone feels an email or record has been missed in a search, they generally ask us directly.

This search produced only three emails, of which two involved a student transfer.

Regards, Tim
Tim Simmons
Chief of Communications

From: Blain Dillard <>
Sent: Thursday, June 13, 2019 11:27 AM
To: Timothy Simmons _ Staff - Communications
Subject: Re: Public record request
CAUTION: This email originated from outside of the organization! Proceed with caution!

Tim, Thanks for the explanation.

Indeed, I made my request to WCPSS on May 6, 5:29pm and asked for emails "from the dates April 1, 2019 to present." However, if I would have wanted WCPSS emails from April 1 until May 6, 5:29pm I would have stated that. The intent was that "present" moves along with time, and changes according to when WCPSS runs the report. Most database query writers would know that or at least would have asked for clarification. Anyway, I do understand if that is your normal practice. Lesson learned.

However, I'm still confused.
  • Did the report writer use May 6, 5:29pm as the end time of the report?
  • Or was the end time around May 7 6:18pm, the time you indicated you sent the request to Tech Services? Your note below says May 7 so I guess they (report writers) define "present" based on when YOU send them the request to them. Not "present" when they run the report or "present" when I sent the original request.
  • You say below that the report was sent to you on May 9. Did they run the report on May 7 and sit on it for 2 days? Why?

Once I did the request to Alpine district in mid May, I got 15 emails: 1 was the one you sent me. 2 were the confidential ones (redacted) you referenced. 2 were the damaging ones Dr. Tillery did not want me to see. And 10 were UNRELATED to MVP. Why not mention those in your response to me? Not that I wanted them, but the point is that you all interpreted my intentions enough to avoid sending me 10 mails not related to MVP, yet failed to interpret what "present" meant.

If you can just forward me the responses Tech Services sent to you between May 6 midnight and May 18 11:59pm inclusive, removing the confidential emails, then I think I will have clarity and be satisfied.

I apologize for being so suspicious and accusatory as I'm sure you are a nice person. But if you would put yourself in my (and other parents') shoes, certainly you can understand. We parents have numerous public records requests in your queue, and so far most of the responses have been quite delayed. I know some are lengthy, but I believe we've only received the SINGLE email you sent me in response to any request (no matter how big or small) involving EMAILS.

I appreciate your help.


Blain Dillard
On 6/12/19 9:42 PM, Timothy Simmons _ Staff - Communications wrote:
Mr. Dillard, Based on your comments at the June 4 school board meeting, it was apparent you had questions about how your May 6 public record request was handled. In checking our records against your comments, I believe I understand what occurred.

As you know, you asked for all email between and from the dates April 1, 2019 to present. I sent a confirmation email to you May 7 letting you know the request had been sent to tech services. On May 9 the results of that search from April 1, 2019 to May 7, 2019 were sent to me. As it contained two emails with confidential information regarding a student transfer, I scheduled a call with our district attorney to ensure it was being handled properly. That call, and the subsequent release of the single email to you that was public, occurred May 15.

On June 4 you informed school board members there were two emails between and dated May 9 that were withheld by us. As our email search ran from April 1 to May 7, it was not possible to withhold and email from May 9 as that date fell outside the dates provided to tech services and confirmed to you on May 7.

I understand why this might not have been immediately apparent as the dates involved were quite close to one another.


Tim Simmons
Chief of Communications

Sunday, June 2, 2019

With MVP Year 2 in the Books in WCPSS - What Now?

Recap and Level Set

WCPSS curriculum administrators have successfully run out the clock on Year 2 of its MVP implementation.  Faced with February 20, 2019, news coverage and a throng of parents from Green Hope High School - and a few other schools with news-watching parents who showed up unexpectedly - leaders from Crossroads One put on a one-directional information session hoping to assuage parents chomping at the bit for answers about why their smart math students suddenly could no longer do their homework or pass a test.  

Since that meeting and through today, crisis management and narrative control has been the game plan.  Little substantive change has been made as WCPSS leadership doubled down on MVP, blaming teachers for implementation inconsistencies, students for not working hard enough, or parents for not understanding the need for the 21st Century critical thinking skills MVP promised to deliver.

Despite numerous reports provided by me and others about unimpressive or declining math scores in every single MVP reference we’ve located and investigated (examples: Alpine, Wake #1, Wake #2, Wake #3, Berkeley, Chapel Hill, and Modesto), and numerous passionate 3-minute parent speeches at Board of Education meetings, there has not been one single admission that the curriculum concept itself could possibly be at fault or have systemic weaknesses.  And not one single research- or evidence-based defense of the program has been presented, other than claiming a mastery of the National Council of Teachers of Mathematics (NCTM) 8 principles of math education. Granted, WCPSS admits there are some errors and typos in the workbooks, which some lucky teachers will be rewarded with summer employment to help correct. But other than that, there’s nothing wrong that a few bits of grade adjustments or additional teacher professional development can’t fix.

Highlight of the past 3 months: As a result of my analysis and debunking reports, having WCPSS contact MVP, LLC, and ask them to remove the tweet which falsely advertised MVP’s year 1 “successes” in Wake County.  Side benefit: MVP also removed the false Facebook advertisement about Chapel Hill’s successes. While neither of those will do any failing student in Wake County any good, it may possibly reduce the rapid spread of the MVP fever that seems to be catching across NC as other schools have or are considering adoption (Charlotte-Mecklenburg, Guilford County, Johnston County, Transylvania County, to name a few).  

In April, approximately 20 parents from approximately 9 different schools filed formal complaints citing material objections related to MVP as it relates to ten NC or WCPSS policy violations.    After seemingly consulting their attorney(s), WCPSS promised a response to all on June 7, along with every other pending and additional public records requests since then. Since then, they’ve pretty much gone radio silent.  Tick. Tick. Tick.

I’m tempering my optimism AND pessimism about what might come from the June 7 response. Regardless, parents can either accept the results or appeal. So we’ll cross that bridge when we get to it.

If Only They Would Have Done This (or, It's Not Too Late to Do This!)

So here’s what WCPSS SHOULD have been doing in light of the MVP complaints that are more than a few isolated experiences:  

1. Decide whether current semester grades matter, and if they don’t, then state that and defend it.  

What this means is the following: One of the premises of MVP is that students will gain 21st Century critical thinking and math understanding skills, because the “old math” ways of “rote” memorization and teacher “lecturing from the chalkboard” are not working.  The implications are that the improvements promised by MVP may not manifest themselves until much later in high school or college, because it takes several years for the kinks to be worked out of a new curriculum. Therefore, if too much emphasis is placed on current semester grades, then we (parents) are being short-sighted and not seeing the long term play of MVP.  If current grades don’t matter, and a 3 letter grade drop should be expected for some students, then WCPSS needs to state those expectations clearly and strongly, AND provide evidence and research which will assure parents their students low grades now will be replaced with greater successes in school and career later.

2. If current grades DO matter and ARE a measure of tactical program success, then earnestly look at the data at a detailed level NOW.  

MVP is a radical change in the way teachers teach.  Picture a spectrum where, on one end, you have a teacher who is the perfect MVP implementer.  They are following MVP guidelines and scripts as precisely as the creator of MVP himself, regardless of whether this is their natural teaching style or not.  They may have to stifle some of their own creativity in order to follow the rigidity of MVP. They are implementing “with fidelity” and “being true to the curriculum” as described by WCPSS curriculum leaders to MVP teachers.  

On the other end of the spectrum is the teacher who is not using MVP as prescribed.  This teacher may be very creative and like to deliver math to their class using a variety of methods and styles, which may or may not include group work and facilitation.

In between freedom and fidelity sit probably the majority of teachers in WCPSS. They saw some potential with MVP and have tried it, but are seeing their students failing to grasp the math. So they are supplementing. They may still be using MVP some, but they are flying under the radar of the "MVP Police" at WCPSS who promise to reprimand teachers not following the program strictly.

I believe the implementation disparities across the county have lead to the results we have been seeing.  Where there are pockets of problems, we find strict implementation fidelity. Where there are happy students making A’s and B's in math like always, we find teachers who are teaching “under the radar” and not using MVP strictly.  They are supplementing, answering questions, and skipping many of the MVP tasks.

WCPSS needs to understand the correlation between performance results and MVP implementation fidelity.  I believe this can best be done by looking at grades first and testing two hypotheses:
  1. Desirable results are correlated with implementing MVP with fidelity (one end of the spectrum)
  2. Undesirable results are correlated with not implementing MVP or implementing MVP with low fidelity (other end of spectrum)
I will define desirable and undesirable results below.  

These are complementary hypotheses.  And they both are testing the opposite of what I believe to be the case due to the many anecdotes I hear about.  But this is what WCPSS wants to believe, and so they should test it and prove it to themselves and the public.

Here’s how I would do this.  Don’t wait for EOGs or EOCs. First of all, those results are produced annually and are not publicly available until around November for the prior school year.  Recently, those tests seem to be less useful for year to year comparisons of the publicly available NC DPI data due to the changes. For example, the numbers published by NC DPI for 2017 and 2018 Math 1 results had to be massaged with a non-published WCPSS algorithm in order to produce numbers to compare MVP year one results.  Additionally, NC is changing assessments again this year, which includes Math 1 and Math 3. Furthermore, it is not clear to me if Math 2 is part of any standardized test this year. Even if our standardized tests were perfect measures, the point is that we can’t wait for an annual checkup on whether MVP is working or not. There are enough students with significant problems NOW that demand a more timely micro-analysis of possible systematic issues.

Therefore, WCPSS should look at student by student performance mid-course on a regular basis and compare to prior math course grades.  They have this data collectively because parents individually have this data in PowerSchool.  

Take my son for example.  Prior to Math 2 Honors, his final grades were 78 (Math 7), 92 (Math 8), and 93 (Math 1).  If you weight those 15%, 35%, 50% (I made up this weighting because it makes sense), his recent math weighted average (RMWA) was 90.4.  Going into Math 2 Honors was a stretch for him, but his confidence in Math was rising, so we went for it. I would have expected him to possibly make a low B in Math 2 Honors or maybe a high C, given his recent math successes and the extra challenge of an honors course.  But no. He encountered a teacher who strictly followed MVP “with fidelity” and was “true to the curriculum.” Luke’s averages for his first 5 quizzes and tests were 57.8 and 60.6, which represented a 3.3 and 3.0 letter grade drop from his RMWA of 90.4! This is a HUGE decline and believe me, it raised red flags for us well before the 5th quiz or test.  This is a similar theme I’ve heard from other parents - multiple letter grade drops.

Desirable vs Undesirable Results

WCPSS should do this analysis throughout the semester.  In classes where there are numerous students who are falling more than 1.5 letter grades from their recent math course grades, a red flag (undesirable result) should be raised.  Likewise, in classes where a large majority of students are within +/- 0.5 letter grades of recent performance, a green flag (desirable result) should be raised. IN BOTH CASES, WCPSS should interview those students to understand how the teacher teachers class. Perhaps this could be done with a survey which checks criteria for "implementation fidelity" that MVP and WCPSS are expecting of teachers. For example, does the teacher start each class with a task, or does the teacher explain the math first? Does the teacher answer a question with a question and will they eventually given an answer? Does the teacher demonstrate one best practice example of how to do today's math problems? Do you feel equipped to do your homework at the end of class? Simply observing the teacher is not enough, because teachers know how to perform when someone is watching.

Obviously, when implementing this analysis, WCPSS’s data scientists should decide how to tweak those criteria to isolate what would best constitute desirable and undesirable performance.  I would have liked to think that this criteria existed already. I would have thought that if a school system saw clusters of students suddenly dropping 3+ letter grades, they would have proactively investigated this, but I am not aware of this happening.

Now, the other factors that have to be considered are the extenuating circumstances.  Are the grades being adjusted or fluffed with multiple mastery opportunities or group-testing?  How much tutoring is used by the students? Are there changes to the students which would explain the performance?

This analysis is how WCPSS can judge the magnitude of what we parents are seeing and hearing from others, OR if 20 students in all of Wake County are truly the only ones with isolated MVP problems.  

In other words, is the 2-part hypothesis true or false?

If the hypotheses are true, then the fix might be more professional development for teachers where there are MVP problems.  But data from the ground-zero home of MVP, American Fork Junior High School would indicate that the hypotheses are false.

So if current grades do matter, and there is a systemic problem correlating to MVP implementation fidelity, then...

3. It’s time to face facts AND common sense and cut bait.  

The facts are based on the analysis done above plus the empirical evidence from other districts that are 3+ years into MVP with no improvements to show.  Examples noted above in introduction.

The common sense part is this: Before WCPSS went to block scheduling, students spent about 160 hours per year in math class (6 periods in one day all year long).  With block scheduling, this time is reduced to about 120 hours per semester for a math class (4 periods in one day each semester). So effectively, the change to block scheduling was a reduction by 25% in class time for a course.  

With MVP, class time is spent doing the MVP task-based activities.  No one argues that some of the MVP problems are good ones that should be understood and worked by students.  But, how much time should be spent - in cacophonous group work - “grappling” with a problem BEFORE learning the fundamental math required to ultimately solve the problem, waiting for the 0-2 strong kids in the group to partially figure it out, while the other 2-4 students wait?  One third of class time (40 hours per course)? One half (60 hours per course)?

When we take yet another portion of precious math class time and devote it to an activity that may or may not yield benefit for only some of the students, logic would indicate that is not a good decision.  Every student. Every day. Ask your child about their MVP math class experience, and you'll hear that some students - they are humans after all - get engaged more than others. It's just the way life is. The teacher cannot individually play the information hide & seek math facilitation game with 30-35 students to ensure every student every day "gets it." It's just not possible.

Every minute counts when we’ve already cut math course time by 40 hours.  With only 120 hours in class pre-MVP allotted for teachers to teach the material the best way they know how, how can turning over even a small portion of that time to a student-led “grappling” and “discovery” exercise be of benefit to all students?  It can’t.  

The sooner WCPSS and others cross that bridge and admit it, the better.  Cut bait. What does it look like to cut bait? I see there are 2 options:
  1. Allow teachers the flexibility to use MVP problems and tasks as they deem appropriate, OR
  2. Drop MVP completely and pursue another curriculum
Regardless of which option, WCPSS math teachers need curriculum resources to teach.  I, and others, believe there is a win-win opportunity here as described next.

4. Develop WCPSS teacher-authored curriculum resources.  

By my estimation, WCPSS must have 400-600 math teachers teaching MVP.  Every school and every teacher had curriculum resources they used to teach Math 1 - 2 - 3 before MVP.  Those resources are still available on Google drives or on teacher laptops.  So the resources for an excellent perfectly-matched-to-standards curriculum exist in aggregate across the county.  With some level of adult coordination, WCPSS could invest money otherwise spent on MVP training and invest in making robust system-wide curriculum resources.

This investment could be in the form of tooling (software) or labor (paying teachers extra to contribute to curriculum resource bank).  Proper project management and tooling could result in:
  • A database of high quality class lessons which map to state standards and made available to students (and parents) for use after class
  • A math problem bank (some with and some without worked examples) which could be used by teachers or students for class work, homework, quizzes, and tests.  
    • Problems could be mapped to lessons (which are mapped to standards). 
    • Problems could be rated for difficulty which would allow teachers to build assignments and assessments appropriate for scaffolding. 
    • This could include MVP problems which are utilized during the appropriate time at the teacher’s discretion.
  • Continued refinement year after year.
  • Teachers who have contributed to the project can be rewarded with recognition and/or additional pay, depending on contribution.
  • Teachers could use their own creativity to deliver material using methods best suited for their style and students’ needs.

Why is this so difficult to envision?  

Again, we’re more than halfway there because this solution already exists in aggregate across the county.  We just need to apply a level of organization to it. This would result in a tailored solution that is on the mark with state standards AND teacher buy-in would be high AND teacher satisfaction could be improved.  The results:
  • Truly high quality resources not riddled with errors an d confusion like MVP materials
  • Better performing students
  • Increased teacher satisfaction
  • ALL teachers have access to ALL resource for ALL students
  • Problem difficulty ratings ensures ALL students are met where they are and can be challenged to go higher
  • Assessments will be fair because they will only contain problems within realm of what is expected
  • Parents will have resources (notes + examples) to help students if needed
In conclusion, I implore our WCPSS leaders, including board members, to set aside their hashtags and preconceived biases favoring MVP as the solution to achieving mathematics utopia, AND LOOK AT THE FACTS and USE COMMON SENSE about what is happening. If the response to the material objection does not indicate that substantive change is coming regarding MVP, I will expect to see protest escalate this fall as a new batch of students enter Math 1 at all schools, and Math 3 at the schools completing the rollout.