The state of California’s proposed framework for new math standards for K-12 schools hasn’t been adopted yet. Hopefully it won’t be without significant changes. When the first draft of the framework was published last year I wrote about the overwhelming focus on equity. The draft is long and broken into chapters. Chapter two is titled “Teaching for Equity and Engagement.” Here’s how Robby Soave at Reason described it:
The entire second chapter of the framework is about connecting math to social justice concepts like bias and racism: “Teachers can support discussions that center mathematical reasoning rather than issues of status and bias by intentionally defining what it means to do and learn mathematics together in ways that include and highlight the languages, identities, and practices of historically marginalized communities.” Teachers should also think creatively about what math even entails: “To encourage truly equitable and engaging mathematics classrooms we need to broaden perceptions of mathematics beyond methods and answers so that students come to view mathematics as a connected, multi-dimensional subject that is about sense making and reasoning, to which they can contribute and belong.”
Chapter nine returns to the focus on equity:
Equity cannot be an afterthought to more traditional mathematics content-centered offerings that do nothing to address the fact that “Black, Latinx, Indigenous, women, and poor students, have experienced long histories of underrepresentation in mathematics and mathematics-related domains” (Martin, 2019; see also Martin, Anderson, & Shah, 2017). Inequities caused by systemic issues means that a “culture of exclusion” persists even in equity-oriented teaching (Louie, 2017). Many of the stories that we use to define mathematics, and to talk about who does or is good at mathematics, are highly racialized and English language-centric, and are experienced that way by students (Lue & Turner, 2020). This means students’ mathematics identities are shaped in part by a culture of societal and institutionalized racism. Professional learning in mathematics can respond to these realities and aim for more than incremental change (which does little to change the framing narratives that drive inequities).
The framework also linked to this outside document titled “A Pathway to Equitable Math Instruction” which is more explicit about what the new focus on equity should look like. [emphasis added]
We see white supremacy culture show up in the mathematics classroom even as we carry out our professional responsibilities outlined in the California Standards for the Teaching Profession (CSTP). Using CSTPas a framework, we see white supremacy culture in the mathematics classroom can show up when:
• The focus is on getting the “right” answer.
• Independent practice is valued over teamwork or collaboration.
• “Real-world math” is valued over math in the real world.
• Students are tracked (into courses/pathways and within the classroom).
• Participation structures reinforce dominant ways of being…These common practices that perpetuate white supremacy culture create and sustain institutional and systemic barriers to equity for Black, Latinx, and Multilingual students. In order to dismantle these barriers, we must identify what it means to be an antiracist math educator.
Turning the focus of math education away from the right answers and toward anti-racism is bad enough but in recent weeks professional math educators have pointed to a number of other problems with the new framework. For instance, you may have noticed that my excerpt from chapter nine (above) includes a bunch of citations to published studies. Clearly a framework with this many references must be carefully researched, right? Well, not so much it turns out. Professor Brian Conrad who is the director of undergraduate math studies at Stanford decided to check the links as it were and found that a high percentage of them were simply misleading in significant ways.
I read the entire CMF, as well as many of the papers cited within it. The CMF contains false or misleading descriptions of many citations from the literature in neuroscience, acceleration, detracking, assessments, and more. (I consulted with three experts in neuroscience about the papers in that field which seemed to be used in the CMF in a concerning way.) Sometimes the papers arrive at conclusions opposite those claimed in the CMF…
A sample misleading quote is “Park and Brannon (2013) found that when students worked with numbers and also saw the numbers as visual objects, brain communication was enhanced and student achievement increased.” This single sentence contains multiple wrong statements: (1) they worked with adults and not students; (2) their experiments involved no brain imaging, and so could not demonstrate brain communication; (3) the paper does not claim that participants saw numbers as visual objects: their focus was on training the approximate number system…
The CMF selectively cites research to make points it wants to make. For example, Siegler and Ramani (2008) is cited to claim that “after four 15-minute sessions of playing a game with a number line, differences in knowledge between students from low-income backgrounds and those from middle-income backgrounds were eliminated”. In fact, the study was specifically for pre-schoolers playing a numerical board game similar to Chutes and Ladders and focused on their numerical knowledge, and at least five subsequent studies by the same authors with more rigorous methods showed smaller positive effects of playing the game that did not eliminate the differences.
More than once the study cited actually made a point contrary to the author’s intent but they failed to tell readers that was the case.
In yet another case, the CMF cites Burris et al (2006) for demonstrating “positive outcomes for achievement and longer-term academic success from keeping students in heterogenous groups focused on higher-level content through middle school”. But the CMF never tells the reader that this paper studied the effect of teaching Algebra I for all 8th grade students (getting good outcomes) — precisely the uniform acceleration policy that the CMF argues against in the prior point…
The CMF claims that Sadler and Sonnert (2018) provides evidence in favor of delaying calculus to college, but the paper finds that taking calculus in high-school improved performance in college…
The CMF makes the dramatic claim that (Black et al, 2002) (really 2004) showed that students “are incredibly accurate at assessing their own understanding, and they do not over or under estimate it.” If this claim were true, no exams would be needed to assess student knowledge: we could just ask them. Unfortunately, the paper of Black et al contains nothing of the sort.
And in some cases the framework simply makes assertions with no attempt to back them up at all.
In some places, the CMF has no research-based evidence, as when it gives the advice “Do not include homework . . . as any part of grading. Homework is one of the most inequitable practices of education.” The research on homework is complex and mixed, and does not support such blanket statements.
All of that comes from the paper’s executive summary. The full paper is 25 pages long and is full of additional examples where the framework’s authors have presented research out of context in ways that are misleading or simply false.
Today, the LA Times published a piece critical of the framework co-written by the dean of Berkeley’s college of engineering and the provost for the school’s Division of Computing, Data Science, and Society. They write that the new framework is fundamentally flawed:
Finding a way to improve math performance is critical. However, the framework’s authors are wrong to suggest that the achievements of computing and wider access to data have made some advanced math courses irrelevant.
This rationale is no more valid than saying that grammar- and spell-checking tools have eliminated the need for students to learn how to write. If anything, the pervasiveness of computers means that we should focus more on mathematical reasoning, not less. As science and engineering educators, we have seen firsthand how students lacking a strong foundation in math struggle to learn both data science and engineering at the college level…
The proposed framework prioritizes providing students with multiple pathways in their math education and the option to choose their courses. But the efficacy of this approach is not supported by data and reflects a poor understanding of how fundamental math skills build on one another. The proposed choose-your-own-adventure approach to math pathways for high school juniors and seniors is fundamentally flawed…
These flaws in the proposal have prompted more than 2,000 STEM professionals and academics — including many in the field of data science — across the country to sign open letters raising concerns about the California Math Framework. The signatories include seven Nobel Prize winners, five Fields medalists and three Turing Award winners, as well as more than 200 professors from the University of California system, USC and Stanford University. Their concerns should be addressed.
So even if you set aside the focus on making math anti-racist, the new framework has significant flaws and is full of misleading descriptions of cited studies. And as I pointed out in March, the focus on de-tracking in the new framework was partly based on a similar program in San Francisco. But according to education policy analyst Tom Loveless, the San Francisco program was not a success.
SFUSD declared detracking a great success, claiming that the graduating class of 2018–19, the first graduating class affected by the policy when in eighth grade, saw a drop in Algebra 1 repeat rates from 40% to 8% and that, compared to the previous year, about 10% more students in the class took math courses beyond Algebra 2. Moreover, the district reported enrollment gains by Black and Hispanic students in advanced courses…
Families for San Francisco, a parent advocacy group, acquired data from the district under the California Public Records Act (the state’s version of Freedom of Information Act). The group’s analysis, available here, calls into question the district’s assertions. As mentioned previously, repeat rates for Algebra I dropped sharply after the elimination of Algebra I in eighth grade, but whether the reform had anything to do with that is questionable. The falling repeat rate occurred after the district changed the rules for passing the course, eliminating a requirement that students pass a state-designed end of course exam in Algebra I before gaining placement in Geometry. In a presentation prepared by the district, speaker notes to the relevant slide admit, “The drop from 40% of students repeating Algebra 1 to 8% of students repeating Algebra 1, we saw as a one-time major drop due to both the change in course sequence and the change in placement policy.”
The claim that more students were taking “advanced math” classes (defined here as beyond Algebra II) also deserves scrutiny. Enrollment in calculus courses declined post-reform. The claim rests on a “compression” course the district offers, combining Algebra II and Pre-Calculus into a single-year course. The Families for San Francisco analysis shows that once the enrollment figures for the compression course are excluded, the enrollment gains evaporate. Why should they be excluded? The University of California rejected the district’s classification of the compression course as “advanced math,” primarily because the course topics fall short of content specifications for Pre-Calculus.
More to the point, minority students who went through the de-tracked program were no better off. Black students in 11th grade were still doing math at a 4th grade level and Hispanic students at a 5th grade level on average. The achievement gap actually went up after this program was implemented. That’s clearly not the kind of equity California is looking for but it may be what they get if the new framework’s detracking is implemented.
Join the conversation as a VIP Member