Canada’s math wars and bad use of the PISA

Canada went through a bit of a panic recently when the PISA 2012 scores came out.

canadascores

[Source.]

Oh no! Scores are dropping! There must be something done wrong, so it’s time to change policy:

“If you look at what’s been happening, predominantly over the last decade, there’s been an unprecedented emphasis on discovery learning,” said Donna Kotsopoulos, an associate professor in Wilfrid Laurier University’s education faculty and former teacher.

Robert Craigen, a University of Manitoba mathematics professor who advocates basic math skills and algorithms, said Canada’s downward progression in the international rankings – slipping from sixth to 13th among participating countries since 2000 – coincides with the adoption of discovery learning.

[Source.]

As I pointed out in a recent post, PISA essentially measures problem solving, and it seems strange to beef up calculation in an attempt to improve problem solving, especially considering Canada’s performance on the TIMSS which does tend to measure calculation. While Canada as a whole hadn’t participated in TIMSS since 1999 (they did in 2015 although the report isn’t out yet), some provinces did:

Ontario 8th grade: 2003 (521), 2007 (517), 2011 (512)
Ontario 4th grade: 2003 (511), 2007 (512), 2011 (518)
Quebec 8th grade: 2003 (543), 2007 (528), 2011 (532)
Quebec 4th grade: 2003 (506), 2007 (519), 2011 (533)

canadastat

So: Ontario 8th grade had a minor dip in 8th and rise in 4th grade, both nearly within statistical significance, and Quebec fluctuated down and then up in 8th grade and had an overall rise in 4th grade.

This does not sound like the sort of data to cause major shift in education policy. If anything, the rising numbers on 4th grade (where lack of drill gets decried the most) indicate that discovery curriculum has helped rather than hurt with calculation skills. (Ontario, for instance, while requiring 4th graders to be able to multiply up to 9, does not require memorizing multiplication tables.)

Let’s also lay on the table these quotes on the troubled nature of PISA scores themselves:

What if you learned that Pisa’s comparisons are not based on a common test, but on different students answering different questions? And what if switching these questions around leads to huge variations in the all- important Pisa rankings, with the UK finishing anywhere between 14th and 30th and Denmark between fifth and 37th?

… in Pisa 2006, about half the participating students were not asked any questions on reading and half were not tested at all on maths, although full rankings were produced for both subjects.

While I wouldn’t say the scores are valueless, I think using them as the sole basis of educational policy shift is troubling. Even if we take PISA scores at face value, the wide-open nature of the actual questions which mimic a discovery curriculum indicate you’d want more discovery curriculum, not less.

Advertisement

15 Responses

  1. “As I pointed out in a recent post, PISA essentially measures problem solving, and it seems strange to beef up calculation in an attempt to improve problem solving,”

    You need both. It’s silly to think our successful students can only do one or the other. I have never worked with anyone that can do only one.

    • Sure. My main point of course is you can’t use PISA alone as evidence of a lack of calculation skills (especially with TIMSS Ontario indicating things aren’t so bad).

      • We don’t use PISA “alone” Jason, that’s a straw man.

        You are quite right that PISA tests “problem solving” (I’ll accept that rubric, close enough — it’s RME). Funny, educationists in Alberta have been arguing (as one put it on CBC recently) that PISA tests “only memorization and algorithms” — and therefore (a) we don’t value these results (so … ah … why are we doing PISA …?) and (b) that stuff was taught before “new” methods used. Glad to see you understand the nature of what PISA is telling us.

        But, Jason, you say “it seems strange to beef up calculation in an attempt to improve problem solving”. I think you mean fundamentals, which includes “calculation”, as you put it.

        Well … look again at who tops the PISA rankings … and who has slid in the 9 year period between years in which math was primary domain: Top systems are those prioritizing basics-up education: they start by setting down proper foundations and build upwards.

        Those systems incorporating constructivist and indirect instruction methods … have slid. Sweden worst of all — more than any other PISA participant. The OECD wrote a 180-page report on what went wrong in Sweden. But … wait a minute: two provinces did worse, even, than Sweden, over that period.

        What has happened in Canada during that period is that 9 provinces moved toward more of a top-down model (which Dr. Stokke’s CDH report places under rubric of “discovery learning”): teaching understanding, assuming the basics will somehow take care of themselves.

        But such does NOT happen. It is teaching understanding without knowing. It’s nonsense. And this is borne out in the PISA data: though PISA supposedly tests “understanding” systems who succeed are those that best support foundations.

        It is borne out even better in TIMSS data, which *does* examine skills at a finer level. And Canada’s performance therein has been a disaster. Consider that no participating Canadian province could do much better than random guessing on simple question 1/3 – 1/4 . This was in Grade 8! See graphic I constructed here http://www.theglobeandmail.com/incoming/article15763259.ece/BINARY/PDF%3A+How+Canadian+students+fared+on+two+problems%3F+Not+much+better+than+guessing

        This is consistent across many studies including the largest comparative education study ever, the ironically-named Project Follow Through: Systems emphasizing basics did better on COGNITIVE (i.e., “understanding”) outcomes.

        See here: http://psych.athabascau.ca/html/387/OpenModules/Engelmann/evidence.shtml

        Systems emphasizing the cognitive domain (as is used in most of Canada today) not only did worse than the “basics” systems but also significantly worse than the control. So using NO intervention was better than that approach. It’s the disease — not the cure.

        Finally, consider Canada’s own national assessment, PISA, which compared which systems used indirect (i.e., “understanding”-first) instruction versus direct instruction. Again, *direct* instruction is positively correlated with stronger outcomes; indirect is *negatively* correlated.

        See http://www.cmec.ca/docs/pcap/pcap2010/English/6_ContextualAnalyse/Context%20ReportEn.pdf

      • Hey Robert, thanks for stopping by.

        While perhaps you aren’t basing everything on PISA, based on all the news articles I read some people are. I do find it interesting anyone would make the argument PISA only measures calculation because it is really hard to try some samples and hold to that (from my other post). Some of them read like they came straight off of the MARS tasks, which I consider the prototypical constructivist framework.

        Here’s the thing about the fraction problem: you seem to have the implicit idea that students used to be able to solve it with the old curriculum. I don’t know the data in Canada well enough to say, but in the US that sort of problem never had a good result, no matter what era you pick. While I find it troubling, I think it suggests more of a microlevel rather than a macrolevel change. Common Core in the US makes an interesting go at fractions (a lot seems to based on work of Hung-Hsi Wu; some also seems to be swiped off Hong Kong) and if you’re unfamiliar with it there’s a good series over at Illustrative Mathematics. (Disclaimer: I worked on the videos myself; the ones up now are remix versions.)

        I’m going to have to settle down and read some the other stuff you linked to. Maybe a followup post is in order.

      • Hi Jason. As I lurk around your blog today I see you replied to my prior comment. I’ll just reiterate that we are not the ones to argue that PISA tests only calculation skills. The most prominent such person in Canada is probably Dave Martin, a High School math teacher in Red Deer Alberta who is involved in committees dealing with their educational restructuring along their own Alberta-based variety of “21st Century Learning”: Inspiring Education, which seems to have at its heart a made-in-Alberta variant of discovery education called “Discipline-based Inquiry Learning”, a product of the Galileo Network there.

        Anyway, Martin is a strong proponent of discovery methodology and also a vocal opponent of testing (in the style of Alfie Kohn). Dave likes to join in these debates and has repeatedly dismissed the significance of PISA scores on the basis that “PISA tests only calculation and memorization”. His point being that, yes, the Alberta kids tested in 2012 had 4 years of the current WNCP curriculum prior to testing, but PISA only tests grade-level material prior to that, so the decline should be attributed to the former curriculum. Alberta’s 9-year decline, by the way, was 32 points — steeper than the aforementioned catastrophic collapse of Sweden’s PISA score.

        Just wanted to make that clear, as I’m not sure who else you might be invoking as trying to make that case — a couple of other educationists (not fundamentals advocates) in Canada have echoed Martin’s line but I think they’re just being too lazy to check it out for themselves. I recall one particular case where the conclusion was “… and since we don’t care about that stuff as much as problem solving, PISA scores are of no significance to us”. Anything to wave them off. What they don’t do is provide credible alternative explanations for the consistently falling scores across Canada during this period.

        There are people on the (let’s call it) “basics” side who argue that PISA is a lousy test. I won’t argue with that — I think its diagnostic value is below that of TIMSS, and this largely because of its RME philosophy. However, I’ve always been of the view that data is data as long as it is consistently handled and honestly analysed. And it is poignant to observe that those systems that perform best in PISA are the ones that most effectively support the teaching of basics in early years. To those of us who stand for laying a foundation in early years and deal honestly with the results of studies like PFT there is no surprise here. *Of course* those with their basics in place will do better, ultimately, in contextual and complex problem solving than those who have a weak foundation; that is almost a tautology.

        Something annoying I’d like to point out here: Your repeated use of “calculation” as a shorthand (so I’m assuming) for the teaching of fundamentals is as bad as your use of “lecturing” as shorthand for “direct instruction”. I wish you would stop that — I know of few who reduce the teaching of basics to “calculation”, any more than to “memorizing math facts”. Those are components of basics instruction and do not capture the whole. Proper basics instruction lays a broad foundation of multiple elements necessary in early education and upon which later learning is built. For a good idea of what such a foundation consists of I would commend examination of Primary grade JUMP Math, Singapore or Saxon math, or about 80% of all texts written before 1980 and after 1900, excluding the most virulent instances of 50’s/60’s “New Math”.

        Anyway, let us deal in honest terms about each others’ positions, not joust with straw men.

        As for the fractions question and the “old curriculum” let’s be clear that we’re talking about prior to 2006 in Canada (I won’t deal with the heterogeneous assortment of “old” American curricula). The WNCP curriculum used across 8 provinces was terrible starting in 1995 but took a turn for much worse 10 years later with the current revision. During the whole period, however, we saw a steady slide among Canadian provinces. On TIMSS 2010 you’ll see the three participating provinces about at par with the U.S. collectively but well below your top performing states. Those are historically our top three provinces. Quebec, on the rise and a bit better than the others, is emerging from an earlier foray into discovery-based instruction, which did not work out very well. http://www.sciencedirect.com/science/article/pii/S027277571400034X
        All other provinces have been diving deeper into discovery during this period; their PISA scores are falling.

        I’m familiar with Hung-Si Wu’s work, having corresponded to him in the past about content questions. I doubt his connection to the CCSS fractions sequence but I’d be interested in seeing any evidence of it you might be willing to share.

        I’m no big fan of CC Math. But for the record, the current grade-level outcomes for fractions therein are considerably superior to their counterparts in all 10 Canadian provinces today. CC also has the advantage of explicitly stating the expectation that standard algorithms be taught, something currently done in only 2 Canadian provinces.

      • I just looked at a couple of the fractions sequence videos. They are not bad — a few minor improvements would help; I would change emphasis in a couple of places and articulate a few things differently. The videos don’t pretend to scaffold appropriately but they do cover the practical side of the content pretty well. The approach in the videos I saw is conventional. I almost wept to see a relatively clear exposition of sums of fractions in Grade 4. Here it’s first seen in (yep, really!) Grade 7.

      • Sorry to blather on even more. I see I have perhaps been too oblique about answering this line in your post:
        “Even if we take PISA scores at face value, the wide-open nature of the actual questions which mimic a discovery curriculum indicate you’d want more discovery curriculum, not less.”

        I can’t tell if you are being ironic here. Yes, as I have touched on a couple of times, one would THINK that to do well on discovery-based questions a “discovery” curriculum is the fix. You would think, but you’d be wrong. That is what numerous strands of evidence, which include Project Follow Through, the PCAP direct/indirect instruction variable analysis, and PISA scores indicate. That this truthy notion is actually false. In fact, failure to adequately support the fundamentals, in favour of higher goals such as understanding and problem-solving skills … undermines BOTH.

        That’s really an adequate summary of my whole position on this matter, I probably should have just said that and shut up here.

      • You would not believe how many revisions the fractions videos had to go through (and keep in mind what is up there is remix 2.0 animated by a different group, so it has gone through more). It is really hard to get things perfect.

        I am not super interested in who is arguing for what and if there’s political motives or some such. I just want to know the argument.

        I read and wrote about both PCAP and Project Follow Through, if you notice.

      • The fractions video project is a good initiative, and I wish you well. Please do resist — strenuously — the pressure to fuzzy it up. Stick with straight presentation of the math. Still, there is room for interesting “packaging”, as in this old 1966 Educational Clip, the Theorem of the Mean Policeman:

        Might even give you some ideas.

    • Well, I wouldn’t look for any sense in those arguing that PISA is a test of basic skills — they are simply displaying their own ignorance.

      I think there’s a good case that PISA data isn’t particularly helpful diagnostically, though it might be a half-decent room thermometer. (And there’s a good case against that too.) I’m happy to take data as data and treat it on its own terms, so I’m not anti-PISA though my tone might sometimes sound that way. I think its sampling methodology is about as good as one could expect, and its data is handled in a professional and (as far as this pure mathematician can tell) appropriately in terms of contemporary statistical practice.

      As for who has political motives, I thought this was pretty obvious. Progressive Education is intrinsically political. Even the terminology has political origins, arising from the group who worked with Dewey and Kilpatrick 100 years ago. Whatever else is true about Dewey, he was an irredeemably political animal, a prominent proponent of the international Democratic Socialist movement: socialists who regarded themselves as distinct from the Marxist in that they did not believe in violent REVOLUTION but in pacific EVOLUTION, or PROGRESS toward the great utopian socialist society. Dewey was an educational advisor to Mao, with whom he was a mutual admirer, and a great friend and ally of Trotsky.

      If you’re looking for political motives, I’d start there and trace it through the Progressive Education movement up to and through the current Common Core initiatives and movements like the United Nations’ “World Curriculum” and Agenda 21. There are plenty of opportunists wanting to infuse education with their political agendas, from Dewey’s dreams of the schools as a primordial holistic collective of learners to the visionaries of the Frankfurt school who believed that socialism will only come to the West through a slow, “progressive” takeover of its cultural and power institutions: the courts, the government, the entertainment industry and — most importantly — the education system.

      As for why we, and the mathematicians in the U.S. make such a fuss, it is largely about being protective about the integrity of our subject. Our “politics” as it were, concern our view that inappropriate tinkering with the subject matter and ill-conceived pedagogical advice to teachers is robbing future generations of a rich mathematical heritage. These motives are very similar to those of the teacher-led ResearchEd movement, which we regard as allies, and which is trying to reconnect teachers to the educational research by cutting out the “consultancy class” that has laboured to provide coloration interpretation and — at times — run interference on, the findings of research that pertain to the classroom.

      I’ve been around in both the mathematical objectors to “progressive education” and the ResearchEd guys, and what strikes me about the people, politically, is that we are all over the map in terms of civic politics. In this business I am allied with people ranging from hard-knock old-school socialists of the Marx/Lenin school to free-thinking libertarians, to hard-core conservatives, and a lot of squishy liberals along for the ride. Its’ pretty well impossible to place us anywhere on the usual political spectrum.

      We often get portrayed as “rednecks” or old-school conservatives. I think that reflects a knee-jerk reaction by the “progressives” in education who can’t seem to think in any other terms. Also, conservative policy think tanks tend to publish material in support of our positions, whereas the left-wing ones tend to ally with the progressive education folks. Believe me this really rankles the socialists on our side of the education debates. I found it remarkable that so many educationists spewed rhetoric about the CD Howe Institute being some fire-breathing hard-right think tank, after Anna’s report came out. Actually, if CD Howe has any political orientation it is “hard center” Liberal. C.D. Howe himself was a highly respected Liberal Party cabinet member but the institute works hard to distance itself even from that level of political affilliation, and has done a remarkably good job of representing issues from all sides of the spectrum

      https://en.wikipedia.org/wiki/C._D._Howe_Institute
      https://www.cdhowe.org/about-us

      I think that some characterize the institute in those partisan terms says more about them than about the institute. I guess depending on where you’re standing, the center might look “far right”…

  2. Two and a half years to get the results out. Even Pearson does better than that!

  3. See also my post about PISA at mathinautumn.blogspot.ca

  4. But when you have an a priori political agenda for education (e.g., “back to basics uber alles”) there’s a tendency to see one solution to every problem, including problems that don’t exist. Politicians as a group are not known to be deep mathematical thinkers, and when they take their advice from mathematicians who aren’t deep educational/pedagogical thinkers, the propensity is for pseudo-panic whenever ANY test scores related to math appear to be dropping (regardless of what’s on the test, whether the test is any good, how meaningful the sampling is, how statistically significant the fluctuation is, etc.): the answer is always: more drill, more calculation, back to basics, dump the calculators, computers, and other electronic technology, and so forth). Canadians are no more immune from this silliness than are Americans, from what I’ve seen over the last 25 years, and Ontario seems to have a special attraction to panic and extreme shifts in policy. Ho, hum.

    As to the claim Mr. Hansen offers about those with whom he’s worked and what those people can or can’t do: 1) about whom are you speaking? Adult co-workers? Kids? Some random sampling of K-12 students you train? I can’t begin to guess; 2) are you seriously asserting that you’ve not encountered students (or adults, for that matter) that are competent calculators who go into mental arrhythmia when asked to solve a problem that doesn’t explicitly set up the required calculations? If so, you clearly are not a K-12 teacher. The mantra of countless K-12 students, generally starting in late elementary school, is: “I can do the math; I just can’t do those word/story/applied/real-world problems.”

    And in case the meaning of that isn’t crystal-clear to you, it translates, upon further questioning, into: “Ask me to crunch simple calculation problems and I’m golden; ask me to have to glean what calculations are required and I’m lost. And don’t even bother asking any sort of problem that requires modelling, non-routine mathematical thinking, thinking outside the box of school mathematics, etc.”

    I would guess that between 80% and 90% of the students one encounters in most American high schools would sign on to those statements. Of course, there are magnet schools, schools in very affluent communities, highly competitive private schools, etc. where it might be just the opposite. But they are the exception, not the rule. Nothing new here, nothing shocking. It’s always been that way in this country. It would take a radical restructuring of math education (and probably education in general) to change things for the better. But whenever anyone tries to move things away from the tried-and-failed methods of the last 150 years, the nay-sayers, conservatives, and reactionaries arise en masse to undermine and undo the changes. And method number one for doing that is to suggest that those of us who want to see a balance between number-crunching and problem-solving as part of what goes on from the beginning in our classrooms and schools are REALLY calling for throwing away any and all work on calculation. Just as you appear to be trying to do with your comment.

  5. […] Canada’s math wars and bad use of the PISA […]

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: