It is tempting when hearing about student performance on an international or national test is to assume they measure some monolithic mathematical ability. When a country is doing well on a test mathematical teaching is doing fine, and when a country is doing worse math teaching needs to be looked at and changed.
Additionally, it is contended any countries that are doing well should have their strategies mimicked and any countries doing badly should have their strategies avoided.
One issue with these thoughts is that the two major international tests — the TIMSS and PISA — measure rather different things. Whether a country is doing well or not may depend on what you think the goals of mathematics education are.
Here are some samples from PISA:
PISA Sample #1
PISA Sample #2
You are asked to design a new set of coins. All coins will be circular and coloured silver, but of different diameters.
Researchers have found out that an ideal coin system meets the following requirements:
· diameters of coins should not be smaller than 15 mm and not be larger than 45 mm.
· given a coin, the diameter of the next coin must be at least 30% larger.
· the minting machinery can only produce coins with diameters of a whole number of millimetres (e.g. 17 mm is allowed, 17.3 mm is not).
Design a set of coins that satisfy the above requirements. You should start with a 15 mm coin and your set should contain as many coins as possible.
PISA Sample #3
A seal has to breathe even if it is asleep in the water. Martin observed a seal for one hour. At the start of his observation, the seal was at the surface and took a breath. It then dove to the bottom of the sea and started to sleep. From the bottom it slowly floated to the surface in 8 minutes and took a breath again. In three minutes it was back at the bottom of the sea again. Martin noticed that this whole process was a very regular one.
After one hour the seal was
a. At the Bottom
b. On its way up
d. On its way down
Here are samples of TIMSS questions:
TIMSS Sample #1
Brad wanted to find three consecutive whole numbers that add up to 81. He wrote the equation
(n – 1) + n + (n + 1) = 81
What does the n stand for?
A)The least of the three whole numbers.
B)The middle whole number.
C) The greatest of the three whole numbers.
D)The difference between the least and greatest of the three whole numbers.
TIMSS Sample #2
Which of these is equal to y^3?
A) y + y + y
B) y x y x y
D) y^2 + y
TIMSS Sample #3
To mix a certain color of paint, Alana combines 5 liters of red paint, 2 liters of blue paint, and 2 liters of yellow paint. What is the ratio of red paint to the total amount of paint?
The PISA tries to measure problem-solving, while the TIMSS focuses on computational skills.
This would all be a moot point if countries who did well on one test did well on the other but this is not always the case.
Possibly the most startling example is the United States, which scored below average in the 2012 PISA
but above average in the 2011 8th grade TIMSS, right next to Finland
This is partly explained by the US having more students in than any in the world “who thought of math as a set of methods to remember and who approached math by trying to memorize steps.”
The link above chastises the US for doing badly at the PISA without mentioning the TIMSS. It’s possible to find articles with reversed priorities. Consider this letter via some Finnish educators:
The mathematics skills of new engineering students have been systematically tested during years 1999-2004 at Turku polytechnic using 20 mathematical problems. One example of poor knowledge of mathematics is the fact that only 35 percent of the 2400 tested students have been able to do an elementary problem where a fraction is subtracted from another fraction and the difference is divided by an integer.
If one does not know how to handle fractions, one is not able to know algebra, which uses the same mathematical rules. Algebra is a very important field of mathematics in engineering studies. It was not properly tested in the PISA study. Finnish basic school pupils have not done well in many comparative tests in algebra (IEA 1981, Kassel 1994-96, TIMSS 1999).
That is, despite the apparently objective measure of picking some test or another as a comparison, doing so asks the question: what’s our goal in mathematics education?
(Continued in Canada’s math wars and bad use of the PISA.)