TIMSS and PIRLSPosted: December 11, 2012
The latest instalment in the international educational data soap opera has been published today. TIMSS and PIRLS between them seek to measure comparative international performance in maths and science for 10 and 14 year olds and in reading for 10 year olds. They also seek to show how performance has changed in absolute terms in different countries across time.
There is something very beguiling about this kind of report. It has a wonderful air of precision and it’s very easy to forget things like margins of error and differences between countries in how tests are managed. For example in 2007 Kazakhstan was apparently fifth in the world in maths for 10 year olds. Really?
Having said that, everyone is going to be poring over the results and trying to draw lessons from them. So what do they show?
In terms of absolute performance for England they show:
- A big improvement in primary maths since 1995 with most of the improvement being between 1995 and 2003.
- Overall an improvement in secondary maths since 1995 but with a slight falling back between 2007 and 2011.
- A small decline in primary reading between 2001 and 2006 balanced by an improvement between 2006 and 2011.
- After improvement between 1995 and 2007 in primary science, some falling back between 2007 and 2011.
- A very modest improvement in secondary science between 1995 and 2003 but an equally modest decline since then.
In terms of relative performance, it’s much the same story. Up in some, down in others but in both cases within a narrow range. We also need to notice that participation varies – some countries didn’t take part every year and this affects league table position.
What conclusions can we draw from this? First perhaps in the immortal words of Corporal Jones “don’t panic”. Other countries see small scale improvement and decline. For example between 2003 and 2007 secondary maths in Singapore and Hong Kong both declined. Any individual result should be approached with some caution and could easily be statistical noise.
So can we draw any useful conclusions? I would suggest that in very broad terms we can say:
- English performance is overall pretty consistent and is amongst the best performers in the world.
- It’s better than the great majority of European countries – even very close to the much lauded Finland in maths.
- So this data doesn’t support the Govian nightmare of plummeting standards across the board.
- We should see if there is other evidence to support or deny the suggestion of a decline in science performance.
One other finding worth looking at is the range of achievement in England. The gap between the highest and lowest performance in England is wider than in virtually all the countries with similar overall results. The idea that we have a tail of under-achievement is supported by this data. OECD would suggest that this has something to do with the fact that we have one of the most socially and economically segregated education systems.
The leading Asian countries remain, by this measure, the world’s highest achievers. But before we go chasing after what we think are the lessons to be learnt from them, there is a very telling section in the Guardian’s article about these results:
“The paradox is that while Truss and her boss, the education secretary, Michael Gove, seek a move towards an Asian-style system of crammed facts and rigorous exams, educators in those countries are wrestling with the paradox that their pupils too often emerge competent but narrow and uncreative in their thinking. Earlier this year Singapore’s education minister proposed a long-term move towards a less restrictive system.”
That certainly is something we should be thinking about.