Thoughts on the Porkess Report

I’ve now worked my way through the Porkess Report. As it’s specific to the English system, and concentrates on school teaching, there are at least two good reasons why I can only comment on it as an interested amateur. However, if we’re lucky this document will do a lot to frame discussions of maths education in the UK more generally over the next few years, so it should probably be required reading for the whole of the profession.

Let’s get the obvious comment out of the way first. Having the committee fronted by Ms Vorderman — a TV presenter with good mental arithmetic skills but no experience of either teaching or real mathematics — was a gimmick, and one which presents the nature of mathematics and the credibility of the report in entirely the wrong way. (This is not, I hope, a sexist comment. I’d have been equally unhappy had the gig been given, say, to the great Johnny Ball; the closest parallel I can imagine would be inviting Michael Palin to chair a committee on foreign policy on the grounds that he’s good on telly and he’s been abroad.) A charitable explanation is that this gimmick was designed to win the report a level of publicity it wouldn’t otherwise have received; a less charitable explanation, reinforced by the first-name-dropping in the unspeakable Gove’s preface, is that it tells us something about Mr Gove’s afternoon viewing habits and something else about the depth of his understanding of mathematics.

Fortunately, the choice of chair doesn’t seem to have corrupted the rest of the choice of panel. We have one (apparently extremely successful) primary-school teacher, Ms Rahman-Hart; two former secondary school teachers now engaged in curriculum development (Dr Porkess and Mr Dunne); and one university mathematician (Professor Budd). I’ve not encountered the others, but Chris Budd is certainly widely respected in the mathematical community, and not known for pulling his punches when he disagrees with something.

So, what of the report itself? I was principally struck by how forthright it is: maths education in England is in a bad way; maths is an exceptional subject which requires exceptional treatment; much of the damage that has been done is because the people who’ve buggered about with education have failed to recognise this. (Ofqual and the erstwhile QCA come in for a particularly vigorous — and richly deserved — dose of the steel toecaps over this. It’s a shame the remit didn’t extend to Scotland, because I’d have loved to see the SQA treated similarly.)

I was familiar with a fair amount of the evidence presented in the report (e.g. the Nuffield report on uptake of post-16 mathematics and the PISA 2009 results, both of which caused some predictable soul-searching at the time before dropping off the news agenda as usual); little of the rest came as a surprise but it’s impressive to see it gathered.

One point that struck me (and that was returned to several times in the report) was the huge difference in the speed at which mathematical ability develops:

There is extreme variability in people’s aptitude for, and attainment in, mathematics; this has been estimated at ten years of learning by the age of 16. (p. 20)

A consequence of this variability is that early attainment in mathematics is a good predictor of later attainment:

The strong link between achieving a good level of performance in mathematics by the end of primary school and continuing success through secondary school is measurable. In 2007, of pupils who had not reached the expected standard by the end of primary school, only 10% went on to achieve at least grade C in GCSE mathematics at age 16. In contrast, 57% of pupils who had achieved the expected standard by the end of primary school, and 94% of those who achieved above the expected standard, achieved at least a grade C. (p. 35, quoting a 2008 DCSF / NAO report)

It is not clear whether this variability is regarded as largely innate or the effect of teaching: on p. 35 the report states fairly confidently that “probably the most important single factor in children’s success is their teacher”, but it’s not clear on what basis this is felt to be probable. This strikes me as an important issue: if mathematical ability is innately widely variable (my own personal guess), the emphasis should be on accommodating this variation; if it is largely the effect of variable teaching quality, the emphasis should be there. So, a slight black mark for finessing this point, but credit for raising the issue.

Another point that the report is refreshingly clear on is the different types and levels of “mathematics” that are appropriate to different students:

Too much of the curriculum experienced by students who are currently low attainers is a trickled down version of the requirements of the top 15%. (p. 22)

(Actually, from what I’ve seen of students with A-levels the trickling isn’t one-way: the notional content of A-levels may be based on the requirements of the top 15%, but the standards of mathematical reasoning required seem to be a lot lower than their needs imply.) This kind of complaint is familiar, especially from the other side of the Atlantic where “Calculus” seems to have become the pons asinorum used to sort the applicants for oversubscribed courses. (See, for example, Lockhart’s Lament or Doron Zeilberger’s Opinions 28 and 115.)

A third point that it was good to see being made is the difficulty (or nigh impossibility) of designing appropriate assessments, especially while working within an assessment framework prescribed by tumshies:

GCSE mathematics… is expected to cover the full range of students’ attainment at the age of 16. Research has shown that among 16-year-olds the range of mathematical development covers ten years; even without the upper and lower 20% tails, it is still about six years. Thus a GCSE in mathematics does not, and cannot, represent any particular standard. However, the expectation of end-users is that it should do just that…

With a large number of tiers, many weaker students find themselves entered for a tier where the papers are appropriate for them but their possible grades are restricted. However, with fewer tiers the papers have to cover wider ranges. This has two negative effects: the lowest grade on a paper is often awarded on a very low mark and syllabus coverage can only be achieved with more, shorter questions and so less emphasis can be placed on the fundamental mathematical skill of problem solving. (p. 51)

The present system encourages excessive re-sitting of AS units in order to gain an A level grade A… It is predicted that the amount of re-sitting will become even more of a problem with the introduction of the A* grade. This is awarded on a very high overall mark; while that sounds desirable, it is actually liable to be a reward for avoiding careless mistakes rather than for genuine mathematical performance. In our view it would be better to have an extra paper for the award of the A* grade. (p. 77)

Another example of such [an unnecessary] regulation is… that any unit (or module) may have only one mode of assessment. Thus a mathematics unit may be 100% examination or 100% coursework… but it may not contain a mixture. This precludes a model, which has a proven success in mathematics… (p. 97)

Ofqual has five Standards Objectives. These are flawed because they refer only to reliability and not to validity… Lack of validity in present qualifications lies at the root of many complaints from both employers and those in universities, but Ofqual is not accountable to them. (p. 97)

How, then, do the report’s authors see these points informing the various levels of qualification? Echoing most sane people, including the Royal Society, they recommend scrapping Key Stage 2 tests (up to age 11), and they point out in general that the practice of holding schools “accountable” via test results has a pernicious effect on education (p. 26). At GCSE, they endorse ACME’s proposal to expand the qualification to a double GCSE — echoing the double GCSE in English Language and Literature — comprising Applications of Mathematics and Methods in Mathematics. I’m not clear that this would necessarily solve the problems of inadequate time being spent on maths in schools, of the apparent irrelevance of much school maths or of meaningless assessment regimes; but it might at least provide space for some of them to be tackled. (An aside: during the upper Ordovician period when I did my Standard Grades, it seemed to be generally accepted that English and Maths each required double the classroom time of any other Standard Grade subject, even though each led to a single award. I guess that with league tables and inappropriate quantification this kind of common sense has been driven out of the school system: if so this says as much about the inanity of league tables as any other single fact I’ve come across.)

The headline-grabbing recommendation is that mathematics should be compulsory for all students aged 16 to 18 — that is, for the last two years of compulsory education once the leaving age in England has risen to 18. It’s a shame that attention has focussed here, because to my mind this section is a curate’s egg.

Let’s deal with the excellent parts first: the report identifies four bands of students with distinct needs (band A aiming for “STEM and other mathematics-heavy degrees”, band B for “degrees with some mathematical content”, band C for “degrees with little or no mathematics” and band D for “apprenticeships, employment, further education”; see p. 66), and it is very clear that these students require very different sorts of mathematics.

For band A, the recommendation is basically not to muck around with the existing A-levels in Mathematics and Further Mathematics, but that universities should generally require A-level Mathematics for all STEM subjects; as noted above, the authors are also sceptical of the validity of resits and of the A* grade — or “sampling noise at the tail of the distribution” as we might as well call it. A few cautious noises are made about introducing a mixture of linear and modular syllabuses (currently forbidden by regulations) but — in contrast to a fairly widely-held perception among university maths teachers — the report is reluctant to blame modularisation for many of the ills of A-level. (It’s understandable that Dr Porkess, as the designer of the first modular A-level (p. 104), might see merits in modules that are not widely perceived, but I felt there was a touch of Mandy Rice-Davis here.)

For band B, the recommendation is to oblige students to keep their hands in with “subsidiary mathematics” and statistics (basically AS levels) so that they don’t go completely rusty between GCSE and starting a degree course in which maths is required. As a teacher of “service courses” for first-year university students, I have to applaud this: after a single year away from maths after Higher, many students seem to have regressed to Standard Grade or lower, while those who have taken two years or more away from maths end up completely at sea. (The diagnostic tests I’ve run for the last few years with incoming engineering students suggest that an A or B at Advanced Higher is a reliable predictor of whether students have assimilated Higher material, while their Higher Maths grade tells you almost nothing: I suspect that “rustiness” is a large part of the explanation for this.)

For band C, the report recommends a new “Mathematics for Citizenship” course, but says nothing much about its content. This was a major disappointment for me: suddenly an opportunity arises to lay the restless spirit of CP Snow and immunise a large fraction of the population against numerological and numerical bullshit — and they say nothing about how it’s to be done. A faint pong of bad egg here, alas.

Finally, for band D, the report recommends a “mature GCSE” and “mathematics units in vocational courses”. Again, much less is said about these than I’d have liked to see: there’s a mention (pp. 68–69) of a forthcoming Engineering Diploma but nothing in more detail. Even with a target of 50% of the school population going on to university, band D surely includes half the students in the 16 to 18 age range; it also includes the estimated 22% of the 16–19 age group who are “functionally innumerate” (p. 3). Based on the report’s previous points about variable mathematical ability, many of these students can be expected to have levels of mathematical achievement more appropriate to those in the early years of secondary school and — at least for a decade or so until reforms to earlier stages have fed through — many of these students will already have been “turned off” maths. Finally, and speaking partly from personal experience, it is extremely hard to get basic mathematical ideas through to older students via what is effectively remedial teaching, once damage has been done at an earlier stage. (The report elsewhere places so much emphasis — correctly in my view — on teaching at primary level that it’s not clear why they believe much can be done to rescue the situation at this much later stage.)

In summary, band D represents the largest of the four cohorts and seems to offer peculiarly tough challenges, but the report doesn’t offer much beyond chaining students to the oars for an extra two years. Seven years after the Smith Report raised the need to look at “pathways” for maths for the 14–19 age range, this strikes me as underwhelming. If I were feeling particular cynical, I might suspect that this recommendation had been given so much prominence in the media because somebody was already developing a pre-emptive rubbishing strategy. If such a strategy is being applied — and, given the number of people who will have been upset by the report, it seems unwise to rule this out — then it is very bad news. The report is a long way from perfect, but even the eggier parts of it make more sense than a great deal of what passes for informed discussion in educational policy, and it describes a landscape that I imagine most people in maths education will recognise.

Back to the recommendations. I don’t know enough about the frameworks for school teacher recruitment, training or CPD to comment on that chapter, so I’ll leave it alone and move on to HE. (This section is prefaced with the disclaimer “While we appreciate that responsibility for higher education lies outside the remit of DfE…”, which I find hard to read other than in a tone of weary irony, but maybe that’s just me.) Here, I admit, the authors are preaching to the choir: end caps on mathematics degree places; increase the mathematical entry requirements for degree courses that require maths (both STEM and non-STEM); generally admit that students’ mathematical underpreparation for university is a major cause of trouble and needs to be dealt with immediately:

It is hard to believe that this situation, in which large numbers of students are systematically under-prepared for their degree courses, has been allowed to arise and see it as a matter of extreme urgency that steps are taken to address it. At the moment all those involved are losers: students, universities and the country as a whole. (p. 93)


Finally, we have the section on government agencies (and if I were looking for a source of the rubbishing strategy, this is where I’d start my investigations). I’ve quoted a couple of the plums above, but here are some more, with a resonance that goes beyond the particular victims of the shoeing:

Until 2006, QCA [the Qualifications and Curriculum Authority] had a well-respected mathematics team. However, at that time it was decided that they did not need subject expertise and the team was disbanded. Thus QCA was left with almost no specialist expertise in mathematics (or, indeed, in other subjects) and a senior management that believed such expertise was unnecessary. We believe this was a very serious mistake: it seems manifestly obvious that the body responsible for the school curriculum should have expert knowledge of the various subjects involved and their particular requirements. (p. 96)

(Are you listening, HEA?)

Ofqual was set up as an independent body. It is thus accountable neither to the government nor to subject communities but at the same time its role in accrediting qualifications gives it enormous power over policy. It can block or undermine new initiatives and in the short time that it has been in existence has already shown a willingness to do so, disallowing potentially valuable mathematics qualifications that do not fit in with its general regulations. (p. 96)

(Are you listening, QAA?)

Ensuring that all syllabuses are of the same standard is Ofqual’s responsibility but, with no dedicated mathematicians on their staff, they are not in a position to do this… the judgements involved require a depth of mathematical understanding that cannot be replaced by ticking a list of boxes. Those without adequate subject knowledge are unable to make decisions about the quality of questions and the answers to them.

Clearly a major change is needed. (p. 98)

Not quite words of one syllable, but surely simple enough even for the average DfE, BIS or university apparatchik to get the message?

Finally, an extract from the Wolf Report on vocational education:

In recent years, both academic and vocational education in England have been bedevilled by well-meaning attempts to pretend that everything is worth the same as everything else. Students and families all know this is nonsense. (p. 97)

Not just students and families; and not just in England, alas. But if everyone involved in educational policy were to read the Porkess Report, and pay careful attention not just to the recommendations but to the tone, perhaps we might be able to force a little of the nonsense into retreat.

This entry was posted in Teaching. Bookmark the permalink.

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s