It’s the exam season, and I’ve been brooding on a question from an earlier post: can we develop useful theories of how and why students fail to learn?
A few weeks back I started toying with the idea of “junk thought”, tentatively defined as habits, beliefs or information that aren’t necessarily wrong or damaging in themselves, but that occupy the space that is needed by more valuable thought. Factorising quadratics, for example, isn’t necessarily a useless skill, but it becomes junk thought when it occupies the mental space marked “how to solve a quadratic equation” because it then becomes an obstacle to learning more general approaches to the problem. While reading around the literature on students’ approaches to problem-solving with this in mind, I found the paper-trail of an article by Shlomo Vinner, “The pseudo-conceptual and the pseudo-analytical thought processes in mathematics learning” [Educ. Stud. Math. 34: 97-129, 1997]. It’s a nicely forthright paper which contains some powerfully clarifying ideas, and I wish I’d discovered it before.
Rather than simply “junk thought”, Vinner describes two related thought processes which students may use — often unconsciously — to bypass genuine mathematical reasoning. Pseudo-conceptual thinking means, roughly speaking, navigating by vague association:
In mental processes that produce conceptual behaviours, words are associated with ideas, whereas in mental processes that produce pseudo-conceptual behaviours, words are associated with words; ideas are not involved.
(As I understand it, “words”, in the sense used above, include mathematical expressions, so pseudo-conceptual behaviour is what occurs when one mathematical expression triggers a vague association with a superficially similar expression.)
Pseudo-analytical thinking, also roughly speaking, means using shortcuts or templates to solve standard problems without properly considering whether they’re applicable. This is the kind of process that generates answers to the “how old is the shepherd?” problem, and it seems to be very common among mathematical learners. Vinner discusses one of the classic examples, the infamous , and several more are informatively dissected in a paper by Johan Lithner [Educ. Stud. Math. 52: 29-55, 2003]. I’d put in the same category the odd strategies that students develop for dealing with trig problems: for example, students studying mechanics often find resolving vectors tricky because, as several have explained to me, “I thought cos meant the horizontal bit and sin meant the vertical bit”. And, of course, exam techniques such as looking for keywords fall into this category too. I encountered a good example in an “advanced” calculus exam in which students were asked to maximise a particular function over the unit sphere. Seeing the keyword “sphere”, a majority of those students who attempted the question promptly parameterised the surface and started calculating normal vectors, confident that the problem would require them to integrate something over it…
Vinner suggests that these “pseudo” behaviours arise essentially from a flawed didactic contract. Students frequently want to please the teacher or the examiner by giving right answers, rather than to understand the material deeply — indeed, many students may not recognise a distinction here — and so it is natural for them to develop strategies for doing this which don’t require too much effort and will work enough of the time to keep everybody happy. He contrasts with this attitude the “cognitive commitment” required to genuinely learn. This “commitment” is not the same thing as the emotional states of wanting to learn or of enjoying learning; rather, it means being in a state where ideas can be assimilated or accommodated — in terms of the “junk thought” metaphor, having a mind that is sufficiently free from junk for new ideas to be fitted in somehow. (My mental image is of trying to reorganise a crowded shed…)
The other important concept in Vinner’s scheme is that of a control mechanism. Even as experienced mathematical thinkers we may associate ideas in a pseudo-conceptual manner or take pseudo-analytical approaches to problems (Doron Zeilberger, with his customary understatement, has argued that most or all mathematical thinking works this way); the difference is that we habitually deploy control mechanisms to check whether the thinking we’ve employed was valid. A simple example of a control mechanism is checking that one’s solution satisfies the original equation; more sophisticated controls may involve a deeper examination of concepts or of the context of a problem. (This application of controls is reminiscent of the systematic reworking using formal “concept definitions” of a proof that was obtained in outline using informal “concept images” — terminology which is also due to Vinner and his co-authors.) What characterises “pseudo” behaviours, and junk thought in general, is that appropriate control mechanisms are not activated.
All this, at one level, is simply fleshing out the generic idea of deep versus surface learning. What I think it adds to this is a sharper focus on the way that surface (or strategic) learning approaches manifest themselves in mathematics. This is important because in maths there can be an unusually large gap between the depth of learning that we want to instil and the shallowness of learning that will enable students to carry out procedures in assessments. In particular, I’d guess that the opportunities for pseudo-analytical thought are rather more pronounced in maths than in more verbal subjects — if only because we tolerate a lack of mathematical grammar in our students’ work far more than we would a lack of linguistic grammar.
In the sense that remedial teaching is mostly about eliminating junk thought and replacing it with genuine thought, a great deal of maths teaching is remedial. So, how do we do this? This brings me back to the post I was originally going to write, because a strategy that recurs throughout Vinner’s article was one I’d stumbled on myself, which I refer to as setting “gotcha” questions.
A “gotcha” question is one that implicitly invites the student into a trap. Frequently these questions aren’t designed as such, but it turns out in tutorials or homeworks that this is how they function. In Vinner’s terms, they’re “problems such that trying to solve them in the pseudo-analytical mode will result in an erroneous answer”; or in terms that he quotes from Vygotsky, “crucial questions” that “can reveal the true nature of the subject’s thought”. They’re also the questions that violate the didactic contract and provoke outrage and rebellion among the class. Depending on one’s attitude as a teacher, one either quietly removes such offending questions from the example sheet, or makes a point of setting them every subsequent year. As earlier posts, such as my rant about inequalities, make clear, I’m in the latter category.
I’ll return at the end of this post to the question of how effective “gotcha” questions can hope to be, but first I want to anatomise one in detail. This is a “gotcha” question which I’ve been setting to first-year engineering maths students for the last few years, and which causes grief so routinely that it’s become affectionately known to tutors as That Blasted Trough Problem.
A trough is 3 metres long. Its ends are equilateral triangles of side 1 metre. If water ﬂows in at a rate of 0.5 cubic metres per minute, how fast is the water level rising when it is 0.3 metres deep?
(The answer given in the back of the book is metres per second.)
I didn’t originate this question, but it turns out to contain a small assault course’s worth of traps.
The first trap shows up when the students, following a perfectly reasonable problem-solving procedure, try to sketch the problem. Roughly two thirds of students draw the trough as a triangular prism with the horizontal side of the triangle at the bottom, and many proceed happily to an answer on this basis. I think the junk thought here is the habit, acquired very early on in school, of always drawing a triangle with the “base” at the bottom. (Similarly, a triangular prism is so often illustrated as a Toblerone box that of course it must have a base at the bottom and an apex at the top.) This habit is so strong that it overrides the fact that these students — who are, remember, engineers — are generally quite good at visualising objects and applying mechanical common sense. In Vinner’s terms, a pseudo-conceptual definition of a triangle as an object with a horizontal base and an apex above it has survived the attentions of an inadequate control mechanism.
[Two asides. First: students could reasonably object that the question doesn’t state that the axis of the triangle must be vertical; an interesting discussion of modelling assumptions would then become possible, but in practice none of my students have ever raised this objection. Second: a very similar pseudo-conceptual response is what makes one of Henry Dudeney’s brainteasers an effective puzzler even for mathematicians. Following an increase in window tax, a man decides to reduce the size of his main window, a square three feet on each side. He boards up half of the area of the window, but it remains three feet wide and three feet high. How?]
A second trap comes when the students, having correctly sketched the trough, try to formulate the variables in the problem. They have of course marked the lengths of the sides of the triangle correctly as 1 metre, and for many of them it now becomes inconceivable that this dimension shouldn’t enter the problem somewhere. I’ve seen some remarkable algebraic gymnastics as they try to set the depth of water, the width of the free surface and even the cross-sectional area to be 1 metre. The junk thought here is procedural, a pseudo-analytical response triggered by context: this is a maths problem, so (like “how old is the shepherd?”) it must use all the information in the question and only that information.
A third trap, which is probably the first one intended by the designer of the question, comes once the students have correctly identified the cross-sectional area of the water as equal to half the depth multiplied by the width of the free surface: instead of then relating the depth to the width using the fact that the cross-section is equilateral, many students try to keep the width constant while varying the depth. I think the problem here is an example of the difficulty that students frequently have identifying constants and dependent and independent variables. Soon et al. [Int. J. Math. Ed. Sci. Tech. 42(8): 1023-1039, 2011] discuss this among other matters, and since reading their rather nice paper I’ve come to wonder whether this is why some of my students report that they find it much easier to tackle problems which involve numerical constants (“a particle of mass 10.5 kg”) than equivalent problems with symbolic constants (“a particle of mass m”). This trap is a little harder to describe in terms of junk thought, unless we take the junk to be the convention that maths problems — particularly calculus problems — involve a single variable and a single formula. If it turns out that this function isn’t called f and the variable isn’t called x, then that’s quite enough difficulty for one day, thank you. (This difficulty may also be analogous to the pseudo-conceptual conflation of a quadratic equation and a quadratic function discussed by Vinner (1997, pp. 105-6): the distinction depends on how a statement like fits into the context of the discussion, not merely on the mathematical shape of the statement.)
Finally, we have the dimensional sting in the tail: having worked through the problem correctly, many students evaluate the rate of rise of the free surface as and promptly panic. The trap, of course, is the need to convert between metres per minute and metres per second — surely a routine enough task for engineering students, but because the habit of overlooking units of measurement is so engrained (and so rarely penalised), both students and tutors can be completely blind to this. Can we see this habit as junk thought? I think we can, if we interpret it as the shortcut rule that units of measurement are adornments, which occur at the start and end of the problem but don’t really belong to the mathematics. (I think there may also be an issue of expected complexity: having got the wrong answer, it’s hard to believe that the error could be something simple — surely it’s more likely that the problem is in calculus, which is “hard”, than in units of measurement, which are “easy”. Again, this is probably reinforced by our habit of setting highly artifical questions where all the awkwardness is concentrated in a single step, and thus establishing a didactic contract in which every question has a unique point of difficulty.)
It’s possible that That Blasted Trough Problem contains too many traps for a single question. I don’t know, though: very few students can make their way through it without experiencing a “gotcha” at some point, and surely that has to increase the chance that they’ll identify some of their junk knowledge and be motivated to deal with it. Surely? Or will it just cement the reputation that I and my like-minded colleagues already have of being sadists whose one ambition is to come between our students and the pass mark they so dearly want?
That seems to be where the weakness of “gotcha” questions lies. If a student is already cognitively commited then a “gotcha” question may trigger a surprise reaction which makes it easier for them to rebuild their understanding. (Indeed, a recent paper by Cline et al. [PRIMUS 22(1): 43-59, 2012] argues that these questions are particularly good at prompting discussion when used as in-class exercises, and this agrees with my own experience using clicker systems — though the latter should be viewed in a context which suggests that some students are happy to use clicker systems in a pretty pseudo-educational way.) On the other hand, if a student isn’t cognitively committed to start with, s/he is liable to perceive “gotcha”s purely as a breach of contract and to resent them bitterly. Once a student has the idea that s/he is being treated unfairly, this tends to override all other considerations — hence, among other things, the perpetual strife over the fairness or otherwise of exam papers.
Some recent cognitive psychological work on “hypercorrection” effects provides an interesting perspective on this. There seems to be fairly robust evidence, originating with work by Butterfield & Metcalfe [J. Exp. Psych. 27(6): 1491-1494, 2001] that students are more likely to correct errors about which they were originally confident than those about which they were less confident. The surprise reflex triggered by “gotcha” questions may be part of this; another aspect is that hypercorrection occurs only when the students already have some knowledge of the topic — the so-called “knew it all along” effect — and otherwise overconfidence may be counterproductive as one would naively expect. [See e.g. Metcalfe & Finn, J. Exp. Psych. 37(2): 437-448, 2011; Dunlosky & Rawson, Learn. Instruct. 22(4): 271-280, 2012.] Anastasia Efklides [Learn. Instruct. 22(4): 290-295, 2012] has also suggested that affective factors may be important: confidence is a positive emotion, and positive emotions make people more open to change their minds. The flip side of this would seem to be that if students don’t have much existing knowledge, or if the effect of a “gotcha” question is to get their backs up with a sense of injustice, then asking such questions may be counter-productive.
As so often in education, the key to the box we need to open seems to be locked inside that box. If a student has entered into a teacher-pleasing contract based on pseudo-conceptual and pseudo-analytical activity then “gotcha” questions may be a way to break this contract — but at the risk of alienating the student entirely. If a student is committed to learning, on the other hand, “gotcha”s may be a good way to help them undo the damage caused by inadequate learning or pseudo-learning in the past — but then the student has already taken the most important step on the way.
I’m not sure whether this is a consoling thought, but even the greatest exponents of “gotcha” questions seem not to have discovered a way round this dilemma. Both the innocent-sounding questions of Socrates and many of the parables of Jesus had a “gotcha” element: they were designed to break through the pseudo-conceptual responses of Sophism and Pharisaism, and thus to force engagement with the demands that lurked behind the words. Both teachers were successful at opening the minds of their committed followers; both were also successful at arousing the sort of resentment that gets one executed as a public menace. At least neither my moral calibre nor my ability as a teacher is likely ever to place me in that category…
Where does this leave us? Junk thought, like junk food, is comforting: it gives one a nice sensation of being full already; of having fulfilled one’s side of the didactic contract and thus deserving a reward. Learning not to depend on it will inevitably be an uncomfortable, and even upsetting, experience for many students; some may feel it as a serious grievance. (Remember those infamous pictures of parents stuffing chips through the school railings when their kids protested that junk food had been banned at lunchtime?) I honestly don’t know whether the purgative of a “gotcha” question is among the best ways to deal with the junk that seems to clog so many students’ minds when they face a maths question. All I really know is that — unlike superficially kinder strategies — I’ve yet to be entirely convinced that these questions are altogether a failure.