When the National Benchmark Tests (NBTs) were first considered, it was suggested that the results would assess entry-level students’ academic and quantitative literacy, and mathematical competence, assess the relationships between higher education entry-level requirements and school-level exit outcomes, provide a service to higher education institutions with regard to selection and placement, and assist with curriculum development, particularly in relation to foundation and augmented courses. We recognise there is a need for better communication of the findings arising from analysis of test data, in order to inform teaching and learning and thus attempt to narrow the gap between basic education outcomes and higher education requirements. Specifically, we focus on identification of mathematical errors made by those who have performed in the upper third of the cohort of test candidates. This information may help practitioners in basic and higher education.

The NBTs became operational in 2009. Data have been systematically accumulated and analysed. Here, we provide some background to the data, discuss some of the issues relevant to mathematics, present some of the common errors and problems in conceptual understanding identified from data collected from Mathematics (MAT) tests in 2012 and 2013, and suggest how this could be used to inform mathematics teaching and learning. While teachers may anticipate some of these issues, it is important to note that the identified problems are exhibited by the top third of those who wrote the Mathematics NBTs. This group will constitute a large proportion of first-year students in mathematically demanding programmes.

Our aim here is to raise awareness in higher education and at school level of the extent of the common errors and problems in conceptual understanding of mathematics. We cannot analyse all possible interventions that could be put in place to remediate the identified mathematical problems, but we do provide information that can inform choices when planning such interventions.

The National Benchmark Tests (NBTs), as with all high-stakes testing, are at times viewed with anxiety and scepticism. Parents, test candidates and teachers appear to want more information, even though a large amount of information is readily available on the National Benchmark Tests Project (NBTP) website (

We provide background to the NBTs together with an analysis of the results of the mathematics tests written in 2012 and 2013. The results are considered at a national level, and we cannot analyse specific interventions that could be put in place. This is more properly the field of teachers and the academics who are involved in teacher education who can hopefully be more fully informed by the results presented here.

The NBTP was commissioned in 2005 by Higher Education South Africa (HESA), now called Universities South Africa, with the following objectives (Griesel,

To assess entry-level academic and quantitative literacy and mathematics proficiency of students.

To assess the relationship between higher education entry-level requirements and school-level exit outcomes.

To provide a service to higher education institutions requiring additional information to assist in admission (selection and placement) of students.

To assist with curriculum development, particularly in relation to foundation and augmented courses.

At the end of Grade 12 all school leavers write the National Senior Certificate (NSC); those wishing to enter higher education also write the NBTs if required to do so by the institutions to which they intend applying. All NBT candidates must write the Academic and Quantitative Literacy (AQL) test; those who intend to study in an area requiring mathematics need to write the Mathematics (MAT) test as well.

The norm-referenced NSC mathematics exam necessarily attempts to reflect the entire school mathematics curriculum. While the criterion-referenced NBT MAT tests do not test anything outside the school curriculum, they are not constrained to include all NSC mathematics topics, and thus focus on those aspects of the school curriculum that have a greater bearing on performance in first-year mathematics courses. Clearly, the NSC mathematics exams and the MAT tests should be regarded as complementary forms of assessment. The two assessment regimes are complementary in the sense that the NSC attempts to answer the question ‘To what extent do NSC candidates meet the curriculum statement expectations as expressed in the subject assessment guidelines?’ while the NBTP attempts to answer the question ‘To what extent do students aiming to enter higher education meet the core academic literacy, quantitative literacy and mathematics competencies required by school leavers on entry to higher education study?’ Whereas the AQL tests are intended as tests of generic skills in the domains of academic and quantitative literacy, the MAT tests are explicitly designed to measure the mathematical preparedness of candidates for mathematically demanding curricula in higher education. The Curriculum and Assessment Policy Statement (CAPS), as was also the case with Curriculum 2005, emphasises the ability of mathematics to provide the necessary conceptual tools for analysing, making and justifying decisions (Department of Basic Education (DBE),

We would need to be convinced about the need for additional testing. We need to be shown where the NSC is not adequate, and we need to be convinced that the NBT is a credible test. In the end we run the NSC at huge expense. … Is it really justifiable to introduce something else? (Paton,

The then deputy director-general Penny Vinjevold questioned their purpose: ‘What will they be used for?’ she asked (Paton,

Spaull and Taylor (

A Council on Higher Education (CHE) study (Scott, Yeld & Hendry,

In the period after the introduction of the NSC in 2008 up to 2012, less than a quarter of Grade 12 learners writing mathematics achieved more than 50%. In the same period the proportion of learners achieving between 70% and 100% fell from 8.3% in 2008 to 5.9% in 2011 and increased to 7.0% in 2012. In 2012, only 15 800 learners achieved between 70% and 100% for mathematics compared with 24 900 in 2008 (Snyman,

With challenging school conditions and changing school curricula (Curriculum 2005, examined in Grade 12 from 2009 until 2013, and then the CAPS, examined for the first time in 2014), teachers find it hard to meet the challenges of changed content, changed emphases and different forms of assessment. In many cases aspects of the mathematical curriculum are not taught, or poorly taught, leaving learners less well prepared for higher education study. Over many years, experience in first-year courses reflects a lack of alignment between school outcomes and higher education expectations. Any information that can provide insight into this lack of alignment should be given consideration.

National senior certificate mathematics achievement rates 2010–2014.

Year | % achieved at 30% or more | % achieved at 40% or more |
---|---|---|

2010 | 47.4 | 30.9 |

2011 | 46.3 | 30.1 |

2012 | 54.0 | 35.7 |

2013 | 59.1 | 40.5 |

2014 | 53.5 | 35.1 |

While the increase in the proportion of students achieving 40% or above is encouraging, it is sobering to consider the low proportion of candidates achieving results that enable them to be admitted to degree programmes requiring mathematics. Even though they are admitted they may ultimately take longer than expected to complete their degrees, or drop out altogether.

The DBE is concerned about the problem (DBE,

Broad references to algebraic skill, the language of mathematics, knowledge of basic competencies and foundational competence provide insufficient information. Additional information on specific skills and competencies is identified below that may require particular attention. Analysis of MAT test results identifies problem areas and their extent among otherwise high performing learners. Teachers, and mathematicians involved in teacher education programmes, are best able to design strategies that will remediate aspects of mathematics that are barriers to success in mathematically demanding programmes in higher education. The findings presented here could also assist higher education in providing appropriate support, since this is clearly necessary. Given that the highest achieving school leavers are participating in higher education, representing 16% of the age cohort, for the majority of students the curriculum structures are clearly not working (Adler,

Assessment is an important tool in informing teaching and learning. The DBE requires teachers and district officials to monitor learner performance and report progress. Several regional and international evaluations that include mathematical performance have taken place, such as the Trends in International Mathematics and Science Study (TIMSS, 1995, 1999, 2003), and the Annual National Assessments^{1}

Educational goals and assessment goals are linked: learning involves the acquisition of skills and knowledge; assessment identifies the level of knowledge or skill acquired. If assessment is to be meaningful, it needs to advance learning and not simply record its status. If teachers can engage with clear examples of the problems their learners exhibit in assessment tasks, they will be better able to communicate to the learners the underlying mathematics that would facilitate better comprehension. For example, if a teacher knows the theory of addition (to give a very trivial example) they could help their learners understand what it means to add ‘like’ terms,: that denominator relates to the name (i.e. identifies ‘like’ terms), numerator relates to ‘how many’, and so on, then fewer learners would think that

Mathematics should be a gateway, and not a gatekeeper, to success in higher education. Students entering science, technology, engineering and mathematics fields need to be proficient in the requisite mathematics. At university the prior domain knowledge and previous learning experiences that students bring to their studies are acknowledged as significant factors influencing student success (Crawford, Gordon, Nicholas & Prosser,

Teachers who are pressed for time, and teachers who need or want to ensure specific test results, tend to teach to the test. Unfortunately, teaching (and possibly learning) may be driven by the extent and type of assessment involved (Jennings & Bearak,

The advent of the ANAs has given rise to additional expectations from teachers. See for example the following DBE statement (

ANA is intended to provide regular, well-timed, valid and credible data on pupil achievement in the education system. Assessment of pupils’ performance in the GET Band (Grades 1–9) has previously been done at school level. Unlike examinations that are designed to inform decisions on learner promotion and progression, ANA data is meant to be used for both diagnostic purposes at individual learner level and decision-making purposes at systemic level. At the individual learner level, the ANA results will provide teachers with empirical evidence on what the learner can and/or cannot do at a particular stage or grade and do so at the beginning of the school year. Schools will inform parents of their child’s ANA performance in March.

The above statement suggests that data from external assessments are intended to be used diagnostically. Shalem and Sapire (

Smith, diSessa and Roschelle (

The MAT diagnostic information can highlight some errors and misconceptions and at the same time help teachers use the information. Since methods of assessment should enable learners to demonstrate what they know rather than what they do not know, not all MAT test items present candidates with possible misconceptions as options, that is, not all MAT items have options that include identifiable error types or misconceptions. Doing so would ‘trap’ many candidates into selecting the apparently obvious option; if it is not provided they are more likely to try to solve the problem and find an answer. To give a trivial example, candidates could be asked to choose the correct answer for the following item:

The volume of the cylinder in the diagram, in cm^{3}, is

(A) 200

(B) 201

(C) 202

(D) 203

If the correct answer is 202, then none of the others would reflect any misconception such as using diameter instead of radius, or calculating area instead of volume. The decision to limit the number of questions probing misconceptions minimises to some extent possible diagnostic opportunities, but yields more accurate results in terms of a candidate’s mathematical proficiency.

NBT data are obtained from the NBTP at the University of Cape Town, while the NSC data are obtained from DBE reports. The data are analysed by year, for candidates writing in each DBE province as well as under the Independent Examinations Board (IEB). The total number of candidates who wrote both the NSCs and NBTs from 2009 to 2013 has increased by 16 453 candidates over this period to a total of 182 156 candidates. There were 41 314 and 45 245 candidates in 2012 and 2013 respectively. The MAT tests take three hours and comprise 60 multiple choice questions. Items are scored dichotomously: a correct response to an item is given a score of ‘1’ and an incorrect response to an item is given a score of ‘0.’ The total raw score is obtained by summing the scored item responses. Test items^{2}

To determine whether learners are able to make the transition between mathematics at secondary and tertiary level, the competencies that are required, but not necessarily made explicit, by higher education need to be assessed. The choice of competencies was until 2013 influenced by the four Learning Outcomes (LO1, LO2, LO3 and LO4) that appeared in the Learning Programme Guidelines of the National Curriculum Statement for Mathematics for Grades 10–12 (DBE,

The MAT tests are embedded in the NSC curriculum, but cut across the different learning areas. This means that whereas the NSC Grade 12 exam assesses separately different learning areas such as algebra, trigonometry and Euclidean geometry and measurement, a MAT test may include a geometry question that will be solved using trigonometry and algebra. The MAT test specification spreads questions into six clusters: algebra, functions, transformations, trigonometry, spatial reasoning and data processing.

The NSC Subject Assessment Guidelines previously specified a taxonomy of categories of mathematical demand, which indicated that learners needed to perform at the levels of knowing (recall or basic factual knowledge), performing routine procedures, performing complex procedures and problem-solving (DBE,

Test results place candidates into three benchmark categories: Basic, Intermediate and Proficient (determined during standard setting, which took place in 2009, 2012 and again in 2015) (for further details see NBTP,

Less than 10% of

Distribution across benchmark levels for 2012 national benchmark test mathematics candidates.

NBT Test | Basic | Intermediate lower | Intermediate upper | Proficient |
---|---|---|---|---|

Mathematics | 41.4% | 36.0% | 14.4% | 8.1% |

Benchmarks were reset in 2012, resulting in the following distribution for candidates in 2013 (see

Distribution across benchmark levels for 2013 national benchmark test mathematics candidates.

NBT test | Basic | Intermediate lower | Intermediate upper | Proficient |
---|---|---|---|---|

Mathematics | 49.0% | 27.1% | 13.7% | 10.3% |

Proportion of learners within NBT Mathematics performance levels for NBT 2012 and 2013 intake cycles.

For this article, we considered the results of prospective students, nationally, who wrote the MAT tests between May and November of 2012 (38 730 candidates) and 2013 (48 318 candidates) and who achieved scores in the

All data relate to candidates who

The difficulties or misconceptions experienced by these NBT candidates indicate the need for various forms of intervention that could be undertaken to address the problems outlined here. Item responses from different mathematical topics that reflect similar misconceptions or errors in reasoning have been grouped together. Suggestions for possible interventions are noted, but in-depth analysis of such interventions is beyond the scope of this article. Teachers, and academics involved in teacher training, would be best placed to consider at what grade (even at primary school) and in how much depth various interventions could be focused. They will be aware for example of the work of Parker & Leinhardt (

It may be common knowledge that for example, as noted in the 2014 NSC Diagnostic Report (DBE, ^{3} + ^{5} = ^{8} are evident in the very group of candidates aspiring to enter university. While these common errors and misconceptions may be familiar to teachers, quantifying and characterising these for the cohort of students entering university is less well understood.

The purpose of

Algebraic processing.

Item ID | Outline of mistake made | Percentage of candidates in the upper third making the mistake |
---|---|---|

A5 |
Solving an algebraic equation: problems subtracting and getting correct sign. | 20%–28%; in 13 tests |

T105 | Correct form of expression but incorrect sign: incorrect algebraic manipulation when subtracting (e.g. assuming |
26%–31%; in 6 tests |

F137 |
Finding the inverse of a function, using function notation: correct form of expression but incorrect sign, i.e. problems with subtracting and getting correct sign. | 26%–35%; in 3 tests |

S95 | Forgetting to apply identical operations to both sides of an equation. | 20%; in one test |

T98 | Cancelling over a sum (of the form |
20%–32%; in 2 tests |

A21 | Finding proportional distance: candidates cannot reduce to the distance travelled in one minute and hence calculate the required distance. | 21%–29%; in 12 tests |

A58 | Binomial expansion: ignoring middle term. | 34%–48%; in 12 tests |

AP63 | Assuming convergence is for –1 < |
23%–32%; in 3 tests |

A172 | Multiplication of powers: multiply bases and add exponents (i.e. 2^{3} x 2^{4} = 4^{7}). |
27%; in one test |

A141 | Addition of powers: keep base and add exponents (i.e. ^{3} + ^{5} = ^{8}). |
20%; in one test |

A3 |
Simplification with exponents: multiplying exponents inside or outside brackets, for example^{2} = ^{2} – ^{2} |
22%–29%; in 2 tests |

Algebraic processing skills are fundamental to all aspects of mathematics in higher education and

If a student has for example understood differentiation, but is unable to correctly apply the necessary algebraic processing procedures, further application and problem-solving is undermined. Analysis of the MAT tests results show that when solving algebraic equations, test candidates have forgotten the difference between an expression and an equation, and the need to apply identical procedures to both sides of an equation in order to find its solution. They have difficulty dealing with signs when subtraction is involved. It is necessary to revise expansion of brackets preceded by a minus, so that learners understand that, for example, −(2 − 3) = (−1)(2 − 3). It may be necessary to revise all operations involving negative integers.

A number of other algebraic concepts are also problematic. ‘Cancelling’ is poorly understood (it appears to be the same as ‘crossing out variables or numbers that are the same’). This also relates to solving equations, where cross-multiplication is applicable (because the same procedure is in effect applied to both sides) whereas it is not applicable when simplifying a mathematical expression. Factorisation, fractions and equivalent fractions must be clarified, along with what ‘cancelling’ actually means. Ratio and proportion are poorly understood. Learners do not remember (from earlier grades) or know that a proportional statement is an equation involving two ratios.

All learners know that (^{2} = ^{2} + 2^{2} but do not necessarily understand that 2

The concept of a function is fundamental to all first-year mathematics courses, whether they are pure mathematics courses or mathematics courses in other disciplines such as commerce or statistics.

Functions and the equations that define them.

Item ID | Outline of mistake made | Percentage of candidates in the upper third making the mistake |
---|---|---|

FG35 FG46 | Function defined differently over its domain: substitution into incorrect expression. | 24%–33%; in 4 tests |

FG27 | Confusing range of function with domain. | 21%–39%; in 4 tests |

12FG6 |
Domain of inverse of exponential function the same as the domain of the exponential function itself. | 35%; in one test |

12TR7 | Finding (from given graph) values of |
36%; in one test |

12SP65 | For ‘highest point’ they choose the point which has the largest |
20%; in one test |

12FG31 | Number of points of intersection of a hyperbola and a straight line: assume that since the equation is quadratic, it must have two roots, i.e. there must be two points of intersection. | 52%; in one test |

12FG41 |
No understanding of difference between finding points at which a line touches a curve, or points at which the line cuts the curve. | 48%; in one test |

12FG42 | Cubic equation: has three roots but graph may have less than three |
56%; in one test |

12FG54 | 21%; in one test | |

F93 | Finding the maximum distance between a line and a parabola: assume maximum occurs at turning point. | 21%; in one test |

12FG16 | Parameters of parabola: can’t use position of (^{2} + |
24%–35%; in 2 tests |

12FG20 | Parameters of parabola: can’t find sign of ^{2} + |
28%; in one test |

FG44 |
Parameters of parabola: can’t determine position of axis of symmetry from sign of ^{2} + |
39%; in one test |

FG9 | Parameters of parabola: don’t know that if ^{2} + ^{2} – 4 |
26%; in one test |

T31 |
Don’t understand the effect of the coefficient in front of |
28%; in one test |

T10 | Range of shifted sine: assume that all sine curves have the same range (i.e. [– 1; 1]). | 20%–21%; in 2 tests |

TG58 | Range of a shifted and stretched sine over a specific domain: ignore the fact that the stretch changes the original domain, and hence the range. | 63%; in one test |

F46 |
Left shift means minus. | 21%–29%; in 8 tests |

F136 | Plus indicates right shift. | 33%–34%; in 4 tests |

T116 | Minus indicates left shift. | 21%–25%; in 2 tests |

The definition of a function, different representations of functions and the terminology associated with functions need to be revised. Ideally, generic graphs should be used to clarify function terminology such as domain, range, function graph (where the graph lies above, on or below the ^{x}

Trigonometric ratios are initially taught in terms of right triangles. This understanding is then poorly expanded later to trigonometric functions in the Cartesian plane, as can be seen from

Trigonometric functions.

Item ID | Outline of mistake made | Percentage of candidates in the upper third making the mistake |
---|---|---|

T21 |
Value of trigonometric expression: numerically correct but sign wrong – wrong quadrant used (i.e. only right triangle considered). | 20%–26%; in 2 tests |

Basic geometric concepts.

Item ID | Outline of mistake made | Percentage of candidates in the upper third making the mistake |
---|---|---|

S11 | Finding the area of square inscribed in a circle: assuming radius of 1 ⇒ side of 2 | 22%–35%; in 11 tests |

S1 |
Not halving diameter to find radius in order to calculate area. | 26%–34%; in 2 tests |

TG46 | Using radius instead of diameter for area calculation. | 24%; in one test |

12SP53 | Assume perimeter of rhombus is 2 × sum of diagonals; don’t know that diagonals bisect at 90° (and that Pythagoras could then be applied). | 41%; in one test |

It is important to reinforce the link between circles, pi, radius and diameter. It should not be assumed that learners have remembered or understood basic geometric concepts, such as perimeter, area, surface area and volume, especially in relation to objects that are shown from a perspective different from the standard perspective, or where composite shapes are involved. The terminology of geometry is possibly problematic: are learners familiar with the meaning of geometrical terms such as face, vertex, rhombus and so on? This needs to be addressed in the earlier grades.

Calculator dependence has resulted in a limited understanding of the number system. It is important to teach the structure of the number system, especially numbers (natural, integers, rational, etc.) in relation to one another.

Number sense.

Item ID | Outline of mistake made | Percentage of candidates in the upper third making the mistake |
---|---|---|

AP35 | Can’t find the smallest of a group of negative numbers: candidates don’t use number line. | 36%–46%; in 3 tests |

AP77 | Can’t find the smallest of a group of negative numbers: candidates don’t use number line. | 46%–50%; in 3 tests |

A96 | Fractions of the form 1/ |
20%–23%; in 3 tests |

12SP17 | In a calculation of area of a triangle in plane: candidates assume |
26%; in one test |

F125 | Candidates assume |
25%; in one test |

AP113 | Don’t know position of |
39%; in one test |

A108 | Can’t determine fraction size relative to respective sizes of numerator and denominator. | 20%–39%; in 2 tests |

AP40 | To find the value for which a square root expression is defined: possible negative values of the expression are not excluded. | 43%; in two tests |

L8 | Can’t select a valid statement based on an expression which is divisible by 7 (it appears no test case is selected). | 21%–22%; in 3 tests |

AP79 |
Zero in denominator not excluded. | 20%; in one test |

L4 |
Multiplication or division by a negative number: wrong inequality direction. | 35%–39%; in 4 tests |

L2 | Assume the only solution of |
27% of top third; in one test |

A110 | Percentage increase followed by equal percentage decrease: % cancels out. | 20%–36%; in 8 tests |

Multiplication and division of numbers are not always well taught in primary school, and the results are still evident much later. Learners need to understand what the process of division actually means, in order to understand why division by zero is impossible. Multiplication and division by negative numbers need to be revised, to enable learners to see that the relative positions of numbers on the number line change, which would clarify their understanding of < and >.

It should not be assumed that percentage is a well understood concept. For many candidates, ‘percentage’ appears to exist in isolation and there is no attempt to associate it with the quantity to which the percentage is applied. The meaning of percentage, and how percentage can be applied, needs to be revised. This concept is taught in earlier grades but has apparently been forgotten.

This article presents some common errors made by high performing candidates in a large-scale study and indicates problems in the conceptual understanding and mathematical skills of these candidates. While teachers would anticipate some of these, it is important to note that the problems are exhibited by the top third of prospective applicants to higher education who wrote the NBT for Mathematics. This group constitutes a large proportion of first-year students in mathematically demanding programmes. The purpose of this article is to raise awareness in both higher and basic education about the type and extent of the problem. It is not the intention to engage in an analysis of all possible interventions that could be put in place. The diagnostic information provided identifies problems MAT candidates demonstrate regarding some of the essential mathematical concepts and procedures deemed necessary by higher education mathematicians. The interventions suggested in response to the diagnostic information from MAT tests can be of use to the school sector in foregrounding areas where mathematical comprehension is weak. These topics, as well as the related terminology and language, need to be given greater attention in the classroom. It is not the intention of this article to be prescriptive with respect to the suggested interventions; teachers should themselves determine how best to use the diagnostic information from the MAT tests to create learning environments which could be more responsive to the needs of higher education.

Policymakers rather than teachers need to consider that some topics may possibly need to be excluded from the school curriculum, without necessarily detracting from its value, in order to achieve greater understanding of key concepts relevant to higher education.

The authors declare that we have no financial or personal relationships that might have inappropriately influenced us in writing this article.

R.P. and A.D. were responsible for compilation and analysis of all data from the NBTs referred to in the document. C.B. was responsible for the purpose, background, literature review, methodology and analysis of the MAT tests to provide diagnostic information.

The ANAs are standardised national assessments for languages and mathematics in the senior phase (Grades 7–9), intermediate phase (Grades 4–6) and in literacy and numeracy for the foundation phase (Grades 1–3). The question papers and marking memoranda (exemplars) are supplied by the national DBE and the schools manage the conduct of the tests as well as the marking and internal moderation (DBE,

In the NBTs, questions are referred to as ‘items’.