### MU-MAP Case Studies

In the 2011 MU-MAP mini-project call for proposals, 7 research projects have been successfully funded. The list of the mini-projects funded with lead applicants, institutions and brief summary of the projects is included below. Project leaders will disseminate their findings at the forthcoming British Mathematical Colloquium, University of Kent, April 17th and 18th/

### PAMPER: Performance Assessment in Mathematics – Preliminary Empirical Research

## PAMPER: Performance Assessment in Mathematics – Preliminary Empirical Research

Closed book examinations dominate the assessment diet of undergraduate mathematics in the UK (Iannone and Simpson, 2011) and while there this is leavened by some variety, oral examinations as a core component at this level disappeared by the 20th Century (Stray 2001). However, there is renewed interest in this form of assessment as more validly grading students' performance of mathematics (Levesley, 2011).

This mini-project aims to investigate the difficulties and advantages to introducing an oral performance component to the assessment of a pure mathematics course. Anecdotal evidence amongst staff at high ranking mathematics departments suggests that the modal assessment method (closed book examination) does not provide them with the clearest insight into students' mathematical understanding. Those with experience of oral exams from their own mathematics education (such as those from most other European countries) note that these can provide such insight relatively quickly.

However, there is concern about the fairness and resource implications of such examinations. It is not clear whether examiners might be biased (consciously or unconsciously) towards certain students; whether student nervousness might overwhelm their performance or how the standard of assessment can be assured. It is also felt that the time taken to assess each student individually and the administration of such a process could be disproportionate to the quality of the information gained.

This project will

a) explore the perceptions of staff and tutors on the course to implementing the assessment

b) detail the process of undertaking performance assessment in a core module

c) outline student attitudes to the assessment process

To download the report of this project as included in the MU-MAP book click here

**Adrian Simpson**

The Principal
Josephine Butler College
South Road
Durham
DH1 3DF
0191 3347261
Adrian.simpson@durham.ac.uk

**Paola Iannone**

School of Education and Lifelong Learning
University of East Anglia
Norwich Research Park
Norwich NR4 7TJ
01603 591007
p.iannone@uea.ac.uk

### Audience Response devices for formative and summative assessment

## Audience Response devices for formative and summative assessment

Description of project: Audience response devices have received tremendous attention in the learning community. Kay and Sage (2009) "Examining the benefits and challenges of using audience response systems: A review of the literature" Computers & Education 53:819-827 present a review of these devices in class use.

There are many advantages, and one notable potential pitfall - that students may not appreciate being constantly assessed. However, our experience is that students do like constant assessment if it is accompanied by rapid feedback. Also, surprisingly, we have had students suggest they would prefer to take in-class tests (that is, summative assessment) using these devices. The aim of this project is to determine the suitability of audience response clickers for this kind of assessment. The literature isn't clear on the problems - there is a lot of emphasis on "fun" and "engagement".

We also note that we use clickers that allow for numeric as well as multiple choice input, which lets us set a wider range of questions than are possible with many systems. We also have experience in dealing with the equipment (again, the literature suggests this is the largest barrier to effective use of these devices). What we don't know is whether it is possible or desirable to incorporate use of these clickers as part of the formal assessed work in a module. By integrating assessment more thoroughly within learning we may obtain better learning rather than a focus on preparing for tests. On the other hand, it is less easy to be flexible with the questions you use in class which can be hugely advantageous (for example Chin and Brown (2002) "Student-generated questions: A meaningful aspect of learning in science" International Journal of Science Education 24: 521-549) and more difficult to implement with clicker based learning.

To download the report of this project as included in the MU-MAP book click here

**Paul Hewson**

School of Computing and Mathematics,
Plymouth University
Drake Circus
Plymouth
PL4 8AA
01752 586870
paul.hewson@plymouth.ac.uk

**David Graham**

School of Computing and Mathematics,
Plymouth University
Drake Circus
Plymouth
PL4 8AA
01752 586880
dgraham@plymouth.ac.uk

### Mathematics Lecturers' Practice and Perception of Computer-Aided Assessment

## Mathematics Lecturers' Practice and Perception of Computer-Aided Assessment

Mathematics lecturers at Loughborough University are in the position of being able to utilise Computer Assisted Assessment (CAA) without the need for developing their own questions. Two projects, undertaken a few years ago by colleagues at Loughborough University and elsewhere have resulted in question banks containing thousands of CAA questions ready to use.

See, for example, the HELM project – http://helm.lboro.ac.uk.

This project aims to evaluate the issues arising for lecturers who utilise existing resources and adopt this method of assessment. It would appear at first sight that the ready availability of CAA questions is an extremely efficient way of assessing hundreds of first year students and would be welcomed by all involved.

Question banks are available for both practice and coursework tests and lecturers are freed from marking students' work. The workload for lecturers is minimal, as dedicated e-learning staff are available to upload tests and the computer software provides pre-prepared feedback to the students and summary statistics for the lecturers. However all is not necessarily as straightforward as it might appear. For most large classes it is not possible to invigilate the coursework tests due to the lack of availability of computer labs for this purpose. Some lecturers and/or departments are concerned that plagiarism is an issue and in these cases paper-based versions of tests may need to be prepared and marked, thus reducing the efficiency of the system.

Other lecturers are concerned about the questions which are available for use. Sometimes they do not fully cover the required syllabus. However the steep learning curve and associated time involved in developing new questions is prohibitive and so lecturers may be tempted to ‘make do'. Other concerns involve the procedural nature of many CAA questions. Clearly lecturers wish their students to be able to apply standard techniques to solving problems. But what of the students' conceptual understanding of the mathematics? Is CAA able to test this? Does it matter to lecturers if it does not?

To download the report of this project as included in the MU-MAP book click here

**Carol Robinson**

Mathematics Education Centre
Loughborough University
Loughborough
LE11 3TU
01509 228252
c.l.robinson@lboro.ac.uk

**Dr Paul Hernandez-Martinez**

Mathematics Education Centre
Loughborough University
Loughborough
LE11 3TU
p.a.hernandez-martinez@lboro.ac.uk

### Summative peer assessment of undergraduate calculus using Adaptive Comparative Judgement

## Summative peer assessment of undergraduate calculus using Adaptive Comparative Judgement

The project aims to demonstrate that sustained mathematical reasoning can be peer assessed with high reliability using Adaptive Comparative Judgement (ACJ). This will be a live trial and the outcome, checked against expert assessment, will be used as part (5%) of the summative assessment of first year undergraduates studying calculus at Loughborough University (first year modules do not contribute to overall degree credit).

This innovation has important implications for assessment of mathematical skills that are traditionally seen as difficult to test, such as making judgements about the relative quality of different mathematical explanations. Therefore it needs to be evaluated rigorously and the project will allow us to undertake this evaluation.

To download the report of this project as included in the MU-MAP book click here

**Ian Jones**

Mathematics Education Centre
Schofield Building
Loughborough University
Loughborough
LE11 3TU
01509 228217
i.jones@lboro.ac.uk

**Lara Alcock**

Mathematics Education Centre
Loughborough University
Loughborough
LE11 3TU
L.J.Alcock@lboro.ac.uk

### Towards an efficient approach for examining employability skills

## Towards an efficient approach for examining employability skills

With the recent shift in undergraduate funding, the future success or failure of a university department has been placed in the hands of student recruitment and so to league-table performance. The employability of graduates has therefore never been more important. But how best should math departments assess these skills in their students? A student's approach to an open-ended problem, one with no necessarily right or wrong answer, is crucial for their employability and is often used on graduate assessment days.

To download the report of this project as included in the MU-MAP book click here

**Stephen Garrett**

Department of Mathematics
University of Leicester
University Road
Leicester
LE1 7RH
0116 252 3899
sjg50@le.ac.uk

### Evaluating assessment practices in a Business and Industrial Mathematics module

## Evaluating assessment practices in a Business and Industrial Mathematics module

The Business and Industrial Mathematics module is run in the second year of the mathematics undergraduate degree at the University of Salford. The module is 100% coursework, 20 credits and spans two semesters. The module attempts to prepare and assess students for work related skills important for mathematicians in the workplace. A variety of assessment methods are used to quantify this, and this proposal aims to compare and evaluate the various assessment practices used.

To download the report of this project as included in the MU-MAP book click here

**Edmund Chadwick**

Newton Building room 271
Mathematics
School of Computing, Science and Engineering, Salford University
M5 4WT
01612953259
e.a.chadwick@salford.ac.uk

### Assessing proofs in pure mathematics

## Assessing proofs in pure mathematics

Can the marking burden be reduced through innovative assessment, whilst keeping the task educationally rich and maintaining the same level of student engagement and learning from the assessment? What novel methods might be used to assess student comprehension? How can the ideas behind mathematical proof be assessed? Can an imaginative approach be developed to determine a student's ability to identify assumptions in a mathematical argument?

These are the questions that have provided the motivation for this project. So far the applicant has begun the exciting development of an interesting and innovative assessment; a multiple-choice test on mathematical proof. This project seeks to further the implementation of this novel assessment practice. It also aims to obtain detailed feedback from students to enable a comprehensive evaluation of this interesting new method of assessment of pure mathematics. Being part of the MU-MAP project will facilitate the dissemination of new ideas, the development of resources, and provide a mechanism to share good practice.

To download the report of this project as included in the MU-MAP book click here

**Timothy Hetherington**

School of Science and Technology, Nottingham Trent University
Clifton Lane
Nottingham
NG11 8NS
01158488417
timothy.hetherington@ntu.ac.uk