Exam Board Planning 2019/20
4 March 2020
Helen Matthews, Director of Academic Services, gives an update on improvements in year two of the new Exam Boards.
The process for Exam Boards first introduced in summer 2019 uses system-based rules to provide validation of student progression and degree outcomes. This provides assurance that awards are correct, automates algorithmic decisions, and gets results to students more quickly.
There were a number of issues with the first year roll-out, related both to the quality of data in the system and to user functionality and reports which caused problems for many boards
After months of consultation and improvements to systems and processes, Academic Model Project and Student Records teams are embarking on testing and training to prepare for exam board planning this year.
Below are the issues that arose in 2018-19, with details of how they have been addressed by the Academic Model Project and Student Records teams.
Progression and Award rules data issues
Information on progression and award rules, initially collected via Programme Summaries in 2017/18, was not sufficiently detailed, and a further data collection and validation with faculties and departments was undertaken during 2018/19. Despite this, there were still cases where specific local rules had not been articulated or captured sufficiently clearly.
Solutions:
- Automation tasks have been amended, the rules attached to programmes have been reviewed, and all outcomes from last year’s data set are being re-run to check for inconsistencies. So far no anomalies have been identified that are not explained by other reasons such as suspension of regulation cases.
- Student Records managers have been working with faculties and departments to resolve outstanding queries about local rules and cross-check all rules on SITS. Departments will be able to verify the rules that have been applied to each programme in Portico. The information will ultimately be published in programme summaries, but for this year’s boards will be provided via Portico.
Data quality
Inaccuracies in the records – for example, where students had interrupted their studies, or where students who had taken a year abroad were registered on the standard version of the programme, rather than the ‘with year abroad’ variant - meant that the system could not apply different rules or specific decisions by the Boards. As a result, manual correction by the Student Records team was necessary and, while data quality reports were provided to assist with pre-board checking, there was insufficient time for this process to be genuinely helpful to Board administration.
Solutions:
- The majority of these errors will have been addressed and should not recur, as students have now progressed through the system. However, student status or programmes may be incorrectly recorded or not recorded at all in Portico, so special attention will need to be paid to these records in the run up to Boards in Summer 2020.
- Student Records are compiling a list of record scenarios (eg students on interruption or who have interrupted in the past, on part time or flexible programmes, on year abroad, year in industry, or Extra Mural Years) that could trip-up the calculations, so that inaccuracies can be identified and addressed in advance.
- In their meetings with faculties and departments, Student Records managers are discussing the set-up of programme data. Board administrators will be given detailed instructions on how to use the data quality reports to identify potential issues on their programmes.
Exam Board tasks
The process was designed on the assumption that assessment of taught modules completes at the end of the main exam period. It became clear that this did not apply to the majority of postgraduate taught programmes and there was no accommodation for flexible delivery and assessment.
Solution:
- Enhancements are being made to the processing tasks in the system. For postgraduate taught programmes, the task will be adapted to accommodate greater diversity in programme delivery structures.
Exam Board preparation (Student Record data issues)
With a Working Group of Examinations Liaison Officers (ELOs) we are building a new data quality report to assist in the preparation for Exam Boards by highlighting any issues with a student’s marks, progression or classification. This will enable administrators to easily identify and rectify any issues with students prior to exam boards. This will provide assurance that data taken to exam boards is correct. This should instil more confidence in the exam board reports themselves.
Usability and design of reports
The reports were designed to be used for straightforward cases, with a more detailed version for reviewing less straightforward cases. Reports displayed a simple credit-weighted mean for the year while some local progression rules use a classification weighted mean.
Solution:
Better retrieval options and formatting changes. More efficient filtering, readability and structure to exam board reports will assist with preparation for exam boards and the process of exam boards themselves.
Unclear practice regarding pre- main Exam Board meetings
A variety of approaches to the conduct of a Board meeting and different uses of terminology for earlier pre-meetings and checking processes led to a lack of clarity about which reports should be used for what purpose.
Solution:
- Exam Board Chairs and the ELOs working group are being consulted on definitions for the different types of meeting that take place and who is in attendance in order to ensure a common understanding, so that clear guidance can be provided on use of reports. A summary is attached in Annex A. If you would like to comment on this please contact Seb Carrington (s.carrington@ucl.ac.uk Ext:45221.)
Uncertainty regarding best use of Exam Board reports in Board Meetings
The ELOs working group is developing guidance on best practice to enhance the existing Standard Exam Board agenda template.
Summary of timescales
Action | Timescale |
---|---|
Re-testing of classification rules | February (complete) |
Re-testing of progression rules | February to mid-March (in progress) |
Internal testing of functionality | March |
User acceptance testing (tasks and board reports) | Mid-March – 24 April |
User acceptance testing (Statistical Reports) | 23 March – 24 April |
Training (Student Records staff) | 23 March – 24 April |
Training (ELOs/administrators) | From 27 April |
Training on Board reports for Chairs | From late May |
Drop-ins | Throughout exam board period |
For reference: Exam Board terminology
Name | Definition | Academic Manual reference |
---|---|---|
Full Board of Examiners | The annual meeting of the Board of Examiners with all members in attendance (subject to quoracy provisions), including the External Examiner(s). | Ch. 4: 13.2.1 a) |
Sub Board of Examiners | The Full Board may delegate authority to a smaller Sub Board of itself to make decisions on its behalf for out-of-cycle matters such as Late Summer Assessments. | Ch. 4: 13.2.1 b) |
Interim Board | Some Programmes hold ‘Interim Boards’ to discuss results received so far, often at the end of the taught modules on a Masters programme. Interim Boards are a type of Sub Board and fall under the same requirements regarding membership, quoracy and candidate anonymity. | Ch. 4: 13.2.1 b) |
Pre-Board | Meeting involving internal examiners only to review marks and outcomes in advance of full board | |
Pre-Meeting | Meeting involving Chair, Board Administrator/ELO and other staff as appropriate to review outcomes in advance of full board. | |
Scrutiny Meeting | Meeting at which exam papers are scrutinised and approved. | |
Parity Meeting | Where an assessment includes multiple pairs of markers it is good practice to hold a parity meeting at the start of the marking process where markers can discuss and develop a shared understanding of the marking criteria. This can include comparing marks for a small sample of student work. | Ch. 4: 7.6.2 |