“My goal is not to be better than anyone else, but to be better than I used to be.”
Dr Wayne W. Dyer
As we publish our public examination results – A-levels, EPQ and BTEC on 16 August and GCSEs on 23 August – it can become easy to focus purely on the facts and figures put forward to us on those days. Exam results, though, are only a part of the picture.
Throughout life we all progress at different rates, some walking and talking at different stages, some achieving the dexterity to hold a pencil confidently later than others, and so on. The same can be said of academic progress later down the line.
So, looking at the progress a student has made on their journey from GCSEs to A-levels, rather than simply their exam results – their achievements, rather than simply attainment – can often show huge leaps forward and successes to be celebrated. This is what Value-added takes into account, measuring how well students have done against their prior attainment.
How does Value-added work?
Value-added can be used to better measure the achievements of an individual student, those of a group of students (a cohort) and those of a school. So, how does it work?
When measuring the achievements of a group of students over a given time period, some will do better, some worse and some will be in-between. This gives us an average or ‘normal distribution’ of progress.
That progress is, of course, affected by the students’ environment. If they are in a setting that encourages and inspires learning and raises their self-confidence and self-esteem, with access to high quality teaching, resources and opportunity, a group of students is more likely to make better progress.
All schools look to improve their students in this way, but if a school is increasing the achievements of their pupils at a higher rate than is expected, those studying there should have an advantage.
How is Value-added measured?
At St Joseph’s College we use Cognitive Ability Test (CAT) predictions as a measure to judge our performance at both GCSE and A-level. Students are compared to the tens of thousands of other students who achieved a similar CAT score in each subject and what those other students then went on to achieve at GCSE or A-level. A nationally standardised prediction for each student can then be made in each of their subjects. How far a student attains above or below this prediction is their Value-added score.
As well as CAT predictions, at A-level we use prior attainment at GCSE to judge performance. The government also uses this measure and each school’s overall Value-added score is published nationally in January each year. This is known as the Level 3 Value-added Score (L3VA).
To measure L3VA, each student’s average GCSE score is calculated and this score is compared to what students with the same GCSE average went on to attain in their A-level subjects or Level 3 General Vocational qualifications nationally. Students who score above this average will show positive Value-added, students scoring below will have achieved negative Value-added.
How does St Joseph’s College score?
School and subject performance can be measured by calculating the average and then comparing it to average attainment across a group of students or a subject.
‘This is easily demonstrated in the data for St Joseph’s College 2016-17 GCSE cohort for which the CAT prediction for Year 11 students attaining five GCSEs at A*to C (or equivalent) with English and Maths was 70%: our outcome was 76%.
At Advanced level in 2016/17 there was a similar story. Our Level 3 Value Added score for A-level in Year 13 (DfE calculation) was 0.21 – a figure that put St Joseph’s College students in the top 15% of schools nationally and ranking ‘above average’ Value-added. In 2017/18 this figure has been calculated by the college as 0.31, which indicates an even higher value added.
We look forward to welcoming our GCSE and A-level students as they collect their results and to celebrating their achievements, as well as their attainment, with them.
Mrs Vicky Fox