|Home | Intranet | A-Z Topics | eServices | Curriculum | Leadership | School Admin | School Improvement | School Websites | School Workforce & HR ||
Value Added Score - KS2
Published October 2015
Understanding How Value Added is Calculated (and how will it work in 2016?)
Brief Guide to the Concept of the Value Added Calculation
The Value Added model is recalculated every year to take account of the national trends in pupil progress in that year. For each individual pupil, their rate of progress is compared with the national average rate of progress for children with the same prior attainment. If your pupils made more progress than this average, they will have contributed positively to your overall VA score; if they made less progress this will lead to a reduction in your VA score. If the overall progress rate of the cohort is average, this would produce a median VA score (100 in KS2 data; 1000 in KS4 data) – meaning that your value added was neither positive nor negative, but completely neutral.
This particular approach to calculating VA (where pupils’ progress is compared against the progress of other pupils nationally with the same prior attainment, but not taking into account any other factors) produces the measure that is used in the Performance Tables, RAISEonline and Inspection Dashboard. (The Fischer Family Trust produces two different measures of VA, using slightly different methodology, including a fully contextualised model, which enables you to really explore how well your pupils progressed compared with similar pupils in similar circumstances nationally.)
As well as producing the VA score, a statistical measure called the ‘95% confidence interval’ is produced – this provides an indication of the ‘statistical significance’ of the VA score, which means the extent to which we can say with confidence that pupils made better or worse progress in a particular school than they would have done if placed in an ’average’ school. The confidence interval increases as the size of the dataset (i.e. number of pupils/results) decreases.
Example of how the VA score and confidence interval work together - imagine the following for a primary school:
VA = 102.5 Confidence interval = 2.2
That gives us a range of possible values for the true VA score, of 102.5 plus or minus 2.2, i.e. the range is from 100.3 to 104.7
As the whole range is above 100.0, we can say with 95% confidence that the progress made by the pupils in this school is better than it would have been if those pupils had been taught in an ‘average’ school. RAISEonline would illustrate this as follows:
The new Ofsted Inspection Dashboards would illustrate it visually as follows:
How many points progress did pupils need to make from KS1-KS2 to achieve Value Added in 2015?
The simple answer is – it depends on the starting point.
This year, across Key Stage 2, the national average rate of progress for many groups of children was 13.4 points (the difference between their KS2 Average Points Score and their Key Stage 1 Average Points Score). However, for some levels of prior attainment, the national average rate of progress was higher. The following Excel table shows some examples from this year’s national data (2014, KS2).
Note that for all children with the same KS1 level in reading, writing and maths, the average points score increase is consistently in the range 13.3-13.7 (It is 13.4 for children who were a 2b across the board or 2a across the board.)
However it is also interesting to note the variations between the individual subject areas. For a child with Level 3 at KS1 in all 3 subjects (21 points), the average points score increase in reading is just 11.7, whereas in maths it is 14.3. This reflects the fact that very few children achieve Level 6 in reading, whereas significantly more achieve it in maths.
For children with different KS1 levels, there are some interesting variations. For example, for children with a Level 1 in both reading and writing but a 2c in maths the national average points score increase was 14.6. But for children with KS1 levels of 2c, 2c, 1 in reading, writing and maths respectively, the average increase is just 12.5. See further examples in the spreadsheet available above.
There is one interesting anomaly in the VA data and that concerns writing. Whereas the VA model uses each individual child’s test score in reading and maths to produce a finely graded point score, writing is a teacher assessment and is expressed as a whole level only. So the only possible point score outcomes for an individual are 3, 9, 15, 21, 27, 33 and 35 (Levels W-6 respectively). For children with a 2b in each subject area at KS1, the national average writing outcome is 28.4 points (i.e. a level 4a). This is because most children in this bracket reached a Level 4 at KS2 (27) but some reached Level 5 (33). But an individual child cannot possibly score 28.4. Therefore all children from this starting point who finish KS2 on level 4 will have negative value added (1.4 points below the average) – even though they have achieved the most common outcome for that starting point (i.e. the mode average). So, when looking across a whole cohort of children, in order to achieve a VA for writing of around 100, about one quarter of the children need to have made that extra level of progress (with no child making less than 2 levels).
To see how your children’s outcomes compared with the national average that is used in the Value Added model, log in to RAISEonline and scroll down the list of reports to find the report called KS1-2 VA Pupil List.
How will progress be measured in 2016?
The precise details are yet to be determined, but we do know that it will be a Value Added model. So, although the KS2 test outcomes will not be expressed in levels but using scaled scores,it is fair to assume that similar patterns will emerge in terms of the national average outcomes for pupils with different starting points. ‘Value Added’ will be achieved when pupils in a particular school attain a higher scaled score than the national average outcome for pupils with the same starting points. However, the change in structure of the tests (the fact that there is just one set of tests for all pupils, and no extension test equivalent to the old Level 6) is likely to have an impact on the patterns of relative progress measures across subjects. The 2015 data shows some very high levels of progress in maths for children with higher prior attainment (e.g. 14.3 points progress in maths compared to 11.7 in reading for children who had level 3 across the board at KS1). This wide difference between the subjects may not occur to such an extent under the new system.