This page dispels some common myths about OPs. For an explanation of what OPs are and how they are calculated, see the section on OPs and FPs.
Myth 1: Students with 5 or more VHAs and an "A" on the QCS Test automatically get an OP1
Levels of Achievement cover fairly broad ranges of achievement. Not all students awarded Very High Achievements (VHAs) are at the same standard. Some students may be at the top of the VHA range, while others may be doing just well enough to get a VHA. In addition, Levels of Achievement in different subjects do not represent equivalent standards.
To be awarded an OP1, students must be in the top 2% of all students in Queensland. There are many more students with 5 VHAs than there are OP1s.
Myth 2: Students with 5 SAIs of 400 will get an OP1
Not necessarily. In fact, usually not. An SAI of 400 only indicates that the student was the school's highest achieving student in that particular subject. This student may not be the best student overall in the school, nor among the top 2% of students in Queensland overall.
Myth 3: To get an OP1, it's better to study some subjects than others
All subjects are treated equally in the calculation of OPs. Any apparent inequality is the result of scaling, which takes into account the different overall capabilities of students in different subject groups and schools. To get a good OP, students must be ahead of strong competition. If the competition is not strong in some subjects, then a student needs to be a long way ahead of the other students to achieve a good OP.
It is possible to obtain an OP1 from any combination of subjects. However, students need to perform much better than other students in subjects where the competition is weak. An OP1 student must perform at the level of the top 2% of students in the state.
Myth 4: High achievers in a low-achieving group can't get a good OP
A student who wants a good OP must demonstrate outstanding achievement. In a low-achieving group, this outstanding achievement would be reflected in a large gap between the SAI of that student and the SAIs of other students.
Myth 5: There is a bias in favour of certain schools
The procedures followed for the calculation of OPs are exactly the same for students in every school. What a student needs to consider when comparing OPs is where they are placed, by their teachers, against what kind of competition. This applies whatever school a student attends.
Students, not schools, are awarded OPs. However, schools are not random collections of students. The quality, application and performance of students is unevenly distributed, so different performances at different schools is to be expected.
Myth 6: Students in a small group or a small school are disadvantaged
The QSA has special procedures in place for small groups and small schools to ensure that this doesn't occur. SAIs are assigned differently, and the scaling processes are adjusted to make sure that OPs reflect students' performances fairly.
Myth 7: Students who do poorly on the QCS Test can't get a good OP
It is important to realise that the QCS Test results are used in the scaling procedures only to determine where the group fits on the baseline scale. What matters for the individual student are their SAIs. A student's individual QCS Test result contributes to the group results for each of their subject–groups and their school-group. QCS Test data are used to provide scaling parameters for different subject-groups and for the whole cohort of OP-eligible students at a student's school. The individual student's QCS Test result contributes to that group data.
In determining the scaling parameters, the QSA checks to see whether any student has QCS Test results which seem quite different from their within-school performance — as might be the case if a high–achieving student were sick on one or both days of the QCS Test and did not perform as well as expected. The contribution of the QCS Test data of these students is down–weighted so that these unusual results will not distort the group's mean and spread on the test.
Last reviewed: 21 December 2011