NAPLAN results inform schools, parents and policy. But too many kids miss the tests altogether
Today the NAPLAN testing window starts for more than a million students in Years 3, 5, 7 and 9. Over the next nine days students will sit literacy and numeracy tests which are designed to measure their reading, writing, numeracy, grammar, punctuation and spelling.
Education decision makers will be holding their breath about how many students turn up for NAPLAN. Last year saw the steepest declines on record in secondary school student participation.
This is an issue because NAPLAN results help inform parents, teachers, schools and education authorities about student learning and can influence decisions about policies, resources and additional supports for students. Declining NAPLAN participation may result in decisions being based on incomplete data.
In our new paper for the Australian Education Research Organisation, we look at who is not sitting the tests and why that matters.
Who is not sitting the tests?
While primary school student participation in NAPLAN has been steady at about 95% since 2014, secondary student participation has been in persistent decline. Last year only 87% of Year 9 students sat the tests.
A sharper decline in participation in 2022 was partly due to flooding in regions across Australia, high rates of illness and COVID-19 isolation requirements – circumstances we hope will not be repeated. It is the long-term decline in NAPLAN participation in secondary schools that needs attention.
The participation rate is alarmingly low for some groups of students. The figure below shows 79% of Year 9 students living in remote Australia sat NAPLAN last year. First Nations students and students from educationally disadvantaged backgrounds also had low participation rates in 2022; 66% and 75% respectively.
Our analysis reveals low-performing students are also less likely to participate in the tests. Students who performed poorly in NAPLAN in Year 7 were nearly five times more likely to miss the Year 9 tests than high-performing students. These findings were replicated for primary students.
Students who are educationally at risk need the best decisions from schools and education authorities. If NAPLAN participation rates are low for these smaller populations, the data is less reliable and the ability to make informed decisions may be compromised.
Why aren’t students sitting the tests?
Students do not sit NAPLAN for three official reasons: they may be exempt from taking the tests, withdrawn by their parents, or absent on the day.
The main reason for the long-term decline in NAPLAN participation is that more parents have been withdrawing their children from the tests. In 2022 over 11,000 Year 9 students didn’t sit the writing test because they had been withdrawn from it.
Being absent is also a contributing factor in the decline in participation; more so for secondary students than primary. In 2022, more Year 9 students than usual were absent from the writing test (in total over 28,600).
There are many reasons students are absent and withdrawn from NAPLAN. Parents who are worried about how their child may be affected by taking the tests and receiving results may choose to keep them at home or formally withdraw them from the tests. Anecdotally there have also been reports of schools asking low performing students to stay home on testing days, so they don’t “drag down” school averages.
On the positive side, our analysis showed Year 9 students with language backgrounds other than English participated in higher proportions than average (92% compared to 87%). This suggests cultural differences and family attitudes to education and testing might play an important role in participation.
Why is high NAPLAN participation important?
NAPLAN data is used by education authorities to better understand the learning progress of all Australian students to inform system-wide policies and support.
It also helps schools, systems and sectors to monitor and evaluate the effectiveness of educational approaches, and identifies schools which need more support.
For example, in NSW, NAPLAN data has been used to understand whether a new teaching role and giving students more practice time have been effective in improving students’ writing skills.
In Victoria, Brandon Park Primary School used its NAPLAN results to inform a whole school change to its teaching of reading, which brought remarkable success.
Given the benefits that good use of NAPLAN data can bring, it is critical the results are representative of the student groups being tested.
While the Australian Curriculum, Assessment and Reporting Authority estimates data for withdrawn and absent students, our analysis suggests student proficiency is likely to be overestimated.
That’s because students not sitting the test are more likely to be lower-performing students from their respective demographic groups. Real data is always better than estimates.
The Australian education system is meant to be about achieving equitable outcomes from education for all students.
Equity is something we should all expect and support.
To achieve it, we need accurate information about student progress on a national scale. NAPLAN is meant to provide that information, so we should support and encourage students to turn up for the tests and try their best.
Lucy Lu, Adjunct Senior Lecturer, Faculty of Education and Social Work, University of Sydney and Olivia Groves, Adjunct Research Fellow, Curtin University
This article is republished from The Conversation under a Creative Commons license. Read the original article.