For many educators and parents, Australia's NAPLAN is controversial. But whatever your opinion of NAPLAN, the data does provide some information about student achievement.
According to the Grattan Institute’s new model of interpreting NAPLAN data there is a widening achievement gap in Australian schools.
For example there is a spread of achievement of up to 7 years between students in year 9. This has been described as alarming.
He explained the findings in this episode of the Learning Capacity podcast. And he also discussed the implications of the widening achievement gap for Australian education.
The discussion with him was so extensive I have recorded it in two podcast episodes. Be sure to catch part two, episode 57, where we discuss the effects of disadvantage on NAPLAN results.
Listen to the podcast.
- NAPLAN (National Assessment Program - Literacy & Numeracy)
- Learning gaps
- Repeating a year
- The equivalent year of learning
- Years of progress
- The effects of disadvantage
- National minimum standards
- A proficiency standard
- A competency standard
People & organisations mentioned
Previous podcast episodes on SoundCloud
If you would like to read the complete podcast transcript, here it is:
Episode 54 of The Learning Capacity Podcast
NAPLAN shows alarming achievement gaps. Grattan Institute’s Dr Peter Goss explains.
Colin Klupiec: We're here to talk about your latest report, Widening Gaps, with respect to NAPLAN. And if I could put it simply like this, there are two major areas that the report talks about and that is there is an enormous spread in student achievement, at least as reported by NAPLAN.
Dr Peter Goss: Yes.
Colin: And there's the effect of disadvantage, and we'll talk about what kinds of disadvantage there are in the moment, whether it's either the school or whether it's the student themselves. So, first of all, let's just talk about the spread of achievement. Let me start with something reasonably controversial. Many people look at NAPLAN and go, "NAPLAN we will suffer through it. We do it. We get good results."
But, actually, it's like one of those standardised tests that you might see overseas, and they are awful. Now in your report, you talk about findings that seem to indicate widening gaps in student achievement. And to give us some perspective, what sort of gap are we talking about, and how wide is that?
Peter: So first in terms of NAPLAN, having a standardised test like we do, I think we are actually fortunate as a nation to have a mirror effectively that reflects back how well we are doing that we can compare against. And we need to take that into account. In terms of what we see in this mirror, when we correct the mirror because the results that we get from NAPLAN are quite tricky to interpret, but when we translate it into what we call equivalent years of learning then we see some patterns about these widening gaps.
In year 3, the bottom 20th percentile of the students to about the top 20th percentile of the students is spread across just over two and a half years worth of learning. So that means that those top students might be working at a year four and a bit level and the bottom students are working closer to a year two level. That's a big spread already in year three.
Colin: Yeah, I was going to say already that's quite large.
Learning gaps increase as students go through school
Peter: And then the question is, does school actually function to close those gaps? Which is something that we would hope. That if a student is behind in their reading or behind in their mathematics that the teacher picks it up, identifies those gaps, helps the student close them. Or do the gaps increase? And the clear finding from our report was that they increase on average as students go through school.
By year 9, the top 20 percent of the students are about 5 and a half years ahead of the bottom 20 percent of students. And if we were to look at the top 10 and bottom 10 percent of the students the gap would be wider again. Now, these are really enormous gaps, not only in terms of what the students know and can do but in also how well it will set them up for life as they finish school and move beyond.
Colin: So, I guess one way to look at it is that you might see a school that has different year levels. It's got year three, four, five, six, seven, eight, nine. And you might think those different cohorts as students in unique stages of their learning, but if you were to take a helicopter view and sort of zoom out a little bit, it's all just kind of mixed because of the differences in achievement levels across all of the years.
Peter: It's very mixed. So, some students who are in year three, their innate reading capabilities means that they have got the same level of skill as some struggling students in year seven. That doesn't mean that they should be put in a year seven class because different things are expected and also there's a different content. They would read different types of books.
But in terms of their ability to be able to understand the text and respond to it, then naturally we should be expecting a lot more from some students. And then for students who are struggling, we need to recognise what they can actually do today and not try to expect them to close the gap in one bound but to close the gap step by step.
Colin: Now, your report talks about a different way of measuring progress, which we'll come to in a minute, but it also makes the comment that there's no way that a student can catch up just by doing more of what they do at school. You talk about the rate of learning needs to accelerate. There is also a comment that seems to suggest that there's an inbuilt stagnation or a problem that's entrenched in the system. Can you elaborate on that?
Peter: So, firstly, it's absolutely possible for many students to catch up and accelerate their rate of learning. The only way that's going to realistically happen is if teachers identify what they know now and what they don't know and teach them that next piece.
I'll give you a little bit of a story on that, and then we can explain why this is hard. A year four student in writing that a teacher was telling me about, was a couple of years behind in her writing. And her teacher had been struggling to get her to improve.
The teacher took what is called a learning continuum, and I talked about these in my last report, and asked the little girl, "What can you do now?"
She knew exactly what she could do. And instead of saying, "Here are the six steps you need to jump," she said, "Well, let's just focus on what the next thing is. At the moment, you can write one sentence well. Instead of asking you to write three paragraphs, tomorrow let's focus on writing two sentences or three sentences and then building from there."
And in a term, that girl made a year's worth of progress. And not only caught up in her learning but got an amazing life experience of saying, "Actually, when you put one step in front of the other, that's going to take you a long way." So catch up is possible.
The flip side is that if you have students that are two or more years behind, and the teacher is not taking any account of that, then that girl is going to be struggling to write two or three sentences but is going to be told write three or four paragraphs. And she's going to fail at that.
She's going to feel terrible. She's not going to be learning the things that the other people in the class would be learning, and next year she will have only moved a little way forward whereas her classmates will have moved further forward. Unless something different is done, she will fall further and further out at the back. So it's not inevitable, but we are never going to close it without recognising and supporting each student.
Colin: Are we talking about interventions now, as in direct interventions, or how a teacher might...? I mean the word that is used is differentiate. But on that scale, we're talking about massive differentiation for every single student. That's very difficult in practical terms. What you are suggesting is intervention and a differentiated approach and somehow trying to make that work.
Start differentiation from the first week of primary school
Peter: That's right. So starting the differentiation from the very first week of primary school. That's one of the recommendations in the report. Particularly in disadvantaged areas where people are more likely to have fundamental gaps or even when they get to school, not be as ready to thrive when they are at school.
Starting to identify what each student knows now and getting each student to spend as much time as possible focusing on an achievable task that moves them forward. Whereas that little girl, if she had been asked to write three or four paragraphs, that's not an achievable task. That's not a good learning experience.
So the first universal level is targeting teaching for all students. When you have students who have highly specific learning needs, or difficulties, or have missed a fundamental gap, or many years behind, that's when we bring in extra support.
But if we can help differentiate the teaching from earlier on in primary school, we should have fewer students falling way at the back and those that have genuine learning difficulties, we should be able to focus our resources more precisely on them.
Repeating a year doesn’t help students catch up
Colin: Some people might suggest, though, that if there are these students who aren't progressing at the same rate, if I can put it as colloquially as this, "can't they just repeat another year"? Your report makes comment about that, and I would guess that almost anybody listening to this conversation would say, "I know someone who's repeated a year." Perhaps they have themselves. But your report suggests that that's not a good way to go. Can you talk us through that?
Peter: That's absolutely right. It's not that repeating a year should never happen. There may be specific circumstances, but repeating a year is happening too often in Australia. The reason I say too often is because the research evidence is very clear that it's ineffective. Most students who repeat a year don't progress and catch up.
In fact, their learning goes backward relative to their peers. And part of the explanation for that seems to be that if you didn't seem to learn the first time around, and you go back into a class and get exposed to the same material in the same way, why are you going to learn it the next time around? And you are also stigmatised because, colloquially, you are the dumb kid who had to repeat a year.There are ways around this.
The second part of too much is the statistics say that 1 in 12 Australian students will repeat a year at some point of their career. Whereas, in the UK, it's only one-third as much as that.
I haven't had the chance to really look into where this is happening, but that costs a lot of money because that's an extra year's worth of school for something that is ineffective. We should be aiming to reduce the amount of kids held back a year and redirecting that funding to help make sure they learn well the first time.
Colin: I guess the repeating a year thing also relates back to the entrenched failure from year to year, because if we accept that repeating is kind of okay, in spite of what the evidence says, then we're not really doing anything to stop that entrenchment.
Peter: So it's not clear how this plays out in Australia. In the US, it's crystal clear. There's been a lot of work that's done on it, and African-American students dramatically more often repeat a year or two, in some places up to one in three. So there it absolutely is entrenching the disadvantage. I hope we don't go down that path.
Colin: Well, I hope so too. Let's now take a look at what your report is talking about with a new way to compare student progress.
So anyone who's ever looked at NAPLAN numbers, and not just the certificates that the students get, will probably say, "Okay guys, what's the problem? I see lines that go up, and I also see lines that suggest that the students at the bottom end are also making improvements as well as the students in the top end, and in some cases the lines seem to narrow."
So is that a problem? Your report seems to suggest just looking at that could be a little misleading. Can you talk us through your new measure for progress?
A new way of measuring progress
Peter: Absolutely. So, as I said, I think we are fortunate to have NAPLAN as a test. The things that it's measuring are real, and they predict important things that happen later on. But it's been known since NAPLAN started that students in earlier years in school tend to gain more points than students in later years of school.
It goes further. If you are in year three and you are a low performer in year three, you tend to gain a lot of points on average, whereas if you are a high performer in year three, you tend to gain fewer points. And so, observably, in different groups of students that we looked at, the lower you start in NAPLAN points, the faster you're moving.
Imagine a cycling analogy. It's a little bit like you are cycling up an increasingly steep hill. When you are on the flat you are moving fast but the further ahead you get the steeper the hill gets and the more you slow down. That's fine. That's what is being measured.
But that property of NAPLAN makes it really hard to compare different groups of students. Let me continue the cycling analogy. Imagine that I went out riding with Cadel Evans, who's a world champion rider.
Colin: I can imagine it very well, actually. Not that I know Cadel Evans, but I'm a cyclist. So I'm hearing you.
Peter: Fantastic. Even if I went out riding with you on this course that starts on a flat and goes upwards, then you or Cadel would shoot off ahead of me, and that's fine. At some point, if we took a snapshot of how fast we were each riding, Cadel, who would be riding up the hill might be doing 15 kilometers an hour, I'm still mostly on the flat. I might be doing 30 kilometers an hour.
Now, on the speedos, as a matter of fact, I am riding faster than Cadel Evans in this scenario. But that's not really the question that I want to know. What I want to know is am I riding better than Cadel Evans? And the answer is clearly no, because when I get to the hill I'll be doing 10. When he was on the flat, he was doing 45.
Unless we find a way of comparing how fast we were riding over the same part of the course, then we can't do the types of comparisons we want to. Because in that case, again, I will never catch up to Cadel, even though I'm riding faster. I will never get to him because as I get to the hill, I will always be going slower than he was at that point. So let's take this back NAPLAN.
The equivalent year of learning
NAPLAN, you gain a lot of points from a low base and that slows down. What we have done is to correct for that curve, and we've translated it into a measure that is actually very intuitive, which we call the equivalent year of learning. So, for year three, the equivalent level of learning for year three is what the national average student got in year three. To be precise, we use the median and there's some technical reasons for that.
Likewise in year five, in year seven and in year nine. And then in between those points we can estimate what a typical student would have got had they sat the NAPLAN test at year six, or at year six and six months.
We can also go below year three and above year nine. And a way to think about that is that if in year 9 the average student scores 585, I think that's what they do in numeracy. Actually, there's a bunch of students who in year 7 scored 585 in numeracy. They were already in that part of the course and we can see how much they progressed in the next two years.
So we can estimate what the typical student might get if they continue learning in the same way to year 11. Once we've done this, we can translate any NAPLAN score into an equivalent year of learning and then we can say...let's say that you and I were taking the mathematics test. We both took them in year five. You might be at equivalent level of learning of year five and five months. I might be at equivalent level of learning of year five and one month.
Then when we track later, we can actually meaningfully compare the difference. In this example, I was four months behind you in year five. Later on, we can see am I less than four months behind or more than four months behind, and get a new and clearer view of whether I've caught up.
Colin: So instead of just looking at a number, like a gain score, we can then start to think about it in terms of, "Well, how far along the track am I?" Which is a much more...you used the word intuitive, but I guess it's a realistic way that people can look at their own progress.
I think to be fair when you do a test at school, you get a mark and you get a percentage. And one of the things that I found, particularly when I've been in the classroom and administered tests and then given results back, the first thing students do is they go for their calculators and they work out the percentage.
They only want to know what the number is, but they don't necessarily want to think about what that number represents relative to the last time they did something of a similar nature. What your report is suggesting is that this way of looking at NAPLAN results gives us a better way of saying, "Okay, that particular student is performing at this particular year level and has made this kind of progress, which we can more easily understand."
Peter: That's right, and, again, the only way that students are going to catch up is by making faster relative progress. And this is where the nature of the curve can lead to potentially misleading interpretations if we're not careful. I'll give a bit of a crazy example. It's a real one.
But if I took some of the most disadvantaged students in Australia, those students who were low achievers in year three with parents who didn't have a high level of education, and some of the most advantaged students in Australia, those who did well in year three and had high levels of parental education, because of the curve and because you slow down the further up you go, NAPLAN would suggest that the first group, the low achievers with low educated parents, make higher NAPLAN gain scores than the more advantaged kids.
That would be lovely if it's true, but that goes against a wealth of research that says if you've got the advantage of parental education you are going to learn faster, and if you did well earlier on you are going to learn faster. The property of the curve makes it very hard to interpret. When we put that in our equivalent years of learning, we find something that resonates far more, which is that the bright kids with highly educated parents are learning at a faster rate.
The gap between the top and the bottom students is 7 years
Colin: As I said before, curves are always nice to look at on charts because if you see an upward facing or an upward trending line you go, "What's wrong with that?" Just to also come back to emphasise exactly how wide this is, I'm going to use this to preface another question. I'm actually going to have to read this out because it's staggering. You want to get the numbers right. The report claims that by year 9 the top 10 percent of students could be as much as 8 years ahead of the bottom 10 percent of students in any given class. 8 years.
Peter: Two parts, across the population eight years. Within any given class, within any given schools, seven years because each school has a subset of population. Seven years is still staggering.
Colin: I was just going to say let's call it seven years, all right? Seven years is still a lot. If you think about it, seven years is more than high school. High school is six years. Someone might say, "Well, all right, guys. You're really off on a wild tangent here. Clearly there's something either wrong with your methodology or maybe NAPLAN, which we never really liked in the first place, has got problems. But you're suggesting that we actually stick with NAPLAN.
Colin: Why is that?
3 reasons to stick with NAPLAN
Peter: NAPLAN has had a lot of work to put in to have a really good underlying progression of skill so that if you score 500, and someone else scores 510, someone else scores 520, then on average they are doing better in that order. So, it's a robust test. Secondly, it's done nationally. The third reason is that this really wide gap is not just a feature of the NAPLAN test and the way that we've interpreted it.
There are other tests out there that look at what are the specific skills that we would expect in different years? And one that I quoted in a previous report looks at that for multiplicative thinking.
And in year eight it will show that some of the students have got all of the concepts that are required in year eight, and it will show that some students are still struggling with the concept in year one. Now that's not a measurement on a test. That's saying, "Can you perform the skill that is required?"
So there are independent ways of verifying that the gap really is this large. NAPLAN gives us an unparalleled opportunity to understand these patterns across a system. We have to be careful about how we apply them, but we need to know what's happening.
Colin: We are quite lucky in Australia because we are one of the few countries that has a test like this, a national test. Is that right?
Peter: Yes, and one that is done across multiple year levels, three, five, seven, and nine, and marked on the same scale so that we can actually look at the progression of students.
National minimum standards are too low
Colin: Just before we talk about the effects of disadvantage, which I think is a very significant thing to talk about, your report suggests that national minimum standards are way too low and that we could, therefore, be seeing, in effect, an overlap. In other words, some students who appear to be okay are actually performing below minimum standard. And we either therefore raise the standard or get rid of them all together. If we push to remove minimum standards, does that make measuring progress harder?
Peter: No it doesn't. So two points, what did we see, and what should we do? What we saw is that a year nine student who is just above the cut point of the national minimum standard is performing considerably below the average year five student. In reading, it's slightly worse. It's about a year for a half percent.
Colin: That's a staggering number.
Peter: That's astonishing. So when we report nationally, and we do, that 90-something percent of Australian students are reading at or above the national minimum standard, we are actually saying 90-something percent of the students are no more than 4 and a bit years behind the average of their peers. Not good enough.
Colin: So making the test easy makes everything look okay?
Peter: If you set the bar too low it's hard to aim high. Reading at a year four and a half level, coincidentally, that means you haven't fully made the transition from learning to read to reading to learn. In order to succeed in high school, you've got to be reading in order to learn other things, not learning to read.
Colin: That's a critical tipping point, really. Isn't it?
Peter: Yeah, at about that year three, four level it is a critical tipping point. So it's set exceptionally low, and if we're only identifying that as the problem area, then I think we are giving ourselves false comfort and not supporting students who need it. What should we do about it? ACARA, to their credit, has recognised that it's set very low.
When they move to online NAPLAN, they're going to bring two higher standards, a proficiency standard, what we'd hope students get to, and a competency standard of what a stronger student will show.
We are 100 percent supportive of those measures, but the idea of national minimum standards has stuck. And accepting that some students can be four years behind and still above the threshold is unhelpful. Either it needs to get raised, or it should get dropped when we have the other standards in place.