NAPLAN 2018 – Here we Go! (Where is that?)

A Bit of Perspective

So it’s that time of year again and we all get to chime in with our take on this thing called NAPLAN.

Here’s my short answer: a test that started out as a diagnostic assessment meant for schools to use to improve their teaching and learning missed the mark because of poor execution – skills assessed were mostly based on expected learning from the previous year and the results only get back to schools toward the end of the year so there’s little chance to adjust or target strategies because the students are just about out the door for summer holidays and the next year level.  Add to this that the data schools get are usually inscrutable and only on the desks of learning leaders and it’s again not surprising that what was meant as a diagnostic hasn’t achieved its purpose.  On top of all that, of course, is the politics of league tables (that aren’t league tables) and what was meant as info for schools has been shifted into a high-stakes test for students and it’s no wonder people haver strong opinions about NAPLAN.

But it’s not all bad. What we do get from NAPLAN is THE ONLY relatively valid and reliable assessment across year levels 3, 5, 7, & 9 and over years.  Now if we add some smart technology (instead of workshops on Excel) and put the data insights into the hands of teachers, that NAPLAN data set becomes quite useful.  Let me explain.

Insight from NAPLAN in 2 Screens

The Classroom Teacher Dashboard

Imagine starting a school year with information about students’ core skills so that you can skip quite a bit of baseline assessments. Here’s an animation of the Teacher Dashboard in Literatu’s Diagnostic Explorer.  When the mouse clicks on the NAPLAN Growth Weather card, notice the class list in the right column is sorted.  If a Biology teacher were about to give a challenging reading assignment to this particular group of students, he or she would immediately get data insights into students who might need a little more support. This data is generated by comparing the two most recent NAPLAN tests and highlighting which students have grown or gone backward in this skill.  Of course NAPLAN is a snapshot, but it is data and I have yet to meet a teacher who isn’t interested in which students could use more help or extension.

Student Level Skill Insight

Besides needing to get this info right to classroom teachers and their students for NAPLAN to make an impact, we also need to get FAR BEYOND bands and dig down to skills and sub-skills at the student level.  Here’s one of my favourite reports that applies traffic light colours to show which areas are strong and which are weak and uses the size of the bubbles to indicate how many questions were in this skill area. In effect, this is NAPLAN telling us how important this skill is (e.g., “Applied comprehension in reading” requires more questions to discern an accurate measure than accurately using contractions).  The animation below shows this at the class level, but then shows how we can find areas to help even very strong students. Notice that you can see the specific skill areas.  This is how schools and teachers can help their class and individual student improve in skill gaps as identified by NAPLAN.  Many schools say they want to show this screen to parents to highlight how they are addressing student needs.


So as 2018 NAPLAN rolls past, we look forward to helping schools get the most from this data set. NAPLAN:  one useful means to target improvement in teaching and learning.



Disclaimer: Literatu provides the above Diagnostic Explorer to schools, which  gives such data insights and visualisations on NAPLAN/ PAT/ Allwell tests.  So you might say we have an interest in NAPLAN. And we do, however, Literatu also offers a full formative suite that exactly addresses the call from the Gonski 2.0 report for richer, ongoing formative diagnostic assessment of students on a regular basic. As a company, Literatu will adapt the Diagnostic Explorer to whatever output ACARA provides for the “adaptive” NAPLAN or any iteration of tests that might follow NAPLAN.  The reason for this disclaimer is to highlight that Literatu’s business is to help schools make sense of data chaos, regardless of the assessments used. We help schools get better. Easier. Faster.

Get Insights from NAPLAN data – in 3 Screens

In response to recent media coverage of flat or backward NAPLAN results, I engaged in a correspondence with a reporter.  Here’s what I wrote:
The perspective I can offer is one that focuses on how schools get the data as opposed to beating up the test, the schools or the government.
I can tell this story in three pictures (from screenshots of our software). This said, my point is not to flog our software, but to highlight the value of EASY ACCESS to data insights and how, without this, the lack of growth is not a surprise, but is, in fact, what we should expect.
All the screens are of actual NAPLAN data, but anonymised so as not to compromise confidentiality.
1) Flat results.
This visualisation shows 6 years of NAPLAN Band achievement across years 3, 5, 7 & 9.  You can see that the real story here is one of No Growth – the results are essentially flat.  This is the story your report told today. The reason I see this slightly differently is that we have schools who are just starting to use our software so 2017/18 is THE FIRST YEAR they have been able to easily see this data (and the next screens). So the point is that, without easy access to unpacking the band scores into skills and subskills, how were schools and teachers EXPECTED to make improvements?  Thus schools and teachers worked very hard either doing the same things they have always done or guessing what needs fixing.
(click to enlarge)
2) Unpacking the Data – from Skill problems to identifying Subskills 
No matter how hard teachers work, doing more of the same doesn’t necessarily address gaps in their students’ skills. Another visualisation shows how the data from the massive spreadsheets can be visualised in a way that goes from seeing the problem to seeing what needs targeting. Here, “traffic light colours” signal problems in specific skills and clicking one of the bubbles reveals the subskills that were assessed. NOW teachers know what they can target their teaching to:
3) Give teachers Insight into the students right in their classes!
The fact that NAPLAN data is often 1-2 years old by the time it reaches school and public attention makes it hard to use. The tests assess skills from the preceding year (e.g., Year 3 assesses Year 2 skills), then schools find out about the results toward the end of their year with the students and here we are almost upon 2018 NAPLAN and MySchool is only now updated with 2017 NAPLAN data.  How is a classroom teacher meant to help the students in their classes today?
In the last screen animation, you can see the “Teacher Dashboard” where a school’s NAPLAN data is sliced and sorted for the actual students sitting in front of a classroom teacher.  Yes, the data may still be a year old, but now the classroom teacher can accommodate and differentiate what he / she does based upon their students. In the animation, notice that both the data in the cards and the list of students in the right column change as I switch between classes (at the top of the dashboard). When I click on the NAPLAN Weather report card for writing, I can see which 4 students went backward from their 2015 to 2017 NAPLAN tests and which 5 achieved above expected growth targets.  Then when I click the NAPLAN Skill Focus card (and its backside) I get details about the top 4 (then 8 when flipped) areas in each of the 4 NAPLAN domains where this particular class of students scored lowest.  Again, clicking on the card, sorts the students according to the skill clicked so we can see who needs the most help and who could be extended.

So, to sum up, I see a big part of the problem is that classroom teachers have not been able to access the right kind of information easily in order to use the NAPLAN data (albeit a “snapshot” and a “diagnostic assessment being used as a high-stakes test” – two legitimate complaints against NAPLAN).  In fact, we have run into the situation where one of the leading state’s association for schools takes the approach of helping schools unpack NAPLAN results through a workshop on using Excel spreadsheets!!!! In 2018!

Our schools are just this year getting such access and we work with them to take charge of their remediation programs and initiatives and expect to see upward trends as they continuously improve their teaching and learning practices.

I’d love to chat or even take you through this software as a way to point to other solutions than beating up teachers, schools or the government – not something your reporting has ever done, but these bash-ups tend to be what’s buzzing in the media.  Perhaps a better, more productive approach is to use smart software to provide data insights?

NAPLAN Resources

As schools find gaps in students’ core skills across the NAPLAN domains, the following resources might be helpful.  Please use the comments link to share your favourite resources.


Language Conventions



General Resources

  • ABC Splash – Games across subjects and year levels

From NSW SMART Teaching

Language Conventions (Spelling & Grammar)




Grattan Institute’s Adaptive Education Report

We welcome the Grattan Institute’s recent report, “Towards an adaptive education system in Australia.”  In it, researcher Peter Goss argues that “our current education system is not fit for purpose given the complex challenges it faces.”  These challenges are familiar to anyone interested in Australian education: the flat or backwards performance on important tests, the number of students not finding success after high school and inequality between schools.  Goss rightly identifies the two key aspects to addressing these are that changes to education must be systemic and based on real evidence.

“The status quo is not working”, says Goss.  We see this in NAPLAN Band ranges

Many have been arguing this case for years and championed specific pedagogical approaches such as Problem-based Learning, Understanding by Design and STEM to name only a few.  In fact, I have been involved in many of these initiatives – and saw them fail to make the systemic change required and advocated for by Goss.  We are past the era of needing “new ideas,” but instead need to put these (and many other) ideas to the test.  The “Adaptive Educational” model put forth by Goss will be familiar to those who have pursued a “closed-loop” or “continuous improvement” process.  But like Goss, we find few such efforts used in ways that effect whole-school or sector change.  This is not for lack of trying on the part of schools and teachers, but from a lack of good data.

Fortunately, the ability to use data as evidence is more possible today than it was a decade ago.  The main reason for this readiness is twofold: a growing cultural appreciation of “Big Data” and as well as the sophistication of the tools required to make these data insights available to schools and their communities.

For over four years, Literatu has been developing powerful analytical software for schools and we can confirm a general “flat or backward” direction of student performance in NAPLAN scores.  But we are seeing something very powerful as well.  School leadership teams and whole staff rooms are excited and energised to engage in just the targeted type of teaching identified as essential by the Grattan Institute’s report.  At issue was not an unwillingness of schools to take such action, but the fact that students’ learning gaps were buried in spreadsheets and hard-to-use software.  What seems to be a dawning realisation by schools that “there must be a better way” has happily led to a boom in schools’ use of Literatu’s NAPLAN Explorer.  This diagnostic tool provides easy access to detailed information in a friendly dashboard so that classroom teachers – not just school leaders – can quickly gain insights that naturally lead to targeted teaching and differentiation.  What’s even better is that these teacher actions generate new data on student performance which feeds-back to validate or challenge the effectiveness of the interventions trialled.  This is such an exciting time to be an educator because after decades of working “in the dark,” real evidence is at our fingertips and a single-click away.  To repeat a very apt phrase, data-inspired teaching “is like what you’ve always done, but unlike anything you’ve done before.”

We encourage schools interested in seeing how easily teachers can grow an adaptive educational system to contact us for a friendly online demonstration.



NAPLAN and Career Aspirations


A recent article in the Sydney Morning Herald, Year 5 NAPLAN scores could shape career goals: study, by Pallavi Singhal shares research from Professor Jenny Gore of the University of Newcastle.  The main focus of the study seems to be the correlation found between Year 5 NAPLAN results and student aspirations.  Clearly “typecasting” students and their ability and potential is never helpful, but the last line of the article particularly resonated with the work we do at Literatu.  Out purpose is to create clever algorithms that spin analytical insights making it easy for teachers to see the real gaps in student skills – completely moving away from those unhelpful blanket statements about being “good” or “bad” in maths or English.  We need to move far deeper into NAPLAN than the band scores if we hope to help lift the very typical “flat line” we see quite often.

So what was the last line?:

“It’s a messy, complex world we work in, in teaching.”

True enough, but I thought this should not be a lament, but a starting point given our promise to schools:

“Transform data chaos into student success.”

And this is the current reality!

The Twofold Importance of Evidence

Every school has a vision for student success – sometimes this is clearly defined and supported with learning principles and at others it’s less defined but clearly part of the school’s “ethos” or values that inspire teachers every day.  This shared sense of purpose is essential so that everyone’s working toward the same goal.

Once a rich vision for student achievement is shared by all staff, students and parents, I see “evidence” as the next important step. This is two main reasons.  First, in the spirit of “backward design,” we should develop measures that capture evidence of what we’re after before we consider teaching to strategies to achieve it.  This way, what we do in the classroom won’t just be “good strategies,” but good strategies chosen to produce the desired results.  In other words, if the measures for evidence are well-designed, their fulfilment provides validation that the vision has been achieved.

Another reason for setting “evidence” as the second step is that “testing” communicates what “really matters.”  Within the last month, approximately one million students in Australia sat the NAPLAN tests for Literacy and Numeracy. I use NAPLAN now, however, to illustrate a fundamental truth: the very act of assessing defines what’s important.  Literacy and Numeracy are two of the seven General Capabilities meant to underpin the Australian Curriculum, but how often do we target success in the other five.  How many parents or students could name even one? Yet few would deny the importance of the other five: ICT capability, Critical and creative thinking, Personal and social capability, Ethical understanding and Intercultural understanding.  Thus, even though all seven apparently warranted inclusion in the Australian Curriculum, it’s clear which two “matter.”

My point is not to downgrade the importance of being literate or developing mathematical capabilities, but to illustrate that the very act of testing something makes it import.  And important not only to teachers, but then, clearly to students, parents and ultimately the wider community.

For this reason I suggest that schools consider developing a few richer assessments and build these into the curriculum. Consider setting such performances across the year levels so students can demonstrate their achievements with increasing sophistication as they mature and develop their abilities.  Many schools use such an approach to encourage student wellness and “21st Century” skills.

What are you doing at your school?  Please share your experiences and insights.

gavel graphic from