NAPLAN 2018 – Here we Go! (Where is that?)

A Bit of Perspective

So it’s that time of year again and we all get to chime in with our take on this thing called NAPLAN.

Here’s my short answer: a test that started out as a diagnostic assessment meant for schools to use to improve their teaching and learning missed the mark because of poor execution – skills assessed were mostly based on expected learning from the previous year and the results only get back to schools toward the end of the year so there’s little chance to adjust or target strategies because the students are just about out the door for summer holidays and the next year level.  Add to this that the data schools get are usually inscrutable and only on the desks of learning leaders and it’s again not surprising that what was meant as a diagnostic hasn’t achieved its purpose.  On top of all that, of course, is the politics of league tables (that aren’t league tables) and what was meant as info for schools has been shifted into a high-stakes test for students and it’s no wonder people haver strong opinions about NAPLAN.

But it’s not all bad. What we do get from NAPLAN is THE ONLY relatively valid and reliable assessment across year levels 3, 5, 7, & 9 and over years.  Now if we add some smart technology (instead of workshops on Excel) and put the data insights into the hands of teachers, that NAPLAN data set becomes quite useful.  Let me explain.

Insight from NAPLAN in 2 Screens

The Classroom Teacher Dashboard

Imagine starting a school year with information about students’ core skills so that you can skip quite a bit of baseline assessments. Here’s an animation of the Teacher Dashboard in Literatu’s Diagnostic Explorer.  When the mouse clicks on the NAPLAN Growth Weather card, notice the class list in the right column is sorted.  If a Biology teacher were about to give a challenging reading assignment to this particular group of students, he or she would immediately get data insights into students who might need a little more support. This data is generated by comparing the two most recent NAPLAN tests and highlighting which students have grown or gone backward in this skill.  Of course NAPLAN is a snapshot, but it is data and I have yet to meet a teacher who isn’t interested in which students could use more help or extension.

Student Level Skill Insight

Besides needing to get this info right to classroom teachers and their students for NAPLAN to make an impact, we also need to get FAR BEYOND bands and dig down to skills and sub-skills at the student level.  Here’s one of my favourite reports that applies traffic light colours to show which areas are strong and which are weak and uses the size of the bubbles to indicate how many questions were in this skill area. In effect, this is NAPLAN telling us how important this skill is (e.g., “Applied comprehension in reading” requires more questions to discern an accurate measure than accurately using contractions).  The animation below shows this at the class level, but then shows how we can find areas to help even very strong students. Notice that you can see the specific skill areas.  This is how schools and teachers can help their class and individual student improve in skill gaps as identified by NAPLAN.  Many schools say they want to show this screen to parents to highlight how they are addressing student needs.


So as 2018 NAPLAN rolls past, we look forward to helping schools get the most from this data set. NAPLAN:  one useful means to target improvement in teaching and learning.



Disclaimer: Literatu provides the above Diagnostic Explorer to schools, which  gives such data insights and visualisations on NAPLAN/ PAT/ Allwell tests.  So you might say we have an interest in NAPLAN. And we do, however, Literatu also offers a full formative suite that exactly addresses the call from the Gonski 2.0 report for richer, ongoing formative diagnostic assessment of students on a regular basic. As a company, Literatu will adapt the Diagnostic Explorer to whatever output ACARA provides for the “adaptive” NAPLAN or any iteration of tests that might follow NAPLAN.  The reason for this disclaimer is to highlight that Literatu’s business is to help schools make sense of data chaos, regardless of the assessments used. We help schools get better. Easier. Faster.

Teachers achieving “Conscious Competence” with technology

What can be done to ensure that technology truly improves learning outcomes?

For the last twenty years, educators, governments, technology companies and publishers have built a narrative that by introducing a new technology, be it a digital book, LMS, SIS, PC, tablet or iPad, there would be an immediate improvement in student learning.

The reality to date is that no-one has established an accepted nexus between learning outcomes and the use of technology. In 2012 Higgins and his colleagues, in their meta-analysis of the numerous studies on the impact of digital technology on student learning, concluded, “Taken together, the correlational and experimental evidence does not offer a convincing case for the general impact of digital technology on learning outcome” (Higgins et al, 2012).

Apparent from multiple teacher surveys, a large proportion of teacher-technology skills lie somewhere between Conscious Incompetence and Conscious Competence. That is, somewhere between teachers being aware they lack specific technology skills and knowing the skills they have are not second nature or fluent. This being the case, the foundations on which technology can be relied on to support stronger learning outcomes, need to be shored up.


We believe the tipping point at which technology will significantly contribute to stronger learning outcomes will be when teachers reach the level of Unconscious Competence with technology. This is when teachers, as a natural part of their professional repertoire, enhance pedagogy and student outcomes by blending the art of teaching with efficiencies and data delivered by supportive technology.

We have five suggestions we think will help technology improve learning outcomes.

1. Support teaching with technology.
Research has proven that teachers have the biggest influence on learning outcomes, not technology. It is however, far easier to make technology accessible than it is to lift teacher skills into a state of unconscious competence. We must refocus on supporting and encouraging teachers with intuitive tools that build capabilities to better inform teaching and learning.

2. Start measuring learning – stop the fixation on managing learning
Learning management is not learning measurement. For too long we have invested in technology that does not inform daily teaching and learning in an exacting context for each student. The idea that ‘I have taught it because it’s in the LMS’ has become a proxy for ‘they have learned it’, without a need for any independent check on what (if anything) has actually been learned. Technology needs to help teachers assess and measure learning.

3. Give teachers the tools to personalise teaching.
We would argue that the perceived need for more standardised ‘digitised’ curriculum content detracts from teachers focusing on having the answers to three critical questions every day. What does each student know now? What is each student ready to learn next? Where should I target and adapt my teaching? Personalised teaching happens naturally when teachers with an unconscious competence for technology are supported with quantitative capabilities.

4. Leverage data to inform teaching.
The most under-utilised, un-leveraged asset of every school is the learning data it produces every day. Schools must build a data capability and culture to surface data insights and help teachers to target teaching, improve feedback and learning outcomes. According to Scottish writer, Arthur Conan Doyle, “It is a capital mistake to theorise before one has data”. Yet, for centuries, the education industry has implemented teaching practices without any data to prove its efficacy.

5. Extend strategic outcomes with data and technology.
Improving teaching and learning outcomes using data is operationally very effective. The same data builds the foundation of the next strategic step. Machine learning and assistive intelligence (commonly referred to as artificial intelligence) offer capabilities to scale finite teacher resources to automatically predict outcomes from captured learning data. A new teacher-dedicated digital assistant can suggest, adapt and prescribe personalised learning on demand.

Mark Stanley – CEO – Founder – Literatu


Horizon Report 2017 – Upcoming Trends

The Horizon Report is a useful marking of near and long term trends in educational technology. The Preview version is available for K12 and highlights some interesting things. Of particular note is the mid-term trend related to the growing focus on measuring learning. “Mid-term” is estimated to be adopted widely in 3-5 years. As an early leader in this area, we’re glad to see such growing attention and interest:

The passage on Measuring Learning states:

The proliferation of data mining software and developments in online education, mobile learning, and learning management systems are coalescing toward learning environments that leverage analytics and visualization software to portray learning data in a multidimensional and portable manner.

We believe that when teachers have access to clever and easy learning analytics, it empowers them to either act on or challenge their instincts. We like to call what we do “AI,” but instead of “artificial intelligence” we try to make it more real and less scary: how about “Awesome Insights” or even “Actual Information”?

We’d love to hear from you about what you’re looking for as data-informed support.

Fantastic Resource Page on Assessment of PBL

Andrew Miller’s gathered a great resource list of links to help teachers interested strategies for assessing students engaged in Problem-based Learning.

What strategies have you found useful? How often do you use PBL?