NAPLAN 2018 – Here we Go! (Where is that?)

A Bit of Perspective

So it’s that time of year again and we all get to chime in with our take on this thing called NAPLAN.

Here’s my short answer: a test that started out as a diagnostic assessment meant for schools to use to improve their teaching and learning missed the mark because of poor execution – skills assessed were mostly based on expected learning from the previous year and the results only get back to schools toward the end of the year so there’s little chance to adjust or target strategies because the students are just about out the door for summer holidays and the next year level.  Add to this that the data schools get are usually inscrutable and only on the desks of learning leaders and it’s again not surprising that what was meant as a diagnostic hasn’t achieved its purpose.  On top of all that, of course, is the politics of league tables (that aren’t league tables) and what was meant as info for schools has been shifted into a high-stakes test for students and it’s no wonder people haver strong opinions about NAPLAN.

But it’s not all bad. What we do get from NAPLAN is THE ONLY relatively valid and reliable assessment across year levels 3, 5, 7, & 9 and over years.  Now if we add some smart technology (instead of workshops on Excel) and put the data insights into the hands of teachers, that NAPLAN data set becomes quite useful.  Let me explain.

Insight from NAPLAN in 2 Screens

The Classroom Teacher Dashboard

Imagine starting a school year with information about students’ core skills so that you can skip quite a bit of baseline assessments. Here’s an animation of the Teacher Dashboard in Literatu’s Diagnostic Explorer.  When the mouse clicks on the NAPLAN Growth Weather card, notice the class list in the right column is sorted.  If a Biology teacher were about to give a challenging reading assignment to this particular group of students, he or she would immediately get data insights into students who might need a little more support. This data is generated by comparing the two most recent NAPLAN tests and highlighting which students have grown or gone backward in this skill.  Of course NAPLAN is a snapshot, but it is data and I have yet to meet a teacher who isn’t interested in which students could use more help or extension.

Student Level Skill Insight

Besides needing to get this info right to classroom teachers and their students for NAPLAN to make an impact, we also need to get FAR BEYOND bands and dig down to skills and sub-skills at the student level.  Here’s one of my favourite reports that applies traffic light colours to show which areas are strong and which are weak and uses the size of the bubbles to indicate how many questions were in this skill area. In effect, this is NAPLAN telling us how important this skill is (e.g., “Applied comprehension in reading” requires more questions to discern an accurate measure than accurately using contractions).  The animation below shows this at the class level, but then shows how we can find areas to help even very strong students. Notice that you can see the specific skill areas.  This is how schools and teachers can help their class and individual student improve in skill gaps as identified by NAPLAN.  Many schools say they want to show this screen to parents to highlight how they are addressing student needs.

 

So as 2018 NAPLAN rolls past, we look forward to helping schools get the most from this data set. NAPLAN:  one useful means to target improvement in teaching and learning.

 

 

Disclaimer: Literatu provides the above Diagnostic Explorer to schools, which  gives such data insights and visualisations on NAPLAN/ PAT/ Allwell tests.  So you might say we have an interest in NAPLAN. And we do, however, Literatu also offers a full formative suite that exactly addresses the call from the Gonski 2.0 report for richer, ongoing formative diagnostic assessment of students on a regular basic. As a company, Literatu will adapt the Diagnostic Explorer to whatever output ACARA provides for the “adaptive” NAPLAN or any iteration of tests that might follow NAPLAN.  The reason for this disclaimer is to highlight that Literatu’s business is to help schools make sense of data chaos, regardless of the assessments used. We help schools get better. Easier. Faster.

Get Insights from NAPLAN data – in 3 Screens

In response to recent media coverage of flat or backward NAPLAN results, I engaged in a correspondence with a reporter.  Here’s what I wrote:
The perspective I can offer is one that focuses on how schools get the data as opposed to beating up the test, the schools or the government.
I can tell this story in three pictures (from screenshots of our software). This said, my point is not to flog our software, but to highlight the value of EASY ACCESS to data insights and how, without this, the lack of growth is not a surprise, but is, in fact, what we should expect.
All the screens are of actual NAPLAN data, but anonymised so as not to compromise confidentiality.
1) Flat results.
This visualisation shows 6 years of NAPLAN Band achievement across years 3, 5, 7 & 9.  You can see that the real story here is one of No Growth – the results are essentially flat.  This is the story your report told today. The reason I see this slightly differently is that we have schools who are just starting to use our software so 2017/18 is THE FIRST YEAR they have been able to easily see this data (and the next screens). So the point is that, without easy access to unpacking the band scores into skills and subskills, how were schools and teachers EXPECTED to make improvements?  Thus schools and teachers worked very hard either doing the same things they have always done or guessing what needs fixing.
(click to enlarge)
2) Unpacking the Data – from Skill problems to identifying Subskills 
No matter how hard teachers work, doing more of the same doesn’t necessarily address gaps in their students’ skills. Another visualisation shows how the data from the massive spreadsheets can be visualised in a way that goes from seeing the problem to seeing what needs targeting. Here, “traffic light colours” signal problems in specific skills and clicking one of the bubbles reveals the subskills that were assessed. NOW teachers know what they can target their teaching to:
  ​
3) Give teachers Insight into the students right in their classes!
 
The fact that NAPLAN data is often 1-2 years old by the time it reaches school and public attention makes it hard to use. The tests assess skills from the preceding year (e.g., Year 3 assesses Year 2 skills), then schools find out about the results toward the end of their year with the students and here we are almost upon 2018 NAPLAN and MySchool is only now updated with 2017 NAPLAN data.  How is a classroom teacher meant to help the students in their classes today?
In the last screen animation, you can see the “Teacher Dashboard” where a school’s NAPLAN data is sliced and sorted for the actual students sitting in front of a classroom teacher.  Yes, the data may still be a year old, but now the classroom teacher can accommodate and differentiate what he / she does based upon their students. In the animation, notice that both the data in the cards and the list of students in the right column change as I switch between classes (at the top of the dashboard). When I click on the NAPLAN Weather report card for writing, I can see which 4 students went backward from their 2015 to 2017 NAPLAN tests and which 5 achieved above expected growth targets.  Then when I click the NAPLAN Skill Focus card (and its backside) I get details about the top 4 (then 8 when flipped) areas in each of the 4 NAPLAN domains where this particular class of students scored lowest.  Again, clicking on the card, sorts the students according to the skill clicked so we can see who needs the most help and who could be extended.

So, to sum up, I see a big part of the problem is that classroom teachers have not been able to access the right kind of information easily in order to use the NAPLAN data (albeit a “snapshot” and a “diagnostic assessment being used as a high-stakes test” – two legitimate complaints against NAPLAN).  In fact, we have run into the situation where one of the leading state’s association for schools takes the approach of helping schools unpack NAPLAN results through a workshop on using Excel spreadsheets!!!! In 2018!

Our schools are just this year getting such access and we work with them to take charge of their remediation programs and initiatives and expect to see upward trends as they continuously improve their teaching and learning practices.

I’d love to chat or even take you through this software as a way to point to other solutions than beating up teachers, schools or the government – not something your reporting has ever done, but these bash-ups tend to be what’s buzzing in the media.  Perhaps a better, more productive approach is to use smart software to provide data insights?

Grattan Institute’s Adaptive Education Report

We welcome the Grattan Institute’s recent report, “Towards an adaptive education system in Australia.”  In it, researcher Peter Goss argues that “our current education system is not fit for purpose given the complex challenges it faces.”  These challenges are familiar to anyone interested in Australian education: the flat or backwards performance on important tests, the number of students not finding success after high school and inequality between schools.  Goss rightly identifies the two key aspects to addressing these are that changes to education must be systemic and based on real evidence.

“The status quo is not working”, says Goss.  We see this in NAPLAN Band ranges

Many have been arguing this case for years and championed specific pedagogical approaches such as Problem-based Learning, Understanding by Design and STEM to name only a few.  In fact, I have been involved in many of these initiatives – and saw them fail to make the systemic change required and advocated for by Goss.  We are past the era of needing “new ideas,” but instead need to put these (and many other) ideas to the test.  The “Adaptive Educational” model put forth by Goss will be familiar to those who have pursued a “closed-loop” or “continuous improvement” process.  But like Goss, we find few such efforts used in ways that effect whole-school or sector change.  This is not for lack of trying on the part of schools and teachers, but from a lack of good data.

Fortunately, the ability to use data as evidence is more possible today than it was a decade ago.  The main reason for this readiness is twofold: a growing cultural appreciation of “Big Data” and as well as the sophistication of the tools required to make these data insights available to schools and their communities.

For over four years, Literatu has been developing powerful analytical software for schools and we can confirm a general “flat or backward” direction of student performance in NAPLAN scores.  But we are seeing something very powerful as well.  School leadership teams and whole staff rooms are excited and energised to engage in just the targeted type of teaching identified as essential by the Grattan Institute’s report.  At issue was not an unwillingness of schools to take such action, but the fact that students’ learning gaps were buried in spreadsheets and hard-to-use software.  What seems to be a dawning realisation by schools that “there must be a better way” has happily led to a boom in schools’ use of Literatu’s NAPLAN Explorer.  This diagnostic tool provides easy access to detailed information in a friendly dashboard so that classroom teachers – not just school leaders – can quickly gain insights that naturally lead to targeted teaching and differentiation.  What’s even better is that these teacher actions generate new data on student performance which feeds-back to validate or challenge the effectiveness of the interventions trialled.  This is such an exciting time to be an educator because after decades of working “in the dark,” real evidence is at our fingertips and a single-click away.  To repeat a very apt phrase, data-inspired teaching “is like what you’ve always done, but unlike anything you’ve done before.”

We encourage schools interested in seeing how easily teachers can grow an adaptive educational system to contact us for a friendly online demonstration.

 

 

World Champ Astros and Analytics

Part of the Washington Post’s coverage of the Houston Astro’s World Series triumph is especially interesting.  In spite of the foreboding title:

Astros’ World Series win may be remembered as the moment analytics conquered MLB for good

much of the article focuses on the importance of the human – in concert with data analytics.

I suggest two main elements are developed in the article, both of which are worthwhile for education to consider.  First, this statement:

“Our game has evolved to the point to where everyone has to choose to what extent they apply” analytics, Hinch said. “We all have them — really smart people that are working behind the scenes to provide that kind of information. How you use them is going to be the competitive advantage. If we think we have different ways to maximize performance, we’re going to use them.”

What I like about this is the double insight that it’s a “given” that data analytics are important and that we all need to use them, but also, that the real trick is what you do with the insights. This goes to an idea I’ve talked about for a while, “schools should take a ‘Big Mother’ not a ‘Big Brother’ approach to collecting student data.  Caring for, and trying to make the most significant contribution, is what drives schools – leveraging data to better reach these goals should drive use of data analytics.

The second aspect that stood out in the article was the import role Astros’ coach A.J. Hinch played as the very human communications link between the “Decision” scientists and the players.  Similarly, school leaders need to inform, but not overwhelm teachers (and students and parents) with the role and use of data without losing sight of the “main game.”