Teachers achieving “Conscious Competence” with technology

What can be done to ensure that technology truly improves learning outcomes?

For the last twenty years, educators, governments, technology companies and publishers have built a narrative that by introducing a new technology, be it a digital book, LMS, SIS, PC, tablet or iPad, there would be an immediate improvement in student learning.

The reality to date is that no-one has established an accepted nexus between learning outcomes and the use of technology. In 2012 Higgins and his colleagues, in their meta-analysis of the numerous studies on the impact of digital technology on student learning, concluded, “Taken together, the correlational and experimental evidence does not offer a convincing case for the general impact of digital technology on learning outcome” (Higgins et al, 2012).

Apparent from multiple teacher surveys, a large proportion of teacher-technology skills lie somewhere between Conscious Incompetence and Conscious Competence. That is, somewhere between teachers being aware they lack specific technology skills and knowing the skills they have are not second nature or fluent. This being the case, the foundations on which technology can be relied on to support stronger learning outcomes, need to be shored up.

© http://www.athleteassessments.com/conscious-competence-learning-matrix/

We believe the tipping point at which technology will significantly contribute to stronger learning outcomes will be when teachers reach the level of Unconscious Competence with technology. This is when teachers, as a natural part of their professional repertoire, enhance pedagogy and student outcomes by blending the art of teaching with efficiencies and data delivered by supportive technology.

We have five suggestions we think will help technology improve learning outcomes.

1. Support teaching with technology.
Research has proven that teachers have the biggest influence on learning outcomes, not technology. It is however, far easier to make technology accessible than it is to lift teacher skills into a state of unconscious competence. We must refocus on supporting and encouraging teachers with intuitive tools that build capabilities to better inform teaching and learning.

2. Start measuring learning – stop the fixation on managing learning
Learning management is not learning measurement. For too long we have invested in technology that does not inform daily teaching and learning in an exacting context for each student. The idea that ‘I have taught it because it’s in the LMS’ has become a proxy for ‘they have learned it’, without a need for any independent check on what (if anything) has actually been learned. Technology needs to help teachers assess and measure learning.

3. Give teachers the tools to personalise teaching.
We would argue that the perceived need for more standardised ‘digitised’ curriculum content detracts from teachers focusing on having the answers to three critical questions every day. What does each student know now? What is each student ready to learn next? Where should I target and adapt my teaching? Personalised teaching happens naturally when teachers with an unconscious competence for technology are supported with quantitative capabilities.

4. Leverage data to inform teaching.
The most under-utilised, un-leveraged asset of every school is the learning data it produces every day. Schools must build a data capability and culture to surface data insights and help teachers to target teaching, improve feedback and learning outcomes. According to Scottish writer, Arthur Conan Doyle, “It is a capital mistake to theorise before one has data”. Yet, for centuries, the education industry has implemented teaching practices without any data to prove its efficacy.

5. Extend strategic outcomes with data and technology.
Improving teaching and learning outcomes using data is operationally very effective. The same data builds the foundation of the next strategic step. Machine learning and assistive intelligence (commonly referred to as artificial intelligence) offer capabilities to scale finite teacher resources to automatically predict outcomes from captured learning data. A new teacher-dedicated digital assistant can suggest, adapt and prescribe personalised learning on demand.

Mark Stanley – CEO – Founder – Literatu
http://www.literatu.com
mark@literatu.com

 

Get Insights from NAPLAN data – in 3 Screens

In response to recent media coverage of flat or backward NAPLAN results, I engaged in a correspondence with a reporter.  Here’s what I wrote:
The perspective I can offer is one that focuses on how schools get the data as opposed to beating up the test, the schools or the government.
I can tell this story in three pictures (from screenshots of our software). This said, my point is not to flog our software, but to highlight the value of EASY ACCESS to data insights and how, without this, the lack of growth is not a surprise, but is, in fact, what we should expect.
All the screens are of actual NAPLAN data, but anonymised so as not to compromise confidentiality.
1) Flat results.
This visualisation shows 6 years of NAPLAN Band achievement across years 3, 5, 7 & 9.  You can see that the real story here is one of No Growth – the results are essentially flat.  This is the story your report told today. The reason I see this slightly differently is that we have schools who are just starting to use our software so 2017/18 is THE FIRST YEAR they have been able to easily see this data (and the next screens). So the point is that, without easy access to unpacking the band scores into skills and subskills, how were schools and teachers EXPECTED to make improvements?  Thus schools and teachers worked very hard either doing the same things they have always done or guessing what needs fixing.
(click to enlarge)
2) Unpacking the Data – from Skill problems to identifying Subskills 
No matter how hard teachers work, doing more of the same doesn’t necessarily address gaps in their students’ skills. Another visualisation shows how the data from the massive spreadsheets can be visualised in a way that goes from seeing the problem to seeing what needs targeting. Here, “traffic light colours” signal problems in specific skills and clicking one of the bubbles reveals the subskills that were assessed. NOW teachers know what they can target their teaching to:
  ​
3) Give teachers Insight into the students right in their classes!
 
The fact that NAPLAN data is often 1-2 years old by the time it reaches school and public attention makes it hard to use. The tests assess skills from the preceding year (e.g., Year 3 assesses Year 2 skills), then schools find out about the results toward the end of their year with the students and here we are almost upon 2018 NAPLAN and MySchool is only now updated with 2017 NAPLAN data.  How is a classroom teacher meant to help the students in their classes today?
In the last screen animation, you can see the “Teacher Dashboard” where a school’s NAPLAN data is sliced and sorted for the actual students sitting in front of a classroom teacher.  Yes, the data may still be a year old, but now the classroom teacher can accommodate and differentiate what he / she does based upon their students. In the animation, notice that both the data in the cards and the list of students in the right column change as I switch between classes (at the top of the dashboard). When I click on the NAPLAN Weather report card for writing, I can see which 4 students went backward from their 2015 to 2017 NAPLAN tests and which 5 achieved above expected growth targets.  Then when I click the NAPLAN Skill Focus card (and its backside) I get details about the top 4 (then 8 when flipped) areas in each of the 4 NAPLAN domains where this particular class of students scored lowest.  Again, clicking on the card, sorts the students according to the skill clicked so we can see who needs the most help and who could be extended.

So, to sum up, I see a big part of the problem is that classroom teachers have not been able to access the right kind of information easily in order to use the NAPLAN data (albeit a “snapshot” and a “diagnostic assessment being used as a high-stakes test” – two legitimate complaints against NAPLAN).  In fact, we have run into the situation where one of the leading state’s association for schools takes the approach of helping schools unpack NAPLAN results through a workshop on using Excel spreadsheets!!!! In 2018!

Our schools are just this year getting such access and we work with them to take charge of their remediation programs and initiatives and expect to see upward trends as they continuously improve their teaching and learning practices.

I’d love to chat or even take you through this software as a way to point to other solutions than beating up teachers, schools or the government – not something your reporting has ever done, but these bash-ups tend to be what’s buzzing in the media.  Perhaps a better, more productive approach is to use smart software to provide data insights?

Grattan Institute’s Adaptive Education Report

We welcome the Grattan Institute’s recent report, “Towards an adaptive education system in Australia.”  In it, researcher Peter Goss argues that “our current education system is not fit for purpose given the complex challenges it faces.”  These challenges are familiar to anyone interested in Australian education: the flat or backwards performance on important tests, the number of students not finding success after high school and inequality between schools.  Goss rightly identifies the two key aspects to addressing these are that changes to education must be systemic and based on real evidence.

“The status quo is not working”, says Goss.  We see this in NAPLAN Band ranges

Many have been arguing this case for years and championed specific pedagogical approaches such as Problem-based Learning, Understanding by Design and STEM to name only a few.  In fact, I have been involved in many of these initiatives – and saw them fail to make the systemic change required and advocated for by Goss.  We are past the era of needing “new ideas,” but instead need to put these (and many other) ideas to the test.  The “Adaptive Educational” model put forth by Goss will be familiar to those who have pursued a “closed-loop” or “continuous improvement” process.  But like Goss, we find few such efforts used in ways that effect whole-school or sector change.  This is not for lack of trying on the part of schools and teachers, but from a lack of good data.

Fortunately, the ability to use data as evidence is more possible today than it was a decade ago.  The main reason for this readiness is twofold: a growing cultural appreciation of “Big Data” and as well as the sophistication of the tools required to make these data insights available to schools and their communities.

For over four years, Literatu has been developing powerful analytical software for schools and we can confirm a general “flat or backward” direction of student performance in NAPLAN scores.  But we are seeing something very powerful as well.  School leadership teams and whole staff rooms are excited and energised to engage in just the targeted type of teaching identified as essential by the Grattan Institute’s report.  At issue was not an unwillingness of schools to take such action, but the fact that students’ learning gaps were buried in spreadsheets and hard-to-use software.  What seems to be a dawning realisation by schools that “there must be a better way” has happily led to a boom in schools’ use of Literatu’s NAPLAN Explorer.  This diagnostic tool provides easy access to detailed information in a friendly dashboard so that classroom teachers – not just school leaders – can quickly gain insights that naturally lead to targeted teaching and differentiation.  What’s even better is that these teacher actions generate new data on student performance which feeds-back to validate or challenge the effectiveness of the interventions trialled.  This is such an exciting time to be an educator because after decades of working “in the dark,” real evidence is at our fingertips and a single-click away.  To repeat a very apt phrase, data-inspired teaching “is like what you’ve always done, but unlike anything you’ve done before.”

We encourage schools interested in seeing how easily teachers can grow an adaptive educational system to contact us for a friendly online demonstration.

 

 

Future of Technology Essay Contest

How Will Technology Impact Our Future?

Prompt: Reflect on how technology touches your life and how rapid advancements might change the way we live, learn, work or connect with others in the future.

Teachers can Register here

Here’s an animation of possible topics to get you thinking!

 

 

 

 

Fantastic Resource Page on Assessment of PBL

Andrew Miller’s gathered a great resource list of links to help teachers interested strategies for assessing students engaged in Problem-based Learning.

What strategies have you found useful? How often do you use PBL?

Data Helps Students Dodge a Hurricane

Eighth grade US teacher Sarah Beachkofsky tells the story of how she was able to help students to progress their learning even after suffering the devastation of Hurricane Matthew. As an educator committed to using data to learn about her students as well as motivate them, she shares an insightful and compelling story. When you lose precious classtime (due to a natural disaster or a school-wide sport carnival 🙂 how can data help make up for lost time?

How do you use data to support your goals for students?

Please share your ideas!
image thanks to NASA Goddard Space Flight Center