Log In


Your membership number
(this must be six digits long and may include zeros, e.g. 001234)

Initially set as your family name in lower-case but you may change it after you have logged in by clicking Your Details

Please enter a username and a password
Back

Checking membership credentials

Logging in

Login Failed
Back
Home > News > NAPLAN: only the test to blame for results

NAPLAN: only the test to blame for results

NAPLAN_2.jpgThe disappointing preliminary results from the 2015 National Assessment Program – Literacy and Numeracy (NAPLAN) indicates a problem with the test rather than the teaching, according to a Charles Sturt University (CSU) literacy expert.

Preliminary results from NAPLAN tests were released in early August and showed that since the tests were introduced in primary and secondary schools in 2008, most measurements have had no major improvement.

Senior lecturer at CSU’s School of Teacher Education, Dr Jae Major, believes this is consistent with international results.

“The fact that improvements are not being seen mirrors what has happened internationally with high stakes standardised testing,” Dr Major said.

“There is usually a short term improvement followed by a plateauing of results. The typical response is to call this stagnation and blame teachers; suggesting they get ‘back to basics’ in curriculum and pedagogy. In other words, blame anything except the test itself for the problem.”

Dr Major believes that the plateauing of results should encourage parents and education professionals to question the test rather than place the blame on teachers.

“It can be easy to blame teachers for results but there is increasing evidence that high stakes, national standardised testing has little impact on achievement which begs the question of why we are spending so much money on something that doesn’t enhance learning outcomes,” Dr Major said.

“Tests such as NAPLAN provide a snapshot in time, and measure a very limited set of specific and narrow skills. This is not an indication of the holistic and complex ways we use literacy skills in the real world. The resulting statistical data seems to be used purely for the purposes of comparing schools.

“The fact that changing the persuasive writing task within the test changed results for Year Three students supports the notion that the test itself may be flawed, raising questions about its use and worth.”

Dr Major believes there are a number of ways that NAPLAN could potentially be improved, to become more effective and valuable.

“One way to make NAPLAN more useful is to change it from being a high stakes test that is administered on a single day, to a suite of standardised tests that are nationally normed,” Dr Major said.

“NAPLAN would be more useful if results were provided to teachers in a more timely way so they could be used to inform teaching. Teachers should be able to use NAPLAN, along with other assessment tools, to build a rich understanding of the progress and areas of need amongst their students.”

Dr Major hopes the overall plateauing of results does not overshadow the positive results in reading.

“Over 90 per cent of children in most parts of the country are achieving at or above the national minimum standard in reading,” Dr Major said.

“I think that is a great achievement for which our students and their teachers should be congratulated.”

Dr Major also believes that teachers need more support when interpreting NAPLAN results and translating them to learning programs.

“The missing component in the process is professional learning to help teachers understand standardised test results in depth, and how to use them, along with the other assessment data they collect, to develop targeted learning programs,” Dr Major said.

Jae_Major.png

This article was extracted from the September edition of Independent Voice. Click here to view the full edition.


Authorised by Terry Burke, Independent Education Union of Australia – Queensland & Northern Territory Branch, Brisbane.