Clearly, continuous improvement has become much more important for RTOs as many more companies realise the value of a robust approach to the collection and management of improvement data. One of the keys to improving processes and resources is to understand cause and effect relationships. We often know what the effect is but can‘t figure out what caused it.
The National Regulator, which is soon to oversee regulation of VET providers, is being introduced as one of many reforms. One of the primary reasons is that the various State and Territory regulators‘ auditors have written audit reports which highlight enormous variations in the interpretation of the national framework, the AQTF. This short think piece is not a diatribe focused on regulators, in fact I have enormous respect for the difficult job they do.
This article is more about cause and effect and how we, as a sector, must learn.
I doubt that any auditor in the country will disagree that the most prominent non compliance resulting from audits is to do with assessments, the old standard 8 and the newer element 1.5 of the AQTF. For those who understand and/or practice Pareto‘s principle, we might say that 80% of non-compliances stem from 20% of the standards or elements and the poor assessment is the most prominent. What is really concerning is that it has been for a long time.
So we have the effect. But can we find the cause? Is there in fact one single cause?
Let us look at some common factors.
Assessments are generally based around a unit of competency or a cluster of units of competency. Since the first training packages were endorsed by the National Training Framework Committee in 1997 (BSA, TDT, ICT and MEA) the format of training packages has not significantly changed. Sure, some terminology has, key competencies have been replaced by Employability Skills etc. but the relationship between elements, performance criteria, the range statement, evidence guide and so on haven‘t. We are 13 years advanced from these early days but we still struggle with writing assessments based on these ‗syllabus‘ style documents. Millions of dollars have been invested and spent on professional development programs over these years aimed at showing people how to interpret units and write assessments, but we ask the question, ‗are we, as a sector, getting better at writing assessments?‘ Some might say that we are not. Audit findings would certainly endorse that.