Michelle Charlton is a VET consultant who works alongside VET PD Group, a community committed to sharing educational strategies and improving capability within the Australian vocational sector. Michelle is an experienced speaker who will be presenting at the National VET Conference later this year. She is also the driver of Write On Resources, specialising in bespoke, written-to-order course materials.
Michelle talks about some of the challenges associated with assessment in VET:
When it comes to assessment, it seems to be one of the most discussed, most written about, most questioned and most difficult aspects of vocational education and training to get right. Why is that?
What makes assessment so difficult to get right? My money is on the fact that a large part of it is because assessment in the VET sector is exactly that – assessment for a vocation; a trade, a workplace role, a career calling. And it requires an evaluation of practical skills and application of knowledge specifically for that profession. After all, how can you tell if a hairdresser can cut your hair unless they do it? Or a mechanic fix your car? An early childhood educator change nappies on your toddler? An electrician safely wire your house?
These instances evoke a strong mental image of the actual task and it’s easy to see why an evaluation of practical skills would be required. And I would argue that many providers do this without difficulty; geared up to accommodate the necessary performance components of a particular vocation.
But what happens when the vocation calls for effectiveness in a team, use of emotional intelligence to foster positive workplace relationships, or abilities to test plan performance? Although no less important in terms of their respective industries, these skills are a little less concrete than the haircutting or nappy changing mentioned above.
Already, we can imagine the impact of trying to assess the different types of skills, hard skills versus soft skills, through a one-size-fits-most approach.
Far beyond the scope of this article is another viewpoint which might analyse private versus public education providers, their commercial drivers, funding arrangements and allocations, and the influence all of that combined might have on how materials are procured, developed and delivered through the RTO. That thought aside, it does bear recognition that various providers have varying degrees of understanding of, and funds to invest, when it comes to assessment materials.
Myriad causal effects may be attributed to why this might be the case, and the rigour required for trainer/assessor accreditation is but one factor in the puzzle for what might make assessment so difficult to get right. Through no fault of the assessor, other downward pressures often mean they are asked to be the trainer, the assessor, the validator, the industry contact, and the content developer for the RTO. Each one of those areas is a specialist area within itself, so how many specialists-in-one is a realistic expectation?
Back to the micro-level of analysis of what makes assessment difficult and we’re forced to acknowledge the content and user expectations of that content. We have a system designed to cater to personal and individual differences, to encompass a variety of delivery methodologies, and to allow for workplace, classroom, blended, and remote (online) assessment. Designing and administering appropriate content is a skill, as what is suitable for one delivery methodology, is rarely compatible across the board. Note: There is a hint of generosity in the use of the word ‘rarely’.
The level of creativity required to ensure the right type of evidence is collected against the requirements of a unit of competency has sky rocketed in comparison to the “good old days” when vocational education and training usually meant an apprenticeship or traineeship, both of which have inbuilt on-the-job time and therefore, an opportunity for skills performance – and observations – in the work environment.
Creating an assessment task that is considered valid in terms of its authenticity, ability to replicate workplace conditions, pressures, resources, equipment and expectations, is one thing. Creating an assessment task that does all of that AND allows the assessment to occur as per different cohort needs - on the job, in a simulated environment, or even in cyberspace - becomes the ongoing challenge.
There is no doubt technology has improved many aspects of our lives, but there are some things that just cannot surpass the critical eye from an experienced mentor. Whilst in some cases there can be, there are some levels of response that cannot (and should not) be ‘automarked’ by an LMS, there are some cost efficiencies that do not translate to long term value, and there are certain unit requirements where a case study or a hypothetical-this-what-I-would-do-if-ever-in-that-situation-type of question just doesn’t cut it.
In embracing the concept of flexibility, and its bid to be a competitive, viable option, has VET become an education marketplace that is trying to be everything to everyone? When instead, through use of undiluted, industry-specific assessment, all it really needs to be, is a solid, respected option to learn the skills of a vocation.
These are some of the challenges that present systemic shortcomings in terms of materials being used to assess vocational courses. And, when combined with some of the other issues presented earlier in this piece i.e. understanding of requirements (or lack thereof), values invested in development of adequate and appropriate materials (or lack thereof), specialists (or lack thereof) to develop such materials, and realistic contexts and resources (or lack thereof) to replicate actual workplace conditions, we have a recipe for non-compliance. And lots of fodder for discussion, debate, industry articles and ongoing PD.
Michelle Charlton, July 2019
VET moves fast. Stay informed, with blogs straight to your inbox.