You are on page 1of 37

A Competency Assessment

Process

Natalie Werbitski
Meteorological Service of Canada
October 2012
Outline

 Define competency-based assessment


 Value of competency assessment
 Tools/techniques for competency
assessment
 Selection and training of assessors
What is Competency Assessment?

 It is a process of verifying that a


worker has the required knowledge
and skills to do his or her job

 Must have an established set of


standards
The Assessment Process

Image credit: http://www.cognology.com.au/cbawhatis.htm


Self-assessment

Image credit: http://www.cognology.com.au/cbawhatis.htm


Initial assessment

Image credit: http://www.cognology.com.au/cbawhatis.htm


Identify gaps

Image credit: http://www.cognology.com.au/cbawhatis.htm


Learning

Image credit: http://www.cognology.com.au/cbawhatis.htm


Reassessment/certification

Image credit: http://www.cognology.com.au/cbawhatis.htm


Why is assessment important?

 May be an organizational requirement


 To achieve certification
 To identify training needs
 To determine if a training initiative had
value
Organizational requirements

 Quality Management System


principles:
 Say what you do
 Do what you say
 Verify continually**
 Continuously improve**

Image credit: www.greenwich.com.sg/qms2.gif


ISO 9001:2008 requirements
for QMS:

 “An organization needs to demonstrate its ability to


consistently provide product that meets customer
and applicable statutory and regulatory
requirements, and
 aim to enhance customer satisfaction through the
effective application of the system, including
processes for continual improvement of the
system and the assurance of conformity to
customer and applicable statutory and
regulatory requirements.”
Achieve certification

 This may tie into organizational


requirements as well
 Examples:
 AMS certifications
 P.Met (Canada)
To identify training needs

 Initial assessment
 Identify gaps in knowledge/skills
 Create training plans to target gaps
 Follow up with a re-assessment
To determine value of training

 Kirkpatrick Level 3 assessment model


 Behaviour evaluation to assess the
extent of applied learning back on the job
 Measurable change in activity/performance?
 Were the relevant skills/knowledge applied?
 Is the new level of skill/knowledge sustained
over time?
A real-life example

 WMO/ICAO competencies for AMF


 By December 2013, every aeronautical
meteorological service provider in the
world must demonstrate that their
forecasters meet these competencies
 Here at CMAC/DWS we’ve established a
rigorous methodology, selected/trained
assessors and have started to carry it out
Conducting a Competency
Assessment

 Identify your standards


 Develop your methodology
 Tools/techniques
 Assessors
 Documents
 Resources
 Timelines
 Carry out your methodology
 Fine-tune the process for next time
CMAC/DWS Competency
Assessment Methodology
Direct observation
 Observing the forecaster
at work, in an operational
setting
 Pros:
 Can glean a lot of
information through
observation
 Cons:
 Resource intensive
 Some processes/routines
are internalized; may be
difficult to observe the
desired competency
 Can be a distraction for
some forecasters (nerves)
Direct questioning
 Questions asked that
are based directly on
the situation during the
direct observation
 Example: Contingency
procedures
 Pros:
 Very flexible/adaptable
 Cons:
 Not natural for every
assessor to come up
with questions on the
spot (no preparation)
 Can interfere with the
work
Post-mortem analysis

 An analysis of previous work


from the forecaster
 Pros:
 Easy to maintain objectivity
 Clear-cut evidence
 Cons:
 Many other competencies not
assessed (such as internal/external
communication)
 Time consuming; tedious
Experiential questions
 Asking questions to assess knowledge based on the
experiences of the forecaster
 Pros:
 One can glean a lot of information from one well-crafted question
 Example: What would you do if un-forecast thunderstorms started to
develop over your forecast area?
 Can be prepared prior to assessment (example, winter-time
phenomena questions during the summer)
 Cons:
 Can be difficult to avoid yes/no questions without practice
 Sometimes questions can be leading, which doesn’t truly assess
the competency
 Can interfere with the operational work that needs to be done
Written examination
 A traditional tool for assessing
knowledge
 Pros:
 Every forecaster asked the same
questions (Fair, repeatable)
 Once the exam/answer key is
established, this tool is not quite as
resource-intensive as some of the Image credit:
other tools http://www.canadiangovernmentjobs.ca/
images/gocwrittenexamsgroups.jpg
 Cons:
 Assesses knowledge-base, not skill-
base
 Knowing about something isn’t the
same as being able to do something
Case studies
 Asking the forecaster to submit case studies
demonstrating their work
 Pros:
 Some aspect of self-assessment involved, as the
forecaster is gathering, compiling his or her own evidence
(builds self-awareness)
 Cons:
 Does not assess every competency
 Would not be feasible to ask for case studies
demonstrating each competency
 Can be time-consuming (finding time for each forecaster
to put together case studies)
Simulator
 To test particular competencies while not on the
desk (simulates desk work)
 Pros:
 Since the forecaster is not actually producing a product
that goes out, the assessment process is less of a
distraction
 Can run similar simulators for all forecasters (repeatable),
testing the same competencies
 Cons:
 A lot of preparation required
 Very time-consuming to run for all forecasters in large
offices
Convergence of Evidence

 Best to use a variety of tools, rather


than to rely on one alone
 Looking for evidence to converge
towards one solution
 Meet the competency or not?
 If evidence diverges… try another
tool/technique to gather more evidence
supporting one solution or another
Assessor Training

 Select the assessors


 Train the assessors
 Workshop design
Assessor Selection
 Experienced forecasters who have
expressed interest in this project
 Looking for individuals with the
following traits:
 Objective  Ethical
 Good listener  Open-minded
 Self-disciplined  Perceptive
 Observant  Patient
Assessor Training Workshop

 Auditing principles
 Communication skills
 Practice!

Image credit: http://blog.ellusionist.com/wp-content/uploads/Practice-11309047.jpeg


Auditing Principles

 Quality Management System:


 Say what you do
 Do what you say
 Verify it continually
 Continually improve

Slide adapted from a “Internal Auditing Workshop” presentation by the Management Support Network Inc.
Auditing Principles
 Maintain control of the assessment
 Be prepared
 Be on-time
 Stay focused (remember your purpose)
 Don’t intimidate
 Be clear in your communication
 Be professional and polite
 Remember you are there to gather facts
 Don’t make assumptions or presumptions
 Don’t make personal judgements
 Listen and observe
 Take notes
 Avoid misunderstandings
 Summarize and review results with the assessee

Slide adapted from a “Internal Auditing Workshop” presentation by the Management Support Network Inc.
Communication Skills

 Questioning techniques
 Be brief, clear, and concise
 Cover one point per question
 Focus on the topic
 Avoid yes/no questions
 Use “Show me”, “Tell me”,
“Why/what/how” questions

Slide adapted from a “Internal Auditing Workshop” presentation by the Management Support Network Inc.
Communication Skills
 Listening Techniques
 Be neutral when listening
 Restate/repeat/summarize what you have
heard to make sure that you have
understood the assessee correctly
 Be reflective – show the assessee that
you care about what he/she is saying

Slide adapted from a “Internal Auditing Workshop” presentation by the Management Support Network Inc.
Practical application

 Found it to be very valuable to have


trial/mock assessments
 Allow the assessor to practice some of
these tools/techniques
 Direct observation
 Direct questioning
 Experiential questioning
Summary

 Stages of the assessment process


 Reasons why competency-based
assessment is beneficial
 Tools/methods of assessment
 Selection and training of assessors
Acknowledgements
 www.cognology.com
 www.businessballs.com/kirkpatricklearninge
valuationmodel.htm
 “Internal Auditing Workshop” presentation
by the Management Support Network Inc.
 http://www.ametsoc.org/amscert/
 http://www.iso.org/iso/home.htm
 http://www.eco.ca/certification/professional-
meteorologists/about/962/
Thank you!

I appreciate your time and participation.


Any questions?

You might also like