Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 29

Competency – Based

Learning Material

PLAN TRAINING
SESSIONS

PLAN TRAINING
SESSIONS
CORE COMPETENCIES:

No. Unit of Competency Module Title Code


1. Plan Training Planning Training TVT232301
Sessions Sessions

2. Facilitate Learning Facilitating Learning TVT232302


Sessions Sessions

3. Supervise Work-Based Supervising Work-Based TVT232303


Learning Learning

4. Conduct Competency Conducting Competency TVT232304


Assessment Assessment

5. Maintain Training Maintaining Training TVT232305


Facilities Facilities

6. Utilize Electronic Utilizing Electronic TVT232305


Media in Facilitating Media in Facilitating
Training Training

MODULE CONTENT
UNIT OF COMPETENCY : PLAN TRAINING SESSION
MODULE TITLE : PLAN TRAINING SESSION
MODULE DESCRIPTOR : This module covers the knowledge, skills and attitudes in
planning a training session. It includes identifying
learner’s requirements, preparing session plan, preparing
instructional materials and organizing learning and
teaching and assessment resources.
NOMINAL DURATION : 40 hours
SUMMARY OF LEARNING OUTCOMES :
Upon completion of this module the students/trainees will be able to:
LO1. Identify learner’s training requirements
LO2. Prepare session plans
LO3. Prepare instructional materials
LO4. Prepare assessment instruments (Institutional)
LO5. Organize learning and teaching resources
LEARNING OUTCOME NO. 4

Prepare Assessment Instruments (Institutional)

Contents:

● Institutional Competency Evaluation

● Evidence Plan

● Table of Specification

● Written Test

● Performance Test

● Questioning Tool

Assessment Criteria

1. Relevant modules of instruction are identified, read and interpreted to


identify required evidence

2. Evidence requirements are determined which will show full coverage


of the training module to be assessed and consistent to the
performance of the training activities

3. Suitable assessment methods are identified which are appropriate


with the learning outcome of the module of instruction.

4. Assessment instrument are prepared in accordance with the content


and learning outcome specified under the assessment criteria of the
module of instruction.

5. Assessment instruments are checked for validity, fairness, safety and


cost effectiveness.

Conditions

The participants will have access to:

Computers and relevant modules of instruction.

Evidence plan of the qualification.

Learning materials

Assessment Method:
● Portfolio
● Written Test/Oral interview
● Performance Criteria Checklist
LEARNING EXPERIENCES
LEARNING OUTCOME 4

PREPARE ASSESSMENT INSTRUMENTS (Institutional)

Activities Special Instructions

1. Read Information Sheet 1.4-1 on


Institutional Competency Evaluation Thi
s Learning Outcome
2 dea wit
. Answer Self-Check 1.4-1 ls h the
developme o
Compare answers to Answer Key nt f the
1.4-1 Institution
al Competency
3 too whic
. Read Information Sheet 1.4-2 on the evaluation l h
trainers use in
Evidence Plan evaluating
the
4 ir trainees after
. Answer Self-Check 1.4-2 finishing a competency of
Compare answers with the answer the
key qualification.
5 Go learnin
. Perform Task Sheet 1.4-2 on how to through the g
prepare an evidence plan activities outlined for you
usin th on the left column to
Evaluate performance g e gain
the necessary
Performance Criteria Checklist information
or knowledge before
doing

6 task practic
. Read Information Sheet 1.4-3 on the the s to e
Table of Specification making the parts of the

7
. Answer Self-Check 1.4-3 evaluation tool.
Compare answer to answer key 1.4-
3
8 The output of this LO is
. Perform Task Sheet 1.4-3 on how to a
prepare a Table of Specification complete Institutional
Competenc
y Evaluation
Evaluate performance using the
Package for one
Performance Criteria Checklist
Competenc o
y f your
Read Information Sheet 1.4-4 on the qualification.
Written Test
10 Your output shall serve
. Answer Self-Check 1.4-4 as
on of portfolio
Compare answers to Answer Key 1.4-4 e your for
yo Institution
ur al
11 Competenc
. Perform Task Sheet 1.4-4 on how to y Evaluation
construct a written test Pla
for n Training
Evaluate performance using the Sessions.
Performance Criteria checklist Fe free
el to show your
Read Information Sheet 1.4-5 on the outputs to your trainer as
Performance Test you accomplish them for
13 Answer Self-Check 1.4- guidanc an evaluatio
. 5 e d n
of your
output.
Compare answers with Answer Key 1.4-5
14
. Perform Task Sheet 1.4-5 on how to
Construct Performance
Test
Evaluat th
e performance using e
performance criteria checklist
15
. Read Information Sheet 1.4-6 on the
Questioning Tool
16 Answer Self-Check 1.4-
. 6
Compare answers to Answer Key
1.4-6
17
. Perform Task Sheet 1.4-6 on how to
construct the questioning tool
Evaluat th
e performance using e
performance criteria checklist
18
. Perform Job Sheet 1.4-6 on how to
constru an
ct Institutional Competency
Evaluation Tool.
Evaluat th
e performance using e
performance criteria checklist
th
After performing e
activiti o LO yo
es f 4 u
may proceed to
LO5.
Information Sheet 1.4 -1
Institutional Competency Evaluation

Learning Objective:

After reading this INFORMATION SHEET, YOU MUST be able to:

determine the objectives of an institutional competency evaluation;

identify the parts of an Institutional Competency Evaluation Tool

Evaluation is a very significant element of the teaching learning


process. This is done to verify the acquisition of knowledge, skills and
attitude needed acquired from the training.

As a trainer, it is a must that you know how to test or verify that


assessment criteria addressed during the training.

Institutional Competency Evaluation

Institutional Competency Evaluation is the assessment of the


knowledge, skills and attitudes acquired from the training. In CBT,
evaluation is the systematic collection and analysis of data needed to make
decisions whether a trainee is competent or not yet competent.

The Institutional Competency Evaluation is administered by the


trainer within the training duration. Trainees should be evaluated every
after competency. No trainee should be allowed to transfer to another
competency without having been assessed.

For the purpose of CBT, assessments are usually given for the
following purposes:

To validate the current competencies of trainees


To measure how much trainees have learned in the training sessions
given

To help diagnose trainee’s problems and guide future instruction

To decide whether trainees are competent or not

The Institutional Competency Evaluation Tool

The competency evaluation tool should be carefully developed so that


it will be able to assess the four dimensions of competency such as the:

Task Skill

Task Management Skill

Job Role and Environment Management Skill

Contingency Management Skills


An analysis of the Modules of Instruction or the Competency
Standards is critical in the preparation of the assessment tool. Performance
criteria for the competency are the main basis for the competency
assessment. You should carefully examine your competency standards so
that these criteria are included as a part of the evidences to be gathered
during assessment.

Characteristics of Good Evaluation Tools

1. Reliability

This refers to consistency of scores by the same person when re-


examined with the same test on different occasion. Your test is
reliable if your test is consistent in testing what it is trying to test.

Factors that may affect Reliability

Length of the test – the longer the test the higher the reliability.

Difficulty of the test – the bigger the spread of the scores the more
reliable the measured difference is likely to be. Items should not be
too easy or too difficult.

Objectivity – this is achieved if scores are independent of the


subjective judgment of individual examinees.

To increase the reliability of the written test we do item-analysis. That

is analyzing the degree of difficulty and the index of discrimination of the test
items. Standard written test items should not be too easy nor too difficult
and it should discriminate those who learned from those who did not learn
anything.
2. Validity

This is the degree to which the test actually measures what it


purports to measure. It provides a direct check on how well the test
fulfils its functions.

Factors that influence the validity of test:

Appropriateness of test items;

Directions;

Reading vocabulary and sentence structures

Difficulty of items;

Construction of test items – no ambiguous items or leading


items;
Length of the test – sufficient length;

Arrangement of items – from easy to difficult; and

Patterns of answers – no patterns

To ensure the validity of the evaluation tool, prepare an Evidence Plan


based on the CS. To increase the validity of the written test, you should
prepare a table of specifications.

Objectivity

The test must be fair to all the examinee.

Discrimination

It must pick up the good examinees from the poor

Ease of Administration and Scoring

The test must have the right length and level of sophistication to do

the job.

Parts of the Competency Evaluation Tool

Evidence Plan

Written Test

Performance Test

Questioning Tool (with answers)

Information Sheet 1.4-2


Evidence Plan
Learning Objectives:

After reading this INFORMATION SHEET, YOU MUST be able to:

● explain the purpose of preparing an evidence plan;


● determine the sources of the contents of the evidence plan;
● identify methods appropriate for evaluating a performance criteria.

One essential part of the Competency-Based Training Delivery is the


institutional assessment. Assessment is the process of collecting evidence
and making judgments on whether competency has been achieved. The
purpose of assessment is to confirm that an individual can perform to the
standards expected in the workplace as expressed in the relevant
competency standards.

In this lesson you will learn how to prepare the evidence plan of your
competency.

The Evidence Plan

In developing evidence gathering tools for an institutional assessment,


the first stage is to prepare an evidence plan.

Evidence plans are designed to –

Serve as a planning tool

Support the assessment process

Assist with the collection of evidence

Inform the learners of what is expected of them before they begin


the assessment

Serve as a guide for the trainer in determining the method of


assessment to be used

In making an Evidence Plan you should have the Competency Standards (CS) of the
chosen competency and the Evidence Plan Template.
Critical aspects of competency are the performance criteria that are
listed in the evidence guide of the Competency Standard (CS) as critical.
These criteria are required to be demonstrated by the trainee for him to
be evaluated as competent. You should prepare an institutional
competency assessment tool that will show these evidences.

Parts of the Evidence Plan


Competency Standard – this is the title of your qualification

Unit of Competency – the institutional evaluation tool is packaged by


competency. The name of the competency is written in this portion.

Evidence Requirements – the criteria for judging the competency of the


trainee. These are written in the competency standards. Critical aspects
of competency should be marked with an asterisk (*). Refer to the CS for
the identification of the critical aspects of competency.

Methods of Assessment – the methods of collecting evidences per each


performance criteria. At least 2 methods of assessment should be chosen
for each criterion to allow for corroboration of evidences.

Knowledge, skills and attitudes and the four dimensions of


competency are to be assessed. To do this, the following methods are
recommended:

4.1 Written test – to test the acquisition of knowledge


4.2 Performance test – to test the demonstrated skills

4.2.1 Demonstration Method – this is the method used when the


performance of a particular skill is to be assessed within
the workshop.

4.2.2 Observation method – is used when the assessment is


done by observing the trainee on the actual job site while
the trainee is doing his job.

4.2.3 Portfolio evaluation – is used when projects or outputs are


required to collect evidences of competency. In
Institutional Evaluation, we use the Performance Criteria
Checklist to evaluate the output/project.

4.3 Interview/questioning – this is to verify evidences which are


not clearly demonstrated during performance test. This is also
the part of the competency evaluation where you can ask
questions to verify Job Role and Environment Management
Skills and Contingency Management Skills.

Evidence Plan
Competency standard: Bookkeeping NCIII
Unit of competency: Journalize Transactions
Ways in which evidence will be collected:
De
[tick the column] Obs
mo
erv Thir
nstr
atio d
atio Por
n& par Wri
n& tfoli
Que ty tten
Que o
stio Rep
stio
nin ort
nin
g
g
The evidence must show that the trainee…


NOTE: *Critical aspects of competency

Information Sheet 1.4-3


Table of Specification

Learning Objective:

After reading this INFORMATION SHEET, YOU MUST be able to:


define table of specification;

discuss the importance of preparing a table of specifications;


determine the parts of the table of specification; and

explain how the table specification is prepared.

The Evidence plan is a plan for the institutional evaluation tool. After
preparing the evidence plan, we are now ready to prepare for the
development of the other parts of the evaluation tool such as the written
test.

To ensure the validity of your written test, you should prepare a table
of specification so that all contents to be tested have a representative
question.

In this lesson, you will learn how the table of specification is

prepared. Table of Specifications


A table that shows what will be tested (taught) is the table of
specifications. For our purpose of institutional evaluation, we shall be
preparing a table of specifications for our written test. This will help us plan
how many items we need to prepare to cover all the contents or objectives
that we need to assess based on the evidence plan you previously prepared.

A table of specifications is a two-way table that matches the objectives


or content you have taught with the level at which you expect students to
perform. It contains an estimate of the percentage of the test to be allocated
to each topic at each level at which it is to be measured. In effect we have
established how much emphasis to give to each objective or topic.

Parts of the Table of Specification

Objectives/Content/Topic – these are the content

Levels of learning – your questions shall be divided into the levels


of learning: knowledge, comprehension and application.

o Factual/Knowledge – recognition and recall of facts

Example:

The figure 1 in the symbol E6013 signifies

Tensile strength

Welding position
Material thickness

Maximum weld length

o Comprehension - interpret, translates, summarizes or


paraphrase given information

Example:

The megger is used to

Measure the amount of illumination

Determine the speed of electric motor

Measure the resistance of a lightning cable

Test the insulation resistance of a circuit

Application - uses information in a situation different from


original learning context
A voltmeter across the line

An ammeter across the line

A voltmeter in series with the line

An ammeter in series with the line

Percentage/number of items

TABLE OF SPECIFICATION

# of
Objectives/Conte Knowledg items
nt e Comprehen Application /
area/Topics sion % of
test

learner’s training
20%
requirements

Session Plan 20%

assessment
instruments 20%
(Institutional)

basic instructional
30%
materials

learning and
10%
teaching resources
TOTAL 100%

We also have to take into account the type of thinking skills we


wish to assess. Whether you use Bloom's taxonomy or another structure,
the levels of learning can help you identify the types of questions (or other
type of assessment) that are appropriate. For ease of use we have used only
three levels: knowledge (recall or recognition), comprehension (or
understanding) and application (or skill), and labeled the columns
accordingly. The important thing is to use levels of thinking that are relevant
for your students and have been incorporated in your instruction. At this
stage it can be helpful to mark an "x" or make a check mark in the cells to
show the levels at which each objective will be measured, as shown in the
example below.

TABLE OF SPECIFICATION

# of
Objectives/Conte Knowledg items
nt e Comprehen Application /
area/Topics sion % of
test

learner’s training
x (10%) x (5%) x (5%) 20%
requirements

Session Plan x(5%) x(5%) x (10%) 20%

assessment
instruments x(10%) x(10%) 20%
(Institutional)

basic instructional x(10%) x(10%) x(10%) 30%


materials

learning and
x(5%) x(5%) 10%
teaching resources

TOTAL 25% 35% 40% 100%

At this point we recognize that 25% of our test is to be on knowledge,


35% on comprehension, and 40% on application. This does not mean that
we must have 25 knowledge questions; it does mean that the score on the
test will reflect comprehension and application in equal amounts, and
knowledge to a lesser degree than knowledge or application.

It may be that at this point you want to compare the test(s) provided
by the textbook publisher with your completed table of specifications. If they
match and you think the questions are well written, you may decide to use
the test (or parts of the test) provided with the text. On the other hand, you
may find that it will be necessary for you to create a test to provide an
accurate assessment of what the students in your class have learned.

One question frequently asked is how many questions are needed to


adequately sample the content representing an objective or topic. Increasing
the number of questions increases the probability that we will have a good
estimate of what the learner knows and can do.

When translated to number of items per topic, the Table of


Specifications for a 40-item test may look like this:

TABLE OF SPECIFICATIONS

TEST ITEM DISTRIBUTION TOTAL PERCEN


NUMBE
Content/ Comp Applicatio R
Factual re n OF
Objectives Knowled hensi ITEMS TAGE (%)
ge on

Training
4 2 2 8 20%
requirements

Session Plan 2 2 4 8 20%

assessment
4 4 8 20%
instruments

basic 4 4 4 12 30%
instructional
materials

learning and
teaching 2 2 4 10%
resources

Total 10 14 16 40 100%

Note: This is a sample. The number of items is not prescribed. The trainer should
decide on the number of items based on the contents of the competency.

For purposes of validating the current competencies of the trainees or


for identifying mastered contents, item placement maybe identified in the
Table of Specifications for easier analysis. At this point you also have to
decide how many questions are needed to measure learning, what type of
questions will be asked and whether a written assessment is sufficient to
measure the competency. In most cases, for skills training, performance
evaluation with interview maybe more appropriate as an assessment
instrument but the effectiveness of written assessment instruments maybe
harnessed through the ingenuity and skills of the trainer. If however, the
trainer decides for a performance evaluation, it should be reflected in the
evidence plan.

Information Sheet 1.4-4


Written Test

Learning Objectives:

After reading this Information Sheet, you must be able to

● explain the advantage of preparing a reliable test item;


● determine the type of test appropriate for testing knowledge contents;
● enumerate guidelines in preparing a written test.

Evaluation of competency should be assessing the knowledge, skills


and attitude. Written test is a method of assessment which can measure
knowledge, skills and attitude learned in a training program but sometimes
trainers fail to develop questions to test the level of skills and attitude.

In this lesson, we will discuss some tips and guidelines in preparing


the written test. The written test that you will write after this lesson should
follow the guidelines in preparing a test item.

In developing test items always consider the five (5) characteristics


of good test – validity, reliability, objectivity, discrimination and ease of
administration and scoring.
As in the construction of a workable and functional project in shop
work, test construction should follow the same steps. In the construction of
a competency assessment instrument, the following steps are recommended:

Examine the established Training Regulations and determine your


objectives. This will help in the analysis of the basic skills and
knowledge requirements of the trade.

Construct the table of specifications. This will be your blue print in


constructing individual test items, it will serve as a guide in the
preparation of a set of competency assessment methodology for a
certain trade.

Construct test items more than the number required for a set of
Competency Assessment Instrument. This will facilitate item banking
and will give an allowance for correction when the test items will be
deliberated whereby some items might be deleted.

Assemble the items for the test. After grouping the items by type,
arrange them such that related items are together. The reason for this
is obvious, it saves examinee time as the test is taken and it will be
easier to point out where the examinee had failed. In assembling items
for the test the speciation table should be followed.

Write clear and concise directions for each type of questions. The
direction should tell the examinee what to do, how to do it and where

to place the responses. They should also contain an example taken


from the subject matter being tested.

Study every aspect of the assembled test. After the test is assembled
and directions are written, it is a good policy to lay it aside for several
days, then pick it up again and review each part critically. Consider
each item from the point of view of the workers who will take the
competency assessment. Try to determine those items that are
ambiguous. Check the grammar and be sure that the words used will
be understood by the workers who will take the competency
assessment.

The written test that we shall prepare as a part of the institutional


assessment will largely measure the acquisition of knowledge. Skills and
attitude shall be measured using performance tests with questioning.

Guidelines for Teacher-Made Tests as to Format

● Include easiest items first.

● Group smaller items together, i.e. matching, completion, etc.

● Put all of an item on the same page. Avoid splitting a matching exercise or
response to a multiple-choice question.

● Number continuously.
● Write clear, precise directions.

● For ease of correcting, place blanks for responses to one side of the paper, or
use a separate answer sheet.

● Avoid patterned responses in true-false, multiple choice, or matching


exercises.

● Proofread the test carefully for clarity, errors, etc.

● Make sure copies of the test are dark and legible.

Pointers in the formulation of test questions for written test

● Keep in mind that it is not possible to measure all outcomes of instruction


with one type of test.

● Devise your items so that they require the trainee to actually apply things
learned rather than merely recalling or recognizing facts.

● Make certain that the type of the test items used for measuring each objective
is the one that will measure the objective.

● Avoid “tricky” or catchy questions. Do not construct puzzling items in which


hidden meaning or subtle clues provide the correct answer.
● Do not lift statements directly from the books and use them as test items.

● Check to make sure that no item can be answered simply by referring to the
other items. Make an item independent upon the answer of another

● Do not include an item for which the answer is obvious to a person who does
not know the subject matter.

● Word the items in the simplest manner possible. Confine the items used to the
vocabulary level of the examinee. States questions clearly and eliminate
ambiguous items.

● Arrange the items so that responses will not form a particular pattern.

Guidelines for Constructing Effective True-False Items

● Use true-false items only when there is a clear-cut true or false answer to the
question.
● Construct items that are entirely true or entirely false.

● Avoid using specific determiners, i.e. “never”, “always”, generally” (Statements


that include all or always are usually false, those including sometimes are
usually true.)

● Rephrase textbook and lecture material rather than quoting it directly.


● State items positively rather than negatively. If a negative is used, underline
words like NO or NOT.
● Construct approximately equal numbers of true or false statements and avoid
setting up an answering pattern.
● Avoid testing for trivial details.

● If a controversial statement is used, quote the authority.

Guidelines for Constructing Effective Multiple Choice Items

● Present a single definite concept in the stem.

● Place all common wording in the stem.

● Make the alternative grammatically consistent with the stem and with each
other.
● Avoid verbal association between the stem and the correct
response(grammatical clues)
● Construct items with a single best item.

● Include four or five alternatives.

● Make all choices plausible.

Arrange alternatives in a logical sequence.

Avoid using opposites or mutually exclusive alternatives.

Eliminate option length and specificity as clue to the


correct response. Make options of similar length.
Delete specific determiners from the alternatives.

Avoid using “all of the above” and “none of the above” unless these
are used in questions where “all of the above” and “none of the
above” are not desirable responses.
Avoid using opposite as possible answers.

Phrase stems positively unless emphasizing an exception. If


desired response is an exception to the question, underline except
or not in the question.
Vary the position of the correct answer in a random manner.

Information Sheet 1.4-5


Performance Test
Learning Objectives:

After reading this INFORMATION SHEET, YOU MUST be able to:

define performance evaluation;

differentiate the procedures of a Job Sheet from that of the instruction


for demonstration in an institutional competency evaluation.

Evaluation of competency covers knowledge, skills and attitudes. To


assess knowledge, we can use written test as a method of assessment but to
effectively assess the skills and attitudes acquired by the trainee in CBT, we
should use performance evaluation which will include a demonstration of
the skill and an interview to follow-up demonstration.

In this lesson, the format and structure of the prescribed performance


test shall be discussed to help you develop your own instructions for
demonstration.

Performance Evaluation

It is the formal determination of an individual’s job-related


competencies and their outcome.

Performance evaluation is accompanied with interview questions


which are used during the actual conduct of the test. This is to support the
evidences gathered by the facilitator/trainer.

GUIDELINES IN FORMULATING PERFORMANCE TEST

This is the practical portion of the competency assessment


instrument. This part measures the skill possessed by the examinee in
relation to the occupation. It consists of General and Specific Instructions,
the List of Materials, Equipment/Tools and the Marking Sheets.

A. GENERAL INSTRUCTIONS

This refers to the overall conduct of the test (before, during and after)
which concerns both the testing officer and the examinee. This part of the
competency assessment specifies the does and don’ts inside the testing
area.
The format of general instructions include:
Performance or what must be done

The conditions or what is given

The standard of performance expected of the examinee

B. SPECIFIC INSTRUCTIONS

This provides the instructions which the examinee must follow in


the performance of the test.
C. LIST OF MATERIALS, EQUIPMENT

This provides the listing of the materials, equipment/tools needed in


the performance of the skills test. This contains also the complete
specifications of each item in the listing.

Pointers to follow in the construction/formulation of a good Test of

Skills:

The test coverage must be consistent with the job description and
skills requirements.

The test must not take more than 8 hours to complete.

the test statement must specify the exact time within which he
examinee is expected to finish task and the tools/equipment that will
be issued to the examinee.

The work performance/specimen or whatever is being tested must


be observable and measurable.

The test should be feasible. Do not design tests which makes use of
rare or too expensive equipment.

Where applicable there must be a working drawing which is clear


and accurate.

The standard performance outcome if possible, should be stated such


as surface finish, clearance or tolerance and number of allowable
errors.

Directions must be clear, simple, concise and accurate.

B. SPECIFIC INSTRUCTIONS
Information Sheet 1.4-6
Questioning Tool

Learning Objectives:

After reading this INFORMATION SHEET, YOU MUST be able to:

determine the purpose of the questioning tool;


enumerate the types of questions that are in the questioning tool.

explain how corroboration of evidences will be achieved using the


questioning tool.

Corroboration of evidences should be achieved when gathering


evidences of competency. In case evidences from the written test and the
performance test results are not enough to decide for the competency of a
trainee, the questioning tool should be used.

In this lesson, we shall discuss the structure of the questioning tool so


that it will help the trainer gather evidences of knowledge, skills and attitude
and the four dimensions of competency needed for the competency being
assessed.

Questioning Tool

The questioning tool is a must in an institutional competency


evaluation tool package. This will be used to verify evidences that were not
clearly demonstrated in the other methods of assessment such as in the
written test and the performance test.

The questioning tools should be able to evaluate the four dimensions


of competency. To be able to do this your questioning tool should contain
questions:

to follow-up the demonstration of task skills and task management


skills.

All possible questions should be written here. Although the trainer is


not required to ask questions that are already observed in the
demonstration of skills, you should write all possible questions so that
these questions are ready for use.

to verify OHS practices.

Safety practices are very important aspect of the demonstration. List


down questions on safety related to the competency being assessed.
Questions should concentrate on safety practices for the competency
being assessed.

to verify Job Role and Environment management skills.


Questions that will verify the responsibility of the worker
towards his customers, co-employee, employer and the
environment are very important because oftentimes this
dimension of competency needs to be verified using these
questions. They are not demonstrated in most demonstration
test.

to gather evidences for contingency management skills.

Infrequent events may arise from the job that would need the
worker to adjust. These are the contingency management
skills questions that you need to construct to verify this
dimension of the competency.

on knowledge of laws, rules and regulations.

Knowledge of Laws, rules and regulations critical to the job


should also be verified. Prepare questions to gather evidences
for the specific competency.

Questioning Tool Template

You might also like