Download as pptx, pdf, or txt
Download as pptx, pdf, or txt
You are on page 1of 60

The Art of Questioning

TRACE Faculty Development Program


26-27 October 2012

Dr. Josephine S. Cacdac, VPAA


Three features of a "good" question :

 It requires more than recall or reproduction


of a skill;
 It has an educative component; that is, the
student will learn from attempting to answer it and
the teacher will learn about the student from the
attempt;
 It is, to some extent, open; that is, there may be
several acceptable answers.
Characteristics of a Good Question
(Capulong)

1. It is simple and clear


2. It is definite
 Permits one answer
3. It is challenging & thought – provoking
 It stimulates students to compare, evaluate, draw
conclusions & appraise results.
4. It is adopted to the age, abilities and interests of
the students.
5. It requires on extends response
Techniques of Questioning
Questioning requires skills
1. Questions should be asked in a natural and well –
modulated voice.
2. A teacher should ask the question first & then wait for the
class to think about it before calling on a student to answer
the question.
3. A sufficient number of questions should be asked to
stimulate students to activity.
4. A teacher should refrain from repeating questions. (to
challenge attention)
5. Questions should be evenly distributed so that the majority
of the pupils can take part in the discussion.
6. A teacher should avoid starting to any mechanical system of
fielding question to the class, such as alphabetical order,
row by row etc.
7. A teacher should ask questions that are really interesting
and thought–provoking.
Techniques in Handling Students
Response
1. A teacher should make every effort to show on appreciative
attitude towards students’ answers.
 The teacher should refrain from giving sarcastic
comments to wrong answer.
2. A teacher should never allow wrong answers slip by.
3. Correct answers of students should be followed with
encouraging remarks by the teacher.
4. Clearness in every point expressed by the students should
be insisted upon by the teacher.
5. Answering in concert should be discouraged.
6. A teacher should be encouraged to answer in a loud and
clear voice.
7. Students should be encouraged to answer in complete
thought units and grammatically correct statements.
8. A teacher should refrain from making the students in his
record book during the class recitation.
Techniques in Handling Student’s
Questions
1. Student’s questions should be encouraged by
a teacher.
2. A teacher should not answer a student
question right away.
3. Indiscriminate student questions should not
be allowed.
4. A teacher should require students to form
grammatically correct questions.
5. If a teacher is asked questions he cannot
answer he should promptly admit this
Bloom’s Taxonomy
(Clark, 2009/Tormod, 2009)
 Knowledge or the recall of data and information, expresses the natural urge to recall
previously learned material.
 Comprehension is the ability to grasp meaning, explain, restate ideas, understanding the
basic information and be able to understand, interpret or extrapolate it.
 Application is the ability to use learned material in new situations or the unprompted use of
an abstraction; involves using information, ideas, and skills to solve problems, then selecting
and applying them appropriately.
 Analysis involves separating information, or separate material, into component parts and
showing the relationships between parts. This includes breaking apart information and ideas
and the ability to distinguish between fact and inference.
 Synthesis suggests the ability to put together separate ideas to form new wholes of a fabric,
or establish new relationships: putting together ideas and knowledge in a new and unique
form. Can build a structure or pattern from diverse elements, potentially creating new
meanings.
 Evaluation is the ability to judge the value or worth of material and ideas against stated
criteria. This involves reviewing and asserting evidence, facts, and ideas, then making
appropriate statements and judgments.
Bloom’s Taxonomy: Key Verbs
(Anderson. Et al., 2001)

 Remember (Knowledge Level) : name, list, state, describe, recall,


label, retrieve, recognize.
 Understand (Comprehension Level): paraphrase, identify,
explain, translate, interpret, interpretation, classify.
 Apply (Application Level): execute, compute, demonstrate, modify,
discover, predict, show, solve, implement.
 Analyze (Analysis Level): diagram, , illustrate, outline, infer,
conclude, differentiate, attribute, compares, contrasts.
 Create (Synthesis Level): create, compose, design, reorganize,
formulate, write a new ending, tell.
 Evaluate (Evaluation Level): judge, appraise, compare, contrast,
criticize, justify, critique.
1. Knowledge Who are the different people we find at the lunch
table?
2. Comprehension
What do we call families like this?

3. Application
Do you know of families like this one?

4. Analysis Why do you think Mama likes everyone to eat


together?

BLOOM’S TAXONOMY OF THINKING SKILLS


Higher order questions ask for analysis, synthesis or evaluation, the last
three categories of Bloom's Taxonomy which define these as demanding
more complex and thus 'higher' levels of thinking. For example:
5. Synthesis
What would happen if the family stopped eating together
at lunch time?

6. Evaluation
What are the advantages and disadvantages of living in
an extended family?

Illustrative examples from Let's Eat!, written by Ana Zamorano and illustrated by JulieVivas.
Taxonomy of Personal Engagement
(Morgan & Saxton, 1988)
1.
being curious about what is presented
Interest
2.
wanting to be, and being involved in the task
Engaging
3.
developing a sense of responsibility towards the task
Committing
4. merging objective concepts (the task or what is to be
Internalising learned) with subjective experience (what is already
owned) resulting in understanding and therefore
ownership of new ideas
5.
Interpreting wanting and needing to communicate that
understanding to others

6. wanting and willing to put that understanding to the


Evaluating test

Incorporates thinking and feeling.


TPE Guide Questions
 What questions will I ask which will attract their
attention? (Interest)
 What questions will I ask which will draw them into
active involvement, where their ideas become an
important part of the process? (Engaging)
 What questions will I ask which will invite them to take on
responsibility for the inquiry? (Committing)
 What questions will I ask which will create an environment
in which they will have opportunities to reflect upon their
personal thoughts, feelings, attitudes, points of view,
experiences and values in relation to the text?
(Internalising)
TPE Guide Questions
 What questions will I ask which will invite them to express
their new understanding to others and further adapt their
ideas in light of the feedback they receive? (Interpreting)
 What questions will I ask which will provide them with
opportunities to test their new thinking in different ways?
What opportunities will I provide which will enable them to
formulate new questions which arise from their new
understanding? (Evaluating)
Three things to foster student questions
(Dillon, 1988)
 Provide for student questions
• make systematic room for them by asking fewer questions yourself
• invite them in by the way we plan for and respond to them
• wait patiently for them by helping students see silent reflection as
acceptable
 Welcome the question
• communicate this through what you say and how you act
• model active listening
 Sustain the asking
• don't automatically answer the question
• help students clarify their question - the spoken question is often not
the question in mind
• reinforce and reward the experience of perplexity and the
expression of inquiry
• restate the question with praise or interest
• bring other students into the discussion
Other strategies
 establish an agreed set of guidelines for asking questions in the
classroom
 develop a list of questions to be used for self- and peer-assessment
 Incorporate learning logs or dialogues journals into your program; stop
the activity of the lesson at strategic times and ask students to reflect
and ask questions of the text they are studying
 allocate time at the end of the lesson for student questions
 have a 'stop and ask' time where you ask to pose a question that comes
to mind
 provide opportunities for students to share their questions about
texts with each other and the whole class
 role play interviews with a character from a text
Other strategies
 model self-talk when working with a text
 praise students questions - 'that's an interesting question', 'I hadn't
thought of that', etc.
 keep a record of interesting questions students ask and take time to
deconstruct them with students, focusing on what makes them
interesting
 ask a couple of students to keep a record of questions asked during a
discussion and identify the proportion of 'on the line', 'between the
lines' and 'beyond the lines' questions
 after studying a text, have students play a game of twenty questions,
for example, to guess an important symbol in film
 in pairs, have students read each other's work and ask a question about
it
Test Construction

TRACE Faculty Development Program


26-27 October 2012
Rationale for Tests
(Delano Wegener)

 Student Placement,
 Diagnosis of Difficulties,
 Checking Student Progress,
 Reports to Student and Superiors,
 Evaluation of Instruction.
Principles of Test Construction
For a psychological test to be acceptable it must
fulfill the following three criteria:

1. Standardization
2. Reliability
3. Validity

APP-79
Standardization
Standardizing a test involves administering the
test to a representative sample of future test
takers in order to establish a basis for
meaningful comparison.
Normal Curve
Standardized tests establish a normal distribution
of scores on a tested population in a bell-shaped
pattern called the normal curve.
Reliability
A test is reliable when it yields consistent results. To
establish reliability researchers establish different
procedures:
1. Split-half Reliability: Dividing the test into two
equal halves and assessing how consistent the
scores are.
2. Reliability using different tests: Using different
forms of the test to measure consistency between
them.
3. Test-Retest Reliability: Using the same test on two
occasions to measure consistency.
Validity
Validity of a test refers to what the test is supposed
to measure or predict.
1. Content Validity: Refers to the extent a test measures
your definition of the construct
2. Criterion-related validity: Relationship between scores
on a test and an independent measure of what the test
is supposed to measure
1. Predictive Validity: Refers to the function of a test in
predicting a particular behavior or trait. For instance, we
might theorize that a measure of math ability should be able
to predict how well a person will do in an engineering-based
profession.
2. Convergent Validity: We might correlate the scores on our
test with scores on other tests that purport to measure basic
math ability, where high correlations would be evidence of
convergent validity.
TEST PLAN (TABLE OF SPECIFICATIONS) TEMPLATE
(CERBIN, 2009)
A Test Plan (Table of Specifications) classifies each test item
according to what topic or concept it tests AND what objective it
addresses. The table can help you write a test that has content
validity—there is a match between what was taught and what is
tested. The table helps insure that you
1.emphasize the same content you emphasized in day-to-day instruction (e.g.,
more items about topic X and fewer about topic Y because you consider X to
be more important and you spent more time on X)
2.align test items with learning objectives (e.g., important topics might
include items that test interpretation, application, prediction, and
unimportant topics might be tested only with simpler recognition items)
3.do not overlook or underemphasize an area of content
You can create the Test Plan as you teach the class by inserting the
topics/concepts covered each day, and to the extent possible writing 1-2 test
items while the class period is still fresh in your mind.
TEST PLAN (TABLE OF SPECIFICATIONS) TEMPLATE
(CERBIN, 2009)

Subject Matter Learning Objectives


concepts, In these columns insert the learning objectives of the
topics, ideas course or unit
Knowledge Analyze Apply Interpret Total
(recall)
In this column
insert the topics
and concepts from
the course material
Topic A
Topic B
Topic C
Etc.
TABLE OF SPECIFICATIONS
(Sample courtesy of Mr. J. Banaag)
ITEM DISTRIBUTION TOTAL
Content/Objectives Factual Compre- Applica- NO. OF %
Knowledge hension tion ITEMS
Procedures in taking vital signs 4 6 10 20 20%

Specifications on different 4 6 10 20 20%


types of thermometer

Procedures in bathing & 5 6 14 25 25%


dressing/undressing of infant

Bathing paraphernalia, uses & 4 6 10 20 20%


types, specifications

Specifications of uses of non- 3 5 7 15 15%


slip rubber mat
Total 20 29 51 100 100%
QUESTION PAPER & MARKING SCHEME:
INTERNATIONAL SPECIFICATION STANDARDS
(UCLES/CIE/NCC-UK)

 General Guidelines
 Setting Questions
 Rubric & Question Paper Format
 Balance of Questions and Marks
 Question Structure Checklist
 Mark Scheme
GENERAL GUIDELINES

 A section must have a heading (eg, test


type)
 The heading shall be followed by an
instruction or direction
(eg, write your answers in the spaces
provided for…).
SETTING QUESTIONS

 Aim to cover the entire syllabus in


proportion to the weighting of the
topics.
 Avoid setting questions that have been
previously used.
 Ensure that questions are not lifted
from worked examples and calculations
from the textbook or reference material.
RUBRIC & PAPER FORMAT

 Decide on the manner the questions are


numbered
 Use keywords to start questions. Examples:
Define, Describe, Distinguish between,
Calculate, Compare, List, State
 Place emphasis on the number of points
required by candidate using bold face, and
always in words, not numbers.
RUBRIC & PAPER FORMAT (CONT.)

 Place emphasis on terms that the candidate


must focus on, particularly for contrast, eg,
by using italics.
 Use a question stem convention for multiple
choice type questions.
 Avoid straightforward True/False questions,
where guesswork alone can earn 50%.
RUBRIC & PAPER FORMAT (CONT.)

 Choose a question format that is sensible.


 For modules where candidates write their
answers in a combined question
paper/answer booklet ensure that there is
adequate space for candidates’ answers.
BALANCE OF QUESTIONS AND MARKS

 Marks available on a question for a topic


should reflect the relative % weighting of
that topic in the syllabus.

 Provide a marks map showing the


proportion of marks.
 Question difficulty should be balanced to
cater to the majority of average candidates.
QUESTION STRUCTURE CHECKLIST

 Make sure that the questions are clear and


unambiguous.
 Make sure that the questions can be
understood by candidates from different
backgrounds.
 Use simple and plain English. Do not use
complicated words or idiomatic phrases.
 Ensure questions do not assess knowledge
outside the syllabus.
QUESTION STRUCTURE CHECKLIST (CONT.)

 For multiple choice questions


• make sure the distractors are sensible;
• check that one or more distractors
cannot be a key;
• ensure that there are enough
distractors.
 Check time allotment.
 The mark allocation must be clearly
shown or indicated.
MARKING SCHEME OR KEY TO CORRECTION

Check that
 the mark scheme is complete for each question.

 all parts of a question have all marks allocated.

 the mark scheme is clear and unambiguous.

 that the mark scheme is not too prescriptive.

 the mark scheme only awards marks for points


asked for in the question.
MARKING SCHEME (CONT.)

Check that
 the mark scheme agrees with the question.

 the number of marks awarded for a


question and question parts tally with the
question.
 Check that the number of marks allocated
is appropriate for the level of difficulty.
MARKING SCHEME (CONT.)

 Avoid cascade errors


 Avoid half-marks.
 If two or more marks are available for a
question part then check that the mark
scheme stipulates what candidates must
do to earn less than the maximum.
MARKING SCHEME (CONT.)

 The mark scheme wordings must have a


standard structure.
Example:
 Introduce the instructions.
 Show each point with a mark allocation in
brackets [ ].
 Accept appropriate alternative answers (if
applicable).
 Show maximum marks available at the end,
eg, [max 6]
Types of tests: choosing what’s fit
http://www.uleth.ca/edu/runte/tests/conmc/wrtmc/wrtmc.html

Deciding what type of questions or test depends on the


skill you're testing
•--> obviously do not use multiple choice to test creative writing
•--> don't ask student to write an essay to find out if they can
identify the capital of Canada
•--Evaluation should always match as closely as possible the
actual activity you're teaching
-teaching oral French, should give oral tests
-if testing ability to write in French, better give essay
"write a letter to the grocer"
-but if testing reading, the best way to see if they have understood
what they have read may be multiple choice, true/false, or short
answer question.
Multiple Choice

Example: The shortage of a product of resource is called -- STEM


a) mass consumption
b) planned obsolescence }DISTRACTORS
c) materialism ALTERNATIVE
*d) scarcity KEY
* keyed answer

Guide questions:
• Does your question have everything it needs?
• Will your stem give away the answer?
• Are they all believable?
Checklist for Multiple-Choice Questions

 Always use the same number of alternatives


 NEVER use "all of the above" or "none of the above“
 Avoid "both a and b" etc
 Words common to all the alternatives should be placed in the stem
 All distractors should be plausible
 All distractors must remain consistent regardless of higher levels of
learning
 Keyed answer must be only correct or clearly the best answer
 Avoid overlapping distractors, e.g., synonyms
 Alternatives must be grammatically consistent with the stem, and
parallel in form
 Avoid stating the correct answer in greater length
 Avoid stating the correct answer in textbook language or stereotyped
phraseology
Checklist for Multiple-Choice Questions

 In science or math, you must have correct number of significant digits


 Avoid absolute terms like "always" or "never"
 Avoid saying UNTRUE things in alternatives (ideal to aim for)
 Vary the distribution of the keyed alternative in random manner
 Pyramiding alternatives: distractors should be arranged in ascending or
descending order
 dates in chronological order

 lines from a passage in order they appear in passage

 numbers in order

 alphabetize

 or by length

 Ensure that every item is independent of every other item


 Each multiple choice item should be worth same number of marks
True-False Questions

 also "yes/no" or "agree/disagree"


 measures facts --> cannot be used where there is an interpretation
 good for vocabulary (definitions), formula, dates, names, etc.

Advantages of True-False Questions


 (apparent) ease of construction
 can test a lot of different subject areas
 generally easy to read, so good for younger kids or poor readers
 one way of modifying evaluation for weaker students....
 easy to score --> both fast and objective
 (but set it up so they are circling true or false, not writing T or F
because some students Ts can look like another student's Fs)
 can cover a lot of ground in given time because fast to answer
 good for testing popular misconceptions
Re True-False Questions

 DO NOT LIFT STATEMENTS DIRECTLY FROM TEXT


- Encourages memorization rather than understanding.
- Becomes a reading test
- Promotes guessing
- Copyright problems

have a 50-50% chance of getting them right


 Students
 Guessing is encouraged
- "right minus wrong" is worse because it's a measure of timidity and self-assurance, not
knowledge --> nothing to do with what you think you're measuring
• great for cheating; peer over shoulder and get a whole row of answers
• offers little diagnostic value
- because nothing to indicate why child got it wrong (e.g. no math work)
- may get it right on the basis of misinformation
- may get it wrong as a result of misreading or misinterpreting statement
- or just "lucky guesser"?
Matching Questions

Basic Form
 Same basic principles as multiple-choice
 basically just a bunch of m-c questions which share same
alternatives
 the left column is called the "premise"
 the right column is called the "option" or "response"
Advantages
 same as Multiple Choice or True False except have
additional advantage of:
 reduces guessing
--ten options = 1 chance out of 10, then 1 out of nine, etc.
-- if options can be reused = 1 out of 10 (or whatever) each time
Guidelines on Matching Questions

 Has to be homogeneous items


Example: Don't mix dates and names
 Label columns
 Generally write the questions/premises on left, "options" on right
 Put the shorter words/phrases in second column
 Arrange items alphabetically or numerically
 Generally have more options than items
 Alternatively, set it up so they can use particular option more than
once
 Provide clear directions
 Whole question has to be on same page
 Assess higher thinking skills same way as with multiple-choice
Completion and Short Answer Questions

DIFFERENCE BETWEEN COMPLETION AND SHORT ANSWER

EXAMPLE OF A COMPLETION QUESTION


The first Prime Minister of Canada was _________________.

EXAMPLE OF A SHORT ANSWER QUESTION


Who was the first Prime Minister of Canada? _____________________

KEY: Sir John A. Macdonald

Completion = fill-in the blank ; Short Answer = answer the question


 when we say short answer, talking sentence or less
 more than one sentences or paragraph are short written response,
because no longer objective items --> need more complex scoring
scheme
Completion and Short Answer Tests

ADVANTAGES

• Easy to build
 No guessing --> kid has to come up with answer rather than just recognize it
 Actually faster to answer than mc because no alternatives to read through
 Better diagnostic info (see where they go wrong)

DISADVANTAGES

 Hard to do higher levels of Bloom's taxonomy because limited to a few words

 Handwriting and spelling become bit of an issue


 afraid to respond because embarrassed by spelling

 Hard to write so it is clear to student what is expected

 Hard to write because they can fill the blank with unintended responses

 Easier to bluff through ---> teachers hate giving filled in blanks zero
 one remedy is to announce that irrelevant responses will get zero
 could specify under blank, though usually unnecessary if carefully written
Essay Tests

Advantages

 No guessing
 have to generate answers rather than select them
 students can demonstrate their knowledge within broad limits
 Allows divergent thinkers to demonstrate originality, creativity
 in contrast to multiple-choice which are terrible for divergent thinkers who always chose the
answer that isn't in the list of alternatives provided
 of course, leaves open danger of diverging a bit too far...
 Reduced lead time required to produce
 less stuff to type, run-off if caught short
 write 'em on the blackboard if you have to
 Less work to administer for smaller number of students
 easier to mark ten essays than to build a multiple-choice test for ten students; advantage
decreases with class size
 but also have to consider long-term: teaching this course again next year?
 Can be rich in diagnostic information
 one topic in real depth
Essay Tests

Disadvantages
 Impossible to mark objectively
 Not valid measure of social or English if handwriting gets in the way
 teacher expectations more important than actual performance
 halo effects: student's earlier work prejudices expectations
 Even different times of day make a difference
 First paper to be read often sets standard, or if you decide later to lighten up,
the first paper's in trouble
 Can sample only limited range of course CONTENT
 Time consuming for student to write
 Time consuming (and mind numbing) to mark
 Certainly kills revision; Puts a heavy emphasis on student's writing, not thinking,
skills
 poor writers are slaughtered, no matter how well they know content

 Ultimately, have to evaluate writing or students won't learn to do it, but term paper may be better
approach to this
 Only use essays when essays are called for
Increasing Objectivity of Essay Scoring

 Score blind – have the student names on title pages


 Read one question at a time, AT THE SAME TIME
 Halo effects – Keep scores of previously marked
questions out of sight; shuffle papers after first run
through
 Have a policy on irrelevant answers, errors
 Mark paper twice or get a colleague to mark papers
 Tell students why they got the mark, not just grade
 Prepare rubrics
Sample Rubrics for Essay Questions
Organizing your test

 Arrange items appropriately on tests


 Patterns on test must be logical
 Group by type of question
 By difficulty, from easy to hard
Item Analysis: Purposes
 Fix marks for current class that just wrote the test
 More diagnostic information on students
 Classroom Level – will tell which questions they were all guessing on or which
most of them found very difficult
 Individual Level – isolate specific errors a student made
 Build future tests, revise test items to make them better
 Really pays off the second time you teach the same course
 SHOULD NOT REUSE WHOLE TESTS
 Part of the continuing professional development
 Help teach us to become better test writers
 Useful for dealing with parents or principals in case of a dispute
 Documenting your good performance/evaluation
 Before and After Pictures
 Long-term payoff
 Over time, a way to find out if innovation is working
Eight Simple Steps to Item Analysis
1. Score each answer sheet, write score total on the corner
2. Sort the pile into rank order from top to bottom score
3. If normal class of 30 students, divide in half (put aside middle paper if
odd)
4. Take top pile, count number of students who responded each
alternative. Repeat process with the bottom pile.
5. Subtract the number in lower group who got the question right from
number of high group sudents who got it right
6. Divide the difference by number of students in upper or lower group
(15 for 30 total students) to give you the “discrimination index”
(D).
7. Total number who got it right
8. Divide total by the total number of students. Difficulty =
(proportion who got it right (p)
Interpreting Item Analysis
1. Potential Miskey
2. Identifying ambiguous items
3. Equal distribution to all alternatives
4. Alternatives are not working
5. Distracter too attractive
6. Question not discriminating
7. Negative discrimination
8. Too easy
9. Omit
10. Relationship between D index and Difficulty (p)
Evaluating a Test (Statistics & Math)
 Measures of Central Tendency (Mean, Median, Mode)
 Frequency Distribution - Graphs (symmetrical, positively or
negatively skewed), Histogram
 Measures of Variability (Range, Standard Deviation, Standard
Scores, T Scores, Percentile, Quartiles, Deciles, Stanines)
Workshop:
1: Arrangement of Test Questions
2: Critiquing of True-False Test
3: Critiquing a Matching Test
4: Critiquing of MCQs
5: Critiquing a Completion/Short Answer Test
6: Critiquing of Essay Test
7: Preparing a TOS
Acknowledgements
Dr. Robert Runté, Education 3604, University of Lethbridge,
http://www.uleth.ca/edu/runte/test/conmc/wrtmc.html
Dr. Nicolas T. Capulong, “The Art of Questioning”
Kit Pertram, “Bloom’s Taxonomy: Levels of Understanding”
Bill Cerbin, CATL Workshop on Writing Better Objective Tests
Informatics Education, (Singapore), 2001
Jerome E. Banaag, “Table of Specification”, TRACE College
[email protected], “Test Construction and Principles Study Guide”
http://www.mindtools.com/pages/article/new/TMC_88.htm
http://psychology.about.com/od/research methods/f/validity.htm
http://academic.luzerne.edu/kdroms/staffdev/valrel.htm
And more…

You might also like