SlideShare a Scribd company logo
1 of 53
WHY NOT MAKE YOUR TESTS BETTER? By: Snežana Filipović
Why do we test?
“ The assumption is that is that best teacher is the teacher who devises classroom methods and techniques that derive from  a comprehensive knowledge of the total process of language learning, of what is happening within the learner and within the teacher and the interaction between the two. All of this knowledge, however, remains somehow abstract in the mind of the teacher unless it can be empirically tested in the real world. Your theory of second language acquisition can be put into practice every day in the classroom, but you will never know how valid your theory is unless you systematically measure the success of your learners – the success of your theory-in-practice” Douglas Brown (1987:218)
What makes a good test?
Tests are good only when they are used for a particular purpose with the students for who they are intended
TEST DEVELOPMENT
STAGE 1 The design stage or  THINK
STAGE 2 The operationalisation stage or THINK & WRITE
STAGE 3 The administration stage or TEST & THINK & CHANGE
Some reasons why I like multiple choice tests
[object Object],[object Object],[object Object],[object Object]
ONE REASON WHY I DO NOT LIKE THEM They are notoriously difficult to write
Anatomy of a Multiple Choice Item 1. How did Tina go to the airport? - a) by bus - b) by car - c) on foot - d) by taxi stem alternatives distractor answer distractor distractor
GUIDELINES FOR MAKING MULTIPLE CHOICE TESTS
Each item should assess a single written objective Before writing an item, think about what it is that you want to test
Include one and only one correct or clearly the best answer in each item
The stem should not be burdened with irrelevant material, but shoul contain as much of an item as possible
Layout of the answers should be clear and consistent. The alternatives should be listed vertically
Avoid answering one item in the test by giving the answer  somewhere else in the test
Keep the items mutually exclusive
The problem should be stated clearly in the stem. The students should not infer what the problem is
Avoid changing pages in the middle of an item
The alternatives should be kept homogenous in content.  They should not consist of potpourri of statements related to the stem but unrelated to each other
You should not provide clues as to which alternative is correct. Keep the grammar of each alternative consistent with the stem
Distractors should be as plausible as possible.  They should sound plausible only to an incompetent student
Put the correct answer in each of the alternative positions approximately the equal number of times, in a random order
Avoid the use of specific determiners.  (never, always, only)
Keep the alternatives similar in length
Try to make the first few items relatively easy. Make sure you have items of the different level of difficulty
Do not try to write the entire test in a day. It takes time, creativity and thinking to write  good multiple choice items. Come back to the test a few days later, with a fresh eye
Analyse the effectiveness of each item. Item analysis is an excelleny tool for this
ITEM ANALYSIS
PURPOSE OF ITEM ANALYSIS - Evaluates the quality of each item - The quality of items determines the quality of the test - Suggests ways of improving the test - Suggests ways of improving  teaching
CLASSICAL ITEM ANALYSIS Item Facility Analysis Item Discrimination Analysis Distractor Efficiency Analysis
ITEM FACILITY INDEX ,[object Object],[object Object],[object Object]
General Rules for Item Facility IF less than 0.20 - difficult test items IF 0.20 - 0.80 -moderately difficult items IF more than 0.80 - easy items
ITEM DISCRIMINATION ANALYSIS We test because we want to find out if the students know the material, but  all we learn for certain is how they did on the exam we gave them. The item  discrimination index tests the test in the hope of keeping the correlation between  the knowledge and exam performance as close as it can be in an admittedly  imperfect system.  Zurawski 1998: 2
ITEM DISCRIMINATION ANALYSIS - Compares the performance of upper group of students (high test scorers) and lower group (low test scorers)  on each item in the test
[object Object],[object Object]
[object Object],[object Object],[object Object],[object Object],[object Object]
DISTRACTOR EFFICIENCY ANALYSIS How many students chose each option?
A perfect item would have 2 characteristics: - Everyone who knows the material tested in the item would get it right - Students who do not know would have the answers equally  distributed amount the options
DISTRACTOR EFFICIENCY ANALYSIS Distractor Efficiency Table
IT IS TIME FOR A COFFEE BREAK! (Thank you all for coming!)
 
 
 
 
 
 
 
 
 

More Related Content

What's hot

Administering, Analyzing, and Improving the Test or Assessment
Administering, Analyzing, and Improving the Test or AssessmentAdministering, Analyzing, and Improving the Test or Assessment
Administering, Analyzing, and Improving the Test or AssessmentNema Grace Medillo
 
Item analysis presentation
Item analysis presentationItem analysis presentation
Item analysis presentationJenita Guinoo
 
Item analysis
Item analysisItem analysis
Item analysisaneez103
 
Fdu item analysis (1).ppt revised by dd
Fdu item analysis (1).ppt revised by ddFdu item analysis (1).ppt revised by dd
Fdu item analysis (1).ppt revised by dddettmore
 
Qualities of a good test (1)
Qualities of a good test (1)Qualities of a good test (1)
Qualities of a good test (1)kimoya
 
MCQ test item analysis
MCQ test item analysisMCQ test item analysis
MCQ test item analysisSoha Rashed
 
EXAMINING DISTRACTORS AND EFFECTIVENESS
EXAMINING DISTRACTORS AND  EFFECTIVENESSEXAMINING DISTRACTORS AND  EFFECTIVENESS
EXAMINING DISTRACTORS AND EFFECTIVENESSJill Frances Salinas
 
Jr guzman-item-analysis
Jr guzman-item-analysisJr guzman-item-analysis
Jr guzman-item-analysisJayArGuzman
 

What's hot (20)

Item analysis
Item analysisItem analysis
Item analysis
 
Item Analysis
Item AnalysisItem Analysis
Item Analysis
 
Matching type of evaluation
Matching type of evaluationMatching type of evaluation
Matching type of evaluation
 
Administering, Analyzing, and Improving the Test or Assessment
Administering, Analyzing, and Improving the Test or AssessmentAdministering, Analyzing, and Improving the Test or Assessment
Administering, Analyzing, and Improving the Test or Assessment
 
Test appraisal
Test appraisalTest appraisal
Test appraisal
 
Item analysis
Item analysis Item analysis
Item analysis
 
Item analysis.pptx du
Item analysis.pptx duItem analysis.pptx du
Item analysis.pptx du
 
Item analysis report
Item analysis report Item analysis report
Item analysis report
 
New item analysis
New item analysisNew item analysis
New item analysis
 
Item Analysis and Validation
Item Analysis and ValidationItem Analysis and Validation
Item Analysis and Validation
 
Item analysis presentation
Item analysis presentationItem analysis presentation
Item analysis presentation
 
Item analysis
Item analysisItem analysis
Item analysis
 
Item analysis
Item analysisItem analysis
Item analysis
 
Fdu item analysis (1).ppt revised by dd
Fdu item analysis (1).ppt revised by ddFdu item analysis (1).ppt revised by dd
Fdu item analysis (1).ppt revised by dd
 
Item Analysis
Item AnalysisItem Analysis
Item Analysis
 
Qualities of a good test (1)
Qualities of a good test (1)Qualities of a good test (1)
Qualities of a good test (1)
 
MCQ test item analysis
MCQ test item analysisMCQ test item analysis
MCQ test item analysis
 
EXAMINING DISTRACTORS AND EFFECTIVENESS
EXAMINING DISTRACTORS AND  EFFECTIVENESSEXAMINING DISTRACTORS AND  EFFECTIVENESS
EXAMINING DISTRACTORS AND EFFECTIVENESS
 
Jr guzman-item-analysis
Jr guzman-item-analysisJr guzman-item-analysis
Jr guzman-item-analysis
 
Analyzing and using test item data
Analyzing and using test item dataAnalyzing and using test item data
Analyzing and using test item data
 

Similar to WHY MAKE TESTS BETTER

Similar to WHY MAKE TESTS BETTER (20)

Test and Assessment Types
Test and Assessment TypesTest and Assessment Types
Test and Assessment Types
 
Item analysis ppt
Item analysis pptItem analysis ppt
Item analysis ppt
 
Constructing of tests
Constructing of testsConstructing of tests
Constructing of tests
 
testconst-do-151030115824-lva1-app6892.ppt
testconst-do-151030115824-lva1-app6892.ppttestconst-do-151030115824-lva1-app6892.ppt
testconst-do-151030115824-lva1-app6892.ppt
 
Test appraisal
Test appraisalTest appraisal
Test appraisal
 
Materi mc
Materi mcMateri mc
Materi mc
 
Analyzing and using test item data
Analyzing and using test item dataAnalyzing and using test item data
Analyzing and using test item data
 
Analyzing and using test item data
Analyzing and using test item dataAnalyzing and using test item data
Analyzing and using test item data
 
Analyzing and using test item data
Analyzing and using test item dataAnalyzing and using test item data
Analyzing and using test item data
 
Assessment-of-Learning.pptx
Assessment-of-Learning.pptxAssessment-of-Learning.pptx
Assessment-of-Learning.pptx
 
Analyzingandusingtestitemdata 101012035435-phpapp02
Analyzingandusingtestitemdata 101012035435-phpapp02Analyzingandusingtestitemdata 101012035435-phpapp02
Analyzingandusingtestitemdata 101012035435-phpapp02
 
Test Construction
Test ConstructionTest Construction
Test Construction
 
Selected Response Items.pdf
Selected Response Items.pdfSelected Response Items.pdf
Selected Response Items.pdf
 
Item analysis2
Item analysis2Item analysis2
Item analysis2
 
Types of test questions
Types of test questionsTypes of test questions
Types of test questions
 
Pencil andpapertest
Pencil andpapertestPencil andpapertest
Pencil andpapertest
 
TOS-WITH-OVERVIEW-OF-TEST-CONSTRUCTION.pptx
TOS-WITH-OVERVIEW-OF-TEST-CONSTRUCTION.pptxTOS-WITH-OVERVIEW-OF-TEST-CONSTRUCTION.pptx
TOS-WITH-OVERVIEW-OF-TEST-CONSTRUCTION.pptx
 
Assessment in Learning
Assessment in LearningAssessment in Learning
Assessment in Learning
 
Kinds of Tests.ppt
Kinds of Tests.pptKinds of Tests.ppt
Kinds of Tests.ppt
 
MODULE 7.pptx
MODULE 7.pptxMODULE 7.pptx
MODULE 7.pptx
 

WHY MAKE TESTS BETTER

  • 1. WHY NOT MAKE YOUR TESTS BETTER? By: Snežana Filipović
  • 2. Why do we test?
  • 3. “ The assumption is that is that best teacher is the teacher who devises classroom methods and techniques that derive from a comprehensive knowledge of the total process of language learning, of what is happening within the learner and within the teacher and the interaction between the two. All of this knowledge, however, remains somehow abstract in the mind of the teacher unless it can be empirically tested in the real world. Your theory of second language acquisition can be put into practice every day in the classroom, but you will never know how valid your theory is unless you systematically measure the success of your learners – the success of your theory-in-practice” Douglas Brown (1987:218)
  • 4. What makes a good test?
  • 5. Tests are good only when they are used for a particular purpose with the students for who they are intended
  • 7. STAGE 1 The design stage or THINK
  • 8. STAGE 2 The operationalisation stage or THINK & WRITE
  • 9. STAGE 3 The administration stage or TEST & THINK & CHANGE
  • 10. Some reasons why I like multiple choice tests
  • 11.
  • 12. ONE REASON WHY I DO NOT LIKE THEM They are notoriously difficult to write
  • 13. Anatomy of a Multiple Choice Item 1. How did Tina go to the airport? - a) by bus - b) by car - c) on foot - d) by taxi stem alternatives distractor answer distractor distractor
  • 14. GUIDELINES FOR MAKING MULTIPLE CHOICE TESTS
  • 15. Each item should assess a single written objective Before writing an item, think about what it is that you want to test
  • 16. Include one and only one correct or clearly the best answer in each item
  • 17. The stem should not be burdened with irrelevant material, but shoul contain as much of an item as possible
  • 18. Layout of the answers should be clear and consistent. The alternatives should be listed vertically
  • 19. Avoid answering one item in the test by giving the answer somewhere else in the test
  • 20. Keep the items mutually exclusive
  • 21. The problem should be stated clearly in the stem. The students should not infer what the problem is
  • 22. Avoid changing pages in the middle of an item
  • 23. The alternatives should be kept homogenous in content. They should not consist of potpourri of statements related to the stem but unrelated to each other
  • 24. You should not provide clues as to which alternative is correct. Keep the grammar of each alternative consistent with the stem
  • 25. Distractors should be as plausible as possible. They should sound plausible only to an incompetent student
  • 26. Put the correct answer in each of the alternative positions approximately the equal number of times, in a random order
  • 27. Avoid the use of specific determiners. (never, always, only)
  • 28. Keep the alternatives similar in length
  • 29. Try to make the first few items relatively easy. Make sure you have items of the different level of difficulty
  • 30. Do not try to write the entire test in a day. It takes time, creativity and thinking to write good multiple choice items. Come back to the test a few days later, with a fresh eye
  • 31. Analyse the effectiveness of each item. Item analysis is an excelleny tool for this
  • 33. PURPOSE OF ITEM ANALYSIS - Evaluates the quality of each item - The quality of items determines the quality of the test - Suggests ways of improving the test - Suggests ways of improving teaching
  • 34. CLASSICAL ITEM ANALYSIS Item Facility Analysis Item Discrimination Analysis Distractor Efficiency Analysis
  • 35.
  • 36. General Rules for Item Facility IF less than 0.20 - difficult test items IF 0.20 - 0.80 -moderately difficult items IF more than 0.80 - easy items
  • 37. ITEM DISCRIMINATION ANALYSIS We test because we want to find out if the students know the material, but all we learn for certain is how they did on the exam we gave them. The item discrimination index tests the test in the hope of keeping the correlation between the knowledge and exam performance as close as it can be in an admittedly imperfect system. Zurawski 1998: 2
  • 38. ITEM DISCRIMINATION ANALYSIS - Compares the performance of upper group of students (high test scorers) and lower group (low test scorers) on each item in the test
  • 39.
  • 40.
  • 41. DISTRACTOR EFFICIENCY ANALYSIS How many students chose each option?
  • 42. A perfect item would have 2 characteristics: - Everyone who knows the material tested in the item would get it right - Students who do not know would have the answers equally distributed amount the options
  • 43. DISTRACTOR EFFICIENCY ANALYSIS Distractor Efficiency Table
  • 44. IT IS TIME FOR A COFFEE BREAK! (Thank you all for coming!)
  • 45.  
  • 46.  
  • 47.  
  • 48.  
  • 49.  
  • 50.  
  • 51.  
  • 52.  
  • 53.