ITEM ANALYSIS AND PEER-REVIEW EVALUATION OF SPECIFIC HEALTH PROBLEMS AND APPLIED RESEARCH BLOCK EXAMINATION

https://doi.org/10.22146/jpki.49006

Novi Maulina(1*), Rima Novirianthy(2)

(1) Syiah Kuala University
(2) Syiah Kuala University
(*) Corresponding Author

Abstract


Background: Assessment and evaluation for students is an essential component of teaching and learning process. Item analysis is the technique of collecting, summarizing, and using students’ response data to assess the quality of the Multiple Choice Question (MCQ) test by measuring indices of difficulty and discrimination, also distracter efficiency. Peer review practices improve quality of assessment validity in evaluating student performance.
Method: We analyzed 150 student’s responses for 100 MCQs in Block Examination for its difficulty index (p), discrimination index (D) and distractor efficiency (DE) using Microsoft excel formula. The Correlation of p and D was analyzed using Spearman correlation test by SPSS 23.0. The result was analyzed to evaluate the peer-review strategy.
Results: The median of difficulty index (p) was 54% or within the range of excellent level (p 40-60%) and the mean of discrimination index (D) was 0.24 which is reasonably good. There were 7 items with excellent p (40–60%) and excellent D (≥0.4). Nineteen of items had excellent discrimination index (D≥0.4). However,there were 9 items with negative discrimination index and 30 items with poor discrimination index, which should be fully revised. Forty-two of items had 4 functioning distracters (DE 0%) which suggested the teacher to be more precise and carefully creating the distracters.
Conclusion: Based on item analysis, there were items to be fully revised. For better test quality, feedback and suggestions for the item writer should also be performed as a part of peer-review process on the basis of item analysis.


Keywords


Item Analysis, Multiple Choice Questions (MCQs), Difficulty Index, Discrimination Index, Peer Review

Full Text:

PDF


References

1.Anamuah-Mensah J, Quagrain KA. Teacher competence in the use of essay tests. The Oguaa Educator University of Cape Coast. 1998;12:31-42.
2. Popham WJ. Classroom assessment: What teachers need to know. Allyn & Bacon, A Viacom Company, 160 Gould St., Needham Heights, MA 02194; World Wide Web: http://www. abacon. com; 1999.
3. Quaigrain K, Arhin AK. Using reliability and item analysis to evaluate a teacher-developed test in educational measurement and evaluation. Cogent Education. 2017 Jan 1;4(1):1301013.
4. Bacon DR. Assessing learning outcomes: A comparison of multiple-choice and short-answer questions in a marketing context. Journal of Marketing Education. 2003 Apr;25(1):31-6.
5. Gronlund NE. Assessment of student achievement. Allyn & Bacon Publishing,Longwood Division, 160 Gould Street, Needham Heights, MA 02194-2310; tele; 1998.
6. Abozaid H, Park YS, Tekian A. Peer review improves psychometric characteristics of
multiple choice questions. Medical teacher. 2017 Mar 16;39(sup1):S50-4.
7. Shakil M. Assessing Student Performance Using Test Item Analysis and its Relevance to the State Exit Final Exams of MAT0024 Classes-An Action Research Project. In A Paper presented on MDC Conference Day 2008 Mar 6.
8. Hotiu A. The relationship between item difficulty and discrimination indices in multiplechoice tests in a physical science course (Doctoral dissertation, Florida Atlantic University).
9. Thorndike RM, Cunningham GK, Thorndike RL, Hagen EP. Measurement and evaluation in psychology and education. Macmillan Publishing Co, Inc; 1991.
10. Mitra NK, Nagaraja HS, Ponnudurai G, Judson JP. The levels of difficulty and discrimination indices in type A multiple choice questions of pre-clinical semester 1 multidisciplinary summative tests. IeJSME. 2009;3(1):2-7.
11. Pande SS, Pande SR, Parate VR, Nikam AP, Agrekar SH. Correlation between difficulty and discrimination indices of MCQs in formative exam in Physiology. South East Asian J Med
Educ. 2013;7:45-50.
12. Gajjar S, Sharma R, Kumar P, Rana M. Item and test analysis to identify quality multiple
choice questions (MCQs) from an assessment of medical students of Ahmedabad, Gujarat. Indian journal of community medicine: official publication of Indian Association of Preventive & Social Medicine. 2014 Jan;39(1):17.
13. Hingorjo MR, Jaleel F. Analysis of one-best MCQs: the difficulty index, discrimination index and distractor efficiency. JPMA-Journal of the Pakistan Medical Association. 2012 Feb
1;62(2):142.
14. Rudner LM, Schafer WD. What teachers need to know about assessment. 2002.
15. Tarrant M, Ware J, Mohammed AM. An assessment of functioning and non-functioning
distractors in multiple-choice questions: a descriptive analysis. BMC medical education.
2009 Dec;9(1):40.
16. Case SM, Swanson DB. Constructing written test questions for the basic and clinical sciences.Philadelphia: National Board of Medical
Examiners; 1998 Oct 20.



DOI: https://doi.org/10.22146/jpki.49006

Article Metrics

Abstract views : 2400 | views : 1864

Refbacks

  • There are currently no refbacks.


Copyright (c) 2020 Novi Maulina, Rima Novirianthy

Creative Commons License
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.

Jurnal Pendidikan Kedokteran Indonesia (The Indonesian Journal of Medical Education) indexed by: