Validitas dan Reliabilitas Ujian SOCA (Students Oral Case Analysis): Studi di Salah Satu Fakultas Kedokteran di Indonesia

https://doi.org/10.22146/jpki.25374

Rizka Aries Putranti(1*), Ova Emilia(2), Efrayim Suryadi(3)

(1) Mahasiswa Program S2 Ilmu Pendidikan Kedokteran FK UGM
(2) Departemen Pendidikan Kedokteran, Fakultas Kedokteran Universitas Gadjah Mada
(3) Departemen Pendidikan Kedokteran, Fakultas Kedokteran Universitas Gadjah Mada
(*) Corresponding Author

Abstract


Background: Medical faculty has to make sure that the students meet the minimal competence needed using apropriate exam. While the exam itself should facilitate students to learn. Oral examination has known for its ability to facilitate students learn but low in validity and reliability. Medical faculty of Lampung University (FK Unila) apply the student oral case analysis (SOCA) exam as one of block assessment component, as with MCQ, tutorial, and laboratory exam. This study aimed to evaluate validity and reliability of SOCA examination at FK Unila

Method: Video of 65 students doing SOCA examination and 28 question rubrics had taken when odd semester exam year 2014-2015 has been carying out at FK Unila. Video and question rubrics were assessed by 5 panelis and analysed using Lawshe's content validity ratio (CVR) to determinate its content validity. Students performance on the video were re-assessed by another assessor to see inter-rater reliability, than analysed using kappa Cohen. Two expert in medical education assessed the cognitive comlpexity of the question rubrics. Data of SOCA's student's mark from year II, III, and IV were analysed for construct valdity and internal consistency.

Results: 93,7% of the overall question in 65 video were valid (CVR>99%) and 71,8% question number in 28 question rubrics also valid according to 5 panelis. SOCA cognitive complexity were at level of analyse, know how and 4a. Inter-rater reliability analysis showed 0,549 (moderate agreement) kappa value. Mann Whitney analysis for construct validity showed no significant difference of all year. Cronbach alpha analysis showed internal consistency at the point 0,575.

Conclusion: FK Unila's SOCA of odd semester examination year 2014-2015 has sufficient content validity, sufficient cognitive complexity and sufficent inter-rater reliability but lack in construct validity and internal consistency.

 

Keywords: SOCA, validity, reliability


Keywords


SOCA, validity, reliability

Full Text:

PDF


References

  1. Norcini J, Anderson B, Bollela V, Burch V, Costa MJ, Duvivier R, Galbraith R, Hays R, Kent A, Perrot V, Roberts T. Criteria for good assessment: Consensus statement and recommendations from the Ottawa 2010 conference. Medical Teacher. 2011;33:206–14.
  2. Baartman LKJ, Bastiaens TJ, Kirschner PA, Van der Vleuten CPM. Evaluation assessment quality in competence-based education: A qualitative comparison of two frameworks. Educational Research Review. 2007;2: 114-29.
  3. FK Unila. Panduan Penyelenggaraan Pendidikan Program Sarjana Kedokteran Dan Program Profesi Dokter. Bandar Lampung: Universitas Lampung; 2010.
  4. FK Unila. Regulasi Pelaksanaan Kurikulum Berbasis Kompetensi. Bandar Lampung: Universitas Lampung; 2012.
  5. Lawshe CH. A quantitative approach to content validity. Personnel Psychology. 1975;28:563-75.
  6. Cohen RJ, Swerdilk ME. Psychological Testing and Assessment – an Introduction to Test and Measurement 7th Ed. Mc Graw Hill Primis Online [internet] 2009. Available from http://www.primisonline.com
  7. Linn RL, Baker EL, Dunbar SB. Complex, Performance-Based Assessment: Expectations and Validation Criteria. Educational Researcher. 1991;20(8):15-21.
  8. Tillema H, Leenknecht M, Segers M. Assessing assessment quality: Criteria for quality assurance in design of (peer) assessment for learning – A review of research studies. Studies in Educational Evaluation. 2011;37:25–34.
  9. Auewarakul C, Downing SM, Jaturatamrong U, Praditsuwan R. Sources of validity evidence for an internal medicine student evaluation system: an evaluative study of assessment methods. Medical Education. 2005; 39: 276–83.
  10. Viera AJ, Garrett JM. Understanding Interobserver Agreement: The Kappa Statistic. Family Medicine. 2005;37(5): 360-3.
  11. White CB, Ross PT, Haftel HM. Assessing the Assessment: Are Senior Summative OSCEs Measuring Advanced Knowledge, Skills, and Attitudes? Academic Medicine. 2008; 83:1191–5.
  12. Downing SM, Yudkowsky R. (Eds.) Assessment in Health Profession Education. Taylor and Francis e-Library. 2009 [internet] available from www.ebookstore.tandf.co.uk
  13. Sim J, Wright CC. The kappa statistic in reliability studies: use, interpretation, and sample size requirements. Physical Therapy. 2005; 85: 257-68.
  14. Cohen L, Manion L, Morrison K. Research Methods in Education 7th Ed. New York: Routledge; 2011.
  15. Dahlan MS. Statistik untuk Kedokteran dan Kesehatan. Deskriptif, Bivariat, dan Multivariat. Dilengkapi Alpikasi dengan Menggunakan SPSS. Edisi 4. Jakarta: Salemba Medika; 2009.
  16. Davis MH, Karunathilake I. The place of the oral examination in today’s assessment systems. Medical Teacher. 2005; 27(4): 294–7.
  17. Joughin G, Collom G. Oral assessment. Biomedical Scientist. 2003; 47(10), 1078-80.
  18. Amjad A. Clinical diagnostic reasoning and the curriculum: a medical student’s perspective. Medical Teacher. 2008; 30: 426-7.
  19. Prihatanto FSI. Studi validitas dan reliabilitas script concordance test pada lulusan dokter baru fakultas kedokteran universitas airlangga (Tesis). Yogyakarta: Program Magister Ilmu Pendidikan Kedokteran Fakultas Kedokteran Universitas Gadjah Mada; 2011.
  20. Sari SM. Validitas dan Reliabilitas Metode Penilaian Student’s Oral Case Analysis (SOCA) pada Mahasiswa Tahap Sarjana Kedokteran. Jurnal Pendidikan Kedokteran Indonesia. 2013; 2(2):1-4.
  21. Handschu R, Littmann R, Reulbach U, Gaul C, Heckmann JG, Neundörfer B, Scibor M. Telemedicine in emergency evaluation of acute strokeminterrater agreement in remote video examination with a novel multimedia system. Stroke. 2003; 34: 2842-6.
  22. Wass V, Wakeford R, Neighbour R, Van der Vleuten C. Achieving acceptable reliability in oral examinations: an analysis of the Royal College of General Practitioners membership examination’s oral component. Medical Education. 2003;37:126–31.
  23. Du Plessis GI, Jorissen HW. Oral assessment. Education Innovation and Consultation, Telematic Learning & Education Innovation, University Of Pretora; 2002.
  24. Schuwirth LWT, Vleuten CPM. General overview of the theories used in assessment: AMEE guide no. 57. Medical Teacher. 2011; 33: 783–97.
  25. Van Der Vleuten CPM. The assessment of professional competence: developments, research and practical implications. Advances in Health Sciences Education. 1996;1:41-67.
  26. Baartman LKJ, Prins FJ, Kirschner PA, Van der Vleuten CPM. Determining the quality of competence assessment programs: a self-evaluation procedure. Studies in Educational Evaluation. 2007; 33: 258-81.



DOI: https://doi.org/10.22146/jpki.25374

Article Metrics

Abstract views : 1465 | views : 2858

Refbacks

  • There are currently no refbacks.


Copyright (c) 2017 Jurnal Pendidikan Kedokteran Indonesia; The Indonesian Journal of Medical Education

Creative Commons License
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.

Jurnal Pendidikan Kedokteran Indonesia (The Indonesian Journal of Medical Education) indexed by:


Creative Commons License

This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.

 

JPKI Stats