Academia.eduAcademia.edu

Outline

Synthesizing and Reporting Milestones-Based Learner Analytics

2020, Academic Medicine

https://doi.org/10.1097/ACM.0000000000002959

Abstract

Coordinating and operationalizing assessment systems that effectively streamline and measure fine-grained progression of residents at various stages of graduate medical training can be challenging. This article describes development, administration, and psychometric analyses of a learner analytics system to resolve challenges in implementation of milestones by introducing the Scoring Grid Model, operationalized in an internal medicine (IM) residency program. A three-year longitudinal cohort of 34 residents at the University of Illinois at Chicago College of Medicine began using this learner analytics system, from entry (July 2013) to graduation (June 2016). Scores from 23 assessments used throughout the 3-year training were synthesized using the Scoring Grid Model learner analytics system, to generate scores corresponding to the 22 reportable IM subcompetencies. A consensus model was used to develop and pilot test the model using feedback from IM faculty members and residents. Scores from the scoring grid were used to inform promotion decisions and reporting of milestone levels. Descriptive statistics and mixed-effects regression were used to examine data trends and gather validity evidence. Initial validity evidence for content, internal structure, and relations to other variables that systematically integrate assessment scores aligned with the reportable milestones framework are presented, including composite score reliability of scores generated from the learner analytics system. The scoring grid provided fine-grained learner profiles and showed predictive utility in identifying low-performing residents.

References (16)

  1. Institute of Medicine. Graduate Medical Education That Meets the Nation's Health Needs. Washington, D.C.: The National Academies Press; 2014.
  2. Taveria-Gomes T, Saffarzadeh A, Severo M, Guimaraes MJ, Ferreira MA. A novel collaborative e-learning platform for medical students: ALERT STUDENT. BMC Med Educ. 2012;14:1-14.
  3. Nasca TJ, Philibert I, Brigham T, Flynn TC. The Next GME Accreditation System. N Engl J Med. 2012;366:1051-1056.
  4. Andolsek K, Padmore J, Hauer KE, Edgar L, Holmboe E. Clinical Competency Committees: A Guidebook for Programs. 2nd ed. Chicago, IL: Accreditation Council for Graduate Medical Education; 2017.
  5. Ekpenyong A, Baker E, Harris I, et al. How do clinical competency committees use different sources of data to assess residents' performance on the internal medicine milestones? A mixed methods pilot study. Med Teach. 2017;39:1074-1083.
  6. Accreditation Council for Graduate Medical Education and American Board of Internal Medicine. The Internal Medicine Milestone Project. Chicago, IL: Accreditation Council for Graduate Medical Education; 2017.
  7. Boateng BE, Bass LD, Blaszak RT, Farrar HC. The development of a competency-based assessment rubric to measure resident milestones. J Grad Med Educ. 2009;1:45-48.
  8. Tekian A, Hodges BD, Roberts TE, Schuwirth L, Norcini J. Assessing competencies using milestones along the way. Med Teach. 2014;19:1-4.
  9. Hawkins RE, Holmboe ES. Constructing an evaluation system for an educational program. In: Holmboe ES, Hawkins RE, eds. Practical Guide to the Evaluation of Clinical Competence. Philadelphia, PA: Mosby; 2008.
  10. Bienkowski M, Feng M, Means B. Enhancing teaching and learning through educational data mining and learning analytics. Washington, D.C.: U.S. Department of Education Office of Educational Technology; 2012.
  11. Park YS, Hodges B, Tekian A. Evaluating the paradigm shift from time-based toward competency-based medical education: Implications for curriculum and assessment. In: Wimmers PF, Mentkowski M, eds. Assessing Competence in Professional Performance Across Disciplines and Professions. New York, NY: Springer; 2016:411-425.
  12. Park YS, Zar F, Norcini J, Tekian A. Competency evaluations in the Next Accreditation System: Contributing to guidelines and implications. Teach Learn Med. 2016;28:135- 145.
  13. Park YS, Riddle J, Tekian A. Validity evidence of resident competency ratings and the identification of problem residents. Med Educ. 2014;48:614-622.
  14. Kane M, Case SM. The reliability and validity of weighted composite scores. Appl Meas Educ. 2004;17:221-240.
  15. Park YS, Lineberry M, Hyderi A, Bordage G, Xing K, Yudkowsky R. Differential weighting for subcomponent measures of Integrated Clinical Encounter scores based on the USMLE Step-2 CS examination: Effects on composite score reliability and pass-fail decisions. Acad Med. 2016;91(11 Suppl):S24-S30.
  16. West DM. Big data for education: Data mining, data analytics, and web dashboards. Washington, D.C.: Brookings Institution; 2012.