Matches in SemOpenAlex for { <https://semopenalex.org/work/W3123528123> ?p ?o ?g. }
Showing items 1 to 75 of
75
with 100 items per page.
- W3123528123 endingPage "160" @default.
- W3123528123 startingPage "159" @default.
- W3123528123 abstract "In this issue of Academic Medicine, we have assembled a set of articles on emerging issues in assessment in medical education. The collection consists of systematic reviews, reports of empirical studies and surveys, scholarly Perspectives, and letters. The authors are learners, educators, and leaders (sometimes all three), and their work examines a wide range of considerations and challenges in assessment across the stages of medical education. Most of these articles came to our journal as spontaneous submissions, signaling interest in and the ever-increasing importance of assessment in medical education. The articles in this collection address topics in assessment that are surfacing in the current context of medical education. That context is uniquely complex and constellated because of the global pandemic, the greater attention being paid to professional well-being in medical education, and the relatively recent changes in grade reporting for national examinations. Recognition of inequity and evidence regarding bias in standardized testing have further heightened concerns regarding assessment in medical education. Viewed in the past as something of a curricular “afterthought,” assessment has evolved to become a rich area for inquiry, with new constructs for defining and evaluating competence, new and different approaches to testing, new emphasis on holistic evaluation, and new and highly informative applications of expertise from other fields, such as the quantitative sciences, humanities, cognitive neuroscience, and, most recently, artificial intelligence/machine learning. Assessment is integral to the learning process and is used as a means of ensuring accountability in medical education and society at large. For these reasons, it is no surprise that many of the articles in this collection call for greater focus, rigor, research, and intentionality in assessment in medical education. The Articles In an extensive systematic review, Brydges et al 1 closely examined reports published over 18 years regarding competence-based medical education, finding that a large but “mixed evidence base, static assumptions, and limited research practices” are hampering advances in assessment and medical education research. Looking intensively at one discipline in postgraduate education, on the basis of a large systematic review of assessment approaches in training in the surgical specialties, Hanrahan et al 2 similarly note that current efforts to evaluate resident competence, strengths, and deficits are fragmented. Hanrahan et al state that published evidence is insufficient, of low quality, and typically based on small-scale initiatives. The authors conclude with a call for a paradigm shift in surgical education, entailing national and international collaboration “to optimize design and validation so that a comprehensive assessment of surgical competence can be implemented.” Reflecting on the impact of the pandemic on medical education, Hauer et al 3 articulate that there is an imperative to further embrace competence-based, rather than time-based, training objectives. Hauer et al reason that the field should expand assessment methods and prioritize “useful, meaningful assessment data” and outcomes, especially in relation to the transition from undergraduate to graduate training stages. This transition in training stages is also the focus of a report by Geraghty et al, 4 who describe six domains of “tension” identified by medical students involved in the early implementation of 13 Core Entrustable Professional Activities (EPAs) for Entering Residency. The student leaders who authored this piece came from 5 of 10 pilot institutions and emphasized the need for additional research to “explore the perspectives of students throughout the process of implementing the Core EPAs” in light of students’ role as “end users” of new curricula. In their Invited Commentary on reevaluating teacher and learner roles and responsibilities, Prober and Norden 5 ask that we place greater importance on interactions between faculty and students and peer-to-peer student collaborations and reemphasize “competence, communication, and compassion” in developing more attuned assessments. This theme of faculty and student interaction emerged in the report by Ingram et al, 6 who examined 3,947 completed evaluations by faculty from the internal medicine, pediatrics, and surgery clerkships at the University of Alabama at Birmingham School of Medicine. The investigators found that 5 characteristics predicted whether students would be recommended for “honors.” One of these characteristics (i.e., contact time with a supervisor) related to clerkship structure rather than explicit clinical competences. For this reason, the authors suggest that the structural elements of clerkships deserve our attention and that evaluation rubrics deserve rigorous “scrutiny.” Hernandez et al 7 performed a survey study that included responses from 110 of 134 internal medicine clerkship directors in the United States. The authors found that most programs rely on clinical performance assessments and the subject exam of the National Board of Medical Examiners to arrive at student grades. Clerkship directors expressed concerns about grade inflation, evaluation inconsistencies, and students’ emphasis on exam performance, which may detract from clinical learning. An overreliance on standardized testing may result in achievement gaps for some students, especially students who identify as belonging to groups underrepresented in medicine, as noted by Jones et al, 8 who argue that such practices contribute to discrimination in medical education. Clerkship grading was the focus of a novel article by Ryan et al, 9 who argue for a transition from grades to a federally regulated competence-based assessment model and development of a standardized letter to communicate accurately the competence, strengths, and weaknesses of students. Many challenges and concerns, some far-reaching in their scope and consequences, rest behind recommendations to reconsider assessment and grading in medical education. The unintended uncoupling of assessment from the goals of medical education and the recognition of bias and inequity in standardized testing have led to examination of the role of assessment in medical education and licensure. As in the past, the articles in this collection reinforce the continuing need to build more psychometrically robust approaches to assessment of specific clinical skills. Our authors raise other issues that are more technical or tactical in nature, such as the need to create more refined evidence-based evaluation tools and to develop assessments that may be conducted in remote learning situations due to the pandemic. Change is difficult, however, as illustrated in the report by McDonald et al, 10 who studied the impact of the elimination of tiered grades and the expansion of 1:1 feedback on core clerkships at the University of California, San Francisco, School of Medicine. Their investigation brought into clarity the many—and sometimes unexpected—effects of curricular change for both faculty and students, even when such change is embraced pedagogically and culturally. (See the AM Last Page by Palaganas and Edwards 11 in this issue for insights about approaching and avoiding common pitfalls in feedback conversations, and see the Perspective by Bearman et al 12 for a discussion of feedback processes that may be implemented in situations with little or no supervision.) The need to engage and support faculty during cultural change was also a theme in the Invited Commentary by van Loon and Scheele, 13 who emphasize empowerment of faculty as the key to effective educational innovation. Similarly, the extensive project by Ryan et al 14 provides validity evidence for assessment built on the reporter–interpreter–manager–educator framework and better-delineated faculty-related and student-related dimensions for this approach to assessment. As reflected in several articles in this collection, empirical study of assessment is valuable in that it can provide clues as to how to better or more effectively implement novel educational approaches, for example, through faculty engagement or empowerment and attention to students’ voices and recommendations. In addition, the value of creative approaches to assessment is illustrated in two reports in this collection. Chang et al 15 describe one of the first longitudinal studies of the progression of metacognition, critical thinking, and regulated learning strategies of medical students, with the intention of helping educators to strengthen the learning skills of all students as well as to identify students at risk of falling behind. In an Innovation Report, Patwari et al 16 describe their early experience using a diagnostic objective structured clinical examination to identify clinical reasoning and knowledge-based deficits in students who may have a disability requiring accommodations to support learning. A Welcome Collection The contribution of thoughtful and carefully derived articles on the topic of assessment from our colleagues across the field of academic medicine is most welcome. The editors of the journal are especially delighted that so many of the articles assembled here were co-authored by medical students and residents. The collection serves to highlight crucial and evolving issues at this moment in medical education and calls upon us to do more on behalf of our learners and our field." @default.
- W3123528123 created "2021-02-01" @default.
- W3123528123 creator A5056364052 @default.
- W3123528123 date "2021-01-27" @default.
- W3123528123 modified "2023-10-03" @default.
- W3123528123 title "Emerging Issues in Assessment in Medical Education: A Collection" @default.
- W3123528123 cites W3015689498 @default.
- W3123528123 cites W3018023220 @default.
- W3123528123 cites W3041023464 @default.
- W3123528123 cites W3044065237 @default.
- W3123528123 cites W3081598233 @default.
- W3123528123 cites W3082362098 @default.
- W3123528123 cites W3084082345 @default.
- W3123528123 cites W3084194547 @default.
- W3123528123 cites W3089458035 @default.
- W3123528123 cites W3090113099 @default.
- W3123528123 cites W3090926741 @default.
- W3123528123 cites W3092128826 @default.
- W3123528123 cites W3092538434 @default.
- W3123528123 cites W3095174594 @default.
- W3123528123 cites W3096130180 @default.
- W3123528123 cites W3097218075 @default.
- W3123528123 doi "https://doi.org/10.1097/acm.0000000000003855" @default.
- W3123528123 hasPubMedId "https://pubmed.ncbi.nlm.nih.gov/33492817" @default.
- W3123528123 hasPublicationYear "2021" @default.
- W3123528123 type Work @default.
- W3123528123 sameAs 3123528123 @default.
- W3123528123 citedByCount "3" @default.
- W3123528123 countsByYear W31235281232022 @default.
- W3123528123 countsByYear W31235281232023 @default.
- W3123528123 crossrefType "journal-article" @default.
- W3123528123 hasAuthorship W3123528123A5056364052 @default.
- W3123528123 hasBestOaLocation W31235281231 @default.
- W3123528123 hasConcept C120912362 @default.
- W3123528123 hasConcept C133462117 @default.
- W3123528123 hasConcept C144024400 @default.
- W3123528123 hasConcept C15744967 @default.
- W3123528123 hasConcept C17744445 @default.
- W3123528123 hasConcept C199539241 @default.
- W3123528123 hasConcept C2779473830 @default.
- W3123528123 hasConcept C36289849 @default.
- W3123528123 hasConcept C509550671 @default.
- W3123528123 hasConcept C71924100 @default.
- W3123528123 hasConceptScore W3123528123C120912362 @default.
- W3123528123 hasConceptScore W3123528123C133462117 @default.
- W3123528123 hasConceptScore W3123528123C144024400 @default.
- W3123528123 hasConceptScore W3123528123C15744967 @default.
- W3123528123 hasConceptScore W3123528123C17744445 @default.
- W3123528123 hasConceptScore W3123528123C199539241 @default.
- W3123528123 hasConceptScore W3123528123C2779473830 @default.
- W3123528123 hasConceptScore W3123528123C36289849 @default.
- W3123528123 hasConceptScore W3123528123C509550671 @default.
- W3123528123 hasConceptScore W3123528123C71924100 @default.
- W3123528123 hasIssue "2" @default.
- W3123528123 hasLocation W31235281231 @default.
- W3123528123 hasLocation W31235281232 @default.
- W3123528123 hasOpenAccess W3123528123 @default.
- W3123528123 hasPrimaryLocation W31235281231 @default.
- W3123528123 hasRelatedWork W1965802029 @default.
- W3123528123 hasRelatedWork W1999407557 @default.
- W3123528123 hasRelatedWork W2073393242 @default.
- W3123528123 hasRelatedWork W2534774209 @default.
- W3123528123 hasRelatedWork W2748952813 @default.
- W3123528123 hasRelatedWork W2899084033 @default.
- W3123528123 hasRelatedWork W2972513998 @default.
- W3123528123 hasRelatedWork W3031052312 @default.
- W3123528123 hasRelatedWork W3032375762 @default.
- W3123528123 hasRelatedWork W4386157523 @default.
- W3123528123 hasVolume "96" @default.
- W3123528123 isParatext "false" @default.
- W3123528123 isRetracted "false" @default.
- W3123528123 magId "3123528123" @default.
- W3123528123 workType "article" @default.