Вы находитесь на странице: 1из 9

ACGME Issues

Measurement of the General Competencies of


the Accreditation Council for Graduate
Medical Education: A Systematic Review
Stephen J. Lurie, MD, PhD, Christopher J. Mooney, MA, and Jeffrey M. Lyness, MD

Abstract
Purpose modalities related to the general Studies of SBP and PBLI generally
To evaluate published evidence that the competencies since 1999; opinion operationalized these competencies as
Accreditation Council for Graduate pieces, review articles, and reports of properties of systems, not of individual
Medical Education’s six general consensus conferences were excluded. trainees.
competencies can each be measured in a The search yielded 127 articles, of
valid and reliable way. which 56 met inclusion criteria. Articles Conclusions
were subdivided into four categories: The peer-reviewed literature provides
Method (1) quantitative/psychometric evaluations, no evidence that current measurement
In March 2008, the authors conducted (2) preliminary studies, (3) studies of SBP tools can assess the competencies
searches of Medline and ERIC using and PBLI, and (4) surveys. independently of one another. Because
combinations of search terms “ACGME,” further efforts are unlikely to be
“Accreditation Council for Graduate Results successful, the authors recommend
Medical Education,” “core competencies,” Quantitative/psychometric studies of using the competencies to guide and
“general competencies,” and the specific evaluation tools failed to develop coordinate specific evaluation efforts,
competencies “systems-based practice” measures reflecting the six competencies rather than attempting to develop
(SBP) and “practice based learning and in a reliable or valid way. Few preliminary instruments to measure the
improvement (PBLI).” Included were all studies led to published quantitative data competencies directly.
publications presenting new qualitative or regarding reliability or validity. Only two
quantitative data about specific assessment published surveys met quality criteria. Acad Med. 2009; 84:301–309.

I n February 1999, the Accreditation specialties . The long-term goal of the with representatives of its constituent
Council for Graduate Medical Education Outcome Project is to develop a new organizations, the ACGME then invited
(ACGME), which is responsible for model of accreditation based on program directors to define specific
accrediting all U.S. clinical residency and defining outcomes linked to the six behaviors that would reflect the general
fellowship programs, unveiled its Outcome general competencies. Furthermore, competencies in their own specialties.
Project.1 This 10-year plan began with a because the Outcome Project was One goal of this project was that
consensus process that defined six general created in conjunction with the appropriate measures of the general
competencies (patient care, medical American Board of Medical Specialties, competencies would be derived from the
knowledge, practice-based learning and there is the potential for this model of needs and insights of those most directly
improvement, interpersonal and certification to be extended to ongoing involved in GME, rather than imposed
communication skills, professionalism, and accreditation of U.S. physicians from above by centralized ACGME
systems-based practice) thought to be throughout their careers. leadership. Ultimately, it was hoped that
common to physicians training in all
such appropriate specification of the
This new model was, at least in part, a general competencies would lead to more
reaction to a widespread feeling that rigorous assessment methods: “Program
Dr. Lurie is director of assessment, Office of and Institutional Requirements (will) . . .
Curriculum and Assessment, University of Rochester “medical education seemed to be mired
School of Medicine and Dentistry, Rochester, New in legions of new requirements,” require programs to use increasingly
York. resulting in “a geometric increase in more useful, reliable, and valid methods
Mr. Mooney is information analyst, Office of the number of ‘musts’ and ‘shoulds’ of assessing residents’ attainment of these
Curriculum and Assessment, University of Rochester facing the director of a GME [graduate competency-based objectives.”1
School of Medicine and Dentistry, Rochester, New
York. medical education] program.”2 By
contrast, an accreditation model based As a part of this process, the ACGME
Dr. Lyness is director of curriculum, Office of expected that the Outcome Project would
Curriculum and Assessment, University of Rochester
on general competencies was predicted
School of Medicine and Dentistry, Rochester, New to “invite creative responses to a provide a “new challenge for program
York. challenge rather than prescribing a directors” to “encourag(e) the use of
Correspondence should be addressed to Dr. Lurie, narrow set of particular responses.”2 evidence and measurement in the
University of Rochester School of Medicine and redesign of GME [graduate medical
Dentistry, 601 Elmwood Ave, Box 601, Rochester
NY, 14642; telephone: (585) 273-4323; e-mail: Having defined the six general education].” Furthermore, this would
(Stephen_Lurie@urmc.rochester.edu). competencies in a series of discussions help program directors in that

Academic Medicine, Vol. 84, No. 3 / March 2009 301


ACGME Issues

“heretofore their work was viewed as to assessing individual residents, based practice” and “practice based
administrative rather than academic, and programs are also expected to “use learning and improvement” for
they were often unsuccessful when they resident performance and outcome publications appearing from 1999 until
appeared before promotion and tenure assessment results in their evaluation of March 2008.
committees.” The authors concluded that the educational effectiveness of the
this “legitimate knowledge-building residency program.”5 This language We then reviewed reference lists of
agenda” would “ultimately result . . . in implies that the competencies can be initially identified studies for any studies
peer-reviewed publications.”2 measured, at least to some degree, that were missed by our search. We
independently of one another for included publications that presented
According to the Outcome Project purposes of evaluation. descriptions of assessment modalities
timeline,3 the goal of Phase Two of the that had been used in specified samples.
project (which was to have occurred Assessment of the core competencies Because of the very diverse nature of this
between July 2002 and June 2006) was to has become an immediately pressing literature, we felt it would have been
have involved “sharpening the focus and issue for residency directors, who must inappropriate to have been overly
definition of the [core] competencies and demonstrate attainment of these restrictive in this criterion. In addition
assessment tools.” This would then set competencies by their trainees. The to including studies that presented
the stage for Phase Three (July 2006 concept of competency-based assessment psychometric data, we included studies
through June 2011), the goal of which is has also been gaining ground both in that simply provided narrative accounts.
to achieve “full integration of the [core] undergraduate medical education and as Studies were included if the authors
competencies and their assessment with a central aspect of ongoing board explicitly stated that their aim was to
learning and clinical care.”3 certification of practicing physicians.6 develop and test an assessment tool as it
Thus, we felt that it was timely to address related to the ACGME core competencies
The Outcome Project has led to vast the question of the reliability and validity and if the article presented any kind of
changes in evaluative strategy affecting with which these competencies can be result based on previously unpublished
every U.S. postgraduate medical training directly assessed by current measurement experience in a specific sample. Thus, our
program (and potentially every practicing tools. inclusion criterion allowed us to exclude
U.S. physician). Yet, it remains unclear to opinion pieces, review articles, and
what degree the Outcome Project has As a secondary question, we sought to reports of consensus conferences because
achieved its stated Phase Two goals of evaluate the literature on the two newly none of these are based on new data
measuring the six general competencies. defined competencies—systems-based relating to the performance of specific
This is a timely issue, not only because practice (SBP) and problem-based tests. We also excluded studies that were
sufficient time has elapsed since the end learning and improvement (PBLI). not published in peer-reviewed journals.
of Phase Two for resulting literature to Because these latter two competencies
appear in print but also because the were a particularly innovative aspect of
success of Phase Three seems to be the Outcome Project and did not exist in Results
depend, at least in part, on the project a formally stated way before 1999, we felt Our search yielded 127 articles, of which
having reached its Phase Two goals. that a complete review of these specific 56 met our inclusion criteria. Because of
competencies would be achievable within the exploratory nature of this study, we
Although the ACGME has published the scope of our study and would further did not have preconceived ideas of how
an online toolbox of assessment shed light on the new achievements of the to organize these studies. Based on our
methodologies,4 including general Outcome Project. By contrast, the other review of the content of these articles, the
psychometric properties of these tools, four competencies have been discussed following four categories emerged as
the document does not comment on how by medical educators for decades, and most reflective of the articles that we
the tools relate to the core competencies. each has its own respective and vast found: (1) quantitative/psychometric
Thus, we sought to evaluate the evidence literature. evaluation of the six general
about whether the six general competencies, (2) preliminary studies of
competencies can currently be measured Finally, we sought to assess the nature the general competencies, (3) studies
independently of one another in a valid of the peer-reviewed literature that has specifically about SBP and PBLI, and
and reliable way. Indeed, if the six core addressed the general competencies. (4) surveys or about the general
competencies cannot be measured This question addresses the ACGME’s competencies.
independently of one another, there goal that the Outcome Project would
would be little practical utility in lead to new intellectual activity and Because we had no a priori sense of the
specifying them as independent criteria of publications. kinds of studies we would encounter in
competence. In their description of the review, we were unable to develop a
assessment of the core competencies, the prespecified quality index. In the cases of
ACGME requires “use of dependable Method survey studies, we judged the quality of
measures to assess residents’ competence We searched Medline and ERIC using work according to the three following
in patient care, medical knowledge, combinations of the search terms traditional criteria: (1) Was there a clear
practice-based learning and “ACGME,” “Accreditation Council for description of the sampling strategy? (2)
improvement, interpersonal and Graduate Medical Education,” “general Did the sample represent a nationally
communication skills, professionalism, competencies,” and “core competencies.” representative sample of the population
and systems-based practice.” In addition We also searched on the terms “systems of interest rather than a local or

302 Academic Medicine, Vol. 84, No. 3 / March 2009


ACGME Issues

convenience sample? and (3) Was the competencies. The authors did not develop specific competencies. We
response rate at least 60%? analyze the factor structure of all the were not able to identify any studies
items considered together, and thus they of portfolios that specifically sought
Finally, we discovered that a number of did not address the extent to which the to measure the ACGME general
publications have provided grids six scales shared common variance. competencies. Nonetheless, the
describing, in various specialties, which relatively small literature on portfolios
assessment methods would in principle 360-degree evaluations. In principle, suggests that portfolio scores will not
be expected to reflect which of the six evaluation by colleagues and coworkers be straightforward to interpret. In
general competencies. We examined provides feedback from persons who may their systematic review of studies of
these grids to assess whether their directly observe one another’s actual daily portfolios, Carraccio and Englander21
conclusions were similar to the findings behaviors. The method may be further concluded that “Evidence to date, in
of the studies we reviewed. refined by framing the questions in terms studying unstructured portfolios,
of the six core competencies. We has demonstrated the difficulty in
identified six studies that provided achieving what is typically considered
Quantitative/psychometric studies of
statistical analysis of such assessment acceptable standards of reliability and
evaluation tools of the six general
tools (Table 1). These relatively small validity in educational measurement.”
competencies
studies do not provide support for the In a subsequent psychometrically
Global rating forms. Summary rating idea that 360-degree evaluations can be rigorous study of a structured portfolio,
forms, which allow faculty to assess used to distinguish individuals’ levels of O’Sullivan et al22 found that raters had
trainees’ abilities over multiple occasions, attainment of the six general good reliability for judging the overall
are probably the most ubiquitous competencies. Two of the studies found quality of a portfolio but poor
assessment tools in residency programs. We that all the items clustered on a single agreement on specific topics (which
identified five studies that specifically factor,12,13 whereas another14 found that were derived specifically for psychiatry
evaluated the ability of global rating forms the items separated into three factors that residents and were, thus, more specific
to assess the six general competencies were not related in a simple way to the six than the ACGME general
(Table 1). These studies have relatively large general competencies. One study15 found competencies).
numbers of participants, which perhaps that residents and attending physicians
reflects the widespread nature of these had little agreement on ratings of Preliminary studies
assessment tools. In the largest of these, residents’ competencies. The other two
Silber et al7 derived items on a global rating We identified 18 peer-reviewed
studies16,17 did not explicitly look at the
scale directly from the language of the publications that described development
degree of concordance between the items or pilot studies of specific assessment
general competencies. These authors then and the general competencies.
determined the scale’s structure based on a tools but that did not provide any
sample of nearly 1,300 residents. They quantitative data relating to the tool’s
Direct observation. We found only two
found that the 23 items on the scale reliability or validity.23– 40 Although these
studies that directly assessed how well
clustered into the two dimensions of articles do not address the reliability or
faculty are able to rate learners’ general
medical knowledge and interpersonal skill validity of their respective measurement
competencies by observing them in
rather than the six general competencies on tools, we included them to fully
specific situations (Table 1). Neither
which the items were based. characterize the current state of the
provides compelling evidence that this
literature. All are narrative studies with
sort of instrument can be used to assess
In general, the other four studies also substantial methodological limitations,
the general competencies in a valid way.
support the conclusion that evaluators including very small sample sizes, lack of
The first18 found that faculty were able
cannot distinguish trainees’ levels of quantitative data, or atypical populations.
directly to observe fewer than 7% of
attainment of the six general residents’ behaviors in a naturalistic
We discovered that three of these articles
competencies in a global rating scale. setting. In the second study,19 raters
resulted in later quantitative follow-up
When individual core competency scores observed a standardized video of a
studies.27,35,36 We then contacted each
were computed from a global rating resident, whom they then rated on the
corresponding author of the remaining
form, all six of these scores were general competencies. No data were
studies and enquired whether he or she
significantly related to a written exam,8 presented on the degree to which the
had any plans to further study the tool
suggesting that the scores were also derived competency scores were related
in terms of reliability or validity. We
significantly correlated with one another. to one another. received responses from 11 of 15 authors,
Another study9 found that derived scores
all of whom told us that they had no
(which were related to some but not all of Portfolios. The ACGME has recently
plans to further study the instrument
the general competencies) were all launched a project to introduce portfolios
they had described.
significantly correlated with one another. into assessment of residents.20 A portfolio
Finally, Reisdorff et al10 found that all six comprises a series of documents that
core-competency scores improved with chronicle a learner’s evolving SBP and PBLI
level of training. In a follow-up analysis,11 competence. Thus, portfolios are We identified 14 studies that specifically
these authors reported that each of the six appealing not only as summative addressed initiatives to assess the
subscales seemed to be unidimensional in evaluation tools but also because of the ACGME-defined competencies of SBP
their factor structures, although there was ways that they might guide learners to and PBLI (Table 2).26,41–53 Because many
considerable variability across the six seek out experiences to help them to of these studies stated that they aimed to

Academic Medicine, Vol. 84, No. 3 / March 2009 303


ACGME Issues

Table 1
Published, Peer-Reviewed Quantitative/Psychometric Evaluation Studies of the
Accreditation Council for Graduate Medical Education General Competencies,
1999 –2008*

Authors Subjects Institution Measures and results


Global rating forms
...................................................................................................................................................................................................................................................................................................................
Silber et al (2004)7 1295 residents at a single Thomas Jefferson/Albert Items derived from the language of the six
institution Einstein competencies clustered into two
dimensions—medical knowledge and
interpersonal skills
...................................................................................................................................................................................................................................................................................................................
Reisdorff et al (2003)10 150 emergency medicine Several programs in Michigan Scores for all six competencies improved
residents between year one and year three. No analysis
of relationships among competencies
...................................................................................................................................................................................................................................................................................................................
Reisdorff et al (2004)11 150 emergency medicine Several programs in Michigan Within each of the six competency areas,
residents each group of items had a single major
eigenvalue. Items were not combined for an
overall analysis
...................................................................................................................................................................................................................................................................................................................
Brasel et al (2004)9 36 surgical residents University of Wisconsin Factor analysis of preexisting evaluation tool
yielded four factors that correspond to four
of the six competencies. The four derived
scores had correlations ranging from 0.64
to 0.75
...................................................................................................................................................................................................................................................................................................................
Tabuenca (2007)8 332 general surgery residents Multiple institutions Scores for all six competencies increased with
increasing year of training. All six scores
correlated significantly with USMLE and in-
training exams. There was no analysis of
relationship among competency scores
360-degree evaluations
...................................................................................................................................................................................................................................................................................................................
Musick et al (2003)17 18 PM & R† residents University of Pennsylvania Descriptive statistics about items means;
no analysis of how items relate to competencies
...................................................................................................................................................................................................................................................................................................................
Higgins et al (2004)16 6 cardiothoracic surgery Rush Presbyterian–St Luke’s No statistical comparison between the six
residents competency scores. Residents improved on
all competencies over time
...................................................................................................................................................................................................................................................................................................................
Massagli et al (2007)12 56 PM & R† residents University of Washington All items were clustered on a single factor,
rather than six
...................................................................................................................................................................................................................................................................................................................
Weigelt et al (2004)13 10 residents on trauma service Medical College of Wisconsin Average scores across competencies were
highly similar. Different raters were unable
to distinguish competencies
...................................................................................................................................................................................................................................................................................................................
Roark et al (2006)15 26 otolaryngology residents Consortium of four New York Compared faculty versus peer ratings of
City hospitals six competencies. There were significant
correlations for three of the six
competencies, but none were significant
if corrected for multiple comparisons
...................................................................................................................................................................................................................................................................................................................
Rosenbaum et al (2005)14 21 family medicine faculty University of Iowa Items derived from the six ACGME
competencies yielded acceptable subscale
reliability. Factor analysis revealed only three
subscales, which were unrelated to the six
competencies
Direct observation
...................................................................................................................................................................................................................................................................................................................
Chisholm et al (2004)18 106 emergency medicine Indiana University In a natural setting, between 3.6% and 6%
residents of resident behaviors were directly observed
by faculty
...................................................................................................................................................................................................................................................................................................................
Shayne et al (2006)19 82 emergency medicine faculty 16 academic emergency Faculty observed two simulated videos and
medicine programs rated them on five of the six competencies.
Raters were internally consistent for all
five scales. There was no analysis of
relationships among competencies
* See the Method section for a description of inclusion criteria.

PM & R, physical medicine and rehabilitation.

assess both competencies, we did not defined quality improvement projects. The other six studies48 –53 presented a
further subdivide the studies according to For each of these studies, the dependent curriculum or elective opportunity and
the two competencies. As shown in Table measure was a relevant clinical outcome then measured participants’ self-reported
2, eight of these26,41– 47 involved author- rather than assessment of participants. confidence or knowledge.

304 Academic Medicine, Vol. 84, No. 3 / March 2009


ACGME Issues

Table 2
Published, Peer-Reviewed Studies of the Accreditation Council for Graduate
Medical Education Competencies of Systems-Based Practice and Practice-Based
Learning and Improvement, 1999 –2008*

Authors Subjects Institution Intervention Outcome


Coleman et al 3 teams of residents and University of Louisville Team-specific documentation Outcome measure improved on
(2003)26 attending physicians in projects: completion of medication all three projects
family medicine lists, microalbumin screening, and
completion of summary sheets
...................................................................................................................................................................................................................................................................................................................
Canal et al 15 surgical residents Indiana University Curriculum focusing on deriving a Self-reported knowledge,
(2007)41 quality-improvement project creation of four quality
improvement projects
...................................................................................................................................................................................................................................................................................................................
Englander et al Numbers of patients or Connecticut Children’s Residents identified barrier to use of Increased use of machine
(2006)42 residents not reported Medical Center a standardized lab-ordering machine
...................................................................................................................................................................................................................................................................................................................
Frey et al 12 family medicine Mayo Clinic Scottsdale Individual projects in senior year Self-reported knowledge
(2003)43 residents
...................................................................................................................................................................................................................................................................................................................
Miller et al 110 patients with Wake Forest University Derivation of an institution-specific Rate of appropriate prescribing
(2006)44 ventilator-associated treatment algorithm to improve increased
pneumonia initial empiric treatment
...................................................................................................................................................................................................................................................................................................................
Mohr et al Improvement teams University of Chicago Identification of five changes in clinic Significantly increased
(2003)45 (including 8 residents) in process immunization rates
pediatrics
...................................................................................................................................................................................................................................................................................................................
Palonen et al 70 residents in internal University of Alabama Comparison of chart review versus Both methods yielded similar
(2006)46 medicine and med-peds patient surveys to estimate rates of estimates
five clinical behaviors
...................................................................................................................................................................................................................................................................................................................
Paukert et al 26 residents and 3 faculty University of Texas Residents and faculty audited 1005 Documetation of preventive
(2003)47 in family practice charts health services increased during
the study
...................................................................................................................................................................................................................................................................................................................
Rivo et al Third- and fourth-year Consortium of eight Various curricula Self-reported behaviors relating
(2004)48 medical students (total medical schools† to systems-based practice
number not reported)
...................................................................................................................................................................................................................................................................................................................
Siri et al 4 groups of residents University of Florida Residents completed Self-reported satisfaction
(2007)49 recommendations for four aspects of
preoperative care
...................................................................................................................................................................................................................................................................................................................
Staton et al 347 patients with East Carolina University Chart review by residents to improve Improved adherence
(2007)50 diabetes in a general adherence with foot examination
medicine internal
medicine outpatient clinic
...................................................................................................................................................................................................................................................................................................................
Thomas et al 46 internal medicine May Clinic Rochester Nonrandom assignment to Small-group participants scored
(2005)51 residents conference versus small-group higher on a skills test and self-
discussion of EBM versus no assessed knowledge
intervention
...................................................................................................................................................................................................................................................................................................................
Tomolo et al 45 internal medicine Cleveland Veterans Affairs Residents completed an outcomes Acceptable interrater reliability
(2005)52 residents Hospital card documenting medical errors for identifying types of errors
...................................................................................................................................................................................................................................................................................................................
Weingart et al 26 internal medicine Beth Israel—Deaconess 3-week elective in quality Self-reported knowledge
(2004)53 residents improvement
* See the Method section for a description of inclusion criteria.

Names of schools not explicitly described in article.

Other studies about the general priorities. Similar rankings of self-rated methodologies. In every case, multiple
competencies competency were found among assessment methods mapped onto
Surveys. We identified 11 published physicians who had completed an allergy multiple general competencies. Thus,
studies that described surveys about the and immunology fellowship in the at a conceptual level, it did not seem
ACGME competencies with varying United States between 1995 and 2000.60 that experts were able to define
samples and response rates (Table measurement tools that uniquely
3).54 – 64 These studies are difficult to Grids. We identified seven publications capture the general competencies, or
summarize because of differences in in which authors developed grids that general competencies that are unique
methodology and populations studied. cross-referenced available assessment to assessment methods.
Only two of these studies met all three tools with the six competencies.31,38,65– 69
of our quality criteria for surveys.56,60 In general, the purpose of these
In the first of these,56 family medicine publications is to develop a checklist of Discussion
program directors consistently rated which general competencies can We find that the literature to date has not
SBP and PBLI as their lowest educational reasonably be assessed with which yielded any method that can assess the six

Academic Medicine, Vol. 84, No. 3 / March 2009 305


ACGME Issues

Table 3
Published, Peer-Reviewed Survey Studies of the Accreditation Council for
Graduate Medical Education General Competencies, 1999 –2008*

Authors Sample Response rate, no. (%)


54
Cogbill et al (2005) Psychiatry residents at University of Arkansas 16 of 23 (70)
...................................................................................................................................................................................................................................................................................................................
Collins et al (2004)55 All U.S. radiology program directors 99 of 192 (52)
...................................................................................................................................................................................................................................................................................................................
Delzell et al (2005)56 All U.S. family medicine program directors 287 of 444 (65)
...................................................................................................................................................................................................................................................................................................................
Heard et al (2002)57 Program directors at the University of Arkansas 24 of 47 (51)
...................................................................................................................................................................................................................................................................................................................
Johnson and Barratt (2005)58 Pediatric Continuity Clinic preceptors 336 of 2378 (14)
...................................................................................................................................................................................................................................................................................................................
Joyner et al (2005)59 Urology program directors 105 of 119 (88)
...................................................................................................................................................................................................................................................................................................................
60
Li et al (2003) Physicians completing U.S. allergy fellowship 1995–2001 253 of 373 (68)
...................................................................................................................................................................................................................................................................................................................
Lynch et al (2003)61 National sample of family physicians 1,228 of 2,363 (54)
...................................................................................................................................................................................................................................................................................................................
Michels et al (2007)62 Ophthalmologists in pacific northwest 147 of 676 (22)
...................................................................................................................................................................................................................................................................................................................
Stiles et al (2006)63 Surgical residents at a single institution 25 of 25 (100)
...................................................................................................................................................................................................................................................................................................................
Wald et al (2007)64 Undergraduate emergency medicine clerkship directors 92 of 132 (70)

ACGME general competencies as presented survey data, although few of By contrast, the other five competencies
independent constructs. Rather, all them met rigorous standards. The reflect, in varying degrees, personal
currently available measurement tools remaining 71 publications (56%) attributes of trainees rather than
generally yield a single dimension of represent consensus conferences, knowledge of objectively derived
overall measured competency or, editorials, thought pieces, etc. information. Furthermore, the relative
sometimes, several measured dimensions values of these attributes are more
that do not relate to the competencies in The exception to this challenge of socially and culturally determined than
a simple manner. This lack of simple measuring competencies seems to be are the abilities comprising “medical
correspondence between the general “medical knowledge.” This competency is knowledge.” Thus, to date, these
competencies and measurement is generally measured with written competencies have proven considerably
mirrored in the several published examinations in which the examinee more challenging to quantify in a reliable
attempts to conceptually map the general answers a series of standardized questions and valid way. Although we did not
competencies onto observable that assess factual knowledge. Recently, systematically survey the literature on
behaviors—such attempts consistently this approach has been expanded with these additional competencies, each has
yield grids in which all possible the use of script-concordance tests, which been the subject of several prior review
measurable behaviors consistently map offer examinees a series of choices that articles, which we believe are helpful
onto three or more of the general attempt to mirror real-world decision for providing additional context.
competencies. Scores obtained by any of making, and in which examinees’ scores For instance, the construct of
the currently available assessment tools are determined by their degree of “professionalism,” which predated the
represent various admixtures of the
concordance with the responses of a ACGME general competencies, has
underlying hypothetical general
reference panel of medical experts.70 –73 continued to defy a clear operational
competencies. That is, it currently does
Because this technique assesses definition despite several decades of
not seem possible to “measure the
application of knowledge in typical attempts to derive one. In addition to
competencies” independently of one
clinical conditions of uncertainty, these deep philosophical differences over the
another in any psychometrically
meaningful way. tests seem to fulfill the ACGME’s various possible meanings of the term
requirement that trainees demonstrate “professionalism,” the inherent
In terms of our goal of characterizing the “application of knowledge to patient challenges of measurement and
existing literature that has grown up care.” Furthermore, it has been shown psychometric analyses add additional
around the ACGME competencies, we that, in a large sample of physicians, layers of uncertainty. In her systematic
find that only 13 of 127 (10%) of paper-and-pencil tests of knowledge have review of measurement of
published studies presented any significant relationships to later markers professionalism, Arnold76 concluded that
psychometric data on assessment tools. of quality of clinical care.74,75 Thus, these “interrater agreement on humanistic
Another 14 of these (11%) presented measures, which reliably assess medical terms can be particularly low.” Even if
descriptions of interventions to assess knowledge, also seem to be valid raters could agree on how to judge
PBLI or SBP, although not all of these predictors of important later clinical particular items relating to such a high-
were psychometrically rigorous. Of the behaviors. Much of this success seems to order construct as “professionalism,”
127 studies we identified, 18 of these be a reflection of the way that “medical relationships among items seem unstable;
presented preliminary data, although 15 knowledge” is composed of a very large depending on the measurement tool
of them (12%) did not have any series of identifiable facts and chosen, a purely empirical definition of
subsequent follow-up publications. relationships among facts, the veracity of “professionalism” may contain as few as
Finally, 11 of these studies (9%) which can be independently assessed. three subscales77,78 or as many as seven79

306 Academic Medicine, Vol. 84, No. 3 / March 2009


ACGME Issues

or eight.80 Thus, at a measurement level, the general competencies but, rather, to Despite these difficulties, we recognize
the meaning of “professionalism” explicitly develop a more fully elaborated that attention to the six ACGME
becomes mired in the technical minutiae model to rationalize and prioritize competencies has already led to some of
of psychometric analysis, irrespective of various assessment tools in light of the their intended benefits. For example,
any philosophical beliefs about the nature general competencies. Although it is many residency programs now have
of the construct itself. possible that such a measurement model additional curricular time and effort
could arise from the kind of grassroots devoted to areas such as interpersonal
On the basis of our results, we suspect effort proposed in the Outcome Project, and communications skills, which were
that such concerns will likely continue to we suspect that this will need to come previously perceived to be lacking in
thwart attempts at measurement of the from further consensus and deliberation many training programs. Future
other general competencies as well. This by the ACGME and its constituent assessment methodologies should
is not because the general competencies organizations. incorporate these beneficial attributes
are, in any sense, “incorrect”; rather, while striving to define assessments that
it is a reflection of the Outcome Project’s As one contribution to the development can be measured reliably and, thus, to
assumption that the general of such a model, we find that the two provide empirical benchmarks for further
competencies, once defined, would reveal newer ACGME competencies—SBP and educational reform.
themselves in a straightforward fashion PBLI—are viewed by many authors as
through measurement. It will remain a representing aspects of health systems
challenge to develop objective measures and teams rather than those of particular Acknowledgments
that correspond neatly to these individuals. Thus, it is possible that The authors thank Diane M. Hartmann, MD,
and David R. Lambert, MD, for their many
generalized educational constructs. environmental variables may exert thoughtful comments on this work.
In addition to disagreements over significant influence on trainees’
theoretical issues, measurement of actual behaviors surrounding these
human behaviors is subject to a host of competences. It is possible, for instance, References
nontheoretical biases and technical that a trainee with relatively good 1 Accreditation Council for Graduate Medical
challenges, including the well-known understanding of systems-based issues Education. The ACGME Outcome Project:
psychometric problems of method may nonetheless seem to perform poorly An Introduction. Available at: (http://www.
acgme.org/outcome/project/OPintrorev1_
variance, observer biases, expectation and when placed in a practice environment
7–05.ppt). Accessed November 13, 2008.
contextual effects, logistical constraints, that hinders good communication 2 Batalden P, Leach D, Swing S, Dreyfus H,
and random error. among caregivers. Further refinements Dreyfus S. General competencies and
of the operational definitions of these accreditation in graduate medical education.
It would be unfortunate, however, if competencies should include measures Health Aff (Millwood). 2002;21:103–111.
3 Accreditation Council for Graduate Medical
these failures of quantification were to of health systems in addition to any Education. ACGME Outcomes Project.
lead to cynicism about the general measures of individuals. Timeline—Working Guidelines. Available at:
competencies or to the conclusion that (http://www.acgme.org/outcome/project/
such principles are of no practical value. Our study has several limitations. First, timeline/TIMELINE_index_frame.htm).
Accessed November 13, 2008.
As initially conceived by the leadership of we did not assess conference
4 Accreditation Council for Graduate Medical
the ACGME, the general competencies presentations, posters, or other Education. Toolbox of Assessment Methods.
were meant as a response to unpublished material. We recognize that Available at: (http://www.acgme.org/Outcome/
“overspecification” of training and much communication among program assess/Toolbox.pdf). Accessed November 17,
assessment requirements. Although we directors, as well as between program 2008.
5 Accreditation Council for Graduate Medical
agree with this concern in principle, we directors and the ACGME, occurs on this Education. ACGME Outcome Project.
feel that the problem was not so much informal, face-to-face level. It is possible Available at: (http://www.acgme.org/outcome/
overspecification (because measurement that we may have missed important comp/compMin.asp). Accessed November 13,
requirements must always be stated with additional information that was 2008.
6 American Board of Medical Specialties. MOC
some specificity) but, rather, a lack of communicated in this way. Nonetheless,
competencies and criteria. Available at: (http://
coherent specification. Without an we deliberately chose not to examine www.abms.org/Maintenance_of_Certification/
overarching set of principles, a list of such material in light of the ACGME’s MOC_competencies.aspx). Accessed November
detailed requirements runs the risk of stated intent that the competencies would 13, 2008.
seeming random and arbitrary. Thus, the result in enhanced scientific activity, 7 Silber CG, Nasca TJ, Paskin DL, Eiger G,
Robeson M, Veloski JJ. Do global rating
general competencies could have an which implies publication and peer forms enable program directors to assess the
invaluable role in guiding assessment review. Second, because of the ongoing ACGME competencies? Acad Med. 2004;79:
strategy as long as it is clear that the six nature of the Outcome Project, it is 549 –556.
general competencies themselves exist in possible that our review failed to reflect 8 Tabuenca A, Welling R, Sachdeva AK, et al.
Multi-institutional validation of a Web-based
a realm outside of measurement. What studies that may be currently ongoing.
core competency assessment system. J Surg
remains missing from the Outcome We suspect, however, that we did not Educ. 2007;64:390 –394.
Project, in our view, is an explicitly stated miss a significant number of these 9 Brasel KJ, Bragg D, Simpson DE, Weigelt JA.
set of expectations that would link the because we contacted all authors who had Meeting the accreditation council for
ideals of the general competencies to the previously published preliminary or pilot graduate medical education competencies
using established residency training program
realities of measurement. Thus, a next descriptions of assessment projects. assessment tools. Am J Surg. 2004;188:9 –12.
step in development of an overall theory Finally, we did not consult officials of the 10 Reisdorff EJ, Hayes OW, Reynolds B, et al.
of assessment would not be to abandon ACGME in preparing our review. General competencies are intrinsic to

Academic Medicine, Vol. 84, No. 3 / March 2009 307


ACGME Issues

emergency medicine training: A multicenter 26 Coleman MT, Nasraty S, Ostapchuk M, improvement for surgery residents. Arch
study. Acad Emerg Med. 2003;10:1049 –1053. Wheeler S, Looney S, Rhodes S. Introducing Surg. 2007;142:479 – 482.
11 Reisdorff EJ, Carlson DJ, Reeves M, Walker practice-based learning and improvement 42 Englander R, Agostinucci W, Zalneraiti E,
G, Hayes OW, Reynolds B. Quantitative ACGME core competencies into a family Carraccio CL. Teaching residents systems-
validation of a general competency composite medicine residency curriculum. Jt Comm J based practice through a hospital cost-
assessment evaluation. Acad Emerg Med. Qual Saf. 2003;29:238 –247. reduction program: A “win-win” situation.
2004;11:881– 884. 27 Dickey J, Girard DE, Geheb MA, Cassel CK. Teach Learn Med. 2006;18:150 –152.
12 Massagli TL, Carline JD. Reliability of a Using systems-based practice to integrate 43 Frey K, Edwards F, Altman K, Spahr N,
360-degree evaluation to assess resident education and clinical services. Med Teach. Gorman RS. The “collaborative care”
competence. Am J Phys Med Rehabil. 2007; 2004;26:428 – 434. curriculum: An educational model addressing
86:845– 852. 28 Dorotta I, Staszak J, Takla A, Tetzlaff JE. key ACGME core competencies in primary
13 Weigelt JA, Brasel KJ, Bragg D, Simpson D. Teaching and evaluating professionalism for care residency training. Med Educ. 2003;37:
The 360-degree evaluation: Increased work anesthesiology residents. J Clin Anesth. 2006; 786 –789.
with little return? Curr Surg. 2004;61:616 – 18:148 –160. 44 Miller PR, Partrick MS, Hoth JJ, Meredith
626. 29 Greenberg JA, Irani JL, Greenberg CC, et al. JW, Chang MC. A practical application of
14 Rosenbaum ME, Ferguson KJ, Kreiter CD, The ACGME competencies in the operating practice-based learning: Development of an
Johnson CA. Using a peer evaluation system room. Surgery. 2007;142:180 –184. algorithm for empiric antibiotic coverage in
to assess faculty performance and 30 Hayes OW, Reisdorff EJ, Walker GL, ventilator-associated pneumonia. J Trauma.
competence. Fam Med. 2005;37:429 – 433. Carlson DJ, Reinoehl B. Using standardized 2006;60:725–729.
15 Roark RM, Schaefer SD, Yu GP, Branovan oral examinations to evaluate general 45 Mohr JJ, Randolph GD, Laughon MM, Schaff
DI, Peterson SJ, Lee WN. Assessing and competencies. Acad Emerg Med. E. Integrating improvement competencies
documenting general competencies in 2002;9:1334 –1337. into residency education: A pilot project from
otolaryngology resident training programs. 31 Johnston KC. Responding to the ACGME’s a pediatric continuity clinic. Ambul Pediatr.
Laryngoscope. 2006;116:682– 695. competency requirements: An innovative 2003;3:131–136.
16 Higgins RS, Bridges J, Burke JM, O’Donnell instrument from the University of Virginia’s 46 Palonen KP, Allison JJ, Heudebert GR, et al.
MA, Cohen NM, Wilkes SB. Implementing neurology residency. Acad Med. 2003;78: Measuring resident physicians’ performance
the ACGME general competencies in a 1217–1220. of preventive care: Comparing chart review
cardiothoracic surgery residency program 32 Lyman J, Schorling J, May N, et al. with patient survey. J Gen Intern Med. 2006;
using 360-degree feedback. Ann Thorac Surg. Customizing a clinical data warehouse for 21:226 –230.
2004;77:12–17. housestaff education in practice-based 47 Paukert JL, Chumley-Jones HS, Littlefield JH.
17 Musick DW, McDowell SM, Clark N, Salcido learning and improvement. AMIA Annu Do peer chart audits improve residents’
R. Pilot study of a 360-degree assessment Symp Proc. 2006:1017. performance in providing preventive care?
Acad Med. 2003;78(10 suppl):S39 –S41.
instrument for physical medicine & 33 Oetting TA, Lee AG, Beaver HA, et al.
48 Rivo ML, Keller DR, Teherani A, O’Connell
rehabilitation residency programs. Am J Phys Teaching and assessing surgical competency
MT, Weiss BA, Rubenstein SA. Practicing
Med Rehabil. 2003;82:394 – 402. in ophthalmology training programs.
effectively in today’s health system: Teaching
18 Chisholm CD, Whenmouth LF, Daly EA, Ophthalmic Surg Lasers Imaging. 2006;37:
systems-based care. Fam Med. 2004;36
Cordell WH, Giles BK, Brizendine EJ. An 384 –393.
(suppl):S63–S67.
evaluation of emergency medicine resident 34 O’Sullivan PS, Cogbill KK, McClain T,
49 Siri J, Reed AI, Flynn TC, Silver M, Behrns
interaction time with faculty in different Reckase MD, Clardy JA. Portfolios as a novel
KE. A multidisciplinary systems-based
teaching venues. Acad Emerg Med. 2004;11: approach for residency evaluation. Acad
practice learning experience and its impact on
149 –155. Psychiatry. 2002;26:173–179.
surgical residency education. J Surg Educ.
19 Shayne P, Gallahue F, Rinnert S, et al. 35 Reisdorff EJ, Hayes OW, Carlson DJ, Walker 2007;64:328 –332.
Reliability of a core competency checklist GL. Assessing the new general competencies 50 Staton LJ, Kraemer SM, Patel S, Talente GM,
assessment in the emergency department: The for resident education: A model from an Estrada CA. “Correction” peer chart audits: A
standardized direct observation assessment emergency medicine program. Acad Med. tool to meet accreditation council on
tool. Acad Emerg Med. 2006;13:727–732. 2001;76:753–757. graduate medical education (ACGME)
20 Accreditation Council for Graduate Medical 36 Shayne P, Heilpern K, Ander D, Palmer- competency in practice-based learning and
Education. ACGME Learning Portfolio: A Smith V; Emory University Department of improvement. Implement Sci. 2007;2:24.
Professional Development Tool. Available at: Emergency Medicine Education Committee. 51 Thomas KG, Thomas MR, York EB, Dupras
(http://www.acgme.org/acWebsite/portfolio/ Protected clinical teaching time and a bedside DM, Schultz HJ, Kolars JC. Teaching
cbpac_faq.pdf). Accessed November 13, 2008. clinical evaluation instrument in an evidence-based medicine to internal medicine
21 Carraccio C, Englander R. Evaluating emergency medicine training program. Acad residents: The efficacy of conferences versus
competence using a portfolio: A literature Emerg Med. 2002;9:1342–1349. small-group discussion. Teach Learn Med.
review and Web-based application to the 37 Simpson D, Helm R, Drewniak T, et al. 2005;17:130 –135.
ACGME competencies. Teach Learn Med. Objective structured video examinations 52 Tomolo A, Caron A, Perz ML, Fultz T, Aron
2004;16:381–387. (OSVEs) for geriatrics education. Gerontol DC. The outcomes card. Development of a
22 O’Sullivan PS, Reckase MD, McClain T, Geriatr Educ. 2006;26:7–24. systems-based practice educational tool.
Savidge MA, Clardy JA. Demonstration of 38 Torbeck L, Wrightson AS. A method for J Gen Intern Med. 2005;20:769 –771.
portfolios to assess competency of residents. defining competency-based promotion 53 Weingart SN, Tess A, Driver J, Aronson MD,
Adv Health Sci Educ Theory Pract. 2004;9: criteria for family medicine residents. Acad Sands K. Creating a quality improvement
309 –323. Med. 2005;80:832– 839. elective for medical house officers. J Gen
23 Alexander M, Pavlov A, Lenahan P. Lights, 39 Triola MM, Feldman HJ, Pearlman EB, Kalet Intern Med. 2004;19:861– 867.
camera, action: Using film to teach the AL. Meeting requirements and changing 54 Cogbill KK, O’Sullivan PS, Clardy J.
ACGME competencies. Fam Med. 2007;39: culture. The development of a Web-based Residents’ perception of effectiveness of
20 –23. clinical skills evaluation system. J Gen Intern twelve evaluation methods for measuring
24 Carraccio C, Englander R, Wolfsthal S, Med. 2004;19:492– 495. competency. Acad Psychiatry. 2005;29:
Martin C, Ferentz K. Educating the 40 Webb TP, Aprahamian C, Weigelt JA, Brasel 76 – 81.
pediatrician of the 21st century: Defining and KJ. The surgical learning and instructional 55 Collins J, Herring W, Kwakwa F, et al.
implementing a competency-based system. portfolio (SLIP) as a self-assessment Current practices in evaluating radiology
Pediatrics. 2004;113:252–258. educational tool demonstrating practice- residents, faculty, and programs: Results of a
25 Clay AS, Petrusa E, Harker M, Andolsek K. based learning. Curr Surg. 2006;63:444 – 447. survey of radiology residency program
Development of a Web-based, specialty 41 Canal DF, Torbeck L, Djuricich AM. directors. Acad Radiol. 2004;11:787–794.
specific portfolio. Med Teach. 2007;29:311– Practice-based learning and improvement: A 56 Delzell JE Jr, Ringdahl EN, Kruse RL. The
316. curriculum in continuous quality ACGME core competencies: A national

308 Academic Medicine, Vol. 84, No. 3 / March 2009


ACGME Issues

survey of family medicine program directors. of emergency medicine clerkship directors. Acad uncertainty: The script concordance approach.
Fam Med. 2005;37:576 –580. Emerg Med. 2007;14:629–634. Eval Health Prof. 2004;27:304 –319.
57 Heard JK, Allen RM, Clardy J. Assessing the 65 Bingham JW, Quinn DC, Richardson MG, 73 Sibert L, Darmoni SJ, Dahamna B, Weber J,
needs of residency program directors to meet Miles PV, Gabbe SG. Using a healthcare Charlin B. Online clinical reasoning
the ACGME general competencies. Acad matrix to assess patient care in terms of aims assessment with the script concordance test:
Med. 2002;77:750. for improvement and core competencies. Jt A feasibility study. BMC Med Inform Decis
58 Johnson CE, Barratt MS. Continuity clinic Comm J Qual Patient Saf. 2005;31:98 –105. Mak. 2005;5:18.
preceptors and ACGME competencies. Med 66 Chapman DM, Hayden S, Sanders AB, 74 Tamblyn R, Abrahamowicz M, Dauphinee
Teach. 2005;27:463– 467. et al. Integrating the Accreditation Council WD, et al. Association between licensure
59 Joyner BD, Siedel K, Stoll D, Mitchell M. for Graduate Medical Education core examination scores and practice in primary
Report of the national survey of urology competencies into the model of the clinical care. JAMA. 2002;288:3019 –3026.
program directors: Attitudes and actions practice of emergency medicine. Ann Emerg 75 Tamblyn R, Abrahamowicz M, Dauphinee D,
regarding the accreditation council for Med. 2004;43:756 –769. et al. Physician scores on a national clinical
graduate medical education regulations. 67 Jarvis RM, O’Sullivan PS, McClain T, Clardy skills examination as predictors of complaints
J Urol. 2005;174:1961–1968. JA. Can one portfolio measure the six to medical regulatory authorities. JAMA.
60 Li JT, Stoll DA, Smith JE, Lin JJ, Swing SR. ACGME general competencies? Acad 2007;298:993–1001.
Psychiatry. 2004;28:190 –196. 76 Arnold L. Assessing professional behavior:
Graduates’ perceptions of their clinical
68 Singh R, Naughton B, Taylor JS, et al. A Yesterday, today, and tomorrow. Acad Med.
competencies in allergy and immunology:
comprehensive collaborative patient safety 2002;77:502–515.
Results of a survey. Acad Med. 2003;78:933–
residency curriculum to address the ACGME 77 Arnold EL, Blank LL, Race KE, Cipparrone N.
938.
core competencies. Med Educ. 2005;39:1195– Can professionalism be measured? The
61 Lynch DC, Pugno P, Beebe DK, Cullison SW, Lin 1204. development of a scale for use in the medical
JJ. Family practice graduate preparedness in the six 69 Wang EE, Vozenilek JA. Addressing the environment. Acad Med. 1998;73:1119 –1121.
ACGME competency areas: Prequel. Fam Med. systems-based practice core competency: A 78 DeLisa JA, Foye PM, Jain SS, Kirshblum S,
2003;35:324–329. simulation-based curriculum. Acad Emerg Christodoulou C. Measuring professionalism
62 Michels KS, Hansel TE, Choi D, Lauer AK. A Med. 2005;12:1191–1194. in a physiatry residency training program.
survey of desired skills to acquire in 70 Charlin B, Gagnon R, Pelletier J, et al. Am J Phys Med Rehabil. 2001;80:225–229.
ophthalmology training: A descriptive Assessment of clinical reasoning in the 79 Blackall GF, Melnick SA, Shoop GH, et al.
statistical analysis. Ophthalmic Surg Lasers context of uncertainty: The effect of Professionalism in medical education: The
Imaging. 2007;38:107–114. variability within the reference panel. Med development and validation of a survey
63 Stiles BM, Reece TB, Hedrick TL, et al. Educ. 2006;40:848 – 854. instrument to assess attitudes toward
General surgery morning report: A 71 Charlin B, Roy L, Brailovsky C, Goulet F, van professionalism. Med Teach. 2007;29:e58 –
competency-based conference that enhances der Vleuten C. The script concordance test: A e62.
patient care and resident education. Curr tool to assess the reflective clinician. Teach 80 Tsai TC, Lin CH, Harasym PH, Violato C.
Surg. 2006;63:385–390. Learn Med. 2000;12:189 –195. Students’ perception on medical
64 Wald DA, Manthey DE, Kruus L, Tripp M, Barrett 72 Charlin B, van der Vleuten C. Standardized professionalism: The psychometric
J, Amoroso B. The state of the clerkship: A survey assessment of reasoning in contexts of perspective. Med Teach. 2007;29:128 –134.

Academic Medicine, Vol. 84, No. 3 / March 2009 309

Вам также может понравиться