Article Item

Defining and Assessing Competence in Regional Anesthesiology Training

Aug 1, 2023, 04:30 AM by by Olga E. Paniagua, MD, and Danielle Ludwin, MD

Cite as: Paniagua OE, Ludwin D. Defining and assessing competence in regional anesthesiology training. ASRA Pain Medicine News 2023;48. https://doi.org/10.52211/asra080123.006.


Residency and fellowship education are rigorous developmental processes and experiences, requiring the right combination of assessments to guide development. Assessment is an indispensable method of ensuring that learners are truly competent for unsupervised, self-guided practice upon completion of their training. Moreover, in rapidly evolving subspecialties such as regional anesthesia, staying up to date on ways to best evaluate competency in the field is fundamental. However, how can we measure competence in regional anesthesia in a standardized fashion when trainee exposure and experiences vary not only by institution, but also, at a granular level?

According to Miller’s pyramid of assessing competence, the chain of expertise begins with knowledge, which is considered the lowest level of the pyramid. Novice trainees must demonstrate that they understand the information, and from there, they can begin to move up the pyramid – from knowing how (application), to showing how (demonstration), and eventually performing in practice (doing).1 This pyramid can be combined with the Dreyfus model of skill acquisition to classify, or rank, trainees as they ascend the pyramid from novice to advanced beginner, to competent, then proficient, and ultimately, to expert level.Analogously, the Accreditation Council for Graduate Medical Education (ACGME) levels of supervision are designed to tailor each trainee’s growth such that less supervision is needed as the trainee rises in rank.

In 2013, fueled by a need for better standardization, the ACGME implemented the Milestones-Based Assessment as a complex educational intervention to take the resident or fellow from a beginner to a proficient practitioner at the time of graduation.2 The milestones defined by the ACGME to assess competence as the trainee strives for both academic and career achievement may be applied to regional anesthesia for program directors and ancillary staff to have context for the qualities and observable behaviors they expect to be demonstrated at each level. The ACGME, however, also has a specific subset of competencies that the regional anesthesia trainee must demonstrate prior to graduation. These include the demonstration of patient care proficiency, procedural skills, medical knowledge and application of that knowledge, interpersonal and communication skills, practice-based learning and improvement, and professionalism. These specialty-specific milestones must be used as one of the tools to ensure that fellows are able to practice core professional activities without supervision upon completion of the program. The question is, how can these competencies be uniformly evaluated?

While there does not appear to be one standardized method of evaluation, there are a number of tools described in the literature that have been formulated for the assessment of practical performance in regional anesthesia, such as task-specific checklists, global rating scales (GRS), quality-compromising behaviors, key performance indicators, psychometric tests, multiple choice questions, hand-motion analysis, visuospatial and psychomotor ability screening, and cumulative sum analysis, among others.3,4 Of these tools, task-specific checklists combined with GRS appears to be a very comprehensive and reliable approach to evaluating competency as they allow for evaluation of technical, non-technical, and professional skills. Task-specific (or procedure-specific) checklists allow for an item-based rubric that can readily assess the technical strengths and weaknesses of the trainee. On the other hand, GRS use Likert scales to assess non-technical aspects of a trainee’s performance like communication and teamwork.

In 2020, Woodworth et al.5 summarized the development of a comprehensive competency-based Regional Anesthesia and Acute Pain Medicine curricula for anesthesiology residency training, which aimed to create and deliver a universal resource for accessing pertinent regional materials as well as a means for assessing “the achievement of competency.” This resource was termed “The Anesthesia Toolbox” and was originally centered around regional anesthesia. The biggest strength of the platform is that it lends itself to high faculty compliance with regards to trainee feedback given its ease of use. It is both a quantitative and qualitative measure of competence as it requires the attending to delineate the specific procedure performed, the complexity of said procedure, and whether the attending feels the trainee would be able to perform it unsupervised. Nevertheless, even though this online platform is currently being used by more than 100 residency programs, we need a larger-scale solution to address the challenges of defining competency.


The goal should be for programs to adopt a holistic approach to assessment.


The decision to incorporate any one of the aforementioned assessment tools into a competency based regional anesthesia curriculum may vary from institution to institution. However, according to Chuan et al, “research in regional anesthesia education would be greatly advanced if we used accepted, common, and reliable tools. This would allow pooling of results in meta-analysis, comparisons between institutions, and comparing the relative effectiveness of different educational interventions.”3 Regardless of the approach, the goal should be for programs to adopt a holistic approach to assessment, thereby ensuring graduates demonstrate key proficiencies essential to the field of regional anesthesia.

Additionally, as part of the holistic approach, it is vitally important that trainees supplement their growth and assess their own competency using honest, ongoing self-assessment and self-reflection. At our own institution (New York Presbyterian/Columbia University Irving Medical Center), we conduct semi-annual evaluations whereby trainees are asked to reflect on their progress, assess their learning goals, and identify areas of challenge in their training. Furthermore, as part of their formative assessment, trainees are required to devise a strategy alongside their department leadership and/or mentors to help attain these goals.

With regional anesthesia continually climbing the ladder of popularity, it is being incorporated in a wide variety of settings. To maintain this fanfare, we need to ensure that the future of our specialty is maintained at the highest standard. The quality of a peripheral nerve block is almost entirely dependent on its operator, and the proficiency of the operator is largely dependent on those who took the time to help cultivate the craft. It is the duty of our faculty and staff to train current and future generations of aspiring anesthesiologists and provide the necessary feedback required to generate growth and command competence. Despite the large body of literature on tools to measure competency, there does not appear to be a one-size-fits-all methodology. Further research is likely needed to determine the most effective and uniform way to judge aptitude amongst regional anesthesia trainees.


Olga E. Paniagua, MD, is a fellow in regional anesthesiology and acute pain medicine at the New York Presbyterian/Columbia University Irving Medical Center.

Ludwin Photo LR
Danielle Ludwin, MD, is a professor of anesthesiology and program director, regional anesthesiology and acute pain medicine fellowship, at the New York Presbyterian/Columbia University Irving Medical Center.

References

  1. Allen B, McEvoy M. Competency assessment in regional anesthesia: quantity today, quality tomorrow. Reg Anesth Pain Med 2017;42(4):429-31. https://doi.org/10.1097/AAP.0000000000000626
  2. Warm E, Rosenblum M, et al. A Guidebook for Implementing and Changing Assessment in the Milestones Era. Chicago, IL: Accreditation Council for Graduate Medical Education; 2020.
  3. Chuan A, Forrest K, et al. Competency-based assessment tools in regional anesthesia: a narrative review. Br J Anaesth2018;120(2):264-73. https://doi.org/10.1016/j.bja.2017.09.007
  4. Chuan A, McLeod G. Tools to assess regional skills within a competency-based curriculum. ASRA Pain Medicine News. 2022;47. https://doi.org/10.52211/asra080122.034
  5. Woodworth G, Barrington M, et al. Anesthesia residency training in regional anesthesiology and acute pain medicine: a competency-based model curriculum. Reg Anesth Pain Med 2020;45(8):660-7. https://doi.org/10.1136/rapm-2020-101480
Close Nav