Article Item

Regional Anesthesia Assessment Tools

Aug 7, 2019

Brian F.S. Allen, M.D.

Tags: competency assessment, regional anesthesia, assessment tools, checklist, rating scale, quality compromising behavior



Brian F.S. Allen, MD
Program Director, Regional Anesthesiology and Acute Pain Medicine Fellowship
Director, Acute Pain Service
Assistant Professor of Anesthesiology
Vanderbilt University Medical Center
Nashville, TN


Multiple assessment tools have been created over the years for assessment of trainee procedural performance during regional anesthesia tasks. Published studies on these tools can be found in a number of journals. Assessment tool format and the study designs in which they are utilized vary. Some studies seek to create a tool, others to validate one, and yet others to describe patient or training outcomes in the context of an assessment.

The references below showcase a number of regional assessment tools reported in the anesthesia literature and provide brief summaries of the associated study. The tools have been grouped into neuraxial and peripheral block categories.

The most common types of tools are 1) task-specific checklists, 2) global rating scales, 3) quality-compromising behaviors, or, most commonly, 4) a combination of two types of tools.

Task-specific checklists list steps of a procedure or task that an observer expects to see performed in the course of an encounter. For example, “asking for initial aspiration to rule out intravascular injection” is a step common to all peripheral nerve blocks and present in all related checklist tools. The rater expects this step to occur (at the proper time) and marks yes or no to whether it was performed. Some checklists allow a 3-point rating scale with specification of whether a task was performed 1) well, 2) poorly, or 3) not at all. Checklists allow more granular assessment of whether procedural steps occurred, but are, as a result, more procedure-specific and tend to be longer than global rating scales. Each different procedure may therefore need its own checklist assessment.

Global rating scales (GRS) allow assessment of overall performance or performance on sub-tasks, typically on a 5 or 7-point scale. These scales can be given “anchors” — descriptions of performance corresponding to different scores. For example, in the category “Instrument Handling,” a score of 1 may correspond to repeated awkward movements whereas 3 means occasional awkward movements and a perfect 5 denotes fluid movements without awkwardness. GRS domains may be more generalizable between different procedures, with the same instrument potentially used for different types of assessment. Fewer categories are usually present than with checklist tools, but a GRS scale does not provide the definitive yes/no about a procedural step as a checklist.

Quality compromising behaviors (QCB) are used in a minority of performance assessment tools. Instead of categorizing each step of a procedure, QCBs highlight any errors that occur. Whereas checklists are written in the affirmative, asking a mark for each procedural step that happens, QCBs are written negatively, with errors or “novice behaviors” flagged by the assessor.

A few studies listed below do not contain specific assessment tools but include checklists or guides that could be helpful in assessing regional anesthesia performance or building your own assessments for clinical practice, simulation, or preparation for Objective Structured Clinical Exams (OSCEs).

The purpose of this post is to highlight some available assessment tools that readers may incorporate into their practice. The intent is not to rank one tool over any other. Yet, as you review summaries and read the actual papers, it is worth asking several questions about each tool. What is the assessment tool designed to evaluate? Is the tool broadly applicable, or limited to a single type of procedure? What validity evidence is presented to suggest the assessment does what it is intended to do? Is it intended for summative use or formative use? How many items are included in each assessment? How feasible is incorporation into clinical practice? What are the relative merits of one assessment type over another?

Peripheral nerve block assessment tools

Sites B, Spence B, Gallagher J, Wiley C, Bertrand M, Blike G. Characterizing Novice Behavior Associated With Learning Ultrasound-Guided Peripheral Regional Anesthesia. Reg Anesth Pain Med. 2007;32(2):107-115.

Sites et al, (2007) sought to characterize mistakes by trainees during ultrasound-guided single injection peripheral nerve blocks. 6 residents were videotaped during performance of 520 blocks. Their mistakes, which are referred to as novice behaviors or quality-compromising behaviors (QCB), were categorized by type and tracked over time. Seven error types were identified before the study, with 5 other errors identified during the study. These 12 types of quality-compromising behaviors were not designed as an assessment tool but have the potential to be used as one. The most common novice behaviors were 1) needle advancement without visualization and 2) unintentional probe movement.

Number of items: 12; Assessment type: QCBs; Clinical use: UGRA single-injection peripheral block

Cheung JJH, Chen EW, Darani R, McCartney CJL, Dubrowski A, Awad IT. The Creation of an Objective Assessment Tool for Ultrasound-Guided Regional Anesthesia Using the Delphi Method. Reg Anesth Pain Med. 2012;37(3):329-333.

Cheung et al. (2012) report of the creation of a 22-item checklist and 9-item global rating scale tool for ultrasound-guided single-injection peripheral nerve block, developed using a modified-Delphi method. The modified-Delphi method is well described in the study.

Number of items: 31; Assessment type: Checklist and GRS; Clinical use: UGRA single-injection peripheral block with or without nerve-stimulation

Laurent DABS, Niazi AU, Cunningham MS, et al. A Valid and Reliable Assessment Tool for Remote Simulation-Based Ultrasound-Guided Regional Anesthesia. Reg Anesth Pain Med. 2014;39(6):496-501.

Burckett–St. Laurent et al. (2014), tested the checklist developed by Cheung et al. for ultrasound-guided single-injection peripheral nerve blocks. This study provided validity evidence for the tool's use based on good intraclass correlation, an ability to distinguish novices from experienced proceduralists, and similar scores in both simulated and patient-care scenarios.

Number of items: 31; Assessment type: Checklist & GRS; Clinical use: UGRA single-injection peripheral block

Wong DM, Watson MJ, Kluger R, et al. Evaluation of a Task-Specific Checklist and Global Rating Scale for Ultrasound-Guided Regional Anesthesia. Reg Anesth Pain Med. 2014;39(5):399-408.

Wong et al. (2014), adapted the tool used by Cheung et al. into a 31-item checklist and 10-item global rating scale for ultrasound-guided single-injection peripheral nerve blocks. In contrast to previous checklists which were written in the affirmative (check if an action DID occur), Wong et al. worded items as Quality Compromising Behaviors (QCB). In this tool, points were given for the absence of an error or QCB. This use of QCBs is reminiscent of the Sites et al. study. This study evaluated 30 blocks using the tool and found an intraclass correlation coefficient of 0.44 for total score. The tool took about 4.5 min (median) to complete an assessment.

Number of items: 41; Assessment type: QCBs and GRS; Clinical use: UGRA single-injection peripheral block

Watson MJ, Wong DM, Kluger R, et al. Psychometric evaluation of a direct observation of procedural skills assessment tool for ultrasound-guided regional anaesthesia. Anaesthesia. 2014;69(6):604-612.

Watson et al. (2014), adapted the Direct Observation of Procedural Skills (DOPS) scoring system into a 11+2 item tool for ultrasound-guided single-injection peripheral nerve blocks. Eleven items were graded on a 9-point scale with behavioral anchors to guide the rating. Two items were dichotomous. The assessments were done on trainees who were videotaped performing nerve blocks. An extensive instruction sheet was used to prompt questioning by supervisors pre-procedure and intra-procedure that was used to score the trainee. Appendix 1 of the paper could be very useful to anyone instructing trainees in regional anesthesia. Intraclass correlation coefficients were low at 0.1-0.49.

Number of items: 13; Assessment type: 9 point behaviorally anchored scale with 2 dichotomous items; Clinical use: UGRA single-injection peripheral block

Naik V, Chandra D, Chung D, Chan V. An Assessment Tool for Brachial Plexus Regional Anesthesia Performance: Establishing Construct Validity and Reliability. Reg Anesth Pain Med. 2007;32(1):41-45.

Naik et al. (2007) sought to validate a 20-item checklist and 8-item global rating scale specific to nerve-stimulator-guided interscalene brachial plexus blockade. They used a pre-existing global rating scale with a checklist created by the modified Delphi method. Scores of senior trainees were compared to those of junior trainees, showing the tool could discriminate between the groups.

Number of items: 28; Assessment type: Checklist and GRS; Clinical use: Stimulator-guided single-injection interscalene block

Sultan SF, Iohom G, Saunders J, Shorten G. A clinical assessment tool for ultrasound-guided axillary brachial plexus block. Acta Anaesthesiol Scand. 2012;56(5):616-623.

Sultan et al. (2012) developed a 63-item checklist and 9-item global rating scale for use in ultrasound-guided axillary brachial plexus blocks. They used expert opinion to develop the checklist and then studied inter-rater reliability and the ability of the tool to discriminate between providers of different skill levels.

Number of items: 72; Assessment type: Checklist and GRS; Clinical use: UGRA single-injection axillary block

Ahmed OM, O'Donnell BD, Gallagher AG, Shorten GD. Development of performance and error metrics for ultrasound-guided axillary brachial plexus block. Adv Med Educ Pract. 2017;8:257-263.

Ahmed et al. (2017), developed a 54-item checklist and 32-item quality compromising behavior (QCB) tool for ultrasound-guided single-injection axillary brachial plexus blocks. Using an intensive Delphi method, 54 requisite steps in axillary block were identified as were 32 QCBs or errors. Errors were categorized as critical or non-critical for scoring. This study only described tool development, no testing was performed using the tool.

Number of items: 86; Assessment type: Checklist & QCBs; Clinical use: UGRA single-injection axillary block

Ahmed OMA, O'Donnell BD, Gallagher AG, Breslin DS, Nix CM, Shorten GD. Construct validity of a novel assessment tool for ultrasound-guided axillary brachial plexus block. Anaesthesia. 2016;71(11):1324-1331.

Ahmed et al. (2016) took their ultrasound-guided single-injection axillary brachial plexus block tool and recorded a very small number of blocks by experts and novices. Experts were shown to commit fewer errors and slightly fewer critical errors (0.8 vs. 1.3) compared to novices. The inter-rater reliability between 2 unidentified reviewers was very good (0.88). Times required to complete the assessment were not recorded.

Number of items: 86; Assessment type: Checklist & QCBs; Clinical use: UGRA single-injection axillary block

Neuraxial anesthesia assessment tools

Friedman Z, Devito I, Siddiqui M, Chan V. Objective Assessment of Manual Skills and Proficiency in Performing Epidural Anesthesia—Video-Assisted Validation. Reg Anesth Pain Med. 2006;31(4):304-310.

Friedman et al. (2006) developed a 27-item checklist and 7-item global rating scale for use in labor epidural placement. A panel of OB anesthesiologists were developed the assessment tool, though development details are not disclosed. Epidural placements by residents were videotaped when they had performed 0-30, 31-90, and > 90 epidural placements. Validity of the tool was suggested by the fact that higher scores correlated to increased trainee experience. Good correlation in total scores between raters also supported validity and reliability of the tool.

Number of items: 34; Assessment type: Checklist and GRS; Clinical use: Lumbar Epidural Placement

Friedman Z, Siddiqui N, DEVITO I. Experience is not enough: repeated breaches in epidural anesthesia aseptic technique by novice operators despite improved skill. Anesthesiology. 2008.

Friedman et al. (2008), developed a 15-item checklist for aseptic technique during lumbar epidural placement to complement their previously-developed manual skills checklist and GRS. This sterile technique assessment tool was employed to assess residents with various levels of lumbar epidural experience. Progression of manual or technical skills was shown to be faster than progression in sterile technique.

Number of items: 15; Assessment type: Checklist; Clinical use: Sterile technique during epidural placement

Birnbach DJ, Santos AC, Bourlier RA, et al. The effectiveness of video technology as an adjunct to teach and evaluate epidural anesthesia performance skills. Anesthesiology. 2002;96(1):5-9.

Birnbach et al. (2002), developed a 33-item checklist for labor epidural placement. Each item was assigned to one of 13 skills and judged on whether a major error, minor error, or no error occurred. Residents were videotaped performing blocks at the start, middle, and end of a month of OB anesthesia. Those allowed to review their performance tapes showed greater skill gains on the assessment tool than those who did not view their performances. The tool was developed by 4 OB anesthesiologists from a larger set of criteria, but the method of tool development was not further detailed.

Number of items: 33; Assessment type: Checklist; Clinical use: Lumbar Epidural Placement

Weed J, Finkel K, Beach ML, Granger CB, Gallagher JD, Sites BD. Spinal Anesthesia for Orthopedic Surgery: A Detailed Video Assessment of Quality. Reg Anesth Pain Med. 2011;36(1):51-55.

Weed et al. (2011), recorded spinal anesthetics in 60 patients to categorize the novice behavior or quality-compromising behavior (QCB) that occurs during spinal placement. Twelve different QCBs were identified. This study assessed placement times, provider training level, patient BMI, number of needle passes, and QCBs to make meaningful correlations. Though it was not designed as an assessment tool, the 12 QCBs observed during spinal placement are useful to know and watch for.

Number of items: 12; Assessment type: QCBs; Clinical use: Spinal

Breen D, Bogar L, Heigl P, Rittberger J, Shorten GD. Validation of a clinical assessment tool for spinal anaesthesia. Acta Anaesthesiol Scand. 2011;55(6):653-657.

Breen et al. (2011) developed a tool that included 11 errors, 2 time measurement intervals, and a 6-item global rating scale for use in spinal placement. Items were developed by a consensus of a focus group that included anesthesiologists, trainees, and a psychologist. The errors differed from quality compromising behaviors (QCB) in other studies (e.g. Sites et al.). Breen et al. had errors that were a mixture of behaviors (doesn't palpate iliac crests) and outcomes (more than 3 spaces attempted, supervisor takes over procedure). Time measurements were T1 - positioning to + CSF and T2 - palpation to skin local. The only validity evidence for the tool was an ability to distinguish novice, intermediate, and expert proceduralists from one another. No ratings of interrater reliability were performed. Time intervals did not differ significantly between groups 

Number of items: 19; Assessment type: QCBs, GRS, and time measures; Clinical use: Spinal

Table 1. Selected Procedural Assessment tools for peripheral and neuraxial regional anesthesia techniques.

GRS = global rating scale, QCBs = quality compromising behaviors

Assessment Author


Type of Assessment

# of items


Peripheral Blocks







Checklist + GRS

20 + 8

Interscalene, Nerve Stim








Checklist + GRS

63 + 9

Axillary, UGRA



Checklist + GRS

22 + 9





31 + 10




9-point scale with behavioral anchors.

2 dichotomous questions

11 + 2




Checklist + QCBs

54 + 32

Axillary, UGRA

Neuraxial Blocks












Checklist + GRS

27 + 7






Asepsis during Epidural








QCBs, GRS, Time measures



Other useful materials for guiding assessment and training

Chuan A, Wan AS, Royse CF, Forrest K. Competency-based assessment tools for regional anaesthesia: a narrative review. Br J Anaesth. 2018;120(2):264-273.

Chuan et al. (2018), perform an in-depth appraisal of assessment tools in this narrative review. Different modalities of assessment are described, as are the principles underlying them. There is an emphasis and thorough description of the properties of an optimal tool, including validity, reliability, etc. Supplemental tables provide in-depth information about the tools highlighted in this review.

Ben-Menachem E, Ezri T, Ziv A, Sidi A, Brill S, Berkenstadt H. Objective Structured Clinical Examination-based assessment of regional anesthesia skills: the Israeli National Board Examination in Anesthesiology experience. Anesth Analg. 2011;112(1):242-245.

Ben-Menachem et al. (2011), describe Objective Structured Clinical Exams for the Israeli Anesthesiology boards. Assessments and tasks were developed via Delphi method. The paper presents the scoring checklist used in an axillary block OSCE scenario. This assessment tool was not designed for clinical practice, but for a testing environment. However, it can provide insight into OSCE scoring.

Sites BD, Chan VW, Neal JM, et al. The American Society of Regional Anesthesia and Pain Medicine and the European Society of Regional Anaesthesia and Pain Therapy Joint Committee Recommendations for Education and Training in Ultrasound-Guided Regional Anesthesia. Reg Anesth Pain Med. 2009;34(1):40-46.

Sites et al. (2009), present ASRA/ESRA guidelines for UGRA training. Though this guideline did not create or report a specific assessment tool, it contains numerous helpful lists that can be adapted to support procedural assessments. These lists and tables include: 1) 10 key tasks for UGRA, 2) Skills associated with block proficiency, UGRA core competencies, 3) Ultrasound curriculum maps, 4) Stepwise ultrasound scanning procedures.

Updated 03/02/2018


Close Nav