Skip to main content

How many is enough? Measuring the number of FAST exams needed by emergency medicine trainees to reach competence

Abstract

Background

For patients with blunt abdominal trauma, the Focused Assessment with Sonography in Trauma (FAST) exam is the initial imaging modality employed to diagnose and risk stratify. A positive FAST exam in this patient population denotes intraperitoneal hemorrhage. In a hemodynamically unstable patient, it necessitates rapid surgical intervention. Ultrasound is highly dependent on the operator’s ability to obtain quality images for interpretation. Failure to obtain adequate images prevents accurate interpretation and reduce its diagnostic accuracy. Previous studies evaluating whether the FAST exam can be improved solely by experience had conflicting results. None of those studies used an objective method to evaluate the FAST exam’s quality. Our study aimed to objectively determine the number of FAST exams required by an emergency medicine (EM) resident to reach sufficient quality for independent scanning.

Methods

59 first-year EM residents from a single site were included in this study. All FAST exams that were saved in the Qpath archival system by the 59 EM residents, whether the exam was performed for educational or clinical purposes, were reviewed, and scored using a Task-Specific Checklist (TSC) score. This score is an objective way to assess the proficiency and quality of the FAST scan. The TSC was based on whether the imaging of 24 specific anatomic landmarks, split into four anatomic regions, was completed successfully or not. The AEMUS (Advanced EM Ultrasonography) faculty provided feedback to trainees wither electronically via Qpath or at the bedside. According to the quality of ultrasound imaging and competence (QUICK Score), if the average TSC score for the first 10 exams was 18 or higher, the resident was considered an expert. However, if the resident failed to achieve that score, we skipped the first exam performed by the resident and the average score for the second through eleventh exams was then calculated. If the resident did not achieve the desired result, the first and second exams were skipped and the average score for the remaining 10 exams was determined. This sequence was repeated until the resident achieved an average score of 18 or higher on their TSC score.

Results

In total, 663 FAST scans performed by EM residents were scored. The average number of FAST exams needed for independent scanning is 11.23 (95% CI, 10.6-11.85). 66.1% of enrolled residents achieved an average score of 18 or higher in their first 10 FAST exams, and 33.8% of residents required more than 10 scans. The average scores for the right upper quadrant (RUQ), left upper quadrant (LUQ), pelvic, and subxiphoid views were 5 (95% CI, 4.88–5.1), 4.7 (95% CI, 4.59–4.8), 5.1 (95% CI, 4.96–5.24), and 3.7 (95% CI 3.6–3.8) respectively.

Conclusion

This study demonstrated that when constructive feedback on each FAST exam was given, the average first-year emergency medicine resident achieves competency in performing FAST exams independently after completing 10–12 (average of 11.23) FAST exams. Further research is required to validate the findings.

Introduction

Focused assessment with sonography for trauma (FAST) is the initial diagnostic imaging modality for blunt abdominal trauma [1, 2]. For the hemodynamically unstable patient, a positive FAST is a sign of peritoneal hemorrhage that necessitates emergent surgical intervention [3, 4]. Due to the great reliance on the operator in ultrasound [5], failure to obtain adequate images has been shown to prevent accurate interpretation of the examination and reduce its value [3, 6]. With such important decisions to make, having a trustworthy method to assess the image quality is essential.

The Accreditation Council for Graduate Medical Education (ACGME) classified point-of-care ultrasound (POCUS) as one of the key skills for graduating emergency medicine residents. Moreover, the American College of Emergency Physicians (ACEP) recommends that emergency medicine residents must complete 150–300 bedside ultrasound exams before graduating, without specifying the type of exam [6,7,8]. Changes in medical education have prompted increased calls for a transition toward a competency-based training curriculum, necessitating the creation of objective competency assessment methods [9, 10]. ACEP POCUS guidelines recommend 25–50 FAST exams prior to graduation in order to achieve competency, however these numbers were based on expert consensus only [7]. Previous research attempted to evaluate the clinician’s performance of the FAST exam, but none used objective methods to analyze the quality of the acquired images [11, 12].

In 2015, the quality of ultrasound imaging and competence (QUICK) score was created as the first validated objective measure of FAST exam quality. The QUICK score was formed from two scoring checklists, Task-Specific Checklist (TSC) and the Global Rating Scale (GRS). The ultrasound images and a video recording of the participants’ hands were recorded in split-screen time-synchronized fashion, with the recordings independently scored by two experts blinded to group assignment who were themselves not participants in the study. TSCs gives a binary assessment of the performance of different parts of a challenging task, scoring a “1” for task completion and a “0” for task failure. Participants do not need to receive perfect scores to finish a task, but higher scores reflect greater competency in validated models. GRS measures task performance quality without evaluating task completion. The GRS score is designed as a Likert Scale. There is no predefined passing score for GRS, but the mean scores are compared using the test.

The QUICK study enrolled 12 novice and 12 expert sonographers who performed the FAST exam on a healthy volunteer. The recorded FAST exams by the participants were independently scored by two experts blinded to both groups. The result showed the experts achieved significantly higher total scores than novice participants [13]. TSC is a reliable and valid score alone to assess and evaluate trainee performance [14]. The combined TSC and GRS scores have also been used in other POCUS and non-POCUS studies to evaluate a trainee’s performance [13, 15]. TSC will be used in this study to assess the quality of the FAST exam completed by emergency medicine residents during their first year of training.

Previous attempts to develop a learning curve for the FAST exam failed since no systematic verified tools were employed to measure exam quality [16]. Despite multiple large-cohort studies, there is controversy in the literature as to whether FAST performance improves with experience alone [16,17,18]. To our knowledge, no one has used a validated scoring system to assess the quality of the FAST exam completed by residents. The goal of this study is to determine how many FAST exams each resident needs to perform to achieve competency to perform FAST exams independently.

Methods

Study setting and design

This was a retrospective single-center study. The Institutional Review Board at Emory University approved this study. FAST exams were performed at Grady Memorial Hospital in Atlanta, GA between July 2018 to June 2022. Grady Memorial Hospital is a Level I trauma center accredited by the American College of Surgeons, with an established emergency medicine residency program and an advanced emergency ultrasound fellowship (AEMUS) program. Grady Memorial Hospital uses the Telexy Qpath software program (Mapleridge, BC, Canada) for image archival, documentation, quality assurance and educational feedback of POCUS exams performed in the emergency department (ED).

Residents from the graduating classes of 2021, 2022, and 2023 were enrolled in the study. All residents attended an ultrasound “bootcamp” during the first month of their residency program, where they were taught about basic ultrasound physics and received 4 h of hands-on POCUS training, including the FAST exam. As is standard practice in the ED, all FAST exams performed are archived by residents in Qpath. All exams, whether performed for education or clinical purposes, were reviewed, and assessed by AEMUS Fellowship-trained faculty members. The residents received constructive feedback about the quality of their images and interpretation of the exam the same week, or while performing the exam by an emergency medicine attending, if the FAST exam was clinically indicated.

Outcome measure

Determining the number of FAST exams that an emergency medicine (EM) resident needs to perform to achieve competency for independent scanning.

Data collection and processing

The resident-performed FAST exams were saved and documented on the “FAST” worksheet in Qpath. This allowed the research team to filter FAST exams from all other POCUS exams performed in the ED. The research team assessed the quality of the FAST exam using the TSC score that was created by Ziesmann et al. in 2015 [13]. The TSC was based on whether the imaging of 24 specific anatomic landmarks, split into four anatomic regions (the right and left upper quadrants, pelvis, and pericardium), was completed successfully or not. The decision to give the point for each anatomic landmarks base on (Table 1). The maximum score for the exam is 24, and the minimum score is 0. Based on the Ziesmann study, a score of 18 represents an 87% probability of expert status, with a sensitivity of 85.7% and a specificity of 75.0% for prediction of expertise, and an area under the ROC (AUROC) curve of 89.9% (95% CI, 0.782Y1.000). Based on this data, we used the score 18 as the cutoff number for expert status.

Data was collected and image quality scored by two Emory University AEMUS fellows. The first 10 FAST exams for each resident were reviewed; if the average score for the first 10 exams was 18 or higher, the resident was considered an expert. However, if the resident failed to achieve that score, we skipped the first exam performed by the resident. Then we calculated the average score for the next 10 exams (exams number two to eleven). If the resident still failed to achieve the desirable score, The first and second exams were skipped, and the average score was calculated for the next ten exams. This sequence was repeated until the resident achieved an average score of 18 or higher. After the AEMUS fellow finished scoring, 25% of the scored exams were reviewed for accuracy by an AEMUS fellowship-trained and focused practice designation (FPD) certified faculty.

Table 1 Task specific checklist hepatorenal space

Results

A total of 59 EM residents from the graduating classes of 2021, 2022 and 2023 were enrolled in the study, and 663 of their scans were assessed for quality. The average number of FAST exams required to be qualified for independent FAST performance was 11.23 (95% confidence interval [95% CI], 10.6-11.85), ranging from 10 to 21 (Fig. 1). 66.1% of all enrolled residents achieved an average score of 18 or higher in their first 10 FAST exams, and 33.8% of residents required more than 10 scans, with a maximum of 21 scans to attain a desirable score. The average scores for the RUQ, LUQ, pelvic, and subxiphoid views were 5 (95% CI, 4.88–5.1), 4.7 (95% CI, 4.59–4.8), 5.1 (95% CI, 4.96–5.24), and 3.7 (95% CI 3.6–3.8) respectively (Fig. 2).

Fig. 1
figure 1

Number of focused assessment with sonography in trauma (FAST) exam.1. Number of FAST exam needed for emergency medicine residents to achieve a score 18 and high on TSC

Fig. 2
figure 2

Task-specific checklist (TSC) scores. TSC scores for each view of the FAST exam stratified by year of residency training

There were no significant differences in the average scores across the three classes. The 2022 EM residents were able to reach the score of 18 or higher after 10.8 scans, compared to 11.7 for the 2021 residents, and 11.2 for the 2023 residents. The average score for the RUQ, LUQ, pelvic, and subxiphoid views were 4.9, 4.8, 5.2 and 3.8 respectively for the 2021 class. The average score for the RUQ, LUQ, pelvic, and subxiphoid views were 5, 4.5, 5.3 and 3.7 respectively for the 2022 class. The average score for the RUQ, LUQ, pelvic, and subxiphoid views were 5, 4.7, 4.6 and 3.5 respectively for the 2023 class.

The pelvic view had the highest average score of 5.1 in the anatomic subset analysis when compared to the average scores in other views. Whereas the subxiphoid view had the lowest average score of 3.7.

Discussion

POCUS differs from other advanced imaging techniques in that imaging is performed and interpreted by the clinician at the point of care, rather than being performed by a technologist and interpreted by a radiologist or other physician trained to interpret images. POCUS is highly operator dependent, and its accuracy varies according to practitioner skill [19, 20]. Obtaining and effectively interpreting the images are the responsibility of the clinician sonologist [20]. The lack of high-quality images has a negative impact on diagnostic accuracy [16]. Using a validated assessment tool like the QUICK score to assess the quality of the FAST exam is an efficient way to assess the competence of the residents in performing the exam [6]. The study achieved its primary goal by calculating the number of FAST examinations required for each resident to achieve competency for independent exam performance and interpretation.

Our study showed the average resident will be competent in performing the FAST exam after attending the POCUS “bootcamp” and receiving constructive feedback on 11.23 FAST exams. This result is consistent with the study performed by Shackford et al. that showed the error rate fell from 17 to 5% after the clinician performed 10 FAST exam [17].

Leavitt RM et al. [21] found that a 15-minute digital training module followed by a 2-hour ultrasound scanning session to be an effective way for participants to acquire POCUS skills. The authors used a comparison questionnaire for the learner to assess the learner’s participant confidence levels before and after the exam. Gracias et al. [18] showed the learning curve for FAST starts to flatten out at 30 to 100 examinations. In this study, the learner’s ability to recognize the free fluid in the abdomen served as an indicator for competency level. However, neither study used valid objective measures to assess image quality. Our study has improved validity by utilizing objective measures and a scoring system to assess the quality of the images.

The subxiphoid view was the most difficult for the residents to obtain, with a mean score of 3.7 out of 6. This finding is in line with the Ziesmann MR et al. [13] article that showed the novice mean score is 1.88 out of 6 and expert mean score is 4.3 out of 6 in obtaining the subxiphoid view. According to those findings, we suggest additional supervision and vigilance when the new learner is obtaining the subxiphoid view for FAST exam. ACEP recommends that learners complete a minimum of 25–50 quality-reviewed FAST exams Before graduation [7]. Based on our findings, ACEP could lower the minimum number of FAST exams from the current requirement to 21. This is because the highest number of FAST exams required by all residents in the study to achieve a score of 18 or higher was 21. Alternatively, rather than relying on a specific number of exams, residencies wishing to employ competency-based evaluation methods can utilize the TSC and QUICK scores to determine competency.

The study has a few limitations. First, the study revealed competency of the EM residents in performing a FAST exam at a level one trauma center with a well-structured emergency medicine residency program and a well-established emergency ultrasound section. The results may not translate to EM residency programs without these resources and expert AEMUS faculty. Second, the EM residents perform a variety of POCUS exams throughout their intern year, which strengthens their overall POCUS skills. Non-POCUS exposed trainees may require a higher number of FAST exams to be qualified for independent scanning. Finally, in this study the trainee either received bedside feedback or the AEMUS faculty sent electronic feedback via Qpath about the quality of the FAST exam; more research is needed to determine which method is more effective for the learner.

Conclusion

This study found that the average first-year EM resident can be competent to perform FAST exams independently after completing 10–12 (average of 11.23) FAST exams including constructive feedback during their first year of training. However, additional research is required in a variety of residency types to externally validate the findings. Furthermore, future research can include the creation of multiple checklists for various POCUS exam to evaluate competency in other POCUS exam types.

Data availability

No datasets were generated or analysed during the current study.

References

  1. Bloom BA, Gibbons RC. Focused Assessment with Sonography for Trauma. [Updated 2021 Jul 31]. StatPearls [Internet]. Treasure Island (FL): StatPearls Publishing; 2022. Jan-.

    Google Scholar 

  2. Rossaint R, Bouillon B, Cerny V, et al. Management of bleeding following major trauma: an updated European guideline. Crit Care. 2010;14(2):R52. https://doiorg.publicaciones.saludcastillayleon.es/10.1186/cc8943.

    Article  PubMed  PubMed Central  Google Scholar 

  3. Blivaiss C, Grewal NP, Steiner A, Puya M. How Accurate is our FAST- 1 year review of FAST exam concordance with CT findings in patients with solid organ injury at a level 2 -community hospital trauma center. J Sci Innov Med. 2019;2(2):20. https://doiorg.publicaciones.saludcastillayleon.es/10.29024/jsim.35.

    Article  Google Scholar 

  4. Rossaint R, Bouillon B, Cerny V, Coats TJ, Duranteau J, Fernández-Mondéjar E, Hunt BJ, Komadina R, Nardi G, Neugebauer E, Ozier Y, Riddez L, Schultz A, Stahel PF, Vincent JL, Spahn DR. Management of bleeding following major trauma: an updated European guideline. Crit Care (London England). 2010;14(2):R52. https://doiorg.publicaciones.saludcastillayleon.es/10.1186/cc8943. & Task Force for Advanced Bleeding Care in Trauma

    Article  Google Scholar 

  5. Pinto A, Pinto F, Faggian A, Rubini G, Caranci F, Macarini L, Genovese EA, Brunese L. (2013). Sources of error in emergency ultrasonography. Critical ultrasound journal, 5 Suppl 1(Suppl 1), S1. https://doiorg.publicaciones.saludcastillayleon.es/10.1186/2036-7902-5-S1-S1

  6. Pinto A, et al. Sources of error in emergency ultrasonography. Crit Ultrasound J. 2013;5:S1. https://doiorg.publicaciones.saludcastillayleon.es/10.1186/2036-7902-5-S1-S1. Suppl 1,Suppl 1.

  7. Statement -ACEPP. Ultrasound Guidelines: Emergency, Point-of‐Care, and Clinical Ultrasound Guidelines in Medicine. 2016. https://www.acep.org/globalassets/sites/acep/media/ultrasound/pointofcareultrasound-guidelines.pdf

  8. ACGME Program Requirements for Graduate Medical Education in Emergency Medicine. Retrieved from https://www.acgme.org/globalassets/PFAssets/ProgramRequirements/110_EmergencyMedicine_2020.pdf?ver=2020-06-26-125701-320&ver=2020-06-26-125701-320

  9. Royal College of Physicians and Surgeons of Canada. Competency-Based Medical Education. Ottawa, Ontario, Canada: 2010.

  10. Swing SR. The ACGME outcome project: retrospective and prospective. Med Teach. 2007;29(7):648Y654.

    Article  Google Scholar 

  11. Buaprasert P et al. Jun. Diagnostic Accuracy of Extended Focused Assessment with Sonography for Trauma Performed by Paramedic Students: A Simulation-Based Pilot Study. Open access emergency medicine: OAEM vol. 13 249–256. 21 2021, https://doiorg.publicaciones.saludcastillayleon.es/10.2147/OAEM.S311376

  12. Basnet S, Shrestha SK, Pradhan A, et al. Diagnostic performance of the extended focused assessment with sonography for trauma (EFAST) patients in a tertiary care hospital of Nepal. Trauma Surg Acute Care Open. 2020;5:e000438. https://doiorg.publicaciones.saludcastillayleon.es/10.1136/tsaco-2020-000438.

    Article  PubMed  PubMed Central  Google Scholar 

  13. Ziesmann M, Tyler, et al. Validation of the quality of ultrasound imaging and competence (QUICK) score as an objective assessment tool for the FAST examination. J Trauma Acute Care Surg. 2015;78(5):1008–13.

    Article  PubMed  Google Scholar 

  14. Campos MEC, de Oliveira MMR, Reis AB, de Assis LB, Iremashvili V. Development and validation a task-specific checklist for a microsurgical varicocelectomy simulation model. Int Braz J Urol. 2020 Sep-Oct;46(5):796–802. PMID: 32539251; PMCID: PMC7822372.

  15. Wilson CA, Chahine S, Davidson J, Dave S, Sener A, Rasmussen A, Saklofske DH, Wang PZT. Working Towards Competence: A Novel Application of Borderline Regression to a Task-Specific Checklist for Technical Skills in Novices. J Surg Educ. 2021 Nov-Dec;78(6):2052–2062. doi: 10.1016/j.jsurg.2021.05.004. Epub 2021 Jun 3. PMID: 34092532.

  16. Ma OJ, Gaddis G, Norvell JG, Subramanian S. How fast is the focused assessment with sonography for trauma examination learning curve? Emerg Med Australas. 2008;20(1):32–7. https://doiorg.publicaciones.saludcastillayleon.es/10.1111/j.1742-6723.2007.01039.x. Epub 2007 Dec 6. PMID: 18062785.

    Article  PubMed  Google Scholar 

  17. ShackfordSR. RogersFB,OslerTM,TrabulsyME,ClaussDW,VaneDW. Focused abdominal sonogram for trauma: the learning curve of nonra- diologist clinicians in detecting hemoperitoneum. J Trauma. 1999;46(4):553Y562.

    Google Scholar 

  18. Gracias VH, Frankel HL, Gupta R, Malcynski J, Gandhi R, Collazzo L, Nisenbaum H, Schwab CW. Defining the learning curve for the focused abdominal Sonogram for Trauma (FAST) examination: implications for credentialing. Am Surg. 2001;67(4):364Y368.

    Google Scholar 

  19. Shokoohi H, Duggan NM, Adhikari S, Selame LA, Amini R, Blaivas M. Point-of-care ultrasound stewardship. J Am Coll Emerg Physicians Open. 2020;1(6):1326–31. https://doiorg.publicaciones.saludcastillayleon.es/10.1002/emp2.12279. PMID: 33392540; PMCID: PMC7771754.

    Article  PubMed  PubMed Central  Google Scholar 

  20. Tsou PYC, Chen KP, Wang YH, et al. Diagnostic accuracy of lung ultrasound performed by novice versus advanced sonographers for pneumonia in children: a systematic review and meta-analysis. Acad Emerg Med. 2019;26(9):1074–88. https://doiorg.publicaciones.saludcastillayleon.es/10.1111/acem.13818.

    Article  PubMed  Google Scholar 

  21. Leavitt RM, Arpin PA, Nielsen BM, Mason NL. Independent learning of the sonographic FAST exam technique using a tablet-based training module. Am J Disaster Med. 2021 Spring;16(2):95–104. https://doiorg.publicaciones.saludcastillayleon.es/10.5055/ajdm.2021.0392. PMID: 34392522.

Download references

Funding

None.

Author information

Authors and Affiliations

Authors

Contributions

AHB and AAB contributed to the discussion and interpretation of the results. JL, GH and JG developed the methodology, performed the data analysis, and collected data. YA and RA wrote the introduction, and prepared the figures. All the authors reviewed the manuscript.

Corresponding author

Correspondence to Abdullah Bakhsh.

Ethics declarations

Ethical approval

The study was conducted in accordance with the Declaration of Helsinki and was approved by the Ethics Committee. Internal Review Board (IRB) ID: STUDY00003768 on Month 09/02/2022 with an exemption from informed consent. No specific consent is needed for statistical analyses of aggregated deidentified data. For this study, patients’ identities, including names, screening IDs, patient IDs, and mobile phone numbers, were de-identified.

Consent to participate

The patient constant form waved. For this study, patients’ identities, including names, screening IDs, patient IDs, and mobile phone numbers, were de-identified.

Consent for publication

Not applicable.

Competing interests

The authors declare no competing interests.

Additional information

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License, which permits any non-commercial use, sharing, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if you modified the licensed material. You do not have permission under this licence to share adapted material derived from this article or parts of it. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by-nc-nd/4.0/.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Bakhribah, A., Leumas, J., Helland, G. et al. How many is enough? Measuring the number of FAST exams needed by emergency medicine trainees to reach competence. Int J Emerg Med 17, 168 (2024). https://doiorg.publicaciones.saludcastillayleon.es/10.1186/s12245-024-00742-x

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doiorg.publicaciones.saludcastillayleon.es/10.1186/s12245-024-00742-x

Keywords