Background: The latest guidelines on parapneumonic effusion (PPE) management1 recommend immediate evaluation of PPE with ultrasound. Recognizing that bedside procedure services (BPSs) staffed with procedural hospitalists (proceduralists) are becoming the first point of ultrasound contact for many patients admitted with PPE our study sought to demonstrate that proceduralists can reliably agree with radiologists (viewed as the gold standard) in determining the presence of loculations in PPEs via ultrasound. To our knowledge there is no literature to support that proceduralists can reliably diagnose sonographic loculations.

Methods: This was a retrospective study with de-identified images. Inclusion criteria was any patient admitted to Froedtert Hospital with concern for pneumonia who had sonographic pleural imaging and pleural fluid analysis. After exclusion criteria, our sample size included 82 sets of images. These sets of images were independently evaluated by four physicians, two radiologists and two proceduralists. Statistical analysis included observer agreement, kappa, and proportions test. Agreement was defined as all physicians in the comparison group agreeing on the presence or absence of loculations.

Results: Cross-disciplinary agreement between all 4 physicians was 76.29% with a kappa of 0.63. Individual cross-disciplinary agreement ranged from 85.37-97.56% with a range of kappa from 0.55-0.9. Our high agreement and kappa of 97.56% and 0.9 respectively, came between a radiologist and a proceduralist. Our lowest kappa (0.55) was also between a radiologist and proceduralist. This low kappa coincided with the only significant difference in the proportion of perceived loculations (p<0.05). Interdisciplinary agreement was 87.80% for proceduralists and 90.24% for radiologists, both with a kappa of 0.64.

Conclusions: This project provides initial indication that recognizing sonographic loculations is similar across disciplines given a cross-disciplinary kappa of 0.63 across all hospitalists and radiologists. Additionally, our highest individual agreement and kappa was between a radiologist and proceduralist.Our study did have several limitations. First, our definition of agreement was very strict, which likely accounts for our low agreement when comparing across all 4 physicians given otherwise similar kappa values with individual comparisons. Additionally, the study is retrospective, had images for different departments (radiology, intensive care, and emergency), and did not contain video, which all limited the physicians’ ability to interpret some image sets. We feel that a prospective, real-time study would improve both the cross-disciplinary and interdisciplinary agreement and reliability between radiologists and proceduralists, enhancing the role of proceduralists in management of PPEs.