In response to skyrocketing healthcare costs, providing high value care is an increasing priority for health care systems. Physicians are responsible for translating high value care (HVC) to the bedside, however there is a paucity of instruments designed to measure observable markers of HVC at the beside. This confounds efforts to develop and measure curricular interventions around this important educational gap. Using Messick’s validity framework, our aim was to create an observational tool to capture HVC topics discussed during bedside rounds.


We undertook the development of the HVC Rounding Tool in 4 iterative phases. Phase 1 identified observable behaviors of HVC within the literature. Phase 2 utilized a modified Delphi approach consisting of 19 national experts on HVC. Through 2 cycles, the expert panel narrowed and defined HVC topics. We organized these topics into the HVC Rounding Tool using dichotomous coding where an observer would identify a topic as discussed or not during real time observation of bedside rounds.  In phase 3 we performed rater training in the use of the Tool, including development of a codebook with explanations of each item on the Tool. Lastly, in phase 4 we undertook instrument piloting for evaluation of the Tool’s reliability.


Through the Delphi process, the panel narrowed an originally proposed 16 HVC items to 10 HVC topics (Table 1) divided into 3 domains (Quality, Cost, Patient Values). 6 raters, grouped in pairs, then observed bedside rounds during 97 patient encounters. Raters were assessed in relation to one another and analysis measured inter-rater reliability using weighted kappa estimates.  We completed 3 rounds of rater training and through content analysis and codebook revision, weighted kappa estimates for each domain improved from a nadir range of 0.25 (95% CI: 0.08-0.44) to 0.41 (95% CI: 0.15-0.67) to a final kappa range of 0.96 (95% CI: 0.8-1.0) to 1.0. Positive agreement measures also improved from 50% -78% to 91.7%-100%.


By using Messick’s framework for validity evidence, we designed and tested this novel HVC Rounding Tool and have shown that it is an effective tool to measure observable markers of HVC at the bedside. We demonstrated content validity through the use of the Delphi panel, internal structure through the rigorous rounds of instrument testing, and response process through our rater training and codebook revision. The HVC Rounding tool may be adapted for use as a peer feedback instrument to help physicians integrate HVC topics during bedside rounds. In addition, our tool could be used as a metric to assess the educational efficacy of future curriculum, designed to educate both faculty and trainees about HVC.