All of the 7 quality designation programs included outcomes (ie, readmission rates and/or mortality rates) and publicly reported the hospitals receiving their quality designation. In contrast, only Aetna explicitly included the presence of multidisciplinary clinical care pathways as part of their quality designation criteria. In addition, only UnitedHealth included surgeon specialization in joint arthroplasty as a factor for quality consideration for its quality designation program. BCBS Distinction+ and Aetna Institutes of Quality for Orthopedic Surgery were the only 2 quality designations that included at least 1 variable that fit into each of the 7 characteristics considered (accreditation, volume, structural, process, outcomes, patient experience, and cost/efficiency).
DISCUSSION
As healthcare continues to shift toward value-based delivery and payment models, quality becomes a critical factor in reimbursement and provider rankings. However, quality is a vague term. Several providers probably do not know what is required to be designated as high quality by a particular rating agency. Moreover, there are multiple quality designation programs, all using distinct criteria to determine “quality,” which further complicates the matter. Our objective was to determine the key stakeholders that provide quality designations in TJA and what criteria each organization uses in assessing quality.
Our idea of comprehensive quality is based on Avedis Donabedian’s enduring framework for healthcare quality focused on structure, process, and outcome.16 We expanded on these 3 areas and analyzed quality designations based on variables fitting into the following categories: accreditation, volume, structural, process, outcomes, patient experience, and cost/efficiency. We believe that these categories encompass a comprehensive rating system that addresses key elements of patient care. However, our results suggest that only 2 major quality designations (BCBS Distinction+ and Aetna Institutes of Quality for Orthopedic Surgery) take all such variables into account.
All quality designation programs that we analyzed required outcome data (ie, readmission and/or mortality rates within 30 days); however, only 2 programs utilized cost in their quality designation criteria (BCBS Distinction+ and Aetna Institutes of Quality for Orthopedic Surgery). Aetna Institutes of Quality for Orthopedic Surgery risk-adjusted for its cost-effectiveness calculations based on age, sex, and other unspecified conditions using a product known as Symmetry Episode Risk Groups. However, the organization also noted that although it did risk-adjust for inpatient mortality, it did not do so for pulmonary embolism or deep vein thrombosis. BCBS Distinction+ also utilized risk adjustment for its cost efficiency measure, and its step-by-step methodology is available online. Further, Consumer Reports does risk-adjust using logistic regression models in their quality analysis, but the description provided is minimal; it is noted that such risk adjustments are already completed by CMS prior to Consumer Reports acquiring the data. The CMS Compare model information is available on the CMS website. The data utilized by several organizations and presented on CMS Compare are already risk-adjusted using CMS’ approach. In contrast, UnitedHealth Premium TJR Specialty Center gathers its own data from providers and does not describe a risk adjustment methodology. Risk adjustment is important because the lack of risk adjustment may lead to physicians “cherry-picking” easy cases to boost positive outcomes, leading to increased financial benefits and higher quality ratings. Having a consistent risk adjustment formula will ensure accurate comparisons across outcomes and cost-effectiveness measures used by quality designation programs.
Factors considered for quality designation varied greatly from one organization to the other. The range of categories of factors considered varied from 1 (Consumer Reports only considered outcome data) to all 7 categories (BCBS Distinction+ and Aetna Institutes of Quality for Orthopedic Surgery). Our findings are consistent with the work by Keswani and colleagues,8 which showed that there is likely variation in factors considered when rating hospital quality more broadly. Our work suggests that quality designation formulas do not appear to get more consistent when focused on TJA.
We found that all organizations in our analysis published the providers earning their quality designation. However, TJC does not provide publicly a detailed methodology on how to qualify for its quality designation. The price to purchase the necessary manual for this information is $146.00 for accredited organizations and $186.00 for all others.17 For large healthcare providers, this is not a large sum of money. Nonetheless, this provides an additional hurdle for stakeholders to gain a full understanding of the requirements to receive a TJC Gold Seal for Orthopedics.
Previous work has evaluated the consistency of and the variety of means of gauging healthcare quality. Previous work by Rothberg and colleagues18 comparing hospital rankings across 5 common consumer-oriented websites found disagreement on hospital rankings within any diagnosis and even among metrics such as mortality. Another study by Halasyamani and Davis19 found that CMS Compare and USNWR rankings were dissimilar and the authors attributed the discrepancy to different methodologies. In addition, a study by Krumholz and colleagues20 focused on Internet report cards, which measured the appropriate use of select medications and mortality rates for acute myocardial infarction as the quality metrics. The authors found that, in aggregate, there was a clear difference in quality of care and outcomes but that comparisons between 2 hospitals provided poor discrimination.20 Other work has analyzed the increasing trend of online ratings of orthopedic surgeons by patients.21 However, there remains no agreed-upon definition of quality. Thus, the use of the term “quality” in several studies may be misleading.
Our results must be interpreted keeping the limitations of our work in mind. First, we used expert knowledge and a public search engine to develop our list of organizations that provide TJA quality designations. However, there is a possibility that we did not include all relevant organizations. Second, although all authors reviewed the final data, it is possible that there was human error in the analysis of each organization’s quality designation criteria.
CONCLUSION
As healthcare progresses further toward a system that rewards providers for delivering value to patients, accurately defining and measuring quality becomes critical because it can be suggestive of value to patients, payers, and providers. Furthermore, it gives providers a goal to focus on as they strive to improve the value of care they deliver to patients. Measuring healthcare quality is currently a novel, imperfect science,22 and there continues to be a debate about what factors should be included in a quality designation formula. Nonetheless, more and more quality designations and performance measurements are being created for orthopedic care, including total hip and total knee arthroplasty. In fact, in 2016, The Leapfrog Group added readmission for patients undergoing TJA to its survey.23 Consensus on a quality definition may facilitate the movement toward a value-based healthcare system. Future research should evaluate strategies for gaining consensus among stakeholders for a universal quality metric in TJA. Surgeons, hospitals, payers, and most importantly patients should play critical roles in defining quality.