Modified endoscopy documentation software can automatically generate endoscopic retrograde cholangiopancreatography (ERCP) quality metrics, based on a trial at two referral centers.
Providers were prompted during procedures, and inputting any missed data took providers less than 30 additional seconds per patient. The approach led to highly accurate quality reports, lead author Gregory A. Coté, MD, MS, of the Medical University of South Carolina, Charleston, and colleagues wrote in Techniques and Innovations in Gastrointestinal Endoscopy.
The investigators suggested that these findings may lead to the kind of quality reports already used for colonoscopy, which are easier to produce. Such reports are important, they wrote, as the U.S. health care system shifts to value-based reimbursement models, which in turn puts greater scrutiny on the quality of endoscopic procedures. However, doing so with ERCP isn’t entirely straightforward.
“Measuring adherence to ERCP quality indicators is especially challenging given: variance in indications, intraprocedural maneuvers, potential outcomes of a complex procedure, and variability in physician report documentation,” Dr. Coté and colleagues wrote. “In order to operationalize robust tracking of clinically relevant adherence to ERCP quality indicators in clinical practice – that is, to provide real-time feedback to providers, health systems, payors, and patients – an automated system of measurement must be developed.”
The quality indicators used in the study were largely drawn from an American Society for Gastrointestinal Endoscopy/American College of Gastroenterology task force document, with exclusion of those that were subjective or required systematic follow-up. The investigators modified existing endoscopy documentation software at two referral centers to include mandatory, structured data fields, principally with inclusion of quality improvements deemed high priority by the society consensus document, study authors, or both. For instance, providers were obligated to select a specific indication instead of various, synonymous terms (for example, “biliary stricture” vs. “common bile duct stricture”). Examples of quality indicators included successful cannulation of the desired duct, successful retrieval of stone less than 10 mm, or successful placement of a bile duct stent when indicated. Endoscopists were also required to note the presence of postoperative foregut anatomy or presence of existing sphincterotomy, variables which serve to stratefy the quality indicator outcome for degree of difficulty and allow appropriate comparisons of data. In addition, the study authors included inquiries about use of rectal indomethacin, use of prophylactic pancreatic duct stent, and documentation of need for repeat ERCP, follow-up x-ray, or both.
After 9 months, the system recorded 1,376 ERCP procedures conducted by eight providers, with a median annualized volume of 237 procedures (range, 37-336). Almost one-third (29%) of the patients had not had prior sphincterotomy.
Automated reporting of ERCP was compared with manual record review, which confirmed high (98%-100%) accuracy. This high level of accuracy “obviates the need for manual adjudication of medical records,” the investigators wrote.
They used data from one provider to create a template report card, and while exact comparisons across providers and institutions were not published, an example report card that was published with the study showed how such comparisons could be generated in the real world.
“The tool presented in this study allows for an objective assessment of ERCP performance which can provide explicit feedback to providers and allow transparent assessment of quality outcomes; it has the potential to improve the quality of ERCP akin to what has been demonstrated using colonoscopy report cards,” the investigators wrote. “Importantly, this can be achieved with minimal alteration to providers’ routine procedure documentation.”
Dr. Coté and colleagues also noted that the software modifications “can be implemented in other endoscopy units using the same or similar software.”
Taking the project to the next level would require widespread collaboration, according to the investigators.
“A key next step is to operationalize the transfer of data across multiple institutions, allowing for the creation of interim, standard-quality indicator reports that could be disseminated to providers, health systems, and payors,” they wrote. “If applied to a national cohort, this tool could accurately assess the current landscape of ERCP quality and provide tremendous opportunities for systematic improvement.”
One author disclosed a relationship with Provation Medical, but the remaining authors declared no relevant conflicts.