A bridge to nowhere
As Biden would tell you, the original concept was a smart one. The wave of digitization had swept up virtually every industry, bringing both disruption and, in most cases, greater efficiency. And perhaps none of these industries was more deserving of digital liberation than medicine, where life-measuring and potentially lifesaving data was locked away in paper crypts – stack upon stack of file folders at doctors’ offices across the country.
Stowed in steel cabinets, the records were next to useless. Nobody – particularly at the dawn of the age of the iPhone – thought it was a good idea to leave them that way. The problem, say critics, was in the way that policymakers set about to transform them.
“Every single idea was well-meaning and potentially of societal benefit, but the combined burden of all of them hitting clinicians simultaneously made office practice basically impossible,” said John Halamka, chief information officer at Beth Israel Deaconess Medical Center, who served on the EHR standards committees under both President George W. Bush and President Obama. “In America, we have 11 minutes to see a patient, and, you know, you’re going to be empathetic, make eye contact, enter about 100 pieces of data, and never commit malpractice. It’s not possible!”
KHN and Fortune examined more than two dozen medical negligence cases that have alleged that EHRs either contributed to injuries, had been improperly altered, or were withheld from patients to conceal substandard care. In such cases, the suits typically settle prior to trial with strict confidentiality pledges, so it’s often not possible to determine the merits of the allegations. EHR vendors also frequently have contract stipulations, known as “hold harmless clauses,” that protect them from liability if hospitals are later sued for medical errors – even if they relate to an issue with the technology.
But lawsuits, like that filed by Fabian Ronisky, which do emerge from this veil, are quite telling.
Ronisky, according to his complaint, arrived by ambulance at Providence Saint John’s Health Center in Santa Monica on the afternoon of March 2, 2015. For two days, the young lawyer had been suffering from severe headaches while a disorienting fever left him struggling to tell the 911 operator his address.
Suspecting meningitis, a doctor at the hospital performed a spinal tap, and the next day an infectious disease specialist typed in an order for a critical lab test – a check of the spinal fluid for viruses, including herpes simplex – into the hospital’s EHR.
The multimillion-dollar system, manufactured by Epic Systems Corp. and considered by some to be the Cadillac of medical software, had been installed at the hospital about four months earlier. Although the order appeared on Epic’s screen, it was not sent to the lab. It turned out, Epic’s software didn’t fully “interface” with the lab’s software, according to a lawsuit Ronisky filed in February 2017 in Los Angeles County Superior Court. His results and diagnosis were delayed – by days, he claimed – during which time he suffered irreversible brain damage from herpes encephalitis. The suit alleged the mishap delayed doctors from giving Ronisky a drug called acyclovir that might have minimized damage to his brain.
Epic denied any liability or defects in its software; the company said the doctor failed to push the right button to send the order and that the hospital, not Epic, had configured the interface with the lab. Epic, among the nation’s largest manufacturers of computerized health records and the leading provider to most of America’s most elite medical centers, quietly paid $1 million to settle the suit in July 2018, according to court records. The hospital and two doctors paid a total of $7.5 million, and a case against a third doctor is pending trial. Ronisky, 34, who is fighting to rebuild his life, declined to comment.
Incidents like that which happened to Ronisky – or to Annette Monachelli, for that matter – are surprisingly common, data show. And the back-and-forth about where the fault lies in such cases is actually part of the problem: The systems are often so confusing (and training on them seldom sufficient) that errors frequently fall into a nether zone of responsibility. It can be hard to tell where human error begins and the technological shortcomings end.
EHRs promised to put all of a patient’s records in one place, but often that’s the problem. Critical or time-sensitive information routinely gets buried in an endless scroll of data, where in the rush of medical decision-making – and amid the maze of pulldown menus – it can be missed.
Thirteen-year-old Brooke Dilliplaine, who was severely allergic to dairy, was given a probiotic containing milk. The two doses sent her into “complete respiratory distress” and resulted in a collapsed lung, according to a lawsuit filed by her mother. Rory Staunton, 12, scraped his arm in gym class and then died of sepsis after ER doctors discharged the boy on the basis of lab results in the EHR that weren’t complete. And then there’s the case of Thomas Eric Duncan. The 42-year-old man was sent home in 2014 from a Dallas hospital infected with Ebola virus. Though a nurse had entered in the EHR his recent travel to Liberia, where an Ebola epidemic was then in full swing, the doctor never saw it. Duncan died a week later.
Many such cases end up in court. Typically, doctors and nurses blame faulty technology in the medical-records systems. The EHR vendors blame human error. And meanwhile, the cases mount.
Quantros, a private health care analytics firm, said it has logged 18,000 EHR-related safety events from 2007 through 2018, 3 percent of which resulted in patient harm, including seven deaths – a figure that a Quantros director said is “drastically underreported.”
A 2016 study by The Leapfrog Group, a patient-safety watchdog based in Washington, D.C., found that the medication-ordering function of hospital EHRs – a feature required by the government for certification but often configured differently in each system – failed to flag potentially harmful drug orders in 39 percent of cases in a test simulation. In 13 percent of those cases, the mistake could have been fatal
The Pew Charitable Trusts has, for the past few years, run an EHR safety project, taking aim at issues like usability and patient matching – the process of linking the correct medical record to the correct patient – a seemingly basic task at which the systems, even when made by the same EHR vendor, often fail. At some institutions, according to Pew, such matching was accurate only 50 percent of the time. Patients have discovered mistakes as well: A January survey by the Kaiser Family Foundation found that 1 in 5 patients spotted an error in their electronic medical records. (Kaiser Health News is an editorially independent program of the foundation.)
The Joint Commission, which certifies hospitals, has sounded alarms about a number of issues, including false alarms – which account for between 85 and 99 percent of EHR and medical device alerts. (One study by researchers at Oregon Health & Science University estimated that the average clinician working in the intensive care unit may be exposed to up to 7,000 passive alerts per day.) Such over-warning can be dangerous. From 2014 to 2018, the commission tallied 170 mostly voluntary reports of patient harm related to alarm management and alert fatigue – the phenomenon in which health workers, so overloaded with unnecessary warnings, ignore the occasional meaningful one. Of those 170 incidents, 101 resulted in patient deaths.
The Pennsylvania Patient Safety Authority, an independent state agency that collects information about adverse events and incidents, counted 775 “laboratory-test problems” related to health IT from January 2016 to December 2017.
To be sure, medical errors happened en masse in the age of paper medicine, when hospital staffers misinterpreted a physician’s scrawl or read the wrong chart to deadly consequence, for instance. But what is perhaps telling is how many doctors today opt for manual workarounds to their EHRs. Aaron Zachary Hettinger, an emergency medicine physician with MedStar Health in Washington, D.C., said that when he and fellow clinicians need to share critical patient information, they write it on a whiteboard or on a paper towel and leave it on their colleagues’ computer keyboards.
While the Food and Drug Administration doesn’t mandate reporting of EHR safety events – as it does for regulated medical devices – concerned posts have nonetheless proliferated in the FDA MAUDE database of adverse events, which now serves as an ad hoc bulletin board of warnings about the various systems.
Further complicating the picture is that health providers nearly always tailor their one-size-fits-all EHR systems to their own specifications. Such customization makes every one unique and often hard to compare with others – which, in turn, makes the source of mistakes difficult to determine.
Dr. Martin Makary, a surgical oncologist at Johns Hopkins and the co-author of a much-cited 2016 study that identified medical errors as the third-leading cause of death in America, credits EHRs for some safety improvements – including recent changes that have helped put electronic brakes on the opioid epidemic. But, he said, “we’ve swapped one set of problems for another. We used to struggle with handwriting and missing information. We now struggle with a lack of visual cues to know we’re writing and ordering on the correct patient.”
Dr. Joseph Schneider, a pediatrician at UT Southwestern Medical Center, compares the transition we’ve made, from paper records to electronic ones, to moving from horses to automobiles. But in this analogy, he added, “our cars have advanced to about the 1960s. They still don’t have seat belts or air bags.”
Schneider recalled one episode when his colleagues couldn’t understand why chunks of their notes would inexplicably disappear. They figured out the problem weeks later after intense study: Physicians had been inputting squiggly brackets – {} – the use of which, unbeknownst to even vendor representatives, deleted the text between them. (The EHR maker initially blamed the doctors, said Schneider.)
A broad coalition of actors, from National Nurses United to the Texas Medical Association to leaders within the FDA, has long called for oversight on electronic-record safety issues. Among the most outspoken is Ratwani, who directs MedStar Health’s National Center on Human Factors in Healthcare, a 30-person institute focused on optimizing the safety and usability of medical technology. Ratwani spent his early career in the defense industry, studying things like the intuitiveness of information displays. When he got to MedStar in 2012, he was stunned by “the types of [digital] interfaces being used” in health care, he said.
In a study published last year in the journal Health Affairs, Ratwani and colleagues studied medication errors at three pediatric hospitals from 2012 to 2017. They discovered that 3,243 of them were owing in part to EHR “usability issues.” Roughly 1 in 5 of these could have resulted in patient harm, the researchers found. “Poor interface design and poor implementations can lead to errors and sometimes death, and that is just unbelievably bad as well as completely fixable,” he said. “We should not have patients harmed this way.”
Using eye-tracking technology, Ratwani has demonstrated on video just how easy it is to make mistakes when performing basic tasks on the nation’s two leading EHR systems. When emergency room doctors went to order Tylenol, for example, they saw a drop-down menu listing 86 options, many of which were irrelevant for the specified patient. They had to read the list carefully, so as not to click the wrong dosage or form – though many do that too: In roughly 1 out of 1,000 orders, physicians accidentally select the suppository (designated “PR”) rather than the tablet dose (“OR”), according to one estimate. That’s not an error that will harm a patient – though other medication mix-ups can and do.
Earlier this year, MedStar’s human-factors center launched a website and public awareness campaign with the American Medical Association to draw attention to such rampant mistakes – they use the letters “EHR” as an initialism for “Errors Happen Regularly” – and to petition Congress for action. Ratwani is pushing for a central database to track such errors and adverse events.
Others have turned to social media to vent. Dr. Mark Friedberg, a health-policy researcher with the Rand Corp. who is also a practicing primary care physician, champions the Twitter hashtag #EHRbuglist to encourage fellow health care workers to air their pain points. And last month, a scathing Epic parody account cropped up on Twitter, earning more than 8,000 followers in its first five days. Its maiden tweet, written in the mock voice of an Epic overlord, read: “I once saw a doctor make eye contact with a patient. This horror must stop.”
As much as EHR systems are blamed for sins of commission, it is often the sins of omission that trip up users even more.
Consider the case of Lynne Chauvin, who worked as a medical assistant at Ochsner Health System, in Louisiana. In a still-pending 2015 lawsuit, Chauvin alleges that Epic’s software failed to fire a critical medication warning; Chauvin suffered from conditions that heightened her risk for blood clots, and though that history was documented in her records, she was treated with drugs that restricted blood flow after a heart procedure at the hospital. She developed gangrene, which led to the amputation of her lower legs and forearm. (Ochsner Health System said that while it cannot comment on ongoing litigation, it “remains committed to patient safety which we strongly believe is optimized through the use of electronic health record technology.” Epic declined to comment.)
Echoing the complaints of many doctors, the suit argues that Epic software “is extremely complicated to view and understand,” owing to “significant repetition of data.” Chauvin said that her medical bills have topped $1 million and that she is permanently disabled. Her husband, Richard, has become her primary caregiver and had to retire early from his job with the city of Kenner to care for his wife, according to the suit. Each party declined to comment.