Original Research

Using Quality Indicators to Assess and Improve Human Research Protection Programs at the VA

Author and Disclosure Information

 

References

The main purpose of collecting these data is to promote quality improvement. Each year VAORO provides feedback to VA research facilities by giving each facility its QI data along with the national and network averages so that each facility knows where it stands at the national and VISN level. It is hoped that with this information, facilities will be able to identify strengths and weaknesses and carry out quality improvement measures accordingly.

Related: Making an Impact: Congressionally Directed Medical Research Programs Complement Other Sources of Biomedical Funding

Several potential reasons exist for the observed improvements. Possibly, improvements could be due to reporting errors, for example, if facilities were underreporting noncompliance. However, underreporting is unlikely, because data were collected from independent RCO audits of ICDs and regulatory protocol audits. At VA, RCOs report directly to institutional officials and function independently of the Research Service.

Some facilities also may have been systematically “gaming the system” in order to make their programs look better. For example, some IRBs might become less likely to suspend a protocol when it should be suspended. While the above possibilities cannot be ruled out completely, the authors believe that they are unlikely. First, not all QIs were improved. Particularly, lapse in IRB continuing reviews remained high and unchanged from 2010 to 2012. In addition, routine on-site reviews of facility’s HRPPs have independently verified some of the improvements observed in these QI data.

Two areas in need of improvement have been identified: lapses in IRB continuing reviews and studies requiring CRADO. These 2 areas can be easily improved if facilities are willing to devote effort and resources to improve IRB procedures and practices. In a previous study based on 2011 QI data, the authors reported that VA facilities with a small human research program (active human research protocols of < 50) had a rate of lapse in IRB continuing reviews of 3.2%; facilities with a medium research program (50-200 active human research protocols) had a rate of 5.5%; and facilities with a large research program (> 200 active human research protocols) had a rate of 8.6%.14 Thus, facilities with a large research program particularly need to improve their IRB continuing review processes.

In addition to QI, these data provide opportunities to answer a number of important questions regarding HRPPs. For example, based on 2011 QI data, the authors had previously shown that HRPPs of facilities using their own VA IRBs and those using affiliated university IRBs as their IRBs of record performed equally well, providing scientific data for the first time to support the long-standing VA policy that it is acceptable for VA facilities to use their own IRB or the affiliated university IRB as the IRB of record.4,13 Likewise, there has been concern that facilities with small research programs may not have sufficient resources to support a vigorous HRPP.

In a previous study based on analysis of 2011 QI data, the authors showed that HRPPs of facilities with small research programs performed at least as well as facilities with medium and large research programs.14 Facilities with large research programs seemed to perform not as well as facilities with small and medium research programs, suggesting that facilities with large research programs may need to allocate additional resources to support HRPPs.

Two fundamental questions remain unanswered. First, are these QIs the most optimal for evaluating HRPPs? Second, do high-quality HRPPs as measured using QIs actually provide better human research subject protections? Although no clear answers to these important questions exist at this time, there is a clear need to measure the quality of HRPPs. Undoubtedly, modification of current QIs or the addition of new ones is needed. However, the authors are sharing their experience with academic and other non-VA research institutions as they develop their own QIs for assessing the quality of their HRPPs.

Acknowledgement
The authors wish to thank J. Thomas Puglisi, PhD, chief officer, Office of Research Oversight, for his support and critical review of the manuscript and thank all VA research compliance officers for their contributions in conducting audits and collecting the data presented in this report.

Author disclosures
The authors report no actual or potential conflicts of interest with regard to this article.

Disclaimer
The opinions expressed herein are those of the authors and do not necessarily reflect those of
Federal Practitioner, Frontline Medical Communications Inc., the U.S. Government, or any of its agencies. This article may discuss unlabeled or investigational use of certain drugs. Please review the complete prescribing information for specific drugs or drug combinations—including indications, contraindications, warnings, and adverse effects—before administering pharmacologic therapy to patients.

Pages

Recommended Reading

Sharing Alzheimer Research, Faster
Federal Practitioner
Family Support Can Prevent Postdeployment Suicide
Federal Practitioner
Profitably Reining in Health Care Fraud
Federal Practitioner
Nivolumab Approved for Expanded Indication
Federal Practitioner
Protecting Sensory Health
Federal Practitioner
No Man Is an Island in the Public Health Service
Federal Practitioner
Experiences of Veterans With Diabetes From Shared Medical Appointments
Federal Practitioner
The VA Geriatric Scholars Program
Federal Practitioner
VHA Under Harsh Criticism From OIG, GAO
Federal Practitioner
Advances in Radiation Therapy for Prostate Cancer
Federal Practitioner

Related Articles