METHODS: We used 5 data collection tools to evaluate the implementation of the intervention, and a combination of descriptive, quantitative, and qualitative analyses. Triangulation was used to attain a complete understanding of the quality of implementation. Twenty-two intervention practices with a total of 54 physicians participated in a randomized controlled trial that took place in Southwestern Ontario, Canada. The key measures of process were the frequency and time involved to deliver intervention components, the scope of the delivery and the utility of the components, and physician satisfaction with the intervention.
RESULTS: Of the 7 components in the intervention model, prevention facilitators (PFs) visited the practices most often to deliver the audit and feedback, consensus building, and reminder system components. All of the study practices received preventive performance audit and feedback, achieved consensus on a plan for improvement, and implemented a reminder system. Ninety percent of the practices implemented a customized flow sheet, and 10% used a computerized reminder system. Ninety-five percent of the intervention practices wanted critically appraised evidence for prevention, 82% participated in a workshop, and 100% received patient education materials in a binder. Content analysis of the physician interviews and bivariate analysis of physician self-reported changes between intervention and control group physicians revealed that the audit and feedback, consensus building, and development of reminder systems were the key intervention components. Ninety-five percent of the physicians were either satisfied or very satisfied with the intervention, and 90% would have been willing to have the PF continue working with their practice.
CONCLUSIONS: Primary care practices in Ontario can implement significant changes in their practice environments that will improve preventive care activity with the assistance of a facilitator. The main components for creating change are audit and feedback of preventive performance, achieving consensus on a plan for improvement, and implementing a reminder system.
A randomized controlled field trial of a multifaceted intervention to improve preventive care tailored to the needs of participating family practices was conducted in Southern Ontario and delivered by nurses trained in the facilitation of prevention. We focus on the process evaluation and complement the outcome evaluation1 by describing how the program was implemented in the intervention practices.
Improving preventive performance is both important and necessary. There is substantial room to improve the rates of appropriate preventive practice.2 The Canadian Task Force on the Periodic Health xamination3,4 has established guidelines for the delivery of preventive care that are supported by clinical evidence as effective in decreasing the impact of disease. However, evidence-based guidelines are not self-implementing.5-7 Changing physicians’ long-held patterns of behavior and the environments in which they work is complex and difficult. Unless the barriers to change can be overcome and actions taken to put preventive care guidelines into practice, evidence-based guideline development efforts will be wasted, and the quality of preventive care will not improve.8
Several reviews have focussed on the effectiveness of different interventions for implementing guidelines and improving care.6,7,9-13 Multifaceted interventions employing trained individuals who meet with providers in their practice settings to provide information and assist the practice in implementing evidence-based guidelines have been shown to be more effective than single interventions.11-14 Tailoring interventions to the requirements of the practice has also been proposed as important in supporting practice changes and in attaining more successful outcomes in preventive care performance compared with interventions that are fixed and lack this flexibility.15-17
As important as knowing what interventions work to improve preventive care performance is understanding why they work. The techniques of process evaluation allow the investigator to determine the extent to which the intervention designed to change practice patterns was actually implemented as planned. Adequate documentation of process facilitates replication and fine-tuning of the intervention.
Intervention Description
Our study built on the work of Fullard and colleagues18 and used a tailored multifaceted approach to getting evidence into action by focusing on the educational, attitudinal, and organizational barriers to change and tailoring interventions to the needs of the practice.17-24 The intervention employed 3 prevention facilitators (PFs) with both master’s degrees in community nursing and skills and previous experience in facilitation. Each PF had primary responsibility for up to 8 primary care practices with up to 6 physicians per practice.
The PFs underwent 30 weeks of intensive training before being assigned to randomly selected intervention practices. The training covered an orientation session, medical office computer systems, medical practice management, prevention in primary care, evidence-based medicine, and facilitation and audit skills development. Approximately 28 hours per week were spent in training and 7 hours per week in preparation and planning. Six of the 30 weeks of training were spent applying skills in a primary care office setting. Once in the field, they were instructed to offer 7 intervention strategies designed to change practice patterns and improve preventive care performance. The strategies were identified from reviews of the literature and constituted the multifaceted component of the intervention.10,11 The PFs were permitted to tailor these strategies to the needs and unique circumstances of the practice. The strategies were: (1) audit and ongoing feedback, (2) consensus building, (3) opinion leaders and networking, (4) academic detailing and education materials, (5) reminder systems, (6) patient-mediated activities, and (7) patient education materials.