0
We're unable to sign you in at this time. Please try again in a few minutes.
Retry
We were able to sign you in, but your subscription(s) could not be found. Please try again in a few minutes.
Retry
There may be a problem with your account. Please contact the AMA Service Center to resolve this issue.
Contact the AMA Service Center:
Telephone: 1 (800) 262-2350 or 1 (312) 670-7827  *   Email: subscriptions@jamanetwork.com
Error Message ......
Article |

Improving Immunization Rates in Private Pediatric Practices Through Physician Leadership FREE

Jeffrey S. Sinn, PhD; Ardythe L. Morrow, PhD; Albert B. Finch, MD
[+] Author Affiliations

From the Department of Psychology, Winthrop University, Rock Hill, SC (Dr Sinn); Center for Pediatric Research, Eastern Virginia Medical School and Children's Hospital of the King's Daughters, (Drs Sinn and Morrow), and Department of Pediatrics, Children's Hospital of the King's Daughters and Its Physician Partners and Eastern Virginia Medical School (Dr Finch), Norfolk.


Arch Pediatr Adolesc Med. 1999;153(6):597-603. doi:10.1001/archpedi.153.6.597.
Text Size: A A A
Published online

Objective  To determine whether a physician-led quality improvement initiative can improve immunization rates in participating private practices.

Design  Surveys of private pediatric practices at 6-month intervals over an 18-month period.

Setting  Ten private pediatric practices in Norfolk and Virginia Beach, Va.

Patients  Children aged 9 to 30 months attending the private practices.

Interventions  Practice immunization rates were assessed and presented to practices on 4 occasions at 6-month intervals. A physician leader convened an immunization task force meeting following the first 3 assessments to review practice guidelines, examine data, and discuss practice changes.

Main Outcome Measures  Practice immunization rates for patients at age 24 months, with 3- and 12-month immunization rates as secondary outcomes.

Results  The mean practice immunization rate at age 24 months increased significantly (P<.05) from 50.9% at baseline to 69.7%. Rates also increased at age 3 months, from 75.5% to 88.9%, and at age 12 months, from 72.9% to 84.6%. The median age at administration of the fourth dose of diphtheria toxoid, tetanus toxoid, and pertussis vaccine decreased (P<.05) from 17.6 to 16.8 months. Physicians also reported making additional changes, including improved record keeping and screening for immunizations at every visit.

Conclusion  A quality improvement initiative enabling physician leadership can improve preschool immunization practices and coverage levels in pediatric practices.

Figures in this Article

THE IMMUNIZATION rate of 2-year-old children serves as a key national indicator for the quality of pediatric care.1 The 1996 National Immunization Survey indicates that at least 22% of US children failed to receive all recommended doses of diphtheria toxoid, tetanus toxoid, and pertussis vaccine (DTP), poliovirus vaccine, measles, mumps, and rubella vaccine (MMR), and Haemophilus influenzae type b vaccine by age 2 years.2 Improving immunization rates requires a multifaceted strategy with physicians playing a critical role in the process of improvement.35 Although evidence suggests that implementing the Standards for Pediatric Immunization Practices improves immunization rates,68 physicians often fail to implement such preventive care guidelines,9 and physician education strategies alone often fail to prompt practice change.10

In contrast to provider education strategies, continuous quality improvement (CQI) efforts have produced significant change in preventive care practices.1115 For example, the Centers for Disease Control and Prevention (CDC) advocate a strategy known as assessment, feedback, incentives, and information exchange (AFIX), which has improved immunization rates in public clinics by as much as 20% to 40%.16,17 Such strategies also have succeeded in selected private practices when health maintenance organizations have influenced physicians through peer reviews and financial incentives.18

Most private practices, however, have not adopted CQI strategies to improve immunization outcomes, and typically neither managed care organizations nor public health officials can compel participation in immunization assessment. Private physicians may resist participation because they often overestimate their rates, underestimate the need for objective measurement of their practices,3,19,20 and view the assessment process as invasive and demeaning. Furthermore, when they are required to participate, physicians typically have only minimal involvement in the process.18 If not engaged as active participants, physicians may not develop a sense of ownership of the process and, consequently, lack motivation to use assessment data or to consider changes in their practices.

To actively engage private pediatricians in a CQI initiative for immunizations, we developed and tested a strategy we refer to as the physician leadership model. In this model, a physician facilitator convenes an immunization task force comprised of physician peers who represent different practices. These representatives meet biannually to review assessment data and consider strategies for practice improvement. A technical team facilitates the assessment and organizational change processes. Grounded in an action research model of intervention, this strategy uses influence tactics developed from the study of organizational change, preventive care practices, and innovation adoption. To determine the effectiveness of the intervention, we assessed improvement in pediatric immunization rates over time.

STUDY DESIGN

The study involved 10 pediatric group practices located in the Norfolk and Virginia Beach, Va, area and affiliated with the children's hospital. All 10 practices participated in the intervention. Evaluation of the intervention outcomes was based on a preintervention immunization assessment followed by 3 subsequent assessments at 6-month intervals.

INTERVENTION: THE PHYSICIAN LEADERSHIP MODEL
Components

The physician leadership model incorporates several different influence tactics for changing physician behavior,21,22 including (1) an opinion leader, (2) academic detailing, (3) goal setting with feedback, (4) peer review, and (5) peer influence. To encourage change in immunization practices, a well-respected local physician functions as an opinion leader by convening the task force and encouraging the adoption of practice innovations.23 In this role, the physician facilitator provides academic detailing24 by highlighting specific practice innovations for producing higher immunization rates. The process enables participants to set goals for coverage levels, adopt innovations to reach those goals (eg, immunizing during visits for acute conditions), and monitor progress by reviewing assessment data.25,26 Motivation to adopt innovations is enhanced by an implicit peer review process in which physicians challenge themselves to become the top performers27,28 and by a peer influence process in which physicians learn of innovations adopted by other practices.29

Process

Researchers collaborated with the medical director of the local physician-hospital organization to initiate a Community Pediatrician Immunization Task Force. The physician leader (A.B.F.) invited 10 of the largest pediatric practices in the region to participate in assessments of their immunization practices. Trained assessors from the academic medical center conducted standardized assessments in the participating practices. The practices received reports providing practice immunization rates and diagnostic data (eg, use of opportunities for simultaneous administration). At the same time, the physician leader invited practices to send a representative to the first task force meeting.

The physician leader began the first meeting by reviewing several key points from the Standards for Pediatric Immunization Practices regarding screening, contraindications, simultaneous administration, record keeping, tracking systems, and practice assessments.6 The group then reviewed the blinded assessment data, identified problems, and discussed ways to improve immunization rates. Participants agreed to reconvene every 6 months to examine new assessment data, review their immunization practices, and consider changes.

After the initial meeting, the task force met 3 more times. The second and third meetings followed a format similar to that of the first meeting. During the second meeting, physician participants expressed several concerns. First, some physicians were concerned that the standard assessment methods produced biased results by including patients in the sample that should be considered inactive. To address this concern, the technical team consulted with the participating physicians to generate alternative assessment criteria that might better delineate active patients. Second, some physicians expressed a desire to see individual patient histories to adequately interpret the practice immunization rates. To meet this need, the technical team developed practice reports that profiled individual patients who were behind schedule.

During the third meeting, immunization rates of practices were presented using the standard CDC criteria and the proposed alternative criteria. In addition, practices received data that profiled the immunization histories of patients who had been 2 or more months overdue for an immunization. These patient profiles indicated the age of the patient, age when specific immunizations had been received, and time elapsed since last seen. The profile also indicated whether the patient had started receiving immunizations at the practice in question and whether a missed opportunity might have occurred. Following each meeting, participants were asked to indicate briefly in writing what changes they were planning or had already made and their reactions to various aspects of the task force process.

Practices were assessed a fourth time, although these data were not reviewed at the fourth meeting of the task force. Instead, during the fourth meeting, participants participated in a focus group to help guide development of the health department's immunization tracking system.

IMMUNIZATION ASSESSMENT METHOD

Practice assessments were conducted using standard CDC methodology and the Clinic Assessment Software Application (CASA).30 Records were systematically selected for all children who met the criteria for an active patient and who were between the ages of 9 and 30 months. This age range was designed to maximize sensitivity to improvements in practice immunization rates: (1) the sample size was largest in the first year of age, when improvements in rates were expected to be smallest; and (2) since assessments were conducted every 6 months, a new cohort of children aged 24 to 30 months was included each round. A patient was deemed active unless documentation indicated the patient had moved or gone elsewhere for care, or had never received an immunization from the practice. Documented vaccine doses were entered into CASA. For all practices and rounds combined, a total of 7269 patient records were assessed (mean, 182 per practice assessment), of which 2904 patient records pertained to children aged 24 months or older (mean, 72 per practice assessment).

Based on the entered data, CASA calculated the proportion of patients who were up-to-date on recommendedimmunizations at age 3, 5, 7, 12, 19, and 24 months, how much the rate would increase if immunizations were always administered simultaneously, and a dropout rate determined by the proportion of patients who had their first DTP dose by age 12 months but lacked their fourth DTP dose at age 24 months. We report up-to-date rates based only on receiving recommended doses of DTP (including acellular pertussis-containing and combination vaccines), poliovirus vaccine, and MMR.

CRITERIA FOR ACTIVE PATIENTS

At the third assessment, immunization rates were computed using CDC criteria and the physicians' alternative criteria for defining active patients. The alternative criteria identified patients as active if the patients had started receiving immunizations and had at least received the second dose of DTP or poliovirus vaccine at the practice in question, and were seen within the last 6 months. These criteria were designed to exclude patients who entered the practice already behind schedule, did not regularly receive immunizations at the practice, or had gone elsewhere for care.

STATISTICAL METHODS

The effectiveness of the intervention strategy on practice immunization rates was assessed using a 1-way, within-practice analysis of variance with assessment time (1, 2, 3, or 4) as the independent variable. Similar analyses were computed on median age at the administration of the fourth dose of DTP and missed opportunities. The difference in immunization rate as a function of what criteria were used to define an active patient was assessed with a dependent Student t test, with assessment method (CDC criteria vs alternative criteria) as the independent variable and immunization rate as the dependent variable.

PRACTICE CHARACTERISTICS

Table 1 presents the characteristics of participating practices. From these data and previous research,31 we estimate that one third of patients in the practices were insured by Medicaid or Medicaid managed care, 15% were insured by the military, 5% to 7% were noninsured, and 45% to 47% were privately insured. One practice attended no meetings but received comparative feedback on immunization rates and a summary of each meeting and, thus, may have been indirectly influenced by the peer review process.

Table Graphic Jump LocationTable 1. Characteristics of Pediatric Practices and Meeting Attendance
ASSESSMENT OF IMMUNIZATION RATES OVER TIME

Practice immunization rates were computed using standard CDC criteria for defining active patients. Comparing times 1 and 4, children were significantly more likely to be up-to-date at age 3 months (75.5% vs 88.9%; F3,27=23.05), 12 months (72.9% vs 84.6%; F3,27=10.72), and 24 months (50.9% vs 69.7%; F3,27=12.14) (P<.001 for all ages). Figure 1 shows the practice rates for children at age 24 months for assessments at times 1 through 4.

Place holder to copy figure label and caption
Figure 1.

Immunization rates by practice for children at age 24 months. Practice coding differs from that in Table 1 to preserve practice confidentiality.

Graphic Jump Location

Additional analyses tested for specific changes in immunization practices that could explain the improved rates. To examine changes in the standard practice for the administration of the fourth dose of DTP, we analyzed age at the administration of the fourth dose of DTP among patients who were up-to-date at age 24 months. The policies of most practices appear to have changed, as the median age at the administration of the fourth dose of DTP for these patients averaged across practices decreased from 17.6 months at time 1 to 16.8 months at time 4 (F3,27=6.17, P<.05). However, the data do not suggest more simultaneous administration of vaccines. The potential gain in rates at age 24 months that practices could have achieved by using missed opportunities for simultaneous administration did not decrease from time 1 (3.8%) to time 4 (3.6%).

STANDARD VS ALTERNATIVE ASSESSMENT

During the third meeting, participants were provided with rates computed using the standard CDC criteria for active patients and alternative criteria endorsed by most of the physician participants. The alternative criteria generated higher rates relative to the CDC criteria for up-to-date status at age 3 months (92.5% vs 87.9%; t9=4.65), 12 months (91.4% vs 82.6%; t9=5.36), and 24 months (82.9% vs 65.2%; t9=4.65) (P<.001 for all ages). Figure2 presents the practice rates calculated at time 3 using the 2 criteria for patients aged 24 months. In addition, exploratory analyses suggested that the percentage of practice patients with Medicaid insurance correlates with the CDC rate (r=−0.61, P=.06) but not with the alternative rate (r=0.18, P=.62).

Place holder to copy figure label and caption
Figure 2.

Immunization rates by practice, comparing Centers for Disease Control and Prevention (CDC) and alternative criteria at time 3. Practice coding differs from that in Table 1 to preserve practice confidentiality.

Graphic Jump Location
QUALITATIVE OUTCOMES

Qualitative data collected from participants after each task force meeting suggest that physicians were influenced by the process, and made specific changes in immunization practices (Table 2). Furthermore, the task force agreed on 2 strategies that required collaboration among practices and with an external agency. On noting that many medical records indicated patients were not seen in the past 6 months or year, participants agreed to help the state health department develop an immunization information system (tracking system) and initiate a postcard reminder system. Finally, physicians indicated that they valued their participation and the exchange of ideas made possible by the process. Respondents reported that the data "opened our eyes" and that the process "brought diverse [pediatric] practices to the table for discussion." One participant noted, "Only by reviewing the data as a group can we begin to make significant changes."

Table Graphic Jump LocationTable 2. Practice Changes Reported by Physicians After the Third Assessment Round*

This study examined the effect of a physician leadership strategy on immunization coverage levels within private pediatric practices at 4 biannual assessments. The data suggest that the intervention improved immunization coverage levels for children aged 24 months by 19 percentage points. Although the absence of a control group prevents a definite conclusion, knowledge of the community and discussion with participating physicians have suggested no alternative hypotheses. A managed care organization in the region instituted a financial incentive program just before the third assessment round, but the participating physicians did not consider it to be an influence and the program occurred late in this study. In addition, Norfolk was the site of a CDC-funded immunization coalition demonstration project that involved community interventions in 1994 and 1995, including physician and public education, linkage with the Women, Infants, and Children program, and other initiatives.32,33 However, the baseline assessment reported herein, conducted in June 1996, reflects the immunization rates of the pediatric practices after community interventions were instituted. Thus, none of the initiatives we are aware of explain the rate increase reported herein.

Our findings are consistent with previous studies that have examined efforts to influence the immunization practices of private physicians. In one study,18 a managed care organization used peer review, feedback, and financial incentives to achieve an 18–percentage point increase in MMR vaccinations of young children during a 3-year period. In another study,34 small groups of physicians examined their rates for vaccinating older adults against influenza, and subsequently improved their rates the next influenza season by 17 percentage points relative to a control condition.

The present study complements these previous studies in several ways. First, the present study achieved an increase in rates comparable with that of the MMR study18 without the use of financial incentives. Second, the present study demonstrated continual improvements on multiple follow-up assessments rather than on only a single subsequent assessment, as used in the influenza study.34 Third, and most important, the present study extends previous findings by using a more stringent criterion of behavior change. The 2 studies18,34 previously mentioned attempted only to increase the provision of a single immunization (MMR or influenza vaccine). The criterion of up-to-date status at age 24 months (ie, 4 DTP vaccines, 2 poliovirus vaccines, and 1 MMR vaccine) is a more stringent measure of behavior change because it requires physicians to provide series of immunizations on schedule. The present study demonstrates that a quality improvement process in the private sector can be effective even when judged by this more difficult standard. Fourth, the present study demonstrates that the process can result in physician endorsement of effective strategies, such as immunization tracking and parent reminder messages.

Given the nature of the data captured by a CASA assessment, the present study is limited in its ability to identify the specific practice changes that increased overall rates. Statistical analyses did suggest, however, that practices began administering the fourth dose of DTP earlier, and revealed that improvements over time were not attributable to greater simultaneous administration of vaccines. Physician self-report suggests increased attention to using all clinical encounters to screen for immunization status, increased willingness to administer immunizations to children with mild illnesses, improved record keeping, and, to a lesser extent, attention to parental education status.

KEY ELEMENTS OF THE PHYSICIAN LEADERSHIP MODEL

Given the apparent success of the intervention, what specific elements of the intervention likely account for the behavior change? Compared with other CQI strategies applied to pediatric immunization practices (eg, the AFIX strategy), the physician leadership strategy has several advantages. First, physicians are more likely to be influenced by a process initiated and facilitated by a respected peer rather than an external auditor.23 Second, the action research model involves physicians as collaborators, which makes physicians less defensive and more willing to consider making changes in their practice. Third, because the process involves a group of physicians, they can influence one another. For example, one pediatrician who doubted the practicality of providing immunizations during office visits for acute conditions committed to adopting this practice after discussing the issue with other pediatricians. Fourth, the presentation of blinded, practice-specific data fosters informal peer review, challenging participants to become top performers.29

ADDITIONAL BENEFITS

The physician leadership model provides exciting additional benefits. First, this model can foster greater diffusion of a CQI strategy within a given geographic region. By starting with a group of prominent practices within a given community, the task force attracts the attention of other physicians. Success with the initial group can then serve as a springboard for involving the remaining practices in a community. For example, other pediatric practices in our region have asked to participate. Second, the collaborative nature of the physician leadership model can enable physicians and assessors to develop better communication and tools for quality of care. In the present study, for example, discussions between the physicians and the technical team addressed methodological issues such as how to define the active patient population, and the physicians learned to value population outcomes data while the assessors learned to provide patient profiles (ie, data organized as a historical picture of individual patients) to complement the aggregate data.

Third, physicians can use this model as a tool for exercising positive leadership in their communities. Arguing that physicians can lead improvements in health care systems, Reinertsen35 offers several principles for physician leadership. The physician leadership model incorporates many of these principles, including working for change, leading through action, defining reality with data, and examining practice processes in detail. Although Reinertsen conceptualizes leadership as a solitary effort, behavioral scientists argue that the function of leadership can be shared among a set of individuals.36 The physician leadership model thus offers the opportunity for physicians to pool their time and talent to provide the collective leadership necessary for addressing critical issues.

LIMITATIONS AND METHODOLOGICAL CONSIDERATIONS

Several limitations of the physician leadership strategy should be noted. The standard CASA assessment methodology is labor intensive, requiring nearly a week of work for one assessor per practice. Less labor-intensive alternatives need to be studied, as they may provide less reliable data. Also, the expertise provided by local academic institutions served as a catalyst for this process. Initiating the process without such expertise could prove difficult. Although the process strives to create physician leadership, the process may not be initiated by such leadership. However, clinicians will likely become more receptive as they become more familiar with outcomes research and CQI methods.

The present study also highlights concern about using the standard assessment method in a private practice setting. Developed in the public health setting, the standard method makes assumptions that may not be appropriate in a private practice setting. Previous research has demonstrated that assessments based on medical record reviews alone are biased by the inclusion of inactive patients and missing immunization histories.37 Prompted by physician concern, we assisted participants to develop an alternative definition of their active patient population. For the third assessment round, the alternative criteria yielded rates that were 18 percentage points higher than those based on the CDC standard criteria. Furthermore, the rates obtained using CDC standard criteria appear to correlate with the proportion of patients with Medicaid insurance, whereas the rates obtained using the alternative criteria do not. Thus, it may be misleading to use the CDC standard method for comparing quality of care in practices that serve different populations. The discrepancy in rates between the 2 sets of criteria highlights the need for public health and private physicians to reach consensus regarding an operational definition of active patients that challenges physicians to proactively manage their patient population yet acknowledges that physicians are not responsible for patients who choose to leave their practice.

Based on the success of this intervention model, further research is needed to systematically study the approach in other settings, using a controlled experimental design. Although this method has been developed in the context of immunization, it is generalizable to other primary health care outcomes.

Accepted for publication October 27, 1998.

This study was supported in part by the Division of Immunization, Virginia Department of Health, Richmond.

We gratefully acknowledge R. Clinton Crews, MPH, and J. Andy McCraw, MPH, for data collection; James B. Farrell, Virginia Department of Health, for his support and encouragement; Carolyn Moneymaker, MD, for strategy consultation; Jonathan Turner, MD, for data analysis; and the participating physicians from the Children's Hospital of The King's Daughters and Its Physician Partners, Norfolk, Va, for their active collaboration.

Reprints: Ardythe L. Morrow, PhD, Center for Pediatric Research, 855 W Brambleton Ave, Norfolk, VA 23510 (e-mail: amorrow@chkd.com).

Editor's Note: This study shows that continuous quality improvement is not only academic.—Catherine D. DeAngelis, MD

Corrigan  JMGriffith  H NCQA external reporting and monitoring activities for health plans: preventive services programs. Am J Prev Med. 1995;11393- 396
Centers for Disease Control and Prevention, State and national vaccination coverage levels among children aged 19-35 months—United States, April-December 1994. MMWR CDC Surveill Summ. 1995;44613- 623
Grabowsky  MOrenstein  WAMarcus  EK The critical role of provider practices in undervaccination. Pediatrics. 1996;97735- 737
Santoli  JMSzilagyi  PGRodewald  LE Barriers to immunizations and missed opportunities. Pediatr Ann. 1998;27366- 374
Link to Article
Udovic  SLLieu  TA Evidence on office-based interventions to improve childhood immunization delivery. Pediatr Ann. 1998;27354- 361
Link to Article
Ad Hoc Working Group for the Development of Standards for Pediatric Immunization Practices, Standards for Pediatric Immunization Practices. JAMA. 1993;2691817- 1822
Link to Article
Lieu  TABlack  SBSorel  MERay  PShinefield  HR Would better adherence to guidelines improve childhood immunization rates? Pediatrics. 1996;981062- 1068
Pierce  CGoldstein  MSuozzi  K  et al.  The impact of Standards for Pediatric Immunization Practices on vaccination coverage levels. JAMA. 1996;276626- 630
Link to Article
Lomas  JAnderson  GMDomnick-Pierre  K  et al.  Do practice guidelines guide practice? the effect of a consensus statement on the practice of physicians. N Engl J Med. 1989;3211306- 1311
Link to Article
Haynes  RBDavis  DAMcKibbon  KATugwell  P A critical appraisal of the efficacy of continuing medical education. JAMA. 1984;25161- 64
Link to Article
Leininger  LSFinn  LDickey  L  et al.  An office system for organizing preventive services: a report by the American Cancer Society Advisory Group on Preventive Health Care Reminder Systems. Arch Fam Med. 1996;5108- 115
Link to Article
Carney  PADietrich  AJKeller  ALandgraf  JO'Connor  GT Tools, teamwork, and tenacity: an office system for cancer prevention. J Fam Pract. 1992;35388- 394
Palmer  RHPeterson  LE Development and testing of performance measures for quality improvement in clinical preventive services. Am J Prev Med. 1995;11402- 416
Centers for Disease Control and Prevention, Prevention and managed care: opportunities for managed care organizations, purchasers of health care, and public health agencies. MMWR CDC Surveill Summ. 1995;44(RR-14)6
Berwick  DM Continuous improvement as an ideal in health care. N Engl J Med. 1989;32053- 56
Link to Article
Centers for Disease Control and Prevention, Recommendations of the advisory committee on immunization practices: programmatic strategies to increase vaccination rates—assessment and feedback of provider-based vaccination coverage information. MMWR CDC Surveill Summ. 1996;45219- 220
LeBaron  CWChaney  MBaughman  AL  et al.  The impact of measurement and feedback on vaccination coverage in public clinics, 1988-1994. JAMA. 1997;277631- 635
Link to Article
Morrow  RWGooding  ADClark  C Improving physicians' preventive health care behavior through peer review and financial incentives. Arch Fam Med. 1995;4165- 169
Link to Article
Leaf  DANeighbor  WESchaad  DScott  CS A comparison of self-report and chart audit in studying resident physician assessment of cardiac risk factors. J Gen Intern Med. 1995;10194- 198
Link to Article
Woo  BWoo  BCook  EFWeisberg  MGoldman  L Screening procedures in the asymptomatic adult: comparison of physician's recommendations, patient's desires, published guidelines, and actual practice. JAMA. 1985;2541480- 1484
Link to Article
Lomas  JHaynes  RB A taxonomy and critical review of tested strategies for the application of clinical practice recommendations: from "official" to "individual" clinical policy. Am J Prev Med. 1988;477- 97
Lawrence  RS Diffusion of task force recommendations: diffusion of the US preventive services task force recommendations into practice. J Gen Intern Med. 1990;5 ((suppl)) S99- S103
Link to Article
Lomas  JEnkin  MAnderson  GM  et al.  Opinion leaders vs audit and feedback to implement practice guidelines: delivery after previous cesarean section. JAMA. 1991;2652202- 2207
Link to Article
Soumerai  SBAvorn  J Principles of educational outreach ("academic detailing") to improve clinical decision making. JAMA. 1990;263549- 556
Link to Article
Locke  EALatham  GP A Theory of Goal Setting and Task Performance.  Englewood Cliffs, NJ Prentice Hall International Inc1990;1- 62
Mugford  MBanfield  PO'Hanlon  M Effects of feedback of information on clinical practice: a review. BMJ. 1991;303398- 402
Link to Article
Winickoff  RNColtin  KLMorgan  MMBuxbaum  RCBarnett  GO Improving physician performance through peer comparison feedback. Med Care. 1984;22527- 534
Link to Article
Barton  MBSchoenbaum  SC Improving influenza vaccination performance in an HMO setting: the use of computer-generated remedies and peer comparison feedback. Am J Public Health. 1990;80534- 536
Link to Article
Mittman  BSTonesk  XJacobson  PD Implementing clinical practice guidelines: social influence strategies and practitioner behavior change. Qual Rev Bull. 1992;18413- 422
Not Available, Clinic Assessment Software Application: CASA User's Guide Version 3.2.  Atlanta, Ga Centers for Disease Control and Prevention1997;
Morrow  ALRosenthal  JLakkis  HD  et al.  A population-based study of access to immunization among urban Virginia children served by public, private, and military health care systems. Pediatrics [serial online]1998;101e5Available at: http://www.pediatrics.org/cgi/content/full/101/2/e5
Link to Article
Butterfoss  FDMorrow  ALRosenthal  J  et al.  CINCH: an urban coalition for empowerment and action. Health Educ Behav. 1998;25212- 225
Link to Article
Rosenthal  JMorrow  ALButterfoss  FDStallings  V Design and baseline results of an immunization community intervention trial in Norfolk, Virginia. Pediatr Ann. 1998;27418- 423
Link to Article
Karuza  JCalkins  EFeather  JHershey  COKatz  LMajeroni  B Enhancing physician adoption of practice guidelines: dissemination of influenza vaccination guideline using a small-group consensus process. Arch Intern Med. 1995;155625- 632
Link to Article
Reinertsen  JL Physicians as leaders in the improvement of health care systems. Ann Intern Med. 1998;128833- 838
Link to Article
Kerr  SJermier  JM Substitutes for leadership: their meaning and measurement. Organ Behav Hum Performance. 1978;22375- 403
Link to Article
Darden  PMTaylor  JASlora  EJ  et al.  Methodological issues in determining rates of childhood immunization in office practice: a study from Pediatric Research in Office Settings (PROS). Arch Pediatr Adolesc Med. 1996;1501027- 1031
Link to Article

Figures

Place holder to copy figure label and caption
Figure 1.

Immunization rates by practice for children at age 24 months. Practice coding differs from that in Table 1 to preserve practice confidentiality.

Graphic Jump Location
Place holder to copy figure label and caption
Figure 2.

Immunization rates by practice, comparing Centers for Disease Control and Prevention (CDC) and alternative criteria at time 3. Practice coding differs from that in Table 1 to preserve practice confidentiality.

Graphic Jump Location

Tables

Table Graphic Jump LocationTable 1. Characteristics of Pediatric Practices and Meeting Attendance
Table Graphic Jump LocationTable 2. Practice Changes Reported by Physicians After the Third Assessment Round*

References

Corrigan  JMGriffith  H NCQA external reporting and monitoring activities for health plans: preventive services programs. Am J Prev Med. 1995;11393- 396
Centers for Disease Control and Prevention, State and national vaccination coverage levels among children aged 19-35 months—United States, April-December 1994. MMWR CDC Surveill Summ. 1995;44613- 623
Grabowsky  MOrenstein  WAMarcus  EK The critical role of provider practices in undervaccination. Pediatrics. 1996;97735- 737
Santoli  JMSzilagyi  PGRodewald  LE Barriers to immunizations and missed opportunities. Pediatr Ann. 1998;27366- 374
Link to Article
Udovic  SLLieu  TA Evidence on office-based interventions to improve childhood immunization delivery. Pediatr Ann. 1998;27354- 361
Link to Article
Ad Hoc Working Group for the Development of Standards for Pediatric Immunization Practices, Standards for Pediatric Immunization Practices. JAMA. 1993;2691817- 1822
Link to Article
Lieu  TABlack  SBSorel  MERay  PShinefield  HR Would better adherence to guidelines improve childhood immunization rates? Pediatrics. 1996;981062- 1068
Pierce  CGoldstein  MSuozzi  K  et al.  The impact of Standards for Pediatric Immunization Practices on vaccination coverage levels. JAMA. 1996;276626- 630
Link to Article
Lomas  JAnderson  GMDomnick-Pierre  K  et al.  Do practice guidelines guide practice? the effect of a consensus statement on the practice of physicians. N Engl J Med. 1989;3211306- 1311
Link to Article
Haynes  RBDavis  DAMcKibbon  KATugwell  P A critical appraisal of the efficacy of continuing medical education. JAMA. 1984;25161- 64
Link to Article
Leininger  LSFinn  LDickey  L  et al.  An office system for organizing preventive services: a report by the American Cancer Society Advisory Group on Preventive Health Care Reminder Systems. Arch Fam Med. 1996;5108- 115
Link to Article
Carney  PADietrich  AJKeller  ALandgraf  JO'Connor  GT Tools, teamwork, and tenacity: an office system for cancer prevention. J Fam Pract. 1992;35388- 394
Palmer  RHPeterson  LE Development and testing of performance measures for quality improvement in clinical preventive services. Am J Prev Med. 1995;11402- 416
Centers for Disease Control and Prevention, Prevention and managed care: opportunities for managed care organizations, purchasers of health care, and public health agencies. MMWR CDC Surveill Summ. 1995;44(RR-14)6
Berwick  DM Continuous improvement as an ideal in health care. N Engl J Med. 1989;32053- 56
Link to Article
Centers for Disease Control and Prevention, Recommendations of the advisory committee on immunization practices: programmatic strategies to increase vaccination rates—assessment and feedback of provider-based vaccination coverage information. MMWR CDC Surveill Summ. 1996;45219- 220
LeBaron  CWChaney  MBaughman  AL  et al.  The impact of measurement and feedback on vaccination coverage in public clinics, 1988-1994. JAMA. 1997;277631- 635
Link to Article
Morrow  RWGooding  ADClark  C Improving physicians' preventive health care behavior through peer review and financial incentives. Arch Fam Med. 1995;4165- 169
Link to Article
Leaf  DANeighbor  WESchaad  DScott  CS A comparison of self-report and chart audit in studying resident physician assessment of cardiac risk factors. J Gen Intern Med. 1995;10194- 198
Link to Article
Woo  BWoo  BCook  EFWeisberg  MGoldman  L Screening procedures in the asymptomatic adult: comparison of physician's recommendations, patient's desires, published guidelines, and actual practice. JAMA. 1985;2541480- 1484
Link to Article
Lomas  JHaynes  RB A taxonomy and critical review of tested strategies for the application of clinical practice recommendations: from "official" to "individual" clinical policy. Am J Prev Med. 1988;477- 97
Lawrence  RS Diffusion of task force recommendations: diffusion of the US preventive services task force recommendations into practice. J Gen Intern Med. 1990;5 ((suppl)) S99- S103
Link to Article
Lomas  JEnkin  MAnderson  GM  et al.  Opinion leaders vs audit and feedback to implement practice guidelines: delivery after previous cesarean section. JAMA. 1991;2652202- 2207
Link to Article
Soumerai  SBAvorn  J Principles of educational outreach ("academic detailing") to improve clinical decision making. JAMA. 1990;263549- 556
Link to Article
Locke  EALatham  GP A Theory of Goal Setting and Task Performance.  Englewood Cliffs, NJ Prentice Hall International Inc1990;1- 62
Mugford  MBanfield  PO'Hanlon  M Effects of feedback of information on clinical practice: a review. BMJ. 1991;303398- 402
Link to Article
Winickoff  RNColtin  KLMorgan  MMBuxbaum  RCBarnett  GO Improving physician performance through peer comparison feedback. Med Care. 1984;22527- 534
Link to Article
Barton  MBSchoenbaum  SC Improving influenza vaccination performance in an HMO setting: the use of computer-generated remedies and peer comparison feedback. Am J Public Health. 1990;80534- 536
Link to Article
Mittman  BSTonesk  XJacobson  PD Implementing clinical practice guidelines: social influence strategies and practitioner behavior change. Qual Rev Bull. 1992;18413- 422
Not Available, Clinic Assessment Software Application: CASA User's Guide Version 3.2.  Atlanta, Ga Centers for Disease Control and Prevention1997;
Morrow  ALRosenthal  JLakkis  HD  et al.  A population-based study of access to immunization among urban Virginia children served by public, private, and military health care systems. Pediatrics [serial online]1998;101e5Available at: http://www.pediatrics.org/cgi/content/full/101/2/e5
Link to Article
Butterfoss  FDMorrow  ALRosenthal  J  et al.  CINCH: an urban coalition for empowerment and action. Health Educ Behav. 1998;25212- 225
Link to Article
Rosenthal  JMorrow  ALButterfoss  FDStallings  V Design and baseline results of an immunization community intervention trial in Norfolk, Virginia. Pediatr Ann. 1998;27418- 423
Link to Article
Karuza  JCalkins  EFeather  JHershey  COKatz  LMajeroni  B Enhancing physician adoption of practice guidelines: dissemination of influenza vaccination guideline using a small-group consensus process. Arch Intern Med. 1995;155625- 632
Link to Article
Reinertsen  JL Physicians as leaders in the improvement of health care systems. Ann Intern Med. 1998;128833- 838
Link to Article
Kerr  SJermier  JM Substitutes for leadership: their meaning and measurement. Organ Behav Hum Performance. 1978;22375- 403
Link to Article
Darden  PMTaylor  JASlora  EJ  et al.  Methodological issues in determining rates of childhood immunization in office practice: a study from Pediatric Research in Office Settings (PROS). Arch Pediatr Adolesc Med. 1996;1501027- 1031
Link to Article

Correspondence

CME
Meets CME requirements for:
Browse CME for all U.S. States
Accreditation Information
The American Medical Association is accredited by the Accreditation Council for Continuing Medical Education to provide continuing medical education for physicians. The AMA designates this journal-based CME activity for a maximum of 1 AMA PRA Category 1 CreditTM per course. Physicians should claim only the credit commensurate with the extent of their participation in the activity. Physicians who complete the CME course and score at least 80% correct on the quiz are eligible for AMA PRA Category 1 CreditTM.
Note: You must get at least of the answers correct to pass this quiz.
You have not filled in all the answers to complete this quiz
The following questions were not answered:
Sorry, you have unsuccessfully completed this CME quiz with a score of
The following questions were not answered correctly:
Commitment to Change (optional):
Indicate what change(s) you will implement in your practice, if any, based on this CME course.
Your quiz results:
The filled radio buttons indicate your responses. The preferred responses are highlighted
For CME Course: A Proposed Model for Initial Assessment and Management of Acute Heart Failure Syndromes
Indicate what changes(s) you will implement in your practice, if any, based on this CME course.
Submit a Comment

Multimedia

Some tools below are only available to our subscribers or users with an online account.

Related Content

Customize your page view by dragging & repositioning the boxes below.

Articles Related By Topic
Related Collections
PubMed Articles