0
We're unable to sign you in at this time. Please try again in a few minutes.
Retry
We were able to sign you in, but your subscription(s) could not be found. Please try again in a few minutes.
Retry
There may be a problem with your account. Please contact the AMA Service Center to resolve this issue.
Contact the AMA Service Center:
Telephone: 1 (800) 262-2350 or 1 (312) 670-7827  *   Email: subscriptions@jamanetwork.com
Error Message ......
Article |

Utilization of a Pediatric Emergency Department Education Computer FREE

Martin V. Pusic, MD; Kevin B. Johnson, MD; Anne K. Duggan, ScD
[+] Author Affiliations

From the Division of General Pediatrics, The Johns Hopkins School of Medicine, Baltimore, Md. Dr Pusic is now with the Department of Medical Informatics, Columbia University, New York, NY.


Arch Pediatr Adolesc Med. 2001;155(2):129-134. doi:10.1001/archpedi.155.2.129.
Text Size: A A A
Published online

Objective  To describe the use of a computer education station placed within a pediatric emergency department.

Design  Prospective tracking of computer tutorial use.

Setting  A tertiary care pediatric emergency department.

Methods  A computer with two 30-minute multimedia computer tutorials was installed in the emergency department. The tutorials were designed for residents to use on a voluntary basis but were available to medical students and allied health professionals as well. Software tracked time, date, duration of use, and the user's path through each tutorial. Data were collected from July 15, 1996, through April 30, 1997.

Results  Twenty-eight residents interacted 71 times with the computer during the study. The mean duration of interactions was 22 minutes (SD, 18 minutes; range, 0-75 minutes), but many lasted less than 5 minutes (15 [21%] of 71). Twenty-four (34%) of the interactions led to tutorial completion. Residents were more likely to complete a tutorial during the day shift (22 [40%] of 55) compared with the evening shift (1 [7%] of 14) (P = .02). A third of the interactions were during evenings and weekends. The education station delivered 26.1 hours of instruction in total. Of 32 first-year pediatric and emergency medicine residents, 22 attempted the tutorials; 4 completed both, and 10 completed one. Allied health professionals were responsible for 28% of the total interactions. They were significantly more likely than medical trainees to have brief interactions, but they were no less likely to complete the tutorials (10 [22%] of 46 vs 31 [27%] of 115; P = .44).

Conclusions  Pediatric residents are willing to use an educational computer placed in the emergency department. Choice of form and content should take into account the likelihood of short interactions and the demonstrated interest of allied health professionals.

Figures in this Article

COMPUTER-AIDED instruction (CAI), delivered largely by CD-ROM and the Internet, has greatly increased the range of options available to residents for self-directed learning. It offers many potential advantages, including scheduling flexibility; variable pace of learning; increased efficiency of learning; greater interactivity; and capacity for the use of sound, animations, and video.1,2 Computer-aided instruction has been shown to be generally more efficacious than traditional methods in many randomized, controlled trials.3 Commercial development has increased greatly. Authoring software, which allows in-house development, has become easier to use.2 Educational World Wide Web sites have proliferated as well.4

The best ways to use these new teaching methods have yet to be determined.5 Residents' attitudes toward CAI are generally positive. Even so, they rarely purchase educational software and few use medical education World Wide Web sites.6 Several medical educators have called on program directors to take a more active role in promoting CAI in postgraduate medical education.1,7 Unfortunately, most of these efforts have been piecemeal instead of being an integral part of an overall curriculum designed to take full advantage of both computer and traditional teaching formats.8

The pediatric emergency department (PED) is a notoriously difficult environment in which to teach residents and medical students. The shift-work nature of the clinical coverage decreases faculty and student availability. The same topic would have to be presented several times during a typical rotation to ensure that each trainee had covered each topic. Unpredictable patient flow makes it difficult to schedule teaching sessions during clinical hours. The variable, often seasonal, nature of patient presentation can render some trainees' experiences incomplete. This is especially true for diseases whose serious presentations are infrequently seen (eg, dysrhythmias, airway problems, and cervical spine fractures). Computer-aided instruction offers solutions to many of these problems. Its advantages in this setting are both logistic and educational. Logistic advantages include continuous availability, portability, and the capacity to track use. Educational advantages include adaptability to trainee's level and rate of learning, interactivity, immediacy, and consistency.9

One of the most common methods of presenting CAI is the computer education station, where several computer-based learning materials are loaded into a conveniently located computer for use on a voluntary basis.10 We are unaware of prior reports describing how trainees will use such a computer station. Using tracking software, we prospectively monitored use of an educational computer placed in a busy clinical setting. Our objective was to determine whether trainees would successfully use such a station on a voluntary basis.

SETTING

The Johns Hopkins Children's Center, Baltimore, Md, is a tertiary care academic institution affiliated with The Johns Hopkins School of Medicine. It supports a pediatric residency program of 66 residents. Internet access is provided throughout the hospital and at 3 workstations within the PED. The hospital has an extensive World Wide Web site11 that lists many educational resources. The PED has an annual census of 26 000 visits and is staffed by 6 full-time pediatric emergentologists.

SUBJECTS

We studied the station's use by pediatric and emergency medicine residents, medical students, and allied health professionals. Pediatric residents typically spend 6 to 8 weeks on rotation in the PED in each of their 3 years of training. First- and third-year residents have day (8 AM to 5 PM) and evening (5 PM to midnight) shifts. Second-year residents have only evening (noon to midnight) and night (10:30 PM to 8 AM) shifts. Emergency medicine residents complete a mandatory 1-month rotation in their first year and may return in subsequent years. One to two medical students per month rotate through the PED doing day and evening shifts. Allied health professionals on staff include 27 nurses, 5 emergency medicine technicians, and 2 physician's assistants.

TUTORIAL CREATION

Two tutorials, entitled "Fever Without Source" (FWS) and "An Approach to the Interpretation of Cervical Spine X-rays" (CSXR), were written by one of us (M.V.P.) using authoring software (Multimedia Toolbook 4.0; Asymetrix Corporation, Bellevue, Wash). These topics were chosen because they are frequently encountered in pediatric emergency medicine and because they lend themselves to 2 different instructional strategies, which use the computer's advantages. The FWS tutorial allows the user to navigate easily within a large amount of textual material (Figure 1). The CSXR tutorial uses computer graphics to demonstrate visual concepts (Figure 2). The tutorial content is aimed primarily at first-year pediatric residents but would also be appropriate for more senior and more junior trainees. Each tutorial was designed to take a first-year pediatric resident approximately 30 minutes to complete. Users can save their place in either tutorial for resumption at a later time.

Place holder to copy figure label and caption
Figure 1.

Two screen captures from the "Fever Without Source" tutorial. A, The text of the practice guideline and a text box that opens after the user has clicked on a reference number (in this case, 20) are shown. B, The MEDLINE abstract of the same reference is shown; this reference was accessed by clicking on a "hot word."

Graphic Jump Location
Place holder to copy figure label and caption
Figure 2.

Three screen captures from the "Cervical Spine X-ray" tutorial. A, An "unknown." B, After the user has mouse clicked over an area of suspected pathologic features in A, the user is given either positive or negative feedback. C, The user can have the findings demonstrated schematically and on the x-ray films.

Graphic Jump Location
TUTORIAL CONTENT

The FWS tutorial is based on the guideline by Baraff et al.12 The entire text of the guideline is presented in the tutorial. In addition, an embedded minitutorial describes the fate of a cohort of children presenting with FWS using the probability assumptions underlying the expert panel's management recommendations. Also available to the user are the MEDLINE abstracts of 91 of the 113 references in the article. In all, 258 screens of information are available to the user: 101 in the main body of the tutorial, 66 in the minitutorial, and 91 reference abstract screens. There are 62 context-sensitive help screens and 26 navigation screens. The material is presented to the users in the same order as in the guideline, but users can navigate through the material in any order they choose. References can be accessed by clicking on the reference number in the text (Figure 1A). "Hot words" are used to embed definitions and further information within the text of the tutorial. Multiple-choice questions are also used to engage the user. The tutorial was inspected for face validity by 2 attending pediatricians. It was pilot tested on 10 medical students for comprehensibility and ease of use. Student questions and comments were used to improve the content and form of the tutorial.

The CSXR tutorial differs from the FWS tutorial in that it uses graphic images much more extensively. Quantitative x-ray films were obtained by examining medical records listing a discharge diagnosis of "cervical spine fracture." The x-ray films were digitized if they were thought to be of teaching value. We were able to find examples of all major fracture patterns. A tutorial was then created that presents a well-recognized approach to the interpretation of CSXRs13 using the digitized films as examples. The tutorial presents 101 screens of information, including 20 different x-ray films. The user can navigate to any part of the tutorial using a map function. Wherever possible, the tutorial has been made interactive through the presentation of undiagnosed films (Figure 2). The student is asked to use the mouse to indicate on a radiograph where the pathologic features might be and then given feedback that illustrates the teaching point to be made. An emergency physician and a pediatric radiologist inspected the tutorial for face validity. The tutorial was pilot tested on 5 medical students and 2 pediatric residents who were not subjects of this study.

IMPLEMENTATION

A computer (IBM compatible 486DX 66 MHz) with 16 megabytes of RAM and standard sound and video boards was installed in a "low-traffic" area suitable for quiet study. The computer does not offer access to either the hospital information system or the Internet. The 2 tutorials were made available on July 15, 1996. Use was tracked from then through April 30, 1997.

UTILIZATION TRACKING

To track tutorial use, we used the course management system that is part of the computer-based training edition of the authoring software used (Multimedia Toolbook 4.0). The users were required to "log on" under 1 of 5 categories: attending physician, nurse, resident, medical student, or other. Emergency medicine technicians and physician assistants used the "other" category. Once the user had completely logged on, the tutorial of his or her choice was presented. The software logged the time and date of each interaction, the user's category, the title of each tutorial screen, and the time it was accessed. We were thus able to recreate the user's path through the tutorial. Residents were, in addition, required to enter their hospital physician identification codes at log-on. This allowed us to track the number of times a given resident interacted with each tutorial.

Completion of a tutorial was defined as an interaction in which (1) the user completed 75% of the screens, which made up the main body of the tutorial; and (2) the user reached the summary screen at the end of the tutorial. Interaction duration was defined as the time that elapsed between logging on and logging off. To account for interruptions, if any screen was visited for more than 5 minutes, those minutes were subtracted from the calculated duration of interaction.

RESIDENT COMPUTER EXPERIENCE

We compared those residents who completed at least one tutorial with those who did not on 3 characteristics: computer ownership, whether they had Internet access at home, and whether they had exposure to medical education software in medical school. This information was collected by survey just before the start of the study.6

ANALYSES

Group differences for continuous variables that were not normally distributed were compared using the Mann-Whitney test. Group differences for categorical variables were compared using the χ2 test. Statistical significance was defined as P<.05. The Institutional Review Board approved the study.

RESIDENT USE

Twenty-eight emergency medicine and pediatric residents interacted 71 times with the computer tutorials during the 10-month study (Table 1). The tutorials were most popular with the first-year residents. Ten completed 1 tutorial, and 4 completed both. Only 6 of 64 second- and third-year residents attempted the tutorials. Excluding interactions in which the resident did not advance beyond the instruction screens, the station delivered 26.1 hours of instruction to the residents.

Table Graphic Jump LocationTable 1. Resident Use of Computer Tutorials*

The number and duration of individual interactions by the residents with each tutorial are shown in Figure 3. The mean duration of the interactions was 22 minutes (SD, 18 minutes) and did not differ between the 2 tutorials. Of the 71 interactions, 15 (21%) lasted less than 5 minutes, although the tutorials were designed to last 30 minutes. In 11 (15%) of the 71 interactions, the resident did not progress beyond the instruction screens. Although the completion rate of the CSXR tutorial (15 [43%] of 35) was higher than that of the FWS tutorial (9 [25%] of 36), the difference was not statistically significant (P = .12). Residents who completed a tutorial required an average of 1.5 interactions to do so (range, 1-4 interactions). Second interactions were usually less than a week after the first (16 [80%] of 20). Three residents completed a given tutorial twice.

Place holder to copy figure label and caption
Figure 3.

The cumulative number and duration of interactions for 28 residents (eg, resident 23 did the "Cervical Spine X-ray" tutorial twice and the "Fever Without Source" tutorial twice, but only 1 interaction led to tutorial completion). A, "Fever Without Source" tutorial. B, "Cervical Spine X-ray" tutorial. The asterisk indicates a completed interaction.

Graphic Jump Location

Tutorial use varied by time of day (Table 2). Most interactions were during the day shift (55 [77%] of 71), with 31 of the 71 interactions occurring between 8 and 10:59 AM. However, a significant number of the interactions (24 [34%] of 71) were during evenings and weekends when didactic instruction would not normally be available. Only 2 interactions were during the night shift. The residents were off duty during 11 (17%) of 64 interactions for which schedule information was available. Residents starting the tutorial during the day shift were far more likely to complete the tutorial than were those who started during the evening, even though the mean duration of their interactions was similar.

Table Graphic Jump LocationTable 2. Resident Interactions With Computer Tutorials by Time of Day*

Survey information on computer access and experience was available for 21 of the 28 residents who attempted tutorials. Residents who completed at least one tutorial (12/21) were not different compared with those who did not for rate of computer ownership, Internet access at home, and use of medical education software in medical school.

Few residents took advantage of the 91 hyperlinked reference abstracts available in the FWS tutorial. In 4 of the 36 FWS interactions, 8 reference abstracts were accessed. The minitutorial embedded within the FWS tutorial was completed in half of the interactions.

USE BY OTHERS

Computer station use was not limited to residents (Table 3). As a group, medical students, nurses, and others interacted with the station more often than the residents did (90 vs 71 interactions). When contrasted with the medical trainees (medical students and residents), the allied health professionals (nurses and others) were more likely to have an interaction that lasted less than 5 minutes but were no less likely to complete a tutorial during a given interaction (Table 4).

Table Graphic Jump LocationTable 3. Number and Duration of Interactions With Computer Tutorials by User Category*
Table Graphic Jump LocationTable 4. Comparison of Completion Rates and Duration of Interactions for Allied Health Professionals and Medical Trainees

Many head-to-head comparisons have shown computer-based learning materials to be as good as or better than traditional methods.3 However, they are rarely integrated into the teaching curricula of undergraduate and postgraduate medical programs.8 Part of the reason for this may be that they are usually presented in a voluntary, ad hoc fashion that does little to encourage the trainees to take ownership of the content.8,10 Our study examined whether instructional materials can be delivered despite these concerns.

We determined the use of a computer education station placed in a PED. The station presented 2 tutorials, developed by one of us (M.V.P.), that would be of interest to residents on rotation in the emergency department. The tutorials were presented to the students as optional learning materials and not as an integrated part of the curriculum. By tracking tutorial use, we showed that instructional materials were indeed delivered to a significant proportion of the target population; however, many of the interactions failed and only half of the first-year pediatric residents completed 1 of the 2 tutorials.

There were several limitations to our study. Because the tutorials were developed by one of us, it may be difficult to generalize our results to the education stations others may organize. However, the tutorials were based on content familiar to most pediatric emergency medicine practitioners. One tutorial was based on a standard pediatric emergency medicine textbook,13 while the other was based on a widely recognized practice guideline.12 Trainees might have been more willing to use the station if there had been a greater variety of materials available. For example, senior residents may have perceived that they were already familiar with the topics presented. Having residents log on under their own names made it possible for us to track their interactions with the station but may have altered their willingness to use the tutorials. The sample was relatively small, so that we may have missed statistically significant differences in our between-groups comparisons.

We expected our pediatric residents to be receptive to CAI because they have been shown to have a high rate of computer ownership (>60%) and a positive attitude toward CAI.6 The culture of the institution is favorable to information technology, and the residents are required to use an electronic patient record. Most of the target (first-year) residents did attempt the tutorials. We were, however, disappointed that only a handful of second- and third-year residents did so. This may be because the second-year residents' shifts did not include the morning, which proved to be the most popular time for use of the station, and because the more senior residents had greater responsibility while in the PED. Integrating the computer tutorials into the curriculum of the trainees could increase participation because presenting the tutorials as a voluntary activity sends an implicit message that they are not valued as highly as other more traditional learning activities.8,10

A pleasant surprise was the willingness of allied health professionals to use the education station. They accounted for 29% (46/161) of the interactions and were as likely as medical trainees to complete a tutorial. A fixed education station within the PED could be an excellent vehicle for delivering instruction to allied health professionals.

We hoped that the computer education station would help overcome some of the scheduling difficulties associated with teaching within the PED. While weekday mornings were the most popular time, more than 34% (24/71) of the interactions were on evenings or weekends, times when didactic teaching is difficult to schedule. While the computer station may have increased off-hours learning, it would be important to explore why the tutorial completion rate was lower during evening shifts compared with the day shift.

Many of the interactions were short. Thirty-seven percent (60/161) lasted less than 5 minutes. It may be that time pressures did not allow the user to interact meaningfully with tutorials designed to last 30 minutes. Most residents who completed the tutorials required more than one session to do so. Educational interventions designed to take into account possible short interaction times might be more effective for the pediatric resident on duty in the PED. The total amount of instruction delivered by the station (44.3 hours) during the 10-month study is low compared with the time required to develop the materials. This should be considered when allocating resources for such an undertaking. Use of off-the-shelf instructional CD-ROMs or Internet medical education sites may be more efficient. However, much of the development time was spent mastering the authoring software, so that development of future tutorials would be expected to take less time. In addition, this software, and other authoring platforms (eg, PowerPoint 97), have become easier to use and more powerful even since the start of this study.14

While our results suggest that pediatric residents and others are willing to use a computer education station, further studies are required to fully demonstrate the effectiveness of CAI in the PED. Explicit measurement of knowledge gain would be helpful as would assessment, in this setting, of the attitudes of learners and educators.

In summary, computer tutorials presented on a dedicated computer in the emergency department can deliver a significant amount of educational material. Optimization of the form, duration, and content of the tutorials might increase their appeal. In particular, the relatively short interactions observed in this setting suggest that the content should be presented in discrete, brief modules. The choice of content and its presentation should also take into account the demonstrated interest of allied heath professionals.

Accepted for publication August 14, 2000.

This study was funded in part by a special projects grant from the Ambulatory Pediatrics Association, McLean, Va.

Presented at the annual meeting of the Ambulatory Pediatrics Association, New Orleans, La, May 5, 1998.

We thank Allen Walker, MD, for donating the computer hardware used in this study and reviewing the manuscript; David McGillivray, MD, for reviewing the manuscript; and Pauline Kerr, MD, and Andrew Grant, MD, for helping to write the tutorials.

Corresponding author: Martin V. Pusic, MD, Department of Medical Informatics, Columbia University, 622 W 168 St, Vanderbilt Clinic Bldg, Fifth Floor, New York, NY 10032 (e-mail: martin.pusic@dmi.columbia.edu).

Chodorow  S Educators must take the electronic revolution seriously. Acad Med. 1996;71221- 226
Link to Article
Santer  DMD'Alessandro  MPHuntley  JSErkonen  WEGalvin  JR The multimedia textbook: a revolutionary tool for pediatric education. Arch Pediatr Adolesc Med. 1994;148711- 715
Link to Article
Jelovsek  FRAdebonojo  L Learning principles as applied to computer-assisted instruction. MD Comput. 1993;10165- 172
Spooner  SA On-line resources for pediatricians. Arch Pediatr Adolesc Med. 1995;1491160- 1168
Link to Article
Friedman  CP The research we should be doing. Acad Med. 1994;69455- 457
Link to Article
Pusic  MV Pediatric residents: are they ready to use computer-aided instruction? Arch Pediatr Adolesc Med. 1998;152494- 498
Link to Article
Koschmann  T Medical education and computer literacy: learning about, through, and with computers. Acad Med. 1995;70818- 821
Link to Article
Friedman  RB Top ten reasons the World Wide Web may fail to change medical education. Acad Med. 1996;71979- 981
Link to Article
Hardin  PCReis  J Interactive multimedia software design: concepts, process and evaluation. Health Educ Behav. 1997;2435- 53
Link to Article
Barnett  GO Information technology and medical education. J Am Med Inform Assoc. 1995;2285- 291
Link to Article
Not Available, Johns Hopkins University Department of Pediatrics Home Page Available at:http://www.med.jhu.edu/peds/pedspage.htmlAccessed October 26, 2000
Baraff  LJBass  JWFleisher  GR  et al.  Practice guideline for the management of infants and children 0 to 36 months of age with fever without source. Ann Emerg Med. 1993;221198- 1210
Link to Article
Woodward  GA Neck trauma. Fleisher  GRLudwig  SReds.Textbook of Pediatric Emergency Medicine. 3rd ed. Baltimore, Md Williams & Wilkins1993;1124- 1142
Santer  DMMichaelsen  VEErkonen  WE  et al.  A comparison of educational interventions: multimedia textbook, standard lecture, and printed textbook. Arch Pediatr Adolesc Med. 1995;149297- 302
Link to Article

Figures

Place holder to copy figure label and caption
Figure 1.

Two screen captures from the "Fever Without Source" tutorial. A, The text of the practice guideline and a text box that opens after the user has clicked on a reference number (in this case, 20) are shown. B, The MEDLINE abstract of the same reference is shown; this reference was accessed by clicking on a "hot word."

Graphic Jump Location
Place holder to copy figure label and caption
Figure 2.

Three screen captures from the "Cervical Spine X-ray" tutorial. A, An "unknown." B, After the user has mouse clicked over an area of suspected pathologic features in A, the user is given either positive or negative feedback. C, The user can have the findings demonstrated schematically and on the x-ray films.

Graphic Jump Location
Place holder to copy figure label and caption
Figure 3.

The cumulative number and duration of interactions for 28 residents (eg, resident 23 did the "Cervical Spine X-ray" tutorial twice and the "Fever Without Source" tutorial twice, but only 1 interaction led to tutorial completion). A, "Fever Without Source" tutorial. B, "Cervical Spine X-ray" tutorial. The asterisk indicates a completed interaction.

Graphic Jump Location

Tables

Table Graphic Jump LocationTable 1. Resident Use of Computer Tutorials*
Table Graphic Jump LocationTable 2. Resident Interactions With Computer Tutorials by Time of Day*
Table Graphic Jump LocationTable 3. Number and Duration of Interactions With Computer Tutorials by User Category*
Table Graphic Jump LocationTable 4. Comparison of Completion Rates and Duration of Interactions for Allied Health Professionals and Medical Trainees

References

Chodorow  S Educators must take the electronic revolution seriously. Acad Med. 1996;71221- 226
Link to Article
Santer  DMD'Alessandro  MPHuntley  JSErkonen  WEGalvin  JR The multimedia textbook: a revolutionary tool for pediatric education. Arch Pediatr Adolesc Med. 1994;148711- 715
Link to Article
Jelovsek  FRAdebonojo  L Learning principles as applied to computer-assisted instruction. MD Comput. 1993;10165- 172
Spooner  SA On-line resources for pediatricians. Arch Pediatr Adolesc Med. 1995;1491160- 1168
Link to Article
Friedman  CP The research we should be doing. Acad Med. 1994;69455- 457
Link to Article
Pusic  MV Pediatric residents: are they ready to use computer-aided instruction? Arch Pediatr Adolesc Med. 1998;152494- 498
Link to Article
Koschmann  T Medical education and computer literacy: learning about, through, and with computers. Acad Med. 1995;70818- 821
Link to Article
Friedman  RB Top ten reasons the World Wide Web may fail to change medical education. Acad Med. 1996;71979- 981
Link to Article
Hardin  PCReis  J Interactive multimedia software design: concepts, process and evaluation. Health Educ Behav. 1997;2435- 53
Link to Article
Barnett  GO Information technology and medical education. J Am Med Inform Assoc. 1995;2285- 291
Link to Article
Not Available, Johns Hopkins University Department of Pediatrics Home Page Available at:http://www.med.jhu.edu/peds/pedspage.htmlAccessed October 26, 2000
Baraff  LJBass  JWFleisher  GR  et al.  Practice guideline for the management of infants and children 0 to 36 months of age with fever without source. Ann Emerg Med. 1993;221198- 1210
Link to Article
Woodward  GA Neck trauma. Fleisher  GRLudwig  SReds.Textbook of Pediatric Emergency Medicine. 3rd ed. Baltimore, Md Williams & Wilkins1993;1124- 1142
Santer  DMMichaelsen  VEErkonen  WE  et al.  A comparison of educational interventions: multimedia textbook, standard lecture, and printed textbook. Arch Pediatr Adolesc Med. 1995;149297- 302
Link to Article

Correspondence

CME
Also Meets CME requirements for:
Browse CME for all U.S. States
Accreditation Information
The American Medical Association is accredited by the Accreditation Council for Continuing Medical Education to provide continuing medical education for physicians. The AMA designates this journal-based CME activity for a maximum of 1 AMA PRA Category 1 CreditTM per course. Physicians should claim only the credit commensurate with the extent of their participation in the activity. Physicians who complete the CME course and score at least 80% correct on the quiz are eligible for AMA PRA Category 1 CreditTM.
Note: You must get at least of the answers correct to pass this quiz.
Please click the checkbox indicating that you have read the full article in order to submit your answers.
Your answers have been saved for later.
You have not filled in all the answers to complete this quiz
The following questions were not answered:
Sorry, you have unsuccessfully completed this CME quiz with a score of
The following questions were not answered correctly:
Commitment to Change (optional):
Indicate what change(s) you will implement in your practice, if any, based on this CME course.
Your quiz results:
The filled radio buttons indicate your responses. The preferred responses are highlighted
For CME Course: A Proposed Model for Initial Assessment and Management of Acute Heart Failure Syndromes
Indicate what changes(s) you will implement in your practice, if any, based on this CME course.
Submit a Comment

Multimedia

Some tools below are only available to our subscribers or users with an online account.

Web of Science® Times Cited: 4

Related Content

Customize your page view by dragging & repositioning the boxes below.

Articles Related By Topic
Related Collections
PubMed Articles