0
We're unable to sign you in at this time. Please try again in a few minutes.
Retry
We were able to sign you in, but your subscription(s) could not be found. Please try again in a few minutes.
Retry
There may be a problem with your account. Please contact the AMA Service Center to resolve this issue.
Contact the AMA Service Center:
Telephone: 1 (800) 262-2350 or 1 (312) 670-7827  *   Email: subscriptions@jamanetwork.com
Error Message ......
Educational Intervention |

Use of E-mail to Teach Residents Pediatric Emergency Medicine FREE

Eva M. Komoroski, MD
[+] Author Affiliations

From the University of Arkansas for Medical Sciences, Department of Pediatrics, Emergency Medicine Section, Arkansas Children's Hospital, Little Rock.


Arch Pediatr Adolesc Med. 1998;152(11):1141-1146. doi:10.1001/archpedi.152.11.1141.
Text Size: A A A
Published online

Objectives  To develop a computer-based teaching program using a hospital health care system to instruct pediatric and medicine-pediatric residents (MPR) in pediatric emergency medicine, and to determine residents' participation, interest, and benefit from the project over 3 years' time.

Design  Prospective, descriptive.

Setting  University-affiliated pediatric hospital.

Participants  Pediatric housestaff.

Methods  One multiple-choice question about pediatric emergency medicine was sent daily to pediatric residents and MPR via the hospital health care system's internal electronic mail (e-mail) system. Residents were asked to reply electronically with the correct answer. The next day, the correct answer, discussion, and a new question were sent to the residents via e-mail.

Main Outcome Measures  Tabulated electronic participation for 3 years; self-report surveys of participation after 1 and 3 years' participation. Pretest and posttest scores before and after 1 year of participation.

Results  From October 3, 1994, to June 14, 1995, 52 of 64 pediatric residents and MPR (81%) elected to receive the e-mail questions, but only 31 (48%) sent electronic replies. The average number of e-mail replies received per resident that year was 38 (22%) of 171 (range, 1-164 e-mail replies; median, 33). In academic years 1995-1996 and 1996-1997, although averages and ranges were similar, regular e-mail participation declined. Residents preferred to participate by reading e-mail only. Pediatric residents and MPR judged e-mail questions to be as educationally valuable or better than Grand Rounds (92%) or our Resident Lecture Series of basic pediatric topics (87%).

Conclusion  Pediatric residents and MPR do participate in a daily e-mailed question/answer format of teaching, but prefer to do so passively, by reading daily questions only, rather than actively, by sending answers to an e-mail box. This format provides medical education that is uniform, accommodates residents' varying schedules, and is a useful adjunct to other teaching methods.

Figures in this Article

ALTHOUGH THERE are specific learning objectives that pediatric emergency medicine faculty would like housestaff to attain, the emergency department (ED) is a difficult area in which to teach residents uniformly. Both the exposure of residents to representative cases and the management styles of attending staff vary. In addition, teaching time may be limited by patient volume, the severity of the patients' conditions, and documentation requirements.

Computer-assisted instruction (CAI) has been used in many areas of medical education to improve teaching.1 Computer-assisted instruction has compared favorably with lectures or reading.

Electronic mail (e-mail) has been used successfully in nursing education to teach specific topics.2 Previous attempts at computer-based teaching of obstetrics-gynecology residents using a daily question format or lectures have been well received and, in the short-term, were associated with an increase in the residents' knowledge base.3,4

We used an already-existing network of computers distributed throughout our pediatric hospital to send a daily pediatric emergency medicine–based question electronically to pediatric residents. We hoped to provide flexibility in learning for residents, and sought to determine resident use of, benefit from, and interest in electronic teaching.

Arkansas Children's Hospital, Little Rock, uses a health care information system (Medical Information Technology Inc, Westwood, Mass), with terminals available throughout all hospital, clinic, and office areas. This system has an internal e-mail feature, which pediatric residents (PR) and medicine-pediatric residents (MPR) began using as a method of communication 2 years prior to the start of our program.

In October 1994, our Pediatric Emergency Medicine division began e-mailing a daily (Monday-Friday) multiple-choice question about a topic in pediatric emergency medicine to residents (Figure 1). The correct answer and a discussion paragraph were sent the next weekday, in addition to a new question (Figure 2). We called this learning initiative "Question of the Day" (QD). Residents "subscribed" to QD by taking a 25-item, multiple-choice pretest before receiving daily questions. The test was composed of some of the daily questions that residents would see during the academic year if they participated in QD. The same test was administered again at the end of the academic year to residents who subscribed to QD to determine the effect of this program on their education. The QD pretest was also administered to new interns as a group during their initial Pediatric Advanced Life Support course in late June 1994. Upper-level subscribers were asked to take the same pretest that the interns had taken, usually during a shift in the ED. Most upper-level residents completed the pretest in September 1998; a few completed the pretest after QD had begun and did not receive questions starting on the first day of the learning initiative, but shortly after they completed their pretest. Posttests were administered individually to interns and upper-level residents in the ED in June 1995. With posttesting, PR and MPR were asked to estimate the percentage of time during the academic year they read QD. Subscribers and nonsubscribers were asked to fill out an opinion survey regarding the usefulness of QD at the end of the first academic year of operation. Some residents did not participate in QD and did not take a pretest or posttest.

In academic years 1995-1996 and 1996-1997, we no longer administered pretests or posttests. Instead, interns were automatically sent QD; upper-level participants were added by request. Subscribers were automatically continued from year to year; they did not have to resubscribe. During all 3 years, electronic participation, in the form of residents returning their answer to a central e-mail address, was monitored.

An announcement was made at a pediatric housestaff meeting in September 1994 to introduce and explain QD and to encourage housestaff subscription and participation. In addition, we sent an announcement via e-mail to all PR and MPR 2 weeks prior to the beginning of our endeavor, which explained the purpose of the pretesting and posttesting and the operation of the program, and encouraged participation. We began sending questions on October 3, 1994. Throughout the remainder of the academic year, we discussed QD with residents who rotated in the ED, encouraging their participation and comments regarding the teaching exercise. We allowed residents to subscribe to QD throughout the remainder of academic year.

Our participatory guidelines regarding QD emphasized flexibility and collegiality. We encouraged open book participation and discussion of the questions with other residents and faculty. Subscribers were able to send their answers and review questions from any hospital health care system terminal at any time. Home computer access to QD was unavailable. The answers could be sent daily, weekly, or in batches. Residents could open their e-mail, read a question, stop to read reference material regarding the question, and send a researched answer, or read the question and immediately send a nonresearched answer. Reading QD takes less than 2 minutes per day on average, although responding with a correct, researched answer takes longer. Residents were able to save questions from day to day without responding, so that they might have seen an answer to a question prior to responding. Because of this, we did not grade correct or incorrect answers, but simply recorded that an answer was returned electronically. We did not tabulate the length of time it took a resident to respond to a question. There was no time limit for answering questions other than the close of the academic year. Residents were assured that their answers and extent of participation would remain confidential. Residents could unsubscribe by requesting removal from the e-mail distribution list. They could also delete e-mail messages without reading them. We were unable to record the amount of time a resident spent looking at QD e-mail.

The computer terminals used by the residents to access e-mail were located at nurses' stations, physicians' workrooms, and other nonsecluded areas throughout the hospital where housestaff congregate. It is possible that nonsubscribers may have been exposed on occasion to the daily questions, reading them along with a subscriber.

Questions about a single topic appeared for a period of 1 to 2 weeks. A notation appeared at the beginning of a series of questions indicating the topic to be covered and a reference source to which subscribers could turn for additional reading and the correct answers (Figure 1). Examples of these references included specific pages in pediatric emergency medicine textbooks, the Red Book, or pertinent articles.

Most of the questions were written by me. Additional contributors, members of the Pediatric Emergency Medicine Section, are noted in the acknowledgment section. Questions were not borrowed from any question source, such the Pediatrics Review and Education Program by the American Academy of Pediatrics or board review texts. The questions were double-checked for content and readability by another member of our section before use on the network. We did not validate any of our multiple-choice questions using formal questionnaire development techniques. We did not perform a pilot test on our pretest or posttest.

There was no structured curriculum for QD. Some of the topics we covered were considered essential material for a pediatric emergency medicine curriculum (ie, management of the septic infant, wound management and repair, basic fracture and burn management, ingestion management, resuscitation basics, and medicolegal issues). Frequently asked questions posed by residents concerning issues such as the type of tetanus booster to give based on age and immunization status after laceration repair, new articles in the pediatric literature, illnesses seen in the ED in large numbers (rotavirus and respiratory syncytial virus), and seasonal conditions such as Rocky Mountain spotted fever provided the impetus for new questions. In addition, interesting or mismanaged cases seen in the ED provided potential subject matter. Some topics were repeated yearly to provide teaching emphasis. In the first year of operation, 171 questions were sent to subscribers; in the second and third years, 219 and 185 questions were sent, respectively. The variation in number of questions sent was due to the late start of academic year 1994-1995 and to changes in faculty academic commitments.

At the end of the third year of operation, academic year 1996-1997, because of dwindling participation, we distributed a second opinion survey to all housestaff to determine how residents used QD, their perception of the usefulness of the program, and their suggestions for improvement.

Electronic responses were recorded to describe participation according to residency level and year, as well as percentage of questions answered by a given resident. Median scores for pretests and posttests were calculated for postgraduate year 1 (PGY-1) and PGY-2 combined with PGY-3 and PGY-4 (upper-level residents). Median differences in paired pretests and posttests were evaluated statistically with the nonparametric Wilcoxon signed rank test.

The project was approved by the University of Arkansas for Medical Sciences Human Research Advisory Committee.

In academic year 1994-1995, 52 (81%) of 64 PR and MPR subscribed to QD, as given in Table 1. Twenty-one residents (33%) subscribed but never sent an e-mail answer. The average number of e-mail answers received per resident who sent at least one answer (n=31, 48%) was 38 (22%) of 171 questions sent (range, 1-164; 0.01%-96%; median, 33 e-mail replies).

Table Graphic Jump LocationTable 1. Participation in Academic Year 1994-1995 for Question of the Day*

At the end of the 1994 academic year, subscribers were given posttests, asked how often they read QD, and asked to fill out a survey regarding their opinion of QD. Interns (n=17) estimated that they read QD an average of 71% of the time (range, 0%-100%; median, 90%). Upper-level residents (n=12) estimated that they read QD an average of 56% of the time (range, 0%-100%; median, 80%). Pretests and posttests were completed by the same groups (Table 2). Interns demonstrated significant improvement in their test scores (P<.01). The scores of the upper-level residents did not improve, but their scores were higher initially and thus had less room for improvement.

Table Graphic Jump LocationTable 2. Pretest and Posttest Results*

In academic year 1995-1996, 24 (35%) of 68 residents sent e-mail answers for QD. The average e-mail response rate was 54 (25%) of 219 answers, with a range of 1 to 178 answers (0.5%-81%; median, 27 answers). In academic year 1996-1997, 21 (30%) of 70 residents participated in QD. The average e-mail response rate was 44 (24%) of 185, with a range of 1 to 180 answers (0.01%-97%; median, 19 answers); only 3 residents (2 PGY-2 and 1 PGY-3) sent e-mail answers consistently (at least 1 answer per week). Table 3 lists e-mail participation by academic year for all 3 years of participation. Table 4 lists the percentage of e-mail answers returned by a given resident per academic year.

Table Graphic Jump LocationTable 3. Electronic Participation by Postgraduate Year (PGY) and Academic Year*
Table Graphic Jump LocationTable 4. Electronic Participation by Percentage of Questions Answered per Year

Both participating and nonparticipating PR and MPR were asked to complete an opinion survey in June 1995 regarding the usefulness of QD. Forty-nine residents, 39 QD participants and 10 nonparticipants, returned the surveys. Those who did not participate in QD indicated that they had missed the original explanation (n=5), lacked initiative (n=4), or had problems with their e-mail and did not receive the QD (n=1). We asked additional questions in a yes/no format. Participants (n=39) indicated that QD inspired them to read further about the featured topics (79%) and to discuss the topics with their peers (87%), with ED faculty (51%), and with other faculty (23%).

In the same questionnaire, we asked participating residents to rate, on a 5-point Likert scale, the relative value of QD with other learning mechanisms available to them: 1 indicates much better; 2, better; 3, same; 4, worse; and 5, much worse. Participants believed that QD was at least as educationally valuable or better (score ≥ 3) than Grand Rounds (92%), our Resident Lecture Series of basic pediatric topics (87%), our ED syllabus, a collection of chapters about topics in pediatric emergency medicine written by our faculty (85%), or Pediatrics in Review (67%). Negative comments included the amount of time it took to read the questions and the fact that time away from Arkansas Children's Hospital or intense rotations in the pediatric intensive care unit interfere with daily answering.

As noted previously, the number of residents who answered QD electronically decreased significantly over the first 3 years of operation, as given in Table 4. Therefore, a new, anonymous questionnaire was sent to all pediatric residents, subscribers, and nonsubscribers in the third year of operation, academic year 1996-1997, to determine their current participation and any dissatisfaction with QD. Fifty-six (79%) of 68 PR and MPR responded to our survey. When residents were asked "Do you electronically answer QD?" 3 answered yes, 21 answered sometimes, and 32 answered no. When asked "Do you read QD?" 39 answered yes, 15 answered sometimes, and 1 answered no (1 missing answer). No dissatisfaction with the program was noted, but suggestions were given as to how to expand the teaching program (more daily questions, a conference using the questions in quiz-show format). Reasons given for lack of e-mail participation included reading the questions and answers at the same time (n=13), lack of time (n=5), not wanting anyone to know if they answered incorrectly (n=3), lack of interest (n=3), and computer use problems (n=2). There were no suggestions made as to how to improve e-mail participation in QD. As a result of this survey, we abandoned the e-mail response feature of QD in academic year 1997-1998 and now send daily questions, answers, and discussion to housestaff with no e-mail answer expected.

In reviewing logs of e-mail replies, we found various patterns of participation. Some residents began the academic year with daily answers for a period of 2 weeks to 2 months. A few continued this pattern throughout the academic year. Some residents answered sporadically, 1 or 2 answers every month, and some sent many answers at a time, followed by long gaps of nonparticipation. We could not discern a pattern in which some topics were more interesting than others, as no particular questions were answered more than others. Occasionally, we saw residents working on QD during their ED rotation, but overall, did not find that residents were more likely to read or answer questions while working in the ED. We were unable to determine when or where PR and MPR were participating in QD, although anecdotal comments indicated that residents took advantage of quiet times during call nights or less busy rotations to look at their e-mail. No resident asked to have their name deleted from the QD e-mail distribution list, but one resident did anonymously comment that he/she never looked at the questions, but rather immediately deleted them.

Computer-assisted instruction has been used in a variety of ways to teach students and practitioners in medically related fields.1 Two studies have involved emergency medicine. Paramedics participated in a continuing education project that compared lectures, videotapes, and CAI.5 Computer-assisted instruction students scored better on posttests and retained information better, although students preferred lectures to CAI. Medical students on a 1-month emergency medicine rotation used an interactive software program to develop their knowledge of an acute chest pain protocol.6 Students who used the program had better diagnostic skills than those who did not, and their knowledge of the chest pain protocol after using the program was significantly better than their knowledge before using it.

Electronic mail has been used by nurses at Biloxi Medical Center, Biloxi, Miss, to teach their peers in a uniform manner an hour-long curriculum on the effects of cocaine and crack.2 The nurses who initiated this project wrote their own teaching modules, a time-consuming task that was justified by the repetitive use of this learning tool. The module included text and self-assessment questions. Pretests and posttests were used to assess the success of the learning experience. An average improvement of 28 to 30 percentage points was noted when the e-mail method was used.

Two different types of teaching initiatives have been used by obstetrics-gynecology departments involving e-mail. One initiative involved daily questions for 60 days sent via e-mail to obstetrics-gynecology residents.3 Residents were required to respond in 24 hours, and were also required to take pretests and posttests. Scores on the posttests were significantly higher than on the pretests. Ninety percent of residents wanted to continue learning in this way, and 85% participated. The second initiative involved an electronic bulletin board for instruction in reproductive endocrinology.4 The authors' motivation for this latter endeavor was that only poor educational programs were available for what is considered an essential teaching section in the specialty. The lectures were sent to residents monthly via e-mail for a 10-month period, and a chat feature allowed residents to send questions regarding lecture content and related cases. The questions were answered by faculty members. Ninety-five percent of target residents accessed the bulletin board and found it convenient. Sixty percent reviewed the lectures and stored them for later use; 40% printed the lectures. The average overall satisfaction expressed by residents was 4.5 (on a 5-point scale). No pretesting or posttesting was done.

Question of the Day was an effort to begin electronic teaching using an e-mail system that was already in place and that was used throughout our hospital. It combined e-mail with topics in pediatric emergency medicine to provide a "round" of questions and answers for PR and MPR. We found that our residents considered QD at least as valuable as standard teaching formats such as Grand Rounds, the Resident Lecture Series of basic pediatric topics, or Pediatrics in Review. We were surprised that residents were reluctant to send e-mail answers in response to the QD. In the first year of operation, residents reported that they read the QD far more often than they answered electronically; in the following 2 years, we found that this pattern continued. Residents still stated that they enjoyed QD, and some stored their questions for later review for in-service examinations and pediatric boards. It is possible that residents felt that they would be graded, tested, or otherwise scrutinized by sending their answers to a mailbox, and therefore preferred to answer anonymously, using QD as a passive learning exercise.

Pediatric interns have a variety of learning experiences in their first year of training. It is difficult to determine which are the most valuable, and we cannot ascertain the extent to which QD questions are responsible for the large increase in knowledge that was observed in intern pretesting and posttesting. We were unable to draw conclusions regarding the benefit of QD to upper-level residents. They had relatively high initial scores, and posttest scores did not improve much, possibly because there was not much room for improvement.

There was less MPR participation than PR in our program. In academic year 1994-1995, of 15 MPR, 5 were nonsubscribers and 9 others answered less than 10 questions by e-mail. Medicine-pediatric residents spend months away from Arkansas Children's Hospital while on required medicine rotations. They attend housestaff meetings irregularly. It is likely that many MPR missed initial opportunities to subscribe to QD. The other hospitals at which they have rotations are not linked to our hospital health care system e-mail system. Once they became subscribers, MPR may have found it difficult to keep up with the accumulation of questions to be answered in their e-mail boxes. Despite all these obstacles, one MPR (PGY-3) did answer 40% of QD questions in academic year 1994-1995.

Stanley Chodorow, PhD, provost at the University of Pennsylvania, commented on the use of the computer in undergraduate liberal arts education in his address to attendees of the Association of American Medical Colleges meeting in 1995.7 He commented on changes in the usual teacher-student relationship with increased computer-assisted education. The role of the teacher with regard to QD is a nontraditional one, but is still essential as the content is planned and written by faculty. Question of the Day, or any electronic teaching program, is an efficient method of teaching when academicians are called on to supervise the visits of increasing numbers of patients, document their care meticulously, and still teach in an exemplary way. There are aspects of pediatric emergency medicine that cannot be taught via a series of questions on the computer, and for these a good teacher is essential. When a medical curriculum is taught by computer, students do not receive the immediate feedback or teaching by example that a physician-teacher can provide.

Developing computer-assisted education programs is time consuming. On average, 8 to 12 hours of faculty time per month is required to develop new questions for QD. Sharing of educational materials between institutions may help with this problem.8

The software capabilities of our current hospital health care e-mail system limit the type of subject matter that can be taught via QD. Without multimedia capabilities, rashes or other visual abnormalities noted during the physical examination are difficult to present. Santer et al9 teach various aspects of pediatric airway disease in their multimedia textbook and include videotapes as part of the teaching material. Multimedia computer presentations would serve QD well by combining videotapes or still photographs with a series of questions. Visual material may have made concepts more memorable to housestaff and may have improved participation in our project.

We may have been able to improve participation by making QD available over the internet. This would have allowed PR and MPR to participate while on off-campus rotations or at home.

The information that residents received as a result of reading QD was the equivalent of a lecture series, especially if residents supplemented reading the daily e-mails with suggested reading. It was not a substitute for a month's rotation in the ED, however. The advantage of QD was its asynchronous provision of uniform information to a group of residents.

Residents enjoyed QD, but preferred to participate passively, by reading daily questions, rather than actively, by sending answers via e-mail. The QD format provides an ongoing educational endeavor that accommodates varying schedules and teaches residents in a uniform manner. It is a useful adjunct to bedside teaching and other methods of learning.

Accepted for publication May 22, 1998.

Presented at the Ambulatory Pediatric Association Annual Meeting, Washington, DC, May 8, 1996.

Henry C. Farrar III, MD, Stephen M. Schexnayder, MD, Rhonda M. Dick, MD, Donald L. Foster, MD, C. James Graham, MD, Laura P. James, MD, Meredith S. Krebel, MD, Steven R. Krebel, MD, and W. Rebecca Wood, MD, of the Emergency Medicine section at Arkansas Children's Hospital wrote and/or proofread the Questions of the Day. Denise Joyce provided technical assistance with the hospital health care system. Shelly Lensing provided statistical assistance and Stephen Kemp, MD, PhD, reviewed the manuscript and provided helpful comments. Michael Nash and Kim Clements provided secretarial assistance by typing material onto the hospital health care system e-mail system and tabulating results. Gwynda Hobby provided secretarial assistance with the manuscript.

Editor's Note: Another study showing that we love to learn, but hate to be tested and scored. I wonder if the residents in this study actually did learn; it sure is an interesting way to teach.—Catherine D. DeAngelis, MD

Corresponding author: Eva M. Komoroski, MD, Arkansas Children's Hospital, 800 Marshall St, Little Rock, AR 72202 (e-mail: evak@exchange.uams.edu).

Jelovsek  MDAdebonojo  L Learning principles as applied to computer-assisted instruction. MD Computing. 1993;10165- 172
Sheridan  MLeGros  E Computer-assisted instruction using electronic mail. J Nurs Staff Dev. 1995;11100- 103
Letterie  GSMorgenstern  LLJohnson  L The role of an electronic mail system in the educational strategies on a residency in obstetrics and gynecology. Obstet Gynecol. 1994;84137- 139
Letterie  GSSalminen  ERMcClure  GB An electronic bulletin board for instruction in reproductive endocrinology in a residency in obstetrics and gynecology. Fertil Steril. 1996;65883- 885
Porter  RS Efficacy of computer-assisted instruction in the continuing education of paramedics. Ann Emerg Med. 1991;20380- 384
Link to Article
Papa  JFMeyer  S A computer-assisted learning tool designed to improve clinical problem-solving skills. Ann Emerg Med. 1989;18269- 273
Link to Article
Chodorow  S Educators must take the electronic revolution seriously. Acad Med. 1996;71221- 226
Link to Article
Desch  LW Pediatric computer-assisted instruction. Arch Pediatr Adolesc Med. 1995;149303- 304
Link to Article
Santer  DMD'Alessandro  MPHontley  JSErkonin  WEGalvin  JR The multimedia textbook: a revolutionary tool for pediatric education. Arch Pediatr Adolesc Med. 1994;148711- 715
Link to Article

Tables

Table Graphic Jump LocationTable 1. Participation in Academic Year 1994-1995 for Question of the Day*
Table Graphic Jump LocationTable 2. Pretest and Posttest Results*
Table Graphic Jump LocationTable 3. Electronic Participation by Postgraduate Year (PGY) and Academic Year*
Table Graphic Jump LocationTable 4. Electronic Participation by Percentage of Questions Answered per Year

References

Jelovsek  MDAdebonojo  L Learning principles as applied to computer-assisted instruction. MD Computing. 1993;10165- 172
Sheridan  MLeGros  E Computer-assisted instruction using electronic mail. J Nurs Staff Dev. 1995;11100- 103
Letterie  GSMorgenstern  LLJohnson  L The role of an electronic mail system in the educational strategies on a residency in obstetrics and gynecology. Obstet Gynecol. 1994;84137- 139
Letterie  GSSalminen  ERMcClure  GB An electronic bulletin board for instruction in reproductive endocrinology in a residency in obstetrics and gynecology. Fertil Steril. 1996;65883- 885
Porter  RS Efficacy of computer-assisted instruction in the continuing education of paramedics. Ann Emerg Med. 1991;20380- 384
Link to Article
Papa  JFMeyer  S A computer-assisted learning tool designed to improve clinical problem-solving skills. Ann Emerg Med. 1989;18269- 273
Link to Article
Chodorow  S Educators must take the electronic revolution seriously. Acad Med. 1996;71221- 226
Link to Article
Desch  LW Pediatric computer-assisted instruction. Arch Pediatr Adolesc Med. 1995;149303- 304
Link to Article
Santer  DMD'Alessandro  MPHontley  JSErkonin  WEGalvin  JR The multimedia textbook: a revolutionary tool for pediatric education. Arch Pediatr Adolesc Med. 1994;148711- 715
Link to Article

Correspondence

CME
Meets CME requirements for:
Browse CME for all U.S. States
Accreditation Information
The American Medical Association is accredited by the Accreditation Council for Continuing Medical Education to provide continuing medical education for physicians. The AMA designates this journal-based CME activity for a maximum of 1 AMA PRA Category 1 CreditTM per course. Physicians should claim only the credit commensurate with the extent of their participation in the activity. Physicians who complete the CME course and score at least 80% correct on the quiz are eligible for AMA PRA Category 1 CreditTM.
Note: You must get at least of the answers correct to pass this quiz.
You have not filled in all the answers to complete this quiz
The following questions were not answered:
Sorry, you have unsuccessfully completed this CME quiz with a score of
The following questions were not answered correctly:
Commitment to Change (optional):
Indicate what change(s) you will implement in your practice, if any, based on this CME course.
Your quiz results:
The filled radio buttons indicate your responses. The preferred responses are highlighted
For CME Course: A Proposed Model for Initial Assessment and Management of Acute Heart Failure Syndromes
Indicate what changes(s) you will implement in your practice, if any, based on this CME course.
Submit a Comment

Multimedia

Some tools below are only available to our subscribers or users with an online account.

Related Content

Customize your page view by dragging & repositioning the boxes below.

Articles Related By Topic
Related Collections
PubMed Articles