Skip navigation

The problem with evaluation forms

Derek makes some interesting points in this post, The Value (Or Lack Thereof) Of Self-Reported Outcomes In CME. I know he's right about people just ripping through the checkoffs like (pick a metaphor) and not providing meaningful data other than a baseline from which you can find outliers, but I'm wondering if maybe he's not asking the right question. He proposes three ways to get better data, including shortening up the form to asking just four questions. Which I'm sure would help, especially when two of them require write-in answers.

As he says, "My hope is, by simplifying and reducing the form to these four basic questions, more participants would be willing to take the time to give thoughtful, articulate answers that would provide meaningful insight into the achievement of the desired outcomes of the activity."

The question I think we need to be asking is why learners aren't motivated to give those thoughtful answers to begin with. As always, in my mind, anyway, it comes down to the old "what's in it for me?" Obviously, they don't see enough value in filling out the form to make it worth their while to do more than the bare minimum. Other than shortening the form, which just makes it less painful, not more valuable, what can you as a CME provider do to engage learners in the outcomes-gathering process so that they actually want to do a good job with those forms?

Update: I just did a quick archive search and found this article on outcomes from back in 2003 that touches on these issues. Some of the tips people gave me then were:

* Keep it short

* Use open-ended questions

* Ask about each objective (so much for keeping it short!), not just if objectives were met

* Offer incentives (free registration/hotel/airfare to participate in another activity)

• Tell attendees that you'll be following up with a survey in three or six months (check the article for tips on how to get them to participate in follow-up surveys and questions to ask)

In re-reading this thing, it's actually holding up pretty well for something written eight years ago. While that make me feel pretty good in one way, it's kind of sad that we haven't made much progress over that time in figuring this out.

Here's another kind of handy sidebar from the article that I think still holds up:

Measurement Tools

Here's a look at just a few of the many ways to measure CME outcomes:

SELF-REPORT THROUGH EVALUATIONS: Asking attendees what they learned and how they planned to (or how they did) use what they learned through immediate, post-meeting evaluation forms and follow-up mailings, e-mailings, faxes, and telephone interviews

Pros: Easy to implement, relatively inexpensive

Cons: Not very reliable, can be difficult to get a significant number of responses

CASE STUDIES: Presenting attendees with a case study related to a specific practice area, both as a pre-test and as a post-meeting evaluation. Can be done via telephone, fax, e-mail, or mail

Pros: While still a form of self-report, studies have found it to be reliable in terms of predicting physician behavior; can be as cost-effective as evaluations

Cons: Need to have expertise to design an effective case study; can be difficult to get responses

CHARTS/PATIENT CARE RECORDS: Measuring baseline performance and post-meeting behavioral improvement by looking at attendees' patient care records

Pros: Highly effective form of evaluation, especially when the records are available in database form

Cons: Privacy issues can be an impediment; can be difficult to obtain outside of hospitals and large healthcare systems

STANDARDIZED PATIENTS: Objective, structured clinical exams where physicians visit stations and examine patients presenting a particular disease. Docs have to come up with the right answer before they can move on to the next station.

Pros: Highly effective form of evaluation; allows CME provider to observe physician interacting with actors posing as patients

Cons: Requires a lot of time and resources to develop and implement

Hide comments

Comments

  • Allowed HTML tags: <em> <strong> <blockquote> <br> <p>

Plain text

  • No HTML tags allowed.
  • Web page addresses and e-mail addresses turn into links automatically.
  • Lines and paragraphs break automatically.
Publish