際際滷

際際滷Share a Scribd company logo
Alliance for CME Webinar Tuesday, September 25, 2007 Beyond Theory: Practical Tools to Tackle Educational Outcomes Evaluation Wendy Turell, DrPH Senior Director Educational Design and Accreditation Services BCME
Todays Agenda I Measurement and CME II   Educational Outcomes Evaluation:  Methodology III  Sample Approaches to Study Design IV Questions and Answers
Rationale for Measurement Return on Education (ROE) Educational dollars should be spent on  effective  education Guide for future programming What formats and strategies work best? Protection within CME guidelines Proof of educational effectiveness
Evaluation and ACCME Compliance Criteria 11  The provider analyzes changes in learners (competence, performance, or patient outcomes) achieved as a result of the overall programs activities/educational interventions Level 1: Provisional Accreditation Level 2: Full Accreditation Level 3: Accreditation with Commendation
An Outcomes-Based Educational Model
Outcomes-Based Educational Model Design Program for Maximum Impact Needs Assessment Identify Optimal Outcomes  (based on educational gap) Outcomes Assessment & Program Evaluation Outcomes-Based Learning Objectives Needs Assessment
Educational Evaluation Tells us whether the planned goals match what learners take from the activity Judges, describes, defines, values, shapes opinion, directs attention Determines the value or quality of educational activities in efforts to provide feedback for improvement   Value can be in regards to: satisfaction, achievement, improved performance, benefits to others, return on investment, etc. Bennett NL: The voices of evaluation. J Contin Educ Health Prof 1997;17:198-206. ACME Evaluating CME Using Outcomes in  Evaluating Educational Outcomes ,  http://www. acme -assn.org,  accessed 1/2006. Green, Joseph & Eckstein, Jane (2006)  A Practical Guide to Integrating an Outcomes-Based Learning Model Into Your Planning Process , Alliance for CME Almanac, 28 (1), p 1-5.
Outcomes Evaluation  More Specific Type of Educational Evaluation Evaluation that strives to demonstrate the relative effectiveness of various approaches to education on learning/behavior change/patient health outcomes (a causes b) Bennett NL: The voices of evaluation. J Contin Educ Health Prof 1997;17:198-206. ACME Evaluating CME Using Outcomes in  Evaluating Educational Outcomes ,  http://www. acme -assn.org,  accessed 1/2006. Green, Joseph & Eckstein, Jane (2006)  A Practical Guide to Integrating an Outcomes-Based Learning Model Into Your Planning Process , Alliance for CME Almanac, 28 (1), p 1-5.
Reach for Greater Heights   in Measurement   LEVEL 5 PATIENT HEALTH LEVEL 4 PERFORMANCE LEVEL 3 LEARNING LEVEL 2 SATISFACTION LEVEL 1 PARTICIPATION LEVEL 6 POPULATION HEALTH Moore DE. A framework for outcomes evaluation.  In: Davis D. Barnes BE, Fox R, eds.  The Continuing Professional Development of Physicians: From Research to Practice.   Chicago: AMA Press: 2003.
Levels 1,2,3 Levels 1: Participation Not a very valid assessment of  educational  outcome Objective Level 2: Satisfaction Does not document learning  Subjective Level 3: Learning Assessment of educational outcome Can be measured with survey questionnaires May or may not lead to behavior change
Levels 4, 5 and 6 Level 4: Performance Various ways to measure Can document impact on practice behavior using follow-up assessments May not capture all new behaviors  Typically self-reported Level 5:  Patient Health Objective measure; desirable outcome variable Difficult to determine if change is due to intervention; also clouded by patient co-morbidities $$$$$, patient privacy laws (HIPPA) and Institutional Review Board Barriers Level 6: Population Health Most desirable outcome variable! Do most CME activities deliver such a reach?? Rarely measured in CME due to financial and logistical barriers
Outcomes Evaluation: Methodology
Sample Study Plan Study Subjects Survey Administration Survey Design Experimental  Group Control Group (optional) Case-Based Questions Knowledge-  Based Questions Pre-Test Staggered  Post-Test Different options to assess learning and behavior change
Validity & Statistical Significance Validity - The degree of confidence one can have in an observed result, such as an improvement in knowledge The degree to which the observed result can be attributed to the studied cause (ie: effectiveness of the CME course) and not random error in sampling and measurement Statistical Significance   Quantifies the degree of confidence you can have in a specific result Ex: Statistically significant at .05 alpha level = 95% chance that the result is valid and not due to chance
Some Ways to Increase Validity Choose participants fairly (everyone has an equal chance of completing surveys) Ensure robust sample size Write a clear survey (participants should understand questions as you understand them) Use a control group (placebo or comparison subjects)
Pre Tests:  Walking-in-the door  knowledge Pre-Tests - Establishes baseline of subjects knowledge/behavior before they are exposed to our educational intervention (EI) Ex: What do registrants for our activity know about the medical topic before they begin the course?  How are they treating patients with this condition?
Post Tests:  Walking-out-the door  knowledge Post-Tests - Collect data on participants knowledge and clinical practice patterns regarding the specific subject matter  after  they are exposed to the EI Ex: What do CME participants know about the medical topic after the course?  How are they treating patients with this condition now? Staggered Post-Tests-  Administer the post-test several weeks or months later. Better capture of the true retention of knowledge (or behavior change) by participants
Experimental and Control Groups Experimental Group-  participants who are exposed to educational activities We are interested in  their  learning and behavior change outcomes Control Group-  A comparison group of similar individuals who are not exposed to learning activities We are interested in how the experimental subjects outcomes differ from the control subjects outcomes  Match on demographic variables
Survey Question Types Knowledge-Based Practice Based Case Studies
Sample Practice-Based Case Study Question GT is a 73 year-old, non-smoking woman with no history of urinary complaints, pelvic organ prolapse or comorbid conditions.  GT presents with primary complaints of urgency and urinary leakage. She states that she cannot travel beyond her immediate neighborhood, or attend social events, for fear of experiencing episodes of urinary incontinence. Her primary goal is to control her symptoms and regain a normal lifestyle.
Sample Case Study Question What steps would you initiate in the evaluation of GT? Recommend initiation of a bladder diary Utilize a questionnaire with GT to help distinguish between urge and/or stress incontinence Perform a urinalysis  All of the above Which of the following steps would you take in the management of GTs condition? Teach the patient to initiate pelvic floor muscle exercises Educate the patient to avoid bladder irritants Both A & B Sacral nerve modulation Which of the following is LEAST likely to be a diagnosis for GT? Overactive bladder Bladder outlet obstruction  Urinary tract infection Stress urinary incontinence
Sample Size You dont have to poll everyone! Effect Size Power Calculation How to Estimate without having a statistics degree Rules of Thumb: If you have <30 participants, try to sample all of them
Incentives Encourage participation  We want to reach our target n Should be medically relevant  They  are  permitted in CME AMA guidelines-  <  $100.00 To avoid the perception of coercion, may wish to stay lower  Resources: www.medicalbooks.com www.medbookstore.com   www.allheart.com   http://solutions.medsite.com/medsite_rewards.asp
Challenges to Methodology Obtaining contact information for pretest contact Live Enduring Materials Funding Limitations Grantor concerns Pre-tests Incentives Objectivity
Challenges to Methodology ?
Sample Approaches Live Meeting (i.e.: Symposium) 250 attendees anticipated Obtained e-mails via pre-registration website Linked pre-test to this site E-mailed post-test to pre-registrants 4 weeks after the live meeting Screener question to assure they attended activity Match pre-test and post-test results
Sample Approaches Journal Supplement Distribution to 7,000 physicians Publisher could not provide e-mail addresses of recipients Could not anticipate readers from population of 7,000 Administered pre-test to control group of similarly specialized physicians Administered post-test to readers and control goup Recruited via mention of post-test and incentive in the rear of printed supplement
Sample Approaches Online CME Activity Challenge = no funding for incentives 1,000 anticipated participants Present with optional (opt-out) pre-test during log-in/registration Present with opt-out post-test at close of activity
Sample Approaches Podcast 500 downloads anticipated Obtained e-mails at point of download/registration Opt-out pre-test offered at point of download Mention URL (easy to recall address) at close of podcast mention incentive
Example 1 You wish to evaluate the educational outcomes of a live meeting regarding Diabetes that is to be held next month in your hospital.  The meeting is targeting physicians and nurses, and you anticipate a turnout of 100 participants.  Although your funding is limited, you do have unlimited access to the hospital information technology department should you need computer programming assistance. How would you approach assessing outcomes for this CME Activity?
Example 2 Your organization has received funding for a series of 6 podcasts on the topic of Heart Failure.  Each of the 6 podcasts will feature different thought leader interviews regarding hot topics in the therapeutic area.  You are not sure how many participants will listed to your podcasts, since this is the first time your company has administered this type of activity.  You have a budget of $10,000 for outcomes for the entire series.  How would you approach assessing outcomes for this CME Activity?
Example 3 Your company has received funding for a monograph publication that is based on the proceedings of a live event on the topic of Alzheimers Disease.  The publishing company informs you that the monograph will be distributed to 9,000 physicians and allied health care professionals.  You have $15,000 to perform an outcomes assessment of the enduring material. How would you approach assessing outcomes for this CME Activity?
Questions and Answers ? ?
Contact Information Wendy Turell, DrPH Associate Vice President, Educational Design and Accreditation Services BCME [email_address] www.bcmeonline.com

More Related Content

What's hot (20)

EBP Made Easy
EBP Made EasyEBP Made Easy
EBP Made Easy
CPDspot
Evidence Base Practice (EBP)-Define, Benefits,Resource, steps PPT
Evidence Base Practice (EBP)-Define, Benefits,Resource, steps PPTEvidence Base Practice (EBP)-Define, Benefits,Resource, steps PPT
Evidence Base Practice (EBP)-Define, Benefits,Resource, steps PPT
sonal patel
What is implementation science and why should you care
What is implementation science and why should you careWhat is implementation science and why should you care
What is implementation science and why should you care
Lisa Muldrew
Nursing process
Nursing processNursing process
Nursing process
Anant Wayzade
IMPACTE groups NHS librarians Oxford 12 March 2010
IMPACTE groups NHS librarians Oxford 12 March 2010IMPACTE groups NHS librarians Oxford 12 March 2010
IMPACTE groups NHS librarians Oxford 12 March 2010
Anne Gray
Evidence based practice for nurses, diabetics, and learning institutions
Evidence based practice for nurses, diabetics, and learning institutionsEvidence based practice for nurses, diabetics, and learning institutions
Evidence based practice for nurses, diabetics, and learning institutions
Forward Thinking, LLC
Promotional Effectiveness 2010
Promotional Effectiveness 2010Promotional Effectiveness 2010
Promotional Effectiveness 2010
henso010
2006 Accreditation Criteria
2006 Accreditation Criteria2006 Accreditation Criteria
2006 Accreditation Criteria
HOWARD ROBIN
Scientific Research
Scientific ResearchScientific Research
Scientific Research
Dalia El-Shafei
Evaluation methods in heathcare systems
Evaluation methods in heathcare systemsEvaluation methods in heathcare systems
Evaluation methods in heathcare systems
Marsa Gholamzadeh
Scholarly Project June 07072013
Scholarly Project June 07072013Scholarly Project June 07072013
Scholarly Project June 07072013
Darene Hall-Burnam DNP GNP-BC
evidence based practice
evidence based practice evidence based practice
evidence based practice
Nikhil Patel
Evaluation
EvaluationEvaluation
Evaluation
bodo-con
Falls- Evidence-based Practice
Falls- Evidence-based PracticeFalls- Evidence-based Practice
Falls- Evidence-based Practice
philianmae
An Introduction Patient Reported Outcome Measures (PROMS)
An Introduction Patient Reported Outcome Measures (PROMS)An Introduction Patient Reported Outcome Measures (PROMS)
An Introduction Patient Reported Outcome Measures (PROMS)
Keith Meadows
Evidence Based Practice, a 5.5 Min Intro
Evidence Based Practice, a 5.5 Min IntroEvidence Based Practice, a 5.5 Min Intro
Evidence Based Practice, a 5.5 Min Intro
Mary Shah
Sc4 preceptor training ppt
Sc4 preceptor training pptSc4 preceptor training ppt
Sc4 preceptor training ppt
blkrantz
Defining the New MCC Blueprint
Defining the New MCC Blueprint Defining the New MCC Blueprint
Defining the New MCC Blueprint
MedCouncilCan
Evidence based practice
Evidence based practiceEvidence based practice
Evidence based practice
Vini Mehta
Evidence Based Practice An Introduction
Evidence Based Practice An IntroductionEvidence Based Practice An Introduction
Evidence Based Practice An Introduction
Adahlia Basco RN, MAN, Ed. D
EBP Made Easy
EBP Made EasyEBP Made Easy
EBP Made Easy
CPDspot
Evidence Base Practice (EBP)-Define, Benefits,Resource, steps PPT
Evidence Base Practice (EBP)-Define, Benefits,Resource, steps PPTEvidence Base Practice (EBP)-Define, Benefits,Resource, steps PPT
Evidence Base Practice (EBP)-Define, Benefits,Resource, steps PPT
sonal patel
What is implementation science and why should you care
What is implementation science and why should you careWhat is implementation science and why should you care
What is implementation science and why should you care
Lisa Muldrew
IMPACTE groups NHS librarians Oxford 12 March 2010
IMPACTE groups NHS librarians Oxford 12 March 2010IMPACTE groups NHS librarians Oxford 12 March 2010
IMPACTE groups NHS librarians Oxford 12 March 2010
Anne Gray
Evidence based practice for nurses, diabetics, and learning institutions
Evidence based practice for nurses, diabetics, and learning institutionsEvidence based practice for nurses, diabetics, and learning institutions
Evidence based practice for nurses, diabetics, and learning institutions
Forward Thinking, LLC
Promotional Effectiveness 2010
Promotional Effectiveness 2010Promotional Effectiveness 2010
Promotional Effectiveness 2010
henso010
2006 Accreditation Criteria
2006 Accreditation Criteria2006 Accreditation Criteria
2006 Accreditation Criteria
HOWARD ROBIN
Evaluation methods in heathcare systems
Evaluation methods in heathcare systemsEvaluation methods in heathcare systems
Evaluation methods in heathcare systems
Marsa Gholamzadeh
evidence based practice
evidence based practice evidence based practice
evidence based practice
Nikhil Patel
Evaluation
EvaluationEvaluation
Evaluation
bodo-con
Falls- Evidence-based Practice
Falls- Evidence-based PracticeFalls- Evidence-based Practice
Falls- Evidence-based Practice
philianmae
An Introduction Patient Reported Outcome Measures (PROMS)
An Introduction Patient Reported Outcome Measures (PROMS)An Introduction Patient Reported Outcome Measures (PROMS)
An Introduction Patient Reported Outcome Measures (PROMS)
Keith Meadows
Evidence Based Practice, a 5.5 Min Intro
Evidence Based Practice, a 5.5 Min IntroEvidence Based Practice, a 5.5 Min Intro
Evidence Based Practice, a 5.5 Min Intro
Mary Shah
Sc4 preceptor training ppt
Sc4 preceptor training pptSc4 preceptor training ppt
Sc4 preceptor training ppt
blkrantz
Defining the New MCC Blueprint
Defining the New MCC Blueprint Defining the New MCC Blueprint
Defining the New MCC Blueprint
MedCouncilCan
Evidence based practice
Evidence based practiceEvidence based practice
Evidence based practice
Vini Mehta

Similar to Alliance 2007 "Best of Conference" Presentation and Webinar: Beyond Theory: Practical Tools to Tackle Educational Outcomes Evaluation, Wendy Turell (20)

CBI Grants West 2009 PI_Enhancing Grants
CBI Grants West 2009 PI_Enhancing GrantsCBI Grants West 2009 PI_Enhancing Grants
CBI Grants West 2009 PI_Enhancing Grants
William Mencia, MD, CCMEP
Updated new techniques
Updated new techniquesUpdated new techniques
Updated new techniques
Lawrence Sherman
EBN.pptx
EBN.pptxEBN.pptx
EBN.pptx
RahulRoyChowdhury13
HealthEd and Amylin EXL Digital Pharma West 2011
HealthEd and Amylin EXL Digital Pharma West 2011HealthEd and Amylin EXL Digital Pharma West 2011
HealthEd and Amylin EXL Digital Pharma West 2011
HealthEd
Assessing Adherence to Treatment: A Partnership
Assessing Adherence to Treatment: A PartnershipAssessing Adherence to Treatment: A Partnership
Assessing Adherence to Treatment: A Partnership
icapclinical
Human Behavior And Psychosocial Assessment
Human Behavior And Psychosocial AssessmentHuman Behavior And Psychosocial Assessment
Human Behavior And Psychosocial Assessment
Kayla Muth
Study Eligibility Criteria
Study Eligibility CriteriaStudy Eligibility Criteria
Study Eligibility Criteria
Effective Health Care Program
Implementation_Research.pptx
Implementation_Research.pptxImplementation_Research.pptx
Implementation_Research.pptx
JoshuaKalunda
Pelletier program assessment_guide
Pelletier program assessment_guidePelletier program assessment_guide
Pelletier program assessment_guide
CORE Group
Implementation strategies
Implementation strategiesImplementation strategies
Implementation strategies
Martha Seife
Chapt 2
Chapt 2Chapt 2
Chapt 2
camcampbell6
Research methodology
Research methodologyResearch methodology
Research methodology
Saiprasad Bhavsar
Diane purcille practice inquiry v7
Diane purcille practice inquiry v7Diane purcille practice inquiry v7
Diane purcille practice inquiry v7
dianepurcille
2013 AAP NCE MOC QI and Your EHR
2013 AAP NCE MOC QI and Your EHR2013 AAP NCE MOC QI and Your EHR
2013 AAP NCE MOC QI and Your EHR
dsandro1
Evidence based practice
Evidence  based  practiceEvidence  based  practice
Evidence based practice
kuldeep amin
Cochrane Health Promotion Antony Morgan Explor Meet
Cochrane Health Promotion Antony  Morgan    Explor MeetCochrane Health Promotion Antony  Morgan    Explor Meet
Cochrane Health Promotion Antony Morgan Explor Meet
Sonia Groisman
QUESTION 1What are the main streams of influence, according to.docx
QUESTION 1What are the main streams of influence, according to.docxQUESTION 1What are the main streams of influence, according to.docx
QUESTION 1What are the main streams of influence, according to.docx
makdul
Theory of Change
Theory of ChangeTheory of Change
Theory of Change
clearsateam
Clinical Decision Making
Clinical Decision MakingClinical Decision Making
Clinical Decision Making
rtolliver
evidence based practice powerpoint presentation
evidence based practice powerpoint presentationevidence based practice powerpoint presentation
evidence based practice powerpoint presentation
ManishaPaul30
HealthEd and Amylin EXL Digital Pharma West 2011
HealthEd and Amylin EXL Digital Pharma West 2011HealthEd and Amylin EXL Digital Pharma West 2011
HealthEd and Amylin EXL Digital Pharma West 2011
HealthEd
Assessing Adherence to Treatment: A Partnership
Assessing Adherence to Treatment: A PartnershipAssessing Adherence to Treatment: A Partnership
Assessing Adherence to Treatment: A Partnership
icapclinical
Human Behavior And Psychosocial Assessment
Human Behavior And Psychosocial AssessmentHuman Behavior And Psychosocial Assessment
Human Behavior And Psychosocial Assessment
Kayla Muth
Implementation_Research.pptx
Implementation_Research.pptxImplementation_Research.pptx
Implementation_Research.pptx
JoshuaKalunda
Pelletier program assessment_guide
Pelletier program assessment_guidePelletier program assessment_guide
Pelletier program assessment_guide
CORE Group
Implementation strategies
Implementation strategiesImplementation strategies
Implementation strategies
Martha Seife
Diane purcille practice inquiry v7
Diane purcille practice inquiry v7Diane purcille practice inquiry v7
Diane purcille practice inquiry v7
dianepurcille
2013 AAP NCE MOC QI and Your EHR
2013 AAP NCE MOC QI and Your EHR2013 AAP NCE MOC QI and Your EHR
2013 AAP NCE MOC QI and Your EHR
dsandro1
Evidence based practice
Evidence  based  practiceEvidence  based  practice
Evidence based practice
kuldeep amin
Cochrane Health Promotion Antony Morgan Explor Meet
Cochrane Health Promotion Antony  Morgan    Explor MeetCochrane Health Promotion Antony  Morgan    Explor Meet
Cochrane Health Promotion Antony Morgan Explor Meet
Sonia Groisman
QUESTION 1What are the main streams of influence, according to.docx
QUESTION 1What are the main streams of influence, according to.docxQUESTION 1What are the main streams of influence, according to.docx
QUESTION 1What are the main streams of influence, according to.docx
makdul
Theory of Change
Theory of ChangeTheory of Change
Theory of Change
clearsateam
Clinical Decision Making
Clinical Decision MakingClinical Decision Making
Clinical Decision Making
rtolliver
evidence based practice powerpoint presentation
evidence based practice powerpoint presentationevidence based practice powerpoint presentation
evidence based practice powerpoint presentation
ManishaPaul30

Alliance 2007 "Best of Conference" Presentation and Webinar: Beyond Theory: Practical Tools to Tackle Educational Outcomes Evaluation, Wendy Turell

  • 1. Alliance for CME Webinar Tuesday, September 25, 2007 Beyond Theory: Practical Tools to Tackle Educational Outcomes Evaluation Wendy Turell, DrPH Senior Director Educational Design and Accreditation Services BCME
  • 2. Todays Agenda I Measurement and CME II Educational Outcomes Evaluation: Methodology III Sample Approaches to Study Design IV Questions and Answers
  • 3. Rationale for Measurement Return on Education (ROE) Educational dollars should be spent on effective education Guide for future programming What formats and strategies work best? Protection within CME guidelines Proof of educational effectiveness
  • 4. Evaluation and ACCME Compliance Criteria 11 The provider analyzes changes in learners (competence, performance, or patient outcomes) achieved as a result of the overall programs activities/educational interventions Level 1: Provisional Accreditation Level 2: Full Accreditation Level 3: Accreditation with Commendation
  • 6. Outcomes-Based Educational Model Design Program for Maximum Impact Needs Assessment Identify Optimal Outcomes (based on educational gap) Outcomes Assessment & Program Evaluation Outcomes-Based Learning Objectives Needs Assessment
  • 7. Educational Evaluation Tells us whether the planned goals match what learners take from the activity Judges, describes, defines, values, shapes opinion, directs attention Determines the value or quality of educational activities in efforts to provide feedback for improvement Value can be in regards to: satisfaction, achievement, improved performance, benefits to others, return on investment, etc. Bennett NL: The voices of evaluation. J Contin Educ Health Prof 1997;17:198-206. ACME Evaluating CME Using Outcomes in Evaluating Educational Outcomes , http://www. acme -assn.org, accessed 1/2006. Green, Joseph & Eckstein, Jane (2006) A Practical Guide to Integrating an Outcomes-Based Learning Model Into Your Planning Process , Alliance for CME Almanac, 28 (1), p 1-5.
  • 8. Outcomes Evaluation More Specific Type of Educational Evaluation Evaluation that strives to demonstrate the relative effectiveness of various approaches to education on learning/behavior change/patient health outcomes (a causes b) Bennett NL: The voices of evaluation. J Contin Educ Health Prof 1997;17:198-206. ACME Evaluating CME Using Outcomes in Evaluating Educational Outcomes , http://www. acme -assn.org, accessed 1/2006. Green, Joseph & Eckstein, Jane (2006) A Practical Guide to Integrating an Outcomes-Based Learning Model Into Your Planning Process , Alliance for CME Almanac, 28 (1), p 1-5.
  • 9. Reach for Greater Heights in Measurement LEVEL 5 PATIENT HEALTH LEVEL 4 PERFORMANCE LEVEL 3 LEARNING LEVEL 2 SATISFACTION LEVEL 1 PARTICIPATION LEVEL 6 POPULATION HEALTH Moore DE. A framework for outcomes evaluation. In: Davis D. Barnes BE, Fox R, eds. The Continuing Professional Development of Physicians: From Research to Practice. Chicago: AMA Press: 2003.
  • 10. Levels 1,2,3 Levels 1: Participation Not a very valid assessment of educational outcome Objective Level 2: Satisfaction Does not document learning Subjective Level 3: Learning Assessment of educational outcome Can be measured with survey questionnaires May or may not lead to behavior change
  • 11. Levels 4, 5 and 6 Level 4: Performance Various ways to measure Can document impact on practice behavior using follow-up assessments May not capture all new behaviors Typically self-reported Level 5: Patient Health Objective measure; desirable outcome variable Difficult to determine if change is due to intervention; also clouded by patient co-morbidities $$$$$, patient privacy laws (HIPPA) and Institutional Review Board Barriers Level 6: Population Health Most desirable outcome variable! Do most CME activities deliver such a reach?? Rarely measured in CME due to financial and logistical barriers
  • 13. Sample Study Plan Study Subjects Survey Administration Survey Design Experimental Group Control Group (optional) Case-Based Questions Knowledge- Based Questions Pre-Test Staggered Post-Test Different options to assess learning and behavior change
  • 14. Validity & Statistical Significance Validity - The degree of confidence one can have in an observed result, such as an improvement in knowledge The degree to which the observed result can be attributed to the studied cause (ie: effectiveness of the CME course) and not random error in sampling and measurement Statistical Significance Quantifies the degree of confidence you can have in a specific result Ex: Statistically significant at .05 alpha level = 95% chance that the result is valid and not due to chance
  • 15. Some Ways to Increase Validity Choose participants fairly (everyone has an equal chance of completing surveys) Ensure robust sample size Write a clear survey (participants should understand questions as you understand them) Use a control group (placebo or comparison subjects)
  • 16. Pre Tests: Walking-in-the door knowledge Pre-Tests - Establishes baseline of subjects knowledge/behavior before they are exposed to our educational intervention (EI) Ex: What do registrants for our activity know about the medical topic before they begin the course? How are they treating patients with this condition?
  • 17. Post Tests: Walking-out-the door knowledge Post-Tests - Collect data on participants knowledge and clinical practice patterns regarding the specific subject matter after they are exposed to the EI Ex: What do CME participants know about the medical topic after the course? How are they treating patients with this condition now? Staggered Post-Tests- Administer the post-test several weeks or months later. Better capture of the true retention of knowledge (or behavior change) by participants
  • 18. Experimental and Control Groups Experimental Group- participants who are exposed to educational activities We are interested in their learning and behavior change outcomes Control Group- A comparison group of similar individuals who are not exposed to learning activities We are interested in how the experimental subjects outcomes differ from the control subjects outcomes Match on demographic variables
  • 19. Survey Question Types Knowledge-Based Practice Based Case Studies
  • 20. Sample Practice-Based Case Study Question GT is a 73 year-old, non-smoking woman with no history of urinary complaints, pelvic organ prolapse or comorbid conditions. GT presents with primary complaints of urgency and urinary leakage. She states that she cannot travel beyond her immediate neighborhood, or attend social events, for fear of experiencing episodes of urinary incontinence. Her primary goal is to control her symptoms and regain a normal lifestyle.
  • 21. Sample Case Study Question What steps would you initiate in the evaluation of GT? Recommend initiation of a bladder diary Utilize a questionnaire with GT to help distinguish between urge and/or stress incontinence Perform a urinalysis All of the above Which of the following steps would you take in the management of GTs condition? Teach the patient to initiate pelvic floor muscle exercises Educate the patient to avoid bladder irritants Both A & B Sacral nerve modulation Which of the following is LEAST likely to be a diagnosis for GT? Overactive bladder Bladder outlet obstruction Urinary tract infection Stress urinary incontinence
  • 22. Sample Size You dont have to poll everyone! Effect Size Power Calculation How to Estimate without having a statistics degree Rules of Thumb: If you have <30 participants, try to sample all of them
  • 23. Incentives Encourage participation We want to reach our target n Should be medically relevant They are permitted in CME AMA guidelines- < $100.00 To avoid the perception of coercion, may wish to stay lower Resources: www.medicalbooks.com www.medbookstore.com www.allheart.com http://solutions.medsite.com/medsite_rewards.asp
  • 24. Challenges to Methodology Obtaining contact information for pretest contact Live Enduring Materials Funding Limitations Grantor concerns Pre-tests Incentives Objectivity
  • 26. Sample Approaches Live Meeting (i.e.: Symposium) 250 attendees anticipated Obtained e-mails via pre-registration website Linked pre-test to this site E-mailed post-test to pre-registrants 4 weeks after the live meeting Screener question to assure they attended activity Match pre-test and post-test results
  • 27. Sample Approaches Journal Supplement Distribution to 7,000 physicians Publisher could not provide e-mail addresses of recipients Could not anticipate readers from population of 7,000 Administered pre-test to control group of similarly specialized physicians Administered post-test to readers and control goup Recruited via mention of post-test and incentive in the rear of printed supplement
  • 28. Sample Approaches Online CME Activity Challenge = no funding for incentives 1,000 anticipated participants Present with optional (opt-out) pre-test during log-in/registration Present with opt-out post-test at close of activity
  • 29. Sample Approaches Podcast 500 downloads anticipated Obtained e-mails at point of download/registration Opt-out pre-test offered at point of download Mention URL (easy to recall address) at close of podcast mention incentive
  • 30. Example 1 You wish to evaluate the educational outcomes of a live meeting regarding Diabetes that is to be held next month in your hospital. The meeting is targeting physicians and nurses, and you anticipate a turnout of 100 participants. Although your funding is limited, you do have unlimited access to the hospital information technology department should you need computer programming assistance. How would you approach assessing outcomes for this CME Activity?
  • 31. Example 2 Your organization has received funding for a series of 6 podcasts on the topic of Heart Failure. Each of the 6 podcasts will feature different thought leader interviews regarding hot topics in the therapeutic area. You are not sure how many participants will listed to your podcasts, since this is the first time your company has administered this type of activity. You have a budget of $10,000 for outcomes for the entire series. How would you approach assessing outcomes for this CME Activity?
  • 32. Example 3 Your company has received funding for a monograph publication that is based on the proceedings of a live event on the topic of Alzheimers Disease. The publishing company informs you that the monograph will be distributed to 9,000 physicians and allied health care professionals. You have $15,000 to perform an outcomes assessment of the enduring material. How would you approach assessing outcomes for this CME Activity?
  • 34. Contact Information Wendy Turell, DrPH Associate Vice President, Educational Design and Accreditation Services BCME [email_address] www.bcmeonline.com

Editor's Notes

  • #3: I Measurement and CME An Outcomes-Based Educational Model Rationale Overview II Educational Outcomes Evaluation: Methodology Pre and Post-testing Experimental and Control Groups III Sample Approaches to Study Design Symposium,Journal Supplement,Online Activity,Podcast IV Questions and Answers
  • #4: Return on Education (ROE) Educational dollars should be spent on effective education Guide for future programming What formats and strategies work best? Protection within CME guidelines Proof of educational effectiveness
  • #7: Integrated program planning progress
  • #11: Levels 1: Participant Satisfaction and Program Quality Measures No longer enough per ACCME and many grantors Level 2: Change (or intent to change) in Knowledge, Attitudes, or Skills Intent to change- has been shown to correlate with actual behavior change Level 3: Self-Reported Behavior Change Can be Captured via staggered post-tests administered weeks to months after activity completion Note : just because something is subjective does not mean its wrong We just have less confidence in these results than we would if we used objective variables
  • #12: Level 4: Change in Practice Ex: chart reviews may not capture every behavior record keeping varies by institution &amp; practice Level 5: Change in Treatment Outcomes or Health Status of Patients Best matched with CME activity that has strong impact (multiple exposures, major intervention) -you wont likely find a change in patient outcome via a one-hour live symposium with 100 participants. Best matched for a series, a larger sweeping initiative in a health-care system, etc.
  • #14: I would fix the visuals on this and put in a build have different colors for each box- make it a bit more interactive. Make sure you say that when you can recurring medical education initiatives- you must have the same evaluation-same outcomes analysis plan- that you can compare in terms of success to other educational initiatives
  • #18: mail, online link, snail mail)
  • #23: You dont have to poll everyone! Effect Size magnitude of effect under study Power Calculation How to Estimate without having a statistics degree Rules of Thumb: If you have &lt;30 participants, try to sample all of them
  • #24: Encourage participation We want to reach our target n Recommended If you plan to use a control group Should be medically relevant They are permitted in CME AMA guidelines- Incentives not to exceed $100.00 Unwritten guidelines- Do not exceed $25.00/survey Avoid the perception of coercion SAME INCENTIVES TO BOTH GROUPS
  • #25: Obtaining contact information for pretest contact Live Can offer pre-tests as participants arrive in room, but this may be awkward and logistically difficult ARS Enduring Materials May be more practical to skip the pre-test (validity may suffer) Funding Limitations Grantor concerns Pre-tests Incentives Can always skip incentives (note: validity may suffer due to lower enrollment) Objectivity