• Users Online: 61
  • Print this page
  • Email this page

Table of Contents
Year : 2021  |  Volume : 4  |  Issue : 1  |  Page : 4-10

Integrating an evidence-based medicine curriculum into physician assistant education: Teaching skills for lifelong decision-making!

Department of Physician Assistant Studies, Grand Valley State University, Grand Rapids, Michigan, USA

Date of Submission12-Jan-2021
Date of Acceptance20-Jan-2021
Date of Web Publication7-May-2021

Correspondence Address:
Dr. Martina Ingeborg Reinhold
301 Michigan Street NE, Grand Valley State University, Grand Rapids, Michigan 49503
Login to access the Email id

Source of Support: None, Conflict of Interest: None

DOI: 10.4103/ehp.ehp_1_21

Rights and Permissions

Background: Medical knowledge continuously evolves and to help health care providers to stay up to date, the evidence-based medicine (EBM) model has emerged. The practice of EBM requires new skills of the health care provider, including directed literature searches, the critical evaluation of research studies and the direct application of the findings to patient care. Methods: This paper describes the integration and evaluation of an evidence-based medicine course sequence into a Physician Assistant curriculum utilizing the expertise from faculty trained in basic science, nursing and primary care practice. Results: Collaboration of faculty with different educational backgrounds resulted in a course series that was equally strong in all aspects of EBM. This new course sequence teaches students to manage and use the best clinical research evidence to competently practice medicine. To assess the effectiveness of the EBM sequence a survey was developed and administered at the beginning and end of the sequence. Conclusion: EBM knowledge gained is essential to effective clinical decision making and this newly developed tool specifically helps to identify student competencies within the defined course objectives. Contributing to the uniqueness of the tool are case-based questions requiring integration of EBM knowledge by the student.

Keywords: Assessment, curriculum, evidence-based medicine, physician assistant

How to cite this article:
Reinhold MI, Bacon-Baguley TA. Integrating an evidence-based medicine curriculum into physician assistant education: Teaching skills for lifelong decision-making!. Educ Health Prof 2021;4:4-10

How to cite this URL:
Reinhold MI, Bacon-Baguley TA. Integrating an evidence-based medicine curriculum into physician assistant education: Teaching skills for lifelong decision-making!. Educ Health Prof [serial online] 2021 [cited 2022 Aug 12];4:4-10. Available from: https://www.ehpjournal.com/text.asp?2021/4/1/4/315622

  Introduction Top

The rapid pace of scientific discovery and technological innovations leads to a continuous advancement of current medical knowledge. To facilitate health-care providers' application of up-to-date information, evidence-based medicine (EBM) has emerged as a paradigm. EBM provides a framework for the incorporation of research evidence, clinical expertise, and patient values/preferences into the delivery of health care.[1] Studies have shown that implementation of EBM principles improves patient outcomes and the quality of care delivered by reducing the gap between best evidence and best practice.[2],[3] Many national registration agencies, accreditation councils, and health professional bodies, including the National Academy of Medicine (formerly the Institute of Medicine), consider EBM a core competency needed for health-care professionals.[4],[5],[6] Consequently, EBM has become a well-accepted model that is being taught across a wide variety of medical and allied health sciences curricula within undergraduate, postgraduate, and continuing health education.

Practicing EBM requires that users are skilled across a five-step process that includes (1) translation of clinical uncertainty into an answerable clinical question; (2) systematic retrieval of the best available evidence; (3) critical appraisal of the evidence for validity, clinical relevance, and applicability; (4) application of results; and (5) evaluation of performance.[7] Review of the literature highlights a disproportionate focus on critical appraisal (Step 3) as compared to the other four steps of practicing EBM.[8],[9],[10] Specifically, a review of twenty EBM educational interventions for medical students revealed a focus on some steps of the process (asking a clinical question, acquire evidence, and critical appraisal) but less concentration on application, assessment, and reflection.[11] Furthermore, effective implementation of EBM educational interventions lacks high-quality validated instruments to establish the success of educational activities.[12]

Physician assistant (PA) programs are accredited through the Accreditation Review Commission on Education for the PA (ARC-PA). The ARC-PA standard B2.10 states “The program curriculum must include instruction to prepare students to search, interpret, and evaluate the medical literature, including its application to individualized patient care.”[13] Accredited PA programs are required to comply with the accreditation standard; however, the amount of curricular time devoted to this instruction can vary 10-fold among surveyed programs.[14] This variation in time leads to a wide variation in EBM/evidence-based practice (EBP) skills of graduates.

This paper describes the integration and evaluation of an EBM course sequence into a PA curriculum utilizing the expertise from faculty trained in basic science, nursing, and primary care practice. The newly developed course sequence teaches students to gain, assess, apply, and integrate knowledge and use the best clinical research evidence to competently practice medicine. The course sequence consists of a total of three courses, spanning an entire academic year (January to December), with the course content encouraging the application of critical appraisal of the literature, and identification of best evidence, while introducing clinical research design and medical literature critique. Importantly, students learn how to apply the skills developed in clinical case scenarios. The curriculum has been developed to provide every student in the program with the most clinically applicable skills to identify, manage, and apply the best clinical research evidence to improve the health of their patients and competently practice medicine. To evaluate the effectiveness of the course sequence, a survey was developed to assess the key principles of EBP.

  Methods Top

Curriculum development

In 2014, the PA program at a Midwestern University replaced a research project requirement with an EBM course series to fulfill the accreditation standard B2.10 mandated by the accreditation body (ARC-PA). The EBM curriculum starts in the second semester of the four-semester didactic curriculum, and all students enrolled in the PA program are required to take the course series [Figure 1]. The course sequence was developed by faculty that attended the Duke University EBM workshop “Teaching and Leading EBM: A workshop for Educators and Champions of Evidence Based Medicine.” The EBM: How to Practice and Teach It, 4th Edition and primary literature were used as reference for developing the course content.[15] The cornerstone of the courses is the interactive small group discussions that are designed to help students learn how to apply the skills developed into clinical practice.
Figure 1: Timeline of curriculum within the program identifying the evidence-based medicine course series. The evidence-based medicine course series spans three semesters of the didactic phase of the curriculum (Winter-evidence-based medicine 1, Spring/Summer-evidence-based medicine 2, and Fall-evidence-based medicine 3). The evidence-based medicine survey was administered at the two time points indicated by the black arrows. Administration occurred prior to statistics/evidence-based medicine courses and after the evidence-based medicine course series

Click here to view

The first course in the series includes a 1-h lecture which is used to introduce EBM topics [Table 1]. Following the lecture, students then work with individual course instructors in small discussion groups (8–10 students) to review and practice/apply the skills introduced in the lecture. Discussion sessions include critical appraisal of journal articles/medical literature. Lecture content was supplemented with reading assignments for the lecture as well as specific assignments for the discussion sections. Learning is assessed by quizzes as well as student participation and presentations in the discussion sessions.
Table 1: Evidence-based medicine lecture topics covered in the first course of the evidence-based medicine course series

Click here to view

The second and third course in the EBM series does not include the 1-h lecture but maintain the weekly 2-h discussion sessions. These courses continue to develop the skills and knowledge obtained in the first course of the series, allowing the student to evaluate the strengths and flaws of medical research documents. During the courses, students demonstrate competency by developing sound clinical questions from case scenarios and presenting clinical answers/solutions based on valid evidence through discussion and presentations. The final project requires students to identify and use best clinical evidence to answer select clinical cases. Students present their findings including the supporting literature and a discussion of relevant EBM concepts that guided their clinical decision-making. This requirement assesses the students' application of the concepts taught throughout the course series. Clinical questions and articles discussed in the weekly discussion sessions align with the topics covered in the clinical curriculum [Table 2]. This connects the interpretation of the medical literature with the content covered in the remaining didactic courses, which focus on clinical medicine. The course instructors met weekly to review and discuss the course content.
Table 2: Alignment of content covered in the discussion portion of the evidence-based medicine course series with clinical medicine topics covered in curriculum

Click here to view

Development and evaluation of the instrument

Survey development occurred in several steps. First, concepts were identified that met one of the eight EBM course series objectives [Table 3] and [Table 4]. Questions were then developed to assess the concepts. Second, a draft was distributed to six faculties with EBM knowledge (content experts) who were asked to take the survey. We removed controversial elements and adopted others in response to their suggestions.
Table 3: Alignment of evidence-based medicine concepts covered in the course series with course objectives

Click here to view
Table 4: Survey questions distribution as they address one of the eight objectives of the evidence-based medicine course series

Click here to view

Last, to test for reliability, 15 students took the test two times 3 weeks apart (intrarater reliability analysis). The Shrout–Fleiss reliability test was used to calculate the intraclass correlation coefficient (ICC), a widely used reliability index in intrarater reliability. The index reflects the variation of data measured by the same rater across two or more trials. The ICC value for the survey is 0.89845, indicating excellent reliability.[16]

The final survey consists of 52 Likert scale questions that address the eight overarching objectives for the course series. The survey is available upon request and can be used with the permission of the authors.

Participants and Assessment

Prior to the start of the didactic phase of the PA curriculum, students from the Class of 2016 (52), Class of 2017 (46), Class of 2018 (55), Class of 2019 (48), and Class of 2020 (48) were asked to complete the survey which served as EBM-baseline knowledge prior to the course series [Figure 1]. This assessment is placed a semester before the start of the EBM course series as students are required to take a graduate level statistics course in the first didactic semester. This course is taught by a statistician and is an overview of statistical techniques commonly encountered in health professions. These topics are applied during the EBM courses where students are required to verbally validate the understanding and application of real-life statistical tests, which models statistical literacy. At the end of the three semester course series, the same survey was given to all students in the program, and the results from before and after the sequence of EBM courses are compared. Specific attention is paid to improvement of overall performance of students in the eight course objectives.

Analysis of data

Survey results were analyzed to provide summary descriptive statistics. Percentages of average responses by PA students were calculated in SPSS version 20 (IBM Corp. Released 2013. IBM SPSS Statistics for Windows, Version 22.0. Armonk, NY: IBM Corp). Percent change and trends were analyzed.

Differences in a change in performance on the respective objectives between the pre-EBM and post-EBM time points were assessed using an independent t-test. Assumption of normality was verified by visual inspection of the distribution of survey scores for each cohort using histograms. Comparison of means was performed within each class, and mean scores pre- and post-EBM course for each class were compared for the overall EBM survey and for each course objective assessed by distinct questions within the survey. The total number of distinct questions in the survey was 52, and a total number of distinct questions for each objective varied by objective. Both raw (unadjusted) and false discovery rate (FDR)-adjusted two-sided P values were presented for each comparison.[17] FDR-adjusted P < 0.05 was considered statistically significant. All statistical analyses were performed using R Studio version 4.0.2. (R Studio Team (2020). RStudio: Integrated Development for R. RStudio, PBC, Boston, MA).

  Results Top

Results from this study are presented in [Table 5]. Nearly 93% (232/249) of the eligible students contributed data at baseline (pre-EBM) and at the end of the didactic curriculum (post-EBM). The eligible student population consisted of 55 male students (24%) and 177 female students (76%). This distribution between male and female students is generally observed in PA cohorts. The average age at the start of the courses was 24.18 years.
Table 5: Increase in correct answers (as percentage change from initial score) using the 52-question survey

Click here to view

Students from the Class of 2016 to 2020 consistently improved (as measured by percent increase in correct responses on the survey tool) after the EBM course series [Class of 2016: 16.0%; Class of 2017: 12.5%; Class of 2018: 36.2%; Class of 2019: 21.7%; and Class of 2020: 22.8%, [Table 5]]. The aggregate average increase in correct answers for the classes tested is 20.5%. This improvement is statistically significant (P < 0.05) for all individual classes as well as the average increase of the aggregate of all classes [Table 5]. We did not identify a statistically significant difference in performance between female and male students (data not shown). These results show that students performed significantly better on the EBM survey after receiving didactic training in EBM principles.

The biggest increase in knowledge across all classes was observed in the areas of finding and evaluating the evidence, with searching the medical database (Objective 3; 87.0%) and critical evaluation of the medical literature (Objective 4; 59.8%) showing the biggest gains over baseline scores. Questions requiring students to ask concise clinical questions (objective 2; 22.9%), presenting arguments that support validity of EBM (objective 7; 25.3%) and analyze, evaluate, and report on the available clinical evidence regarding diagnosis showed improvement (Objective 8; 17.2%) but to a lesser extent. Responses to questions asking students to define EBM (Objective 1; 13.5%), demonstrate knowledge of relevant research methodologies (Objective 5; 5.7%), and evaluation of research design application (Objective 6; 8.9%) showed only slight improvement above the initial pre-EBM score.

Statistical analysis identified a significant improvement for four objectives in at least 4 of the 5 years analyzed. These objectives relate to asking concise clinical questions (Objective 2), concepts on validity and reliability (Objective 3), evaluation of the medical resources (Objective 4), and translation of the available clinical evidence into clinical decision-making [Objective 8, [Table 6]], coinciding with objectives where students' scores improved more notably after participation in the EBM course series.
Table 6: Objectives that showed significant improvement after completion of the evidence-based medicine course series

Click here to view

Students consistently scored higher on some questions of the survey prior to participation in the EBM course series. We evaluated student responses using Item Response Theory (IRT) to identify ways to improve survey accuracy and to reduce the number of survey questions.[18] IRT takes into account the number of questions answered correctly and the difficulty of the question. We identified 13 questions that, on the difficulty rating, scored as easier [negative values, [Table 7]], indicating that these questions are not helpful in determining knowledge gained as a result of participation in the EBM course series. All 13 questions received a correct presurvey score rating of 73% and higher, suggesting that questions receiving a score of ~75% or higher on the presurvey may not be effective in determining knowledge gained.
Table 7: Questions identified by item response theory as not being helpful in determining knowledge gained in evidence-based medicine course series

Click here to view

Next, the 13 questions identified by IRT were removed from the survey results, and the data were re-analyzed for all five PA classes individually as well as an aggregate. The results show that the score improved by a greater percentage when these 13 questions were excluded from the analysis [Table 8]. This suggests that the remaining survey questions better assess actual learning of EBM principles taught in the EBM course series by limiting measuring information learned in courses prior to joining the PA program.
Table 8: Increase in correct answers (as percentage change from initial score) using the original 52-question survey and the modified 39-question survey (after removal of 13 questions identified using item response theory)

Click here to view

  Discussion Top

This study reports findings on a PA program EBM curriculum that is effective in improving measured EBM knowledge. Key elements of this curriculum include the integration of lectures with small discussion sessions to develop a solid foundation in EBM knowledge. Additional semesters with only discussion sessions build on this knowledge through practice of critical appraisal and application in clinical scenarios. Our results suggest that the integration of discussion groups with lectures contributed most to the learning of EBM in these PA students. Qualitative comments from students, such as “group discussions were always super helpful” and “EBM is a good course and I believe an important part of our education,” underscore the value of the EBM course design and highlight the approval of the EBM course series by students. Existing EBM curricula in other professional programs focus predominantly on teaching critical appraisal skills and spent less time on clinical application.[8],[19] Our curriculum teaches students the important skill of how to ask a clinical question, how to find the best evidence to answer this clinical question and apply valid evidence to clinical practice. In addition, our curriculum teaches students how to critically evaluate resources provided in point of care references such as DynaMed® or Lexicomp®. Incorporation of critical appraisal skills with evaluation of resources found in point of care references is critical. These databases are updated frequently and designed to provide the most useful and up-to-date point of care health information available, making these sites attractive to the health-care provider who is short on time.

The predominant improvement was seen in knowledge questions related to searching and critical evaluation of the medical literature. Questions on integration were still more difficult for students. These findings support that students are comfortable and effective in using the available technology to identify the relevant clinical literature. However, applying the identified information to the clinical question remains challenging.

Instructors of the courses utilized the information from the survey to focus on areas showing less improvement. Additional practice with appraisal of the clinical literature was added to the discussion session in 2019, allowing students to apply the concepts of study design, as well as evaluation of strength and weakness of the study. In 2020, practice sessions on measures of effect were added, focusing on the interpretation and use of these measures. Furthermore, the clinical faculty developed case scenarios that were incorporated into the lectures and discussion sessions, allowing students to better connect the EBM principles with patient care. Students commented that “class discussion and case examples” contributed most to their learning and that “applying it to clinical decision-making helped bring the point home.”

Analysis of the survey tool using IRT resulted in the identification of 13 questions (about 25%) that are not helpful in determining knowledge gained. Removal of these questions in a secondary analysis resulted in an improvement in measured learning. Prior exposure to research methods and analysis in the prerequisite undergraduate statistics course or through undergraduate research and/or courses likely accounts for the increased knowledge of students in some of the principles of EBM. Faculty will evaluate the 13 questions in detail and determine if these questions either need to be removed from the survey or rewritten. The latter option will be required for objectives that are addressed by fewer questions [Table 4]. Removal of questions in this targeted way also reduces the length of the survey. Currently, students take an average of 35 min to complete the survey. It would be advantageous to reduce the time to 20 min as the average attention span for individuals in higher education has been determined to be between 15 and 20 min.[20]

A strength of this study is that it includes data from five consecutive PA classes (Class of 2016 to Class of 2020), with about 250 students participating. This improved the ability to perform statistical analysis of the results as random differences are minimized using the aggregate data. As shown in [Table 5], the aggregate data show improvement in all objectives, with some objectives showing statistical significance for a majority of the classes analyzed [Objectives 2, 3, 4, 7, and 8; [Table 6]].

Our study has several limitations. The use of the multiple-choice format makes implementation simple but does not allow students to demonstrate some performance-based components of EBM such as developing a focused clinical question from a scenario or searching the literature for good answers. These are more effectively evaluated in the Fresno or Berlin test, which consist of open-ended questions that are formulated as scenarios.[21],[22] Grading of either of these tests is more subjective and time consuming. Furthermore, each student served as their own control in the pre-post design of this study, without a concurrent control group of students that did not participate in the EBM course series. This allows for the possibility that other factors outside of the EBM curriculum could have influenced students' performance. However, no other specific EBM content is part of the PA curriculum.

In summary, a PA student curriculum linking traditional lectures with group discussions resulted in consistent increases in perceived and measured EBM knowledge. The developed curriculum aims to graduate PA students that have learned the skills to be self-directed and critical consumers of the clinical literature, with the ability to apply this information to clinical practice and decision-making.

Financial support and sponsorship


Conflicts of interest

There are no conflicts of interest.

  References Top

Sackett DL, Richardson SW, Rosenberg W, Haynes RB. Evidence-based medicine: How to practice and teach EBM. BMJ 1996;313:1410.  Back to cited text no. 1
Samir FM, Trask AL, Waller MA, Watts DD. Management of brain-injured patients by an evidence-based medicine protocol improves outcomes and decreases hospital charges. J Trauma 2004;56:492-9.  Back to cited text no. 2
Evidence Based Medicine Matters. London; 2013. Available from: http://en.testingtreatments.org/wp-content/uploads/2016/11/Evidence-Based-Medicine-Matters.pdf. [Last accessed on 2021 Jan 17].  Back to cited text no. 3
Accreditation Council for Graduate Medical Education. Available from: https://www.acgme.org/What-We-Do/Accreditation/Common-Program-Requirements. Published 2020. [Last accessed on 2020 Mar 11].  Back to cited text no. 4
Goto M, Schweizer ML, Vaughn-Sarrazin MS, Perencevich EN, Livorsi DJ, Diekema DJ, et al. Association of evidence-based care processes with mortality in staphylococcus aureus bacteremia at veterans health administration hospitals. JAMA Intern Med 2017;177:1489-97.  Back to cited text no. 5
Porter ME. Evidence-Based Medicine and the Changing Nature of Healthcare: 2007 IOM Annual Meeting Summary. Washington, DC; (2008).  Back to cited text no. 6
Dawes M, Summerskill W, Glasziou P, Cartabellotta A, Martin J, Hopayian K, et al. Sicily statement on evidence-based practice. BMC Med Educ 2005;5:1.  Back to cited text no. 7
Hatala R, Guyatt G. Evaluating the teaching of evidence-based medicine. JAMA 2002;288:1110-2.  Back to cited text no. 8
Meats E, Heneghan C, Crilly M, Glasziou P. Evidence-based medicine teaching in UK medical schools. Med Teach 2009;31:332-7.  Back to cited text no. 9
Blanco MA, Capellow CF, Dorsch JL, Perry G, Zanetti ML. A survey study of evidence-based medicine training in US and Canadian medical schools. J Med Libr Assoc 2014;102:160-8.  Back to cited text no. 10
Maggio LA, Tannery NH, Chen HC, ten Cate O, O'Brien B. Evidence-based medicine training in undergraduate medical education: A review and critique of the literature published 2006-2011. Acad Med 2013;88:1022-8.  Back to cited text no. 11
Albarqouni L, Hoffmann T, Glasziou P. Evidence-based practice educational intervention studies: A systematic review of what is taught and how it is measured. BMC Med Educ 2018;18:177.  Back to cited text no. 12
Accreditation Review Commission on Education for the Physician Assistant. ARC-PA (Accreditation Review Commission on Education for Physician Assistant); 2019.  Back to cited text no. 13
White DM, Stephens P. State of evidence-based practice in physician assistant education. J Physician Assist Educ 2018;29:12-8.  Back to cited text no. 14
Strauss SE, Glasziou P, Richardson SW, Haynes BR. Evidence-based medicine: How to Practice and Teach It. 4th ed. London, UK: Churchill Livingstone; 2010.  Back to cited text no. 15
Portney LG, Watkins MP. Foundations of Clinical Research: Applications to Practice. 3rd ed. London, UK: Prentice Hall; 2000.  Back to cited text no. 16
Benjamini Y, Hochberg Y. Controlling the false discovery rate: A practical and powerful approach to multiple testing. J R Stat Soc 1995;57:289-300.  Back to cited text no. 17
Reise SP, Ainsworth AT, Haviland MG. Item response theory: Fundamentals, applications, and promise in psychological research. Curr Dir Psychol Sci 2005;14:95-101.  Back to cited text no. 18
Green ML. Graduate medical education training in clinical epidemiology, critical appraisal, and evidence-based medicine: A critical review of curricula. Acad Med 1999;74:686-94.  Back to cited text no. 19
Stuart J, Rutherford R. Medical student concentration during lectures. Lancet 1978;312:514-6.  Back to cited text no. 20
Ramos KD, Shafer S, Tracz SM. Validation of the Fresno test of competence in evidence based medicine. BMJ 2003;326:319-21.  Back to cited text no. 21
Fritsche L, Greenhalgh T, Falck-Ytter Y, Neumayer HH, Kunz R. Do short courses in evidence based medicine improve knowledge and skills? Validation of Berlin questionnaire and before and after study of courses in evidence based medicine. BMJ 2002;325:1338-41.  Back to cited text no. 22


  [Figure 1]

  [Table 1], [Table 2], [Table 3], [Table 4], [Table 5], [Table 6], [Table 7], [Table 8]


    Similar in PUBMED
   Search Pubmed for
   Search in Google Scholar for
 Related articles
    Access Statistics
    Email Alert *
    Add to My List *
* Registration required (free)  

  In this article
Article Figures
Article Tables

 Article Access Statistics
    PDF Downloaded152    
    Comments [Add]    

Recommend this journal