Trust in Action: Best Practices, Pitfalls, and the Future of EPAs

Entrustable Professional Activities (EPAs) have significantly impacted health professions education since their introduction in the mid-2000s (ten Cate, 2005). At their core, EPAs are observable, measurable professional tasks that assess a learner's level of entrustability. Entrustability reflects a progression from direct supervision, where the instructor closely observes the learner, to indirect supervision, where oversight takes place from a distance, and eventually to full independence, when the learner performs the task without direct observation.
For example, initially an instructor might closely watch a medical learner perform a physical exam, but as the learner demonstrates proficiency the instructor may trust the learner to conduct exams more independently, verifying only the final diagnosis.
EPAs have been increasingly adopted across many health professions, including medicine, pharmacy, and nursing (Liu et al., 2021; Abeyaratne & Galbraith, 2023). The use of EPAs in these clinical settings provides educators and supervisors with a structured framework to assess and report on learners' clinical competence using authentic assessment strategies. When implemented effectively, EPAs clarify expectations for learners as well as enhance patient safety and educational outcomes by ensuring that clinical responsibilities align closely with learners' readiness, knowledge, and skill.
However, despite their growing popularity and clear benefits, the use of EPAs is not without its challenges. Misunderstanding their purpose, inconsistent implementation, and the conflation of EPAs with competencies and skills can all compromise the effectiveness of EPAs in clinical environments (ten Cate & Schumacher, 2022; Wu et al., 2022). To maximize the educational and clinical impact of EPAs, it is important to address these issues proactively and promote best practices. It can also be helpful to understand how EPAs are currently utilized in health professions education, be aware of common implementation pitfalls, and consider future-oriented applications of EPA-based assessment.
Current Practice
EPAs have become central to competency-based education (CBE) at both undergraduate and graduate medical education levels (Pinilla et al., 2021; Gummesson et al., 2023). In addition to utilizing EPAs developed by standard-based, professional, or accrediting organizations, institutions may also develop their own or modify existing EPAs through structured processes involving a wide array of process participants, including pre-clinical professors, clinicians, and learners.
Assessment of EPAs often involves methods such as direct observation by instructors and self-assessment by learners themselves. Mobile applications and digital tools have increasingly been adopted to streamline these assessments, making the feedback process efficient and timely (Wu et al., 2022; Liu et al., 2021). These methods facilitate formative feedback, allowing learners to continuously improve their understanding and practice through clear, actionable feedback from instructors.
In practice, EPAs align clinical responsibilities with learners’ development and skill level, providing structured opportunities for growth. By clearly outlining tasks and expectations, EPAs aid in assessment and help learners build confidence and autonomy as they progress through their professional education.
Common Misuses and Challenges
Despite their clear benefits, EPAs are often misunderstood or even misused, limiting their educational potential. One common misstep is mistaking EPAs for competencies. Competencies represent the knowledge, skills, and attitudes necessary to perform professional tasks, whereas EPAs are the actual professional activities or tasks themselves. When these two are confused, assessments can simply become "checklists" that fail to capture the true nature of professional practice (ten Cate & Schumacher, 2022).
Another significant challenge is the inconsistent implementation of EPAs across institutions and specialties. EPAs need some degree of standardization to ensure consistency, but overly rigid frameworks can reduce their practical utility and adoption in local settings. Variation in faculty training and inconsistent understanding among instructors further magnify these issues, leading to a lack of reliability and validity when it comes to EPA assessment tools (Pinilla et al., 2021).
Finally, EPAs can suffer from inadequate integration into existing curricula, leading to them being perceived as additional administrative burdens rather than beneficial tools. For EPAs to be effective, they must be seamlessly integrated into regular educational and clinical routines, ensuring they are both meaningful and sustainable for all involved parties. Digital tools or platforms–such as those offered by Elentra–make it easy to initiate and complete assessments, can significantly ease administrative burden, and can soften the perception of process participants who may view EPAs as simply “another hurdle.”
Best Practices
To effectively implement EPAs, clear differentiation from competencies must be maintained. This clarity ensures that both EPAs and competencies can be assessed accurately and meaningfully. Utilizing standardized and validated tools like the EQual rubric or QUEPA tool during EPA development can significantly enhance their validity and reliability (Abeyaratne & Galbraith, 2023).
Structured training for faculty and supervisors is crucial to successful EPA implementation. Such training helps educators understand how to effectively observe, assess, and provide feedback on EPAs, ensuring a high level of fidelity in implementation. Faculty must see EPAs as valuable educational tools rather than administrative burdens to ensure active engagement and consistent usage (ten Cate & Schumacher, 2022).
Finally, EPAs should primarily serve formative assessment purposes. Continuous formative feedback encourages learners to recognize and address skill gaps in real-time. This promotes ongoing professional development, and when effectively integrated, EPAs can significantly enhance both learner development and patient care quality.
Future Directions
Looking ahead, EPAs are likely to undergo further refinement toward increased authenticity and longitudinal integration within curricula. Future developments may include more sophisticated digital tools and mobile technologies to streamline assessment and feedback processes, making real-time formative feedback even more accessible and actionable (Wu et al., 2022). The use of mobile apps in CBE environments can streamline the process of providing and receiving formative feedback, leading to improved understanding of gaps in learning for all stakeholders.
Additionally, expanding EPAs across a broader range of healthcare specialties and contexts will enhance their applicability and relevance. This expansion requires ongoing research, regular stakeholder consultations, and continuous iteration based on evidence and educational outcomes (Pinilla et al., 2021).
Ultimately, the success of EPAs hinges on their continuous refinement, guided by evidence-based research and collaborative input from educators, clinicians, and learners. This proactive approach will ensure EPAs remain impactful educational tools within health professions education and continue to strengthen learning outcomes and patient care.
Elentra’s comprehensive health professions platform includes numerous tools relevant to the assessment of EPAs, including online evaluation forms, a mobile app for clinical site assessments and procedure logs, and a dedicated module for Competency-based Education (CBE) framework management, tracking, and reporting. To learn more about how Elentra can help you integrate EPA-based assessment practice into your curriculum, contact us today.
References
- Abeyaratne, C., & Galbraith, K. (2023). A Review of Entrustable Professional Activities in Pharmacy Education. American Journal of Pharmaceutical Education, 87(3).
- Gummesson, C., Alm, S., Cederborg, A., Ekstedt, M., Hellman, J., Hjelmqvist, H., ... & Tejera, A. (2023). Entrustable professional activities (EPAs) for undergraduate medical education – development and exploration of social validity. BMC Medical Education, 23, 635.
- Liu, L., Jiang, Z., Qi, X., Xie, A., Wu, H., Cheng, H., ... & Li, H. (2021). An update on current EPAs in graduate medical education: A scoping review. Medical Education Online, 26(1), 1981198.
- Pinilla, S., Lenouvel, E., Cantisani, A., Klöppel, S., Strik, W., Huwendiek, S., & Nissen, C. (2021). Working with entrustable professional activities in clinical education in undergraduate medical education: A scoping review. BMC Medical Education, 21, 172.
- ten Cate, O. (2005). Entrustability of professional activities and competency-based training. Medical Education, 39(12), 1176–1177.
- ten Cate, O., & Schumacher, D. J. (2022). Entrustable professional activities versus competencies and skills: Exploring why different concepts are often conflated. Advances in Health Sciences Education, 27(2), 491–499.
- Wu, S.‑J., Fan, Y.‑F., Chien, C.‑Y., & Wu, Y.‑J. (2022). Merits and demerits of entrustable professional activities for assessments of surgical residents: A systematic review. Health Sciences Review, 5, 100069.