The Evidence

Why ePortfolios?

The evidence-based case for portfolio-based assessment in competency-based medical education.

From time-based training to demonstrated competency

Medical education has, for over a century, measured competence by time. Complete three years of residency, attend a prescribed number of lectures, and rotate through specified departments — and you are deemed competent. The implicit assumption is that exposure equals ability. But decades of educational research have shown that this assumption is unreliable. Two residents who complete identical rotations can emerge with vastly different levels of clinical capability.

Competency-based medical education (CBME) challenges this paradigm. Instead of asking "how long did you train?", it asks "what can you do?" The focus shifts from inputs (time, attendance) to outcomes (observable, measurable clinical activities). This is not merely a philosophical distinction — it fundamentally changes what assessment tools are required. When the goal is to document growth over time, you need a tool that can capture longitudinal evidence, not just snapshots.

Traditional assessment methods — MCQs, vivas, end-of-rotation examinations — measure knowledge at a single point in time. They are useful, but insufficient. They cannot capture how a resident's clinical reasoning evolves, how their procedural skills improve with practice, or how their professional identity matures through reflection. An ePortfolio is the only tool that can capture this trajectory: a living, growing repository of evidence that demonstrates competency development over months and years.

What CBME requires that paper cannot deliver

1

Longitudinal tracking

Competency develops over months and years. A single examination cannot capture whether a resident who struggled with central line insertion in July can now perform it independently in March. Paper logbooks record events but cannot reveal trajectories.

2

Multiple data points

Workplace-based assessments — mini-CEX, CbD, DOPS — accumulate over time. Each individual assessment is a data point; the pattern across dozens of assessments is what reveals competency. Aggregating and visualising these patterns requires a digital system.

3

Reflective practice

Reflection is not navel-gazing — it is a documented process of self-awareness and growth. When a resident writes about a difficult clinical encounter and their mentor responds, both parties learn. This asynchronous, documented dialogue is impossible with paper.

4

Mentor feedback loops

Effective mentorship requires asynchronous, documented guidance. A mentor who can review a resident's portfolio entry, add annotations, and suggest improvements — without needing to schedule a meeting — provides more frequent and actionable feedback.

5

Evidence aggregation

Each clinical experience, reflection, assessment, and feedback item is a piece of evidence. Linking these to specific Entrustable Professional Activities (EPAs) and competencies creates a comprehensive picture that no paper system can assemble.

EPAs: the core unit of modern assessment

Entrustable Professional Activities (EPAs) were introduced by Prof. Olle ten Cate in 2005 as units of professional practice that can be entrusted to a trainee once they have demonstrated sufficient competence. [1] Unlike competencies, which describe individual qualities (medical knowledge, communication, professionalism), EPAs describe real work — "manage a patient with acute chest pain" or "obtain informed consent for a procedure." Each EPA draws on multiple competencies simultaneously, making them a natural unit for workplace-based assessment.

The concept of entrustment is what makes EPAs powerful. Rather than a binary pass/fail, entrustment is graduated: a trainee progresses from observation to independent practice and eventually to supervising others.

Level Description
1Observe only
2Perform with direct supervision
3Perform with indirect supervision
4Perform independently
5Supervise others

The EPA-competency matrix is a critical conceptual tool. EPAs sit on one axis (work units), competencies on the other (individual qualities). Each EPA draws on a specific combination of competencies. Tracking a trainee's progress across this matrix requires a system that can link evidence to both EPAs and competencies — precisely what an ePortfolio provides.

In 2014, the AAMC defined 13 Core EPAs for entering residency, establishing a reference framework that has since been adopted and adapted worldwide. [2] India's NMC has incorporated EPA-based assessment into its CBME curriculum, making EPA tracking a practical requirement for every medical college.

What the evidence says

The case for portfolios in medical education is not theoretical — it is grounded in a substantial and growing body of evidence. A 2023 scoping review by Lim and colleagues examined 76 studies on portfolio use in medical education and found that portfolios serve a dual purpose: they nurture professional identity formation while simultaneously capturing context-specific micro-competencies that traditional assessments miss entirely. [3] The review highlighted that portfolios are particularly effective at documenting the soft, longitudinal aspects of medical training — the very aspects that CBME prioritises but that examinations cannot measure.

Reflection, often dismissed as a "soft" educational intervention, has robust evidence behind it. Winkel and colleagues conducted a systematic review in 2017, analysing 16 studies covering 477 residents and fellows. They found that structured reflection increased the learning of complex subjects and deepened professional values — outcomes that directly align with the goals of competency-based training. [4] Crucially, the evidence showed that reflection was most effective when it was documented and received mentor feedback — precisely the workflow an ePortfolio supports.

The breadth of ePortfolio applications in healthcare education was mapped by Janssens and colleagues in 2022, who identified eight distinct objectives of ePortfolio use: competency documentation, reflection, feedback, interprofessional collaboration, continuing professional development, bridging theory and practice, employment readiness, and certification. [5] This taxonomy demonstrates that ePortfolios are not single-purpose tools but comprehensive platforms that support the full spectrum of professional development.

Evidence from India is emerging and encouraging. At Geetanjali Medical College in Rajasthan, a 2024 study compared ePortfolio-based learning with traditional seminar-based teaching among 48 MBBS students. The ePortfolio group showed a 29% improvement in knowledge scores compared to 19% through seminars. Faculty satisfaction was 88%, and 92% of faculty recommended expanding ePortfolio use to other subjects. [6] These findings suggest that ePortfolios are not only effective in Western medical education systems but are equally applicable — and perhaps more needed — in the Indian context.

The most sustained Indian evidence comes from Sri Balaji Vidyapeeth, Pondicherry, which introduced the CoBALT (Competency-Based Learning and Training) model in 2015 — the first institution in India to formally implement competency-based training for PG residents. The model uses ACGME's six competency domains, 30–40 EPAs per department with Dreyfus-based entrustment levels, multisource feedback adapted for Indian cultural context, and continuous e-portfolio monitoring with weekly mentor feedback. [8] A subsequent paper advanced the framework further by introducing descriptive rubrics for EPA assessment, improving objectivity in entrustment decisions beyond the generic Dreyfus scale. [9] The CoBALT model demonstrates that comprehensive competency-based training with e-portfolio monitoring is not only feasible within Indian regulatory constraints but can function as a model for other institutions.

Portfolio-based vs traditional assessment

Dimension Traditional Portfolio-Based
What it measures Knowledge at a point in time Competency development over time
Attitudes & professionalism Difficult to assess Natural fit
Feedback End-of-rotation, summative Continuous, formative
Learner agency Passive Active — learner curates evidence
Mentor role Examiner Coach and guide

The Indian context

India's National Medical Commission (NMC) mandated competency-based medical education from 2019 onwards, and the Postgraduate Medical Education Regulations (PGMER-2023) explicitly require e-logbooks for postgraduate trainees. This is not a suggestion — it is a regulatory mandate. Yet the current reality across India's 706+ medical colleges is that the vast majority still rely on paper logbooks, manual attendance registers, and end-of-posting assessments.

The gap between regulatory expectation and ground reality is significant. NMC expects documented EPA tracking, competency mapping, and reflective practice. Most institutions have none of the infrastructure to deliver this. The challenge is compounded by frequent regulatory changes: a 2024 study found that 81% of faculty were unhappy with the pace and frequency of regulatory changes in Indian medical education. [7] But this finding actually strengthens the case for ePortfolios — a digital platform can adapt to regulatory changes far faster than paper systems. When NMC revises competency frameworks or assessment requirements, an ePortfolio platform can update its templates and frameworks within days. A paper logbook redesign takes months of printing and distribution.

India currently has no dominant ePortfolio platform for medical education. Global platforms like MedHub and New Innovations are designed for ACGME requirements and priced for American institutions — typically USD 15-50 per user per year. For a large Indian medical university with thousands of trainees, these costs are prohibitive. This creates an opportunity: an open-source, regulation-aware ePortfolio platform built specifically for Indian medical colleges can fill a gap that no existing solution addresses.

See how these capabilities map to your department's needs