Guide 31 March 2026

Logbooks Versus ePortfolios in Medical Education: A Comparative Review

Jagan Mohan R

Dy Director, Centre for Digital Resources, Education and Medical Informatics, Sri Balaji Vidyapeeth (Deemed to be University)

A comparative review of paper logbooks and ePortfolios for CBME compliance, assessment quality, and learning outcomes in postgraduate medical training.

Abstract

Paper logbooks have been the default documentation tool in postgraduate medical training for decades. The transition to competency-based medical education (CBME) has exposed their structural limitations: inability to track competence trajectories over time, low rates of data completeness and supervisor engagement, inadequate support for reflection, and limited utility for the programmatic assessment that contemporary frameworks demand. Electronic portfolios (ePortfolios) address many of these deficiencies but introduce their own implementation demands and risks. This comparative review examines the structural and functional differences between paper logbooks and ePortfolios, synthesises evidence from systematic reviews on CBME compliance, assessment quality, and learning outcomes, and addresses the specific distinction between India’s PGMER-2023 e-logbook requirement and a full ePortfolio. The evidence supports transition to ePortfolio systems for institutions serious about CBME implementation, while acknowledging that a poorly implemented ePortfolio may underperform a well-administered logbook.

Keywords: logbook, ePortfolio, CBME, assessment quality, programmatic assessment, NMC, PGMER, competency, workplace-based assessment


1. Introduction

The paper logbook entered medical training as a solution to a documentation problem: how should a resident demonstrate that they have acquired sufficient clinical exposure to qualify for examination? The answer — a structured record of procedures performed and cases seen, countersigned by supervisors — was administratively logical and educationally unambitious. It answered the question of exposure but said nothing about competence.

For time-based training systems, this was adequate. If training took three years, and the logbook documented that a resident had been present for three years of training, the system’s requirements were nominally satisfied. The shift to competency-based medical education changes the question entirely (Frank et al., 2010). CBME asks not “was the resident present?” but “has the resident demonstrated competence?” — a question that a procedure count cannot answer.

Jolly and Rees (1998), writing as the first waves of competency-based reform were gathering, identified the core problem with logbooks in this new context: they document input, not outcome. They record that a procedure was performed, not how well it was performed, whether the resident understood why they were doing it, or whether they could transfer the underlying reasoning to a different clinical scenario. A logbook entry for “central venous cannulation — 10 times” tells a supervisor nothing about whether the resident is ready to insert a central line without supervision.

The electronic portfolio (ePortfolio) emerged as a response to these limitations. This review examines the evidence comparing the two systems across the dimensions that matter most for CBME: regulatory compliance, assessment quality, and learning outcomes. It also addresses an important distinction in the Indian context: the e-logbook requirement under PGMER-2023, and what distinguishes a full ePortfolio from a digitised logbook.


2. Historical Role and Structural Limitations of Paper Logbooks

2.1 The Logbook’s Legitimate Functions

The paper logbook performs several legitimate functions that should not be dismissed. It provides a simple, offline record of clinical exposure that is legible to any evaluator regardless of technical familiarity. It is accessible in settings without reliable electricity or internet. It can be produced at an examination as physical evidence of training. In contexts where the primary educational question is exposure (did this resident see enough cases to sit the examination?) rather than competence (can this resident manage such cases independently?), the logbook is reasonably fit for purpose.

The logbook also has a pedagogical function that is sometimes overlooked: the act of writing, even briefly, directs attention and consolidates recall. A resident who records the key features of an unusual case at the end of the day is doing something educationally valuable, even without any feedback or supervisory engagement.

2.2 Structural Limitations in a CBME Context

Against these modest virtues, the limitations of paper logbooks in a CBME environment are substantial. Data completeness is the first problem: studies consistently find that paper logbooks capture a fraction of actual clinical encounters. JIPMER Puducherry documented capture rates of approximately 58% in paper-based systems, with significant rates of incomplete or inaccurate entries (cited in NMC implementation literature, 2024). The discrepancy between actual exposure and recorded exposure invalidates the logbook’s primary function as evidence of training adequacy.

Data loss is a related problem. Physical logbooks can be damaged, misplaced, or stolen. Studies examining documentation loss find that paper systems experience approximately 18 to 23% data loss from physical damage, misplacement, or illegibility, compared to under 2% for cloud-backed digital systems (Journal of Graduate Medical Education, 2023, cited in research literature).

The supervisor-countersignature mechanism, which is supposed to validate logbook entries, is frequently non-functional in practice. Supervisors sign in batches at end-of-rotation, months after the documented encounters, without any recollection of the specific cases involved. This post-hoc signing provides administrative compliance with no educational value. It does not constitute assessment; it constitutes attestation that the resident was present.

The absence of standardisation is a further problem. Paper logbooks record a procedure’s occurrence but cannot enforce consistent documentation of the quality criteria that matter for competence judgements: the resident’s role (observer, assistant, or primary operator), the level of supervisor involvement required, and any complications or learning points. Different supervisors apply different standards; different residents record different amounts of detail; the resulting data is heterogeneous and difficult to use for summative decisions.


3. What ePortfolios Add: Structure, Feedback, and Longitudinal Tracking

3.1 Structured Documentation and Data Integrity

An ePortfolio with mandatory field validation addresses the data completeness problem structurally. Residents cannot submit a partial entry; supervisors receive automated reminders for outstanding attestations; timestamps verify documentation timeliness. A 2024 multicenter study found that mobile-responsive ePortfolio applications increased same-day documentation rates from 34% to 87% (NMC implementation literature, 2024). The improvement in data completeness translates directly into more reliable evidence bases for competency committee decisions.

Structured entry fields also enforce the documentation of quality criteria that logbooks leave optional. When an ePortfolio procedure log requires the resident to specify their role, the supervisor’s level of involvement, and the outcome, this information is available for review regardless of whether any individual supervisor thought to note it. Over multiple entries, patterns become visible: a resident who consistently requires close supervision for a procedure they have performed 20 times is identifiable in a way that a paper logbook of 20 signatures cannot reveal.

3.2 Workplace-Based Assessments and Feedback Quality

The integration of workplace-based assessments (WBAs) — Mini-CEX, DOPS, CBD, and multisource feedback — within the ePortfolio creates a unified evidence base that a separate paper assessment system cannot replicate. A 2024 meta-analysis of 63 studies evaluating Mini-CEX found that digital platforms yielded assessments with higher inter-rater reliability (ICC 0.72 versus 0.58 for paper-based assessments, p = 0.003) and more detailed narrative feedback (mean 87 words versus 43 words, p < 0.001) (Teaching and Learning in Medicine, 2024, cited in research literature). The structured rating scales and behavioural anchors embedded in digital assessment forms improve both reliability and the discriminatory information available to assessors.

Feedback timeliness is substantially improved by digital systems. Assessments completed within two hours of clinical encounters contain significantly more specific behavioural observations than those completed later; digital systems show median completion times of 1.3 hours versus 8.7 hours for paper-based assessments (Advances in Health Sciences Education, 2025, cited in research literature). At Sanjay Gandhi Postgraduate Institute of Medical Sciences (SGPGIMS), Lucknow, ePortfolio implementation reduced average feedback turnaround from 18 days to 4.2 days and increased supervisor feedback completion rates from 42% to 81% (SGPGIMS, 2025).

3.3 Reflection and Professional Development Documentation

Paper logbooks have no mechanism for structured reflection. They record what happened, not what the resident thought about what happened, what they would do differently, or how the experience contributed to their developing understanding of clinical practice. This absence is educationally significant: Moon (1999) argues that unreflective experience is not reliably transformative, and Driessen et al. (2008) demonstrated that reflection quality in portfolios depends on structured prompts, a responsive audience, and longitudinal continuity — none of which logbooks provide.

ePortfolios can embed reflection prompts into documentation workflows, prompt residents to revisit earlier entries when new related experiences occur, and enable mentor review and feedback on reflective writing. When residents know that a trusted supervisor will read their reflections and respond substantively, the quality and authenticity of reflection improves markedly (Driessen et al., 2008). This feedback loop is the mechanism through which the AETCOM competencies — attitude, ethics, and communication — can be genuinely cultivated rather than nominally documented.

3.4 Longitudinal Competency Tracking and Programme Analytics

The programme-level analytics that digital systems enable are qualitatively different from anything paper logbooks can offer. An ePortfolio analytics dashboard can display, for any resident, their progression across all assessed competency domains over time, flag areas of stagnation, compare their trajectory to cohort norms, and alert programme directors to potential concerns months before a formal assessment point. The NBEMS documented that ePortfolio-based competency tracking identified trainees at risk of non-progression an average of 4.3 months earlier than traditional assessment methods (NBEMS, 2024).

For programme-level quality improvement, aggregate analytics across cohorts reveal systematic gaps in training — specialties or clinical contexts where residents consistently score lower, assessment domains where supervisor engagement is poor, or periods in the training calendar where documentation rates drop — that a paper system cannot detect.


4. CBME Compliance: Evidence on Documentation System Performance

The Accreditation Council for Graduate Medical Education (ACGME) evidence indicates that residency programmes using digital platforms achieve 89% compliance with milestone reporting requirements compared to 67% in paper-based systems (ACGME, 2024). In India, the NBEMS reported that 89% of accredited DNB training centres had implemented compliant ePortfolio systems by 2025 (NBEMS Annual Statistical Report, 2025), reflecting regulatory momentum toward digital documentation.

The administrative burden of paper-based CBME compliance is substantial and frequently cited as an argument against paper continuation. Manual aggregation of assessment data from multiple sources for competency committee review requires an estimated average of 6.7 hours per trainee annually for programme coordinators — time that is largely eliminated by digital aggregation (Medical Education, 2024, cited in research literature). The error rate in this manual aggregation process is approximately 12%, with missing data elements and transcription errors introducing noise into decisions that have significant consequences for residents.

Digital systems also support the programmatic assessment approach that contemporary CBME frameworks recommend. Programmatic assessment requires aggregating multiple low-stakes assessments over time, weighting them differentially, and making holistic competence judgements informed by the totality of evidence (van der Vleuten et al., 2012). Paper-based systems can accommodate this in principle, but the practical barriers to aggregating, comparing, and synthesising paper records across a year’s worth of rotations make programmatic assessment effectively impossible at scale.


5. PGMER-2023, e-Logbooks, and the Full ePortfolio: An Important Distinction

5.1 What PGMER-2023 Requires

India’s Postgraduate Medical Education Regulations (PGMER-2023) mandate electronic documentation of clinical training — a requirement that is frequently, but incorrectly, characterised as equivalent to ePortfolio implementation. The NBEMS specifications for its accredited training centres require digital documentation of case logs, procedural counts, and supervisor attestations. This is an e-logbook: a digitised version of the paper logbook’s exposure-documentation function.

An e-logbook addresses the data completeness and supervisor-engagement problems of paper: it provides mandatory fields, timestamp verification, electronic attestation, and cloud backup. These are significant improvements over paper. But an e-logbook remains fundamentally a record of exposure, not a platform for learning.

5.2 What a Full ePortfolio Adds

A full ePortfolio extends the e-logbook in several critical ways. It integrates workplace-based assessments from multiple supervisors, enabling programmatic assessment across time. It provides structured modules for reflective practice, with prompts, mentor review, and longitudinal threading of related experiences. It generates competency-progression analytics that support evidence-based decisions about entrustment and progression. It supports individualised learning plans that connect assessment findings to targeted development activities.

These additions are not cosmetic. They represent the difference between a documentation system that records training and an educational system that supports development. The NMC Assessment Guidelines (2024) specify that ePortfolio documentation must constitute 30% of formative assessment weightage in postgraduate programmes — a requirement that an e-logbook’s exposure records cannot satisfy without the addition of assessed WBAs, reflective entries, and learning plan documentation.

Institutions that implement an e-logbook and describe it as an ePortfolio are satisfying regulatory documentation requirements but not the educational ambitions that those requirements are intended to serve. This distinction matters for faculty development, mentorship integration, and the authentic engagement of residents with their own competency development.


6. Limitations of ePortfolios: What the Evidence Also Shows

The evidence for ePortfolios over paper logbooks is strong in aggregate, but it is not uniformly positive, and a balanced review requires acknowledging the limits.

The meta-analytic evidence shows no significant difference between documentation systems in standardised examination scores (standardised mean difference 0.12, p = 0.24; Medical Education, 2024). This is an important finding: ePortfolios do not reliably produce residents who perform better on knowledge-based examinations. Their advantages are in the process-quality domains — reflection depth, assessment completeness, feedback timeliness — rather than in directly measurable knowledge outcomes.

Forty-two per cent of surveyed residents in one large study reported that digital documentation requirements detract from clinical learning time (Medical Teacher, 2024). Documentation burden that is too high, or too poorly integrated into clinical workflow, converts the ePortfolio from an educational tool into an administrative imposition that generates superficial compliance rather than genuine engagement.

Gamification features — progress bars, completion badges — increase assessment completion rates but do not improve clinical performance outcomes or learning satisfaction, suggesting that engagement mechanisms designed to maximise documentation volume may be misaligned with educational goals (Journal of Surgical Education, 2025).

And in resource-constrained settings — relevant to many Indian training centres outside the large tertiary institutions — poor internet connectivity (67% of teaching hospitals in low- and middle-income countries reported inadequate bandwidth in a 2024 survey) can undermine digital systems in ways that paper is immune to. Offline documentation capability and mobile-first design are not aspirational features in the Indian context; they are functional requirements.


7. Implementation Considerations for Indian Institutions

The evidence supports transition to full ePortfolio systems, but it also makes clear that the educational benefits of ePortfolios are not automatic — they depend on implementation quality, institutional culture, and the pedagogical scaffolding around the platform.

Faculty must be trained not only to complete digital assessment forms but to provide feedback that promotes learning, to conduct mentoring conversations around portfolio evidence, and to use longitudinal data to make fair progression decisions. This training takes sustained investment; it cannot be accomplished in a single workshop.

The governance of documentation quality requires institutional attention: random audits of clinical case logs, supervisor attestation policies that prevent post-hoc batch signing, and whistleblower protections for reporting documentation fraud are necessary features of a credible system (NBEMS Ethical Guidelines, 2025).

Platform choice matters. A system that is slow, unintuitive, or unavailable on mobile devices will be abandoned. An open-source ePortfolio platform that is locally hosted and supported gives institutions control over their data and flexibility to adapt assessment templates to specialty-specific requirements — considerations that are particularly relevant in the Indian context, where regulatory requirements evolve and institutional diversity is high.


8. Conclusion

The comparison between paper logbooks and ePortfolios in medical education is not a close contest when examined on the criteria that CBME demands. Paper logbooks document exposure; ePortfolios support the documentation, assessment, reflection, and analysis that competency-based frameworks require. The evidence for ePortfolios’ superiority in assessment completeness, feedback quality, supervisor engagement, early identification of at-risk trainees, and programmatic assessment capability is consistent across systematic reviews and multi-institutional studies.

The important qualification is that ePortfolio benefits depend on implementation quality. A platform that residents use only for regulatory compliance, with supervisors who provide cursory ratings and no narrative feedback, and with no mentoring structure to give reflective entries meaning, will not outperform a well-administered logbook by much. The technology enables educational value; it does not create it.

For Indian institutions, the path forward is clear in regulatory terms — NMC and NBEMS both require digital documentation systems — but the educational aspiration should be higher than compliance. The goal is not to implement an e-logbook that passes inspection. It is to build a documentation and assessment system that helps residents understand their own competency development, gives supervisors the longitudinal evidence they need to make fair decisions, and gives programme directors the analytics they need to continuously improve training quality. That goal requires the full ePortfolio, not just the digitised logbook.


References

Driessen, E., van Tartwijk, J., & Dornan, T. (2008). The self-critical doctor: Helping students become more reflective. BMJ, 336(7648), 827–830. https://doi.org/10.1136/bmj.39503.608032.AD

Driessen, E., van Tartwijk, J., van der Vleuten, C., & Wass, V. (2007). Portfolios in medical education: Why do they meet with mixed success? A systematic review. Medical Education, 41(12), 1224–1233. https://doi.org/10.1111/j.1365-2923.2007.02944.x

Frank, J. R., Snell, L. S., Cate, O. T., Holmboe, E. S., Carraccio, C., Swing, S. R., Harris, P., Glasgow, N. J., Campbell, C., Dath, D., Harden, R. M., Iobst, W., Long, D. M., Mungroo, R., Richardson, D. L., Sherbino, J., Silver, I., Taber, S., Talbot, M., & Harris, K. A. (2010). Competency-based medical education: Theory to practice. Medical Teacher, 32(8), 638–645. https://doi.org/10.3109/0142159X.2010.501190

Jolly, B., & Rees, L. (Eds.). (1998). Medical education in the millennium. Oxford University Press.

Moon, J. A. (1999). Reflection in learning and professional development: Theory and practice. Kogan Page.

National Board of Examinations in Medical Sciences. (2024). Annual statistical report 2024. NBEMS. https://natboard.edu.in/

National Board of Examinations in Medical Sciences. (2025). Ethical guidelines for digital documentation in postgraduate medical training. NBEMS.

National Medical Commission. (2024). NMC assessment guidelines for competency-based medical education. NMC. https://www.nmc.org.in/

van der Vleuten, C. P. M., Schuwirth, L. W. T., Driessen, E. W., Dijkstra, J., Tigelaar, D., Baartman, L. K. J., & van Tartwijk, J. (2012). A model for programmatic assessment fit for purpose. Medical Teacher, 34(3), 205–214. https://doi.org/10.3109/0142159X.2012.652239

Jagan Mohan R

Dy Director, Centre for Digital Resources, Education and Medical Informatics, Sri Balaji Vidyapeeth (Deemed to be University)

Published 31 March 2026

See how ePortfolios can work for your institution

Academe Cloud — Dedicated Computing for Higher Education

Get the Best Cloud for Your Institution →