Skip to main content
M&E Studio
Home
Services
Tools
AI for M&E
Workflows
Plugins
Prompts
Start a Conversation
Library
Contribution AnalysisDevelopmental EvaluationImpact EvaluationLogframe / Logical FrameworkMost Significant ChangeOutcome HarvestingOutcome MappingParticipatory EvaluationProcess TracingQuasi-Experimental DesignRealist EvaluationResults FrameworkResults-Based ManagementTheory of ChangeUtilization-Focused Evaluation
M&E Studio

Decision-Grade M&E, Responsibly Built

About

  • About Us
  • Contact
  • LinkedIn

Services

  • Our Services
  • Tools

AI for M&E

  • Workflows
  • Plugins
  • Prompts
  • AI Course

M&E Library

  • Decision Guides
  • Indicators
  • Reference
  • Downloads

Legal

  • Terms
  • Privacy
  • Accessibility

© 2026 Logic Lab LLC. All rights reserved.

  1. M&E Library
  2. /
  3. Process Evaluation
TermEvaluation2 min read

Process Evaluation

Assessment of how a programme is implemented, whether activities are delivered as planned and to intended quality standards.

Definition

A process evaluation (also called implementation or formative evaluation) assesses how a programme is being delivered. It examines whether activities are being implemented as planned, reaching the intended beneficiaries, at the intended quality, and with intended frequency. Process evaluation answers "Is the programme working as designed?" rather than "Is the programme creating results?" It focuses on understanding implementation fidelity, identifying barriers and facilitators, and generating learning to improve delivery. Process evaluations are typically conducted during implementation, often repeatedly at multiple points.

Why It Matters

Many programmes fail to achieve results not because their theory is wrong, but because implementation is weak or inconsistent. Process evaluation detects these gaps early, when adjustments can still be made. It builds team understanding of what is working and what is not, enabling responsive management and problem-solving. Process evaluation is particularly valuable in complex contexts, with new programme models, or when implementation partners are learning to deliver quality activities. It bridges monitoring data (which tracks outputs delivered) and impact data (which measures outcomes achieved), explaining why results fell short or exceeded expectations.

In Practice

A financial literacy programme might conduct a process evaluation after six months that examines: Are training sessions being held at planned frequency and duration? Are the right facilitators delivering content? Are the intended beneficiaries participating? Are materials being taught in the intended sequence and depth? Are participants engaged? A process evaluation might find that classes are occurring but attendance is irregular, or that facilitators are skipping certain modules due to time pressure. This information would lead to adjustments, perhaps shorter modules, support for attendance barriers, or revised scheduling. Process evaluation uses qualitative methods (interviews, observation), quantitative monitoring data (attendance records, fidelity checklists), and program theory to examine implementation quality.

Related Topics

  • Formative Evaluation, Evaluation conducted during programme development to refine approaches
  • Adaptive Management, Using evaluation evidence to adjust strategies
  • Theory of Change, Programme logic used to assess whether activities are delivered as designed
  • DAC Evaluation Criteria, International standards for evaluation assessment
  • Contribution Analysis, Method for understanding pathways from activities to results

At a Glance

Understand how a programme is functioning and whether implementation aligns with design

Best For

  • New or complex programmes
  • Implementation challenges
  • Learning during programme delivery
  • Supporting adaptive management

Complexity

Medium

Timeframe

Conducted during implementation, often repeatedly

Related Topics

Core Concept
Adaptive Management
A management approach that uses continuous learning from monitoring and evaluation data to adjust programme strategies and activities in response to changing evidence or context.
Pillar
Theory of Change
A structured explanation of how and why a set of activities is expected to lead to desired outcomes, mapping the causal logic from inputs to impact.
Core Concept
Evaluation Criteria (DAC)
The OECD-DAC framework provides five standard criteria, relevance, efficiency, effectiveness, impact, and sustainability, for systematically assessing the merit and value of development interventions.
Pillar
Contribution Analysis
A structured approach to building a credible case for how and why a programme contributed to observed outcomes, without requiring experimental attribution.