Skip to main content
M&E Studio
AI for M&E
AI How-TosPromptsPlaybooksPlugins
Indicators
Workflows
M&E Resources
M&E MethodsReference Library
About
Services
FR — FrançaisES — Español
M&E Studio

AI for M&E, Built for Practitioners

AI for M&E

  • AI How-Tos
  • Prompts
  • Playbooks
  • Plugins
  • Indicators
  • Workflows

M&E Resources

  • M&E Methods
  • Reference Library
  • Decision Guides
  • Tools
  • Courses

Company

  • About
  • Services
  • Contact
  • LinkedIn

Legal

  • Terms
  • Privacy
  • Accessibility

© 2026 Logic Lab LLC. All rights reserved.

Library
  1. M&E Library
  2. /
  3. Process Evaluation: What It Is and How to Conduct One

Process Evaluation: What It Is and How to Conduct One

Process evaluation examines how a program is being implemented: whether activities are delivered as designed, to the right people, and at the right quality. It answers the 'how?' before asking 'did it work?'

Definition

A process evaluation (also called implementation or formative evaluation) assesses how a program is being delivered. It examines whether activities are being implemented as planned, reaching the intended beneficiaries, at the intended quality, and with intended frequency. Process evaluation answers "Is the program working as designed?" rather than "Is the program creating results?" It focuses on understanding implementation fidelity, identifying barriers and facilitators, and generating learning to improve delivery. Process evaluations are typically conducted during implementation, often repeatedly at multiple points.

Why It Matters

Many programs fail to achieve results not because their theory is wrong, but because implementation is weak or inconsistent. Process evaluation detects these gaps early, when adjustments can still be made. It builds team understanding of what is working and what is not, enabling responsive management and problem-solving. Process evaluation is particularly valuable in complex contexts, with new program models, or when implementation partners are learning to deliver quality activities. It bridges monitoring data (which tracks outputs delivered) and impact data (which measures outcomes achieved), explaining why results fell short or exceeded expectations.

In Practice

A financial literacy program might conduct a process evaluation after six months that examines: Are training sessions being held at planned frequency and duration? Are the right facilitators delivering content? Are the intended beneficiaries participating? Are materials being taught in the intended sequence and depth? Are participants engaged? A process evaluation might find that classes are occurring but attendance is irregular, or that facilitators are skipping certain modules due to time pressure. This information would lead to adjustments - perhaps shorter modules, support for attendance barriers, or revised scheduling. Process evaluation uses qualitative methods (interviews, observation), quantitative monitoring data (attendance records, fidelity checklists), and program theory to examine implementation quality.

Related Topics

  • Formative Evaluation: Evaluation conducted during program development to refine approaches
  • Adaptive Management: Using evaluation evidence to adjust strategies
  • Theory of Change: Program logic used to assess whether activities are delivered as designed
  • DAC Evaluation Criteria: International standards for evaluation assessment
  • Contribution Analysis: Method for understanding pathways from activities to results

At a Glance

Understand how a program is functioning and whether implementation aligns with design

Best For

  • New or complex programs
  • Implementation challenges
  • Learning during program delivery
  • Supporting adaptive management

Related Topics

Overview
Adaptive Management
A management approach that uses continuous learning from monitoring and evaluation data to adjust program strategies and activities in response to changing evidence or context.
In-Depth Guide
Theory of Change
A structured explanation of how and why a set of activities is expected to lead to desired outcomes, mapping the causal logic from inputs to impact.
Overview
Evaluation Criteria (DAC)
The OECD-DAC framework provides five standard criteria, relevance, efficiency, effectiveness, impact, and sustainability, for systematically assessing the merit and value of development interventions.
In-Depth Guide
Contribution Analysis
A structured approach to building a credible case for how and why a program contributed to observed outcomes, without requiring experimental attribution.
PreviousPerformance EvaluationNextReal-Time Evaluation