Process indicators measure implementation quality, not just activity count. They answer "how well is the program running?" which is a different question from "how much is the program delivering?"
What Process Indicators Measure
Process indicators track the delivery quality dimension of a program. The usual categories:
- Fidelity. How closely implementation matches the documented protocol or curriculum. Example: percentage of training sessions delivered with all required modules covered.
- Coverage. What share of the eligible population is actually being reached. Example: proportion of target households receiving at least one home visit per quarter.
- Dose. The amount or duration of exposure participants receive. Example: average number of counseling sessions completed per enrolled client.
- Adherence. Completion of the planned protocol end-to-end. Example: percentage of participants completing the full 12-week curriculum.
- Quality markers. Participant-side and supervision-side signals of delivery quality. Example: supervisor observation scores, participant satisfaction ratings.
These are operational measures. They look at the pipework of program delivery, not the results downstream.
Process vs Output
The clearest way to see the distinction is to pair them.
An output indicator counts what was delivered: "5 trainings held, 120 participants trained."
A process indicator describes how well delivery went: "5 trainings held with 94% curriculum fidelity, 87% of participants attended all sessions, supervisor fidelity rating 4.2 out of 5."
Process indicators build on outputs but add the quality dimension a raw count misses. A program can hit every output target while delivering those outputs badly, and only a process layer will surface it.
Design Rules
Three rules make process indicators usable:
- Tie to a documented protocol or delivery standard. A process indicator without a protocol to measure against is unmeasurable. If you cannot point to a curriculum, SOP, or fidelity checklist, you are measuring something else.
- Collect during the activity, not after. Fidelity, attendance, and dose have to be captured in real time, through session logs, observation forms, or attendance rosters. Reconstructing them later produces garbage data.
- Pair each process indicator with the output it describes. Process indicators are not standalone. They qualify an output. Design them as pairs.
Proposal Context
Process indicators strengthen a MEL plan by letting the program distinguish implementation failure from design failure when outcomes disappoint. If outcome indicators move less than expected, process data tells you whether the theory was wrong or whether delivery never happened the way it was supposed to. Donor reviewers increasingly look for process indicators in high-fidelity programs: evidence-based interventions in health, pedagogically-specific education programs, protocol-heavy protection casework. A common proposal pitfall is listing only output counts without the fidelity qualifier, producing a MEL plan that cannot diagnose whether delivery is driving outcome results. Balance matters: 15-30% of indicators in a strong MEL plan are process indicators feeding into outcome measurement. Over-loading process at the expense of outcome is the opposite error.
Common Mistakes
Writing a process indicator without a documented protocol. "Percentage of sessions delivered with fidelity" means nothing if no one has written down what fidelity looks like. Protocol first, indicator second.
Treating process as a substitute for outcome. High fidelity and full attendance are not evidence the program worked. They are evidence the program ran as designed. Pair process indicators with outcome indicators; do not replace outcome measurement with process measurement.
Related Topics
- Indicator: The parent concept and core definition
- Output Indicator: What was delivered (process indicators qualify these)
- Outcome Indicator: What changed as a result of delivery
- Indicator Selection: Choosing the right indicator mix across types
- Adaptive Management: Using process data to adjust delivery in real time