Definition
A confounding variable (or confounder) is an extraneous factor that correlates with both the intervention being evaluated and the outcome of interest, creating a spurious association that can lead to incorrect causal conclusions. Confounders threaten the internal validity of an evaluation by making it appear that the intervention caused an outcome when, in fact, the observed effect may be due to the confounding variable.
For example, in evaluating a job training programme's impact on employment, socioeconomic status could be a confounder: individuals from higher socioeconomic backgrounds may be more likely to enroll in the programme AND more likely to find employment regardless of training. Without accounting for this confounder, the evaluation would overestimate the programme's true impact.
Identifying and controlling for confounders is essential for credible causal inference and accurate attribution of outcomes to interventions rather than to other factors.
Why It Matters
Confounding variables are the primary obstacle to establishing causal claims in M&E. Without addressing confounders, evaluations risk:
- Overestimating impact — attributing outcomes to the intervention that were actually caused by pre-existing differences between participants and non-participants
- Underestimating impact — masking a real effect because a confounder worked in the opposite direction
- Drawing incorrect conclusions — leading to decisions about scaling, modifying, or terminating programmes based on flawed evidence
This is why quasi-experimental designs and impact evaluations dedicate substantial attention to confounder identification and control. The threat of confounding is what distinguishes rigorous causal analysis from simple before-after comparisons or participant-only outcome reporting.
Understanding confounders is also critical for interpreting any evaluation that claims causal effects. When reading an impact evaluation, the first question should be: "What confounders did the evaluators consider, and how did they control for them?"
In Practice
Confounders appear in programmes across sectors. Common examples include:
- Health interventions: Age, baseline health status, and access to healthcare confound the relationship between a nutrition programme and child health outcomes
- Education programmes: Prior academic achievement and parental education confound the relationship between tutoring and test scores
- Economic development: Market access and infrastructure quality confound the relationship between business training and revenue growth
Addressing confounders requires either design-based or analysis-based strategies:
- Randomized designs eliminate confounding through random assignment (though attrition can reintroduce confounding)
- Quasi-experimental designs use techniques like propensity score matching, regression discontinuity, or difference-in-differences to approximate randomization
- Statistical controls include regression adjustment, stratification, or matching on observed confounders
- Sensitivity analysis assesses how robust findings are to unobserved confounders
The key is to identify potential confounders during evaluation design (through theory and context analysis) and select appropriate control strategies before data collection begins.
Related Topics
- Bias — broader category of systematic errors including confounding
- Causal inference — the framework for establishing cause-effect relationships
- Selection bias — a specific type of confounding from non-random assignment
- Quasi-experimental design — methods for controlling confounders without randomization
- Impact evaluation — evaluations specifically designed to establish causal effects
- Counterfactual — the comparison needed to isolate intervention effects from confounders
- Attribution vs Contribution — distinguishing causal claims from contribution stories
Further Reading
- Causal Inference in Statistics: A Primer — Pearl, Drman, and Glymour. Accessible introduction to confounding and causal reasoning.
- Designing Quasi-Experimental Impact Evaluations — International Initiative for Impact Evaluation (3ie). Practical guidance on controlling confounders.
- What Is a Confounder? — BMJ. Concise clinical epidemiology explanation with examples.
Data References (populated during production)
- Indicators: 8 indicators across 3 donor frameworks relate to confounding control in evaluation design
- MEAL Rules: Best practices from EX132_F3_R015, EX45_R023, EX109_R018; Common mistakes from EX132_F2_R008, EX59_R012, EX109_W015
Last updated: 2026-02-27