Core ConceptEvaluation

Mixed Methods Evaluation

An evaluation approach that systematically combines quantitative and qualitative data to provide a more complete understanding of programme effects, mechanisms, and context.

10 min read
Also known as:Mixed-methods evaluationMixed methods researchIntegrated evaluation

When to Use

Mixed methods evaluation is the right approach when you need to understand not just what happened, but why and how it happened. Use it when:

  • Explaining unexpected results — quantitative data shows a programme didn't meet targets, but you need to understand why through qualitative inquiry
  • Strengthening validity — you need to triangulate findings across multiple data sources to build confidence in your conclusions
  • Understanding mechanisms — you need to map the causal pathways connecting activities to outcomes, not just measure whether outcomes occurred
  • Capturing complexity — programme effects vary across contexts or subgroups, requiring both statistical patterns and contextual explanation
  • Meeting donor requirements — major donors (USAID, FCDO, EU) explicitly require mixed-methods approaches in their evaluation standards

Mixed methods is less useful when you need rapid, low-cost assessment of simple outputs (quantitative monitoring alone may suffice) or when your evaluation questions are purely descriptive without need for causal explanation.

| Scenario | Use Mixed Methods? | Better Alternative | |----------|----------------------|-------------------| | Testing whether a programme achieved its targets | Partially | Survey design alone | | Understanding why targets were or weren't met | Yes | Mixed methods | | Building confidence in findings through triangulation | Yes | Mixed methods | | Exploring emergent outcomes not in the design | Partially | Outcome harvesting | | Establishing causal attribution | Yes, as foundation | Contribution analysis or impact evaluation |

How It Works

Mixed methods evaluation follows a structured design that determines how quantitative and qualitative data are integrated. The three most common designs are:

  1. Convergent (parallel) design. Collect quantitative and qualitative data simultaneously and independently, then merge the datasets during analysis to compare and contrast findings. This design is ideal for triangulation — checking whether different data sources tell the same story. For example, survey results showing improved farmer incomes can be compared with focus group discussions about income-generating activities to see if the patterns align.

  2. Explanatory sequential design. Collect and analyse quantitative data first, then use those results to inform a qualitative phase that explains unexpected patterns or mechanisms. The quantitative phase identifies what happened (e.g., which subgroups showed the strongest outcomes), and the qualitative phase explores why (e.g., through interviews with programme staff and beneficiaries in those subgroups). This design is particularly useful when you need to explain surprising results.

  3. Exploratory sequential design. Conduct qualitative inquiry first to explore a phenomenon, then use those insights to develop quantitative measures. This design is valuable when you're evaluating a novel programme approach and need to understand key themes before developing survey instruments. For example, qualitative interviews might reveal that "community ownership" is a critical success factor, which can then be measured quantitatively in a follow-up survey.

The critical element across all designs is integration — the deliberate process of connecting quantitative and qualitative findings to produce insights that neither approach could generate alone. Integration can occur at multiple stages: during data collection (using qualitative findings to refine survey questions), during analysis (mapping qualitative themes to quantitative outcomes), or during interpretation (using qualitative evidence to explain quantitative patterns).

Key Components

A well-constructed mixed methods evaluation includes these essential elements:

  • Explicit integration plan — a documented strategy for how quantitative and qualitative data will be connected, including specific integration points (e.g., "qualitative interviews will explore survey findings on outcome X")
  • Complementary data collection — quantitative and qualitative methods that address the same evaluation questions from different angles, with clear rationale for method selection
  • Triangulation protocol — systematic procedures for comparing findings across data sources, documenting convergence and divergence, and resolving discrepancies
  • Weighting and prioritization — explicit criteria for how conflicting findings will be resolved (e.g., giving more weight to quantitative data for prevalence questions, qualitative for mechanism questions)
  • Joint displays — visual or tabular presentations that side-by-side display quantitative and qualitative findings to facilitate integrated interpretation
  • Meta-inferences — conclusions that explicitly draw on both data types, stating how the integration changed or strengthened understanding compared to either approach alone
  • Transparency documentation — clear reporting of the design choice, integration procedures, and any limitations in the mixed methods approach

Best Practices

Balance quantitative and qualitative data. Evaluation must use a mixed-method approach with a genuine balance of quantitative and qualitative data, not token qualitative components added to primarily quantitative studies. Quantitative data can come from monitoring efforts, but qualitative data must be systematically collected and analysed, not anecdotal. (MEAL Rule: EX136_S013)

Use mixed-methods approaches with explicit integration. Use a mixed-method approach in evaluation with balance of quantitative and qualitative data, where quantitative data can come from monitoring efforts. The key is that both data types are collected systematically and integrated during analysis, not just reported separately. (MEAL Rule: EX136_P028)

Leverage the strengths of each method. A mixed-methods approach is often recommended that can utilize the advantages of both, measuring what happened with quantitative data and examining how and why it happened with qualitative data. Quantitative data provides breadth and generalizability; qualitative data provides depth and mechanistic explanation. (MEAL Rule: EX112_R061)

Plan integration from the start. Evaluation Plan must reflect mixed-method approach/balance of quantitative and qualitative data, with integration planned during design, not added as an afterthought. Define specific integration points: where will data be compared? How will conflicting findings be resolved? What joint displays will be used? (MEAL Rule: EX136_R045)

Use appropriate analysis methods for each data type. Identify the most appropriate type of analysis for the indicator, such as summary tables, review of data, and qualitative matrices, in the method of analysis. Quantitative data requires statistical analysis appropriate to the sampling design; qualitative data requires systematic coding and thematic analysis. (MEAL Rule: EX089_R012)

Follow a structured evaluation approach. The evaluation approach involves five main steps: 1) conceptualizing data needs, 2) developing data collection and management tools and processes, 3) collecting and managing data, 4) analysing and interpreting data, and 5) reporting and using findings. Each step should consider both quantitative and qualitative dimensions. (MEAL Rule: EX28_R091)

Common Mistakes

Treating mixed methods as simply adding qualitative to quantitative. The most common failure is conducting a primarily quantitative evaluation with a few focus groups tacked on at the end, without genuine integration. This is not mixed methods — it's quantitative evaluation with supplemental qualitative data. True mixed methods requires deliberate integration where both data types inform each other.

Failing to plan integration. Many evaluations collect both data types but never connect them during analysis. The quantitative results are reported in one section, qualitative findings in another, with no attempt to explain how they relate. This wastes the value of mixed methods and produces fragmented conclusions.

Using qualitative data as tokenism. Adding a small number of interviews or focus groups without systematic collection or analysis, then dismissing qualitative findings as "anecdotal" when they conflict with quantitative results. Both data types require rigorous methods appropriate to their nature.

Not resolving conflicting findings. When quantitative and qualitative data appear to contradict, some evaluations simply report the discrepancy without attempting to explain it. Conflicting findings are often the most valuable insights — they may reveal context-specific mechanisms, measurement issues, or subgroup differences that require explanation.

Inappropriate weighting. Giving equal weight to small, non-representative qualitative samples and large, rigorous quantitative surveys when making overall conclusions. The evaluation should specify how different data types will be weighted for different types of questions (e.g., quantitative for prevalence, qualitative for mechanisms).

Examples

Agricultural Resilience — East Africa

A 5-year agricultural resilience programme in Kenya and Uganda used an explanatory sequential design to understand why some farmer groups showed stronger income improvements than others. The quantitative phase (survey of 600 farmers) identified that farmer groups with female leadership showed 40% higher income gains. The qualitative phase (focus groups and interviews with 12 farmer groups) explored mechanisms, revealing that female-led groups had stronger peer learning networks and more equitable resource distribution. The integrated finding — that female leadership operates through specific social mechanisms — informed programme adaptations to strengthen female leadership development, not just participation.

WASH — South Asia

A water and sanitation programme in Bangladesh used a convergent design to triangulate health outcomes. Quantitative data (health facility records) showed no improvement in diarrhoea rates, while qualitative data (household interviews) suggested improved hygiene practices. Triangulation revealed that while hygiene had improved, water source contamination at the household level (stored water) was the binding constraint. The mixed methods approach identified the specific intervention gap — household water treatment — that quantitative data alone could not reveal.

Governance — West Africa

A governance programme in Sierra Leone used an exploratory sequential design to develop evaluation measures for a complex advocacy intervention. Initial qualitative work (key informant interviews with 30 stakeholders) identified three informal influence pathways not captured in the programme theory. These pathways were then operationalized as quantitative indicators and measured in a follow-up survey. The mixed methods approach captured both formal advocacy outcomes and emergent informal influence mechanisms, providing a more complete picture of programme impact.

Compared To

Mixed methods evaluation is one of several approaches to data collection and analysis. The key differences:

| Feature | Mixed Methods | Quantitative-Only | Qualitative-Only | |---------|--------------|-----|-----| | Primary strength | Breadth + depth; triangulation | Generalizability; statistical power | Rich contextual understanding | | Causal explanation | Strong (mechanisms + patterns) | Moderate (patterns only) | Strong (mechanisms only) | | Resource intensity | High | Medium | Medium | | Time required | 3-8 weeks | 2-4 weeks | 2-4 weeks | | Best for | Complex programmes; mechanism questions | Large-scale outcome measurement | Exploratory; emergent outcomes | | Triangulation | Built-in | Limited | Limited |

Relevant Indicators

23 indicators across 5 major donor frameworks (USAID, DFID/FCDO, EU, Global Fund) relate to mixed methods evaluation design and use:

  • Evaluation methodology — "Proportion of evaluations using mixed-methods approaches with balanced quantitative and qualitative data" (USAID)
  • Triangulation — "Number of evaluation findings triangulated across multiple data sources" (FCDO)
  • Data integration — "Degree to which quantitative and qualitative findings are interpreted together in analysis" (EU)
  • Question coverage — "Percentage of evaluation questions addressed through multiple data sources" (Global Fund)

Related Tools

Related Topics

Further Reading