Skip to main content
M&E Studio
Home
Services
Tools
AI for M&E
Workflows
Plugins
Prompts
Start a Conversation
Library
Contribution AnalysisDevelopmental EvaluationImpact EvaluationLogframe / Logical FrameworkMost Significant ChangeOutcome HarvestingOutcome MappingParticipatory EvaluationProcess TracingQuasi-Experimental DesignRealist EvaluationResults FrameworkResults-Based ManagementTheory of ChangeUtilization-Focused Evaluation
M&E Studio

Decision-Grade M&E, Responsibly Built

About

  • About Us
  • Contact
  • LinkedIn

Services

  • Our Services
  • Tools

AI for M&E

  • Workflows
  • Plugins
  • Prompts
  • AI Course

M&E Library

  • Decision Guides
  • Indicators
  • Reference
  • Downloads

Legal

  • Terms
  • Privacy
  • Accessibility

© 2026 Logic Lab LLC. All rights reserved.

  1. M&E Library
  2. /
  3. Mixed Methods Evaluation
Core ConceptEvaluation10 min read

Mixed Methods Evaluation

An evaluation approach that systematically combines quantitative and qualitative data to provide a more complete understanding of programme effects, mechanisms, and context.

When to Use

Mixed methods evaluation is the right approach when you need to understand not just what happened, but why and how it happened. Use it when:

  • Explaining unexpected results: quantitative data shows a programme didn't meet targets, but you need to understand why through qualitative inquiry
  • Strengthening validity: you need to triangulate findings across multiple data sources to build confidence in your conclusions
  • Understanding mechanisms: you need to map the causal pathways connecting activities to outcomes, not just measure whether outcomes occurred
  • Capturing complexity: programme effects vary across contexts or subgroups, requiring both statistical patterns and contextual explanation
  • Meeting donor requirements: major donors (USAID, FCDO, EU) explicitly require mixed-methods approaches in their evaluation standards

Mixed methods is less useful when you need rapid, low-cost assessment of simple outputs (quantitative monitoring alone may suffice) or when your evaluation questions are purely descriptive without need for causal explanation.

ScenarioUse Mixed Methods?Better Alternative
Testing whether a programme achieved its targetsPartiallySurvey design alone
Understanding why targets were or weren't metYesMixed methods
Building confidence in findings through triangulationYesMixed methods
Exploring emergent outcomes not in the designPartiallyOutcome harvesting
Establishing causal attributionYes, as foundationContribution analysis or impact evaluation

How It Works

Mixed methods evaluation follows a structured design that determines how quantitative and qualitative data are integrated. The three most common designs are:

  1. Convergent (parallel) design. Collect quantitative and qualitative data simultaneously and independently, then merge the datasets during analysis to compare and contrast findings. This design is ideal for triangulation, checking whether different data sources tell the same story. For example, survey results showing improved farmer incomes can be compared with focus group discussions about income-generating activities to see if the patterns align.

  2. Explanatory sequential design. Collect and analyse quantitative data first, then use those results to inform a qualitative phase that explains unexpected patterns or mechanisms. The quantitative phase identifies what happened (e.g., which subgroups showed the strongest outcomes), and the qualitative phase explores why (e.g., through interviews with programme staff and beneficiaries in those subgroups). This design is particularly useful when you need to explain surprising results.

  3. Exploratory sequential design. Conduct qualitative inquiry first to explore a phenomenon, then use those insights to develop quantitative measures. This design is valuable when you're evaluating a novel programme approach and need to understand key themes before developing survey instruments. For example, qualitative interviews might reveal that "community ownership" is a critical success factor, which can then be measured quantitatively in a follow-up survey.

The critical element across all designs is integration: the deliberate process of connecting quantitative and qualitative findings to produce insights that neither approach could generate alone. Integration can occur at multiple stages: during data collection (using qualitative findings to refine survey questions), during analysis (mapping qualitative themes to quantitative outcomes), or during interpretation (using qualitative evidence to explain quantitative patterns).

Key Components

A well-constructed mixed methods evaluation includes these essential elements:

  • Explicit integration plan: a documented strategy for how quantitative and qualitative data will be connected, including specific integration points (e.g., "qualitative interviews will explore survey findings on outcome X")
  • Complementary data collection: quantitative and qualitative methods that address the same evaluation questions from different angles, with clear rationale for method selection
  • Triangulation protocol: systematic procedures for comparing findings across data sources, documenting convergence and divergence, and resolving discrepancies
  • Weighting and prioritization: explicit criteria for how conflicting findings will be resolved (e.g., giving more weight to quantitative data for prevalence questions, qualitative for mechanism questions)
  • Joint displays: visual or tabular presentations that side-by-side display quantitative and qualitative findings to facilitate integrated interpretation
  • Meta-inferences: conclusions that explicitly draw on both data types, stating how the integration changed or strengthened understanding compared to either approach alone
  • Transparency documentation: clear reporting of the design choice, integration procedures, and any limitations in the mixed methods approach

Best Practices

Balance quantitative and qualitative data. Evaluation must use a mixed-method approach with a genuine balance of quantitative and qualitative data, not token qualitative components added to primarily quantitative studies. Quantitative data can come from monitoring efforts, but qualitative data must be systematically collected and analysed, not anecdotal.

Use mixed-methods approaches with explicit integration. Use a mixed-method approach in evaluation with balance of quantitative and qualitative data, where quantitative data can come from monitoring efforts. The key is that both data types are collected systematically and integrated during analysis, not just reported separately.

Leverage the strengths of each method. A mixed-methods approach is often recommended that can utilize the advantages of both, measuring what happened with quantitative data and examining how and why it happened with qualitative data. Quantitative data provides breadth and generalizability; qualitative data provides depth and mechanistic explanation.

Plan integration from the start. Evaluation Plan must reflect mixed-method approach/balance of quantitative and qualitative data, with integration planned during design, not added as an afterthought. Define specific integration points: where will data be compared? How will conflicting findings be resolved? What joint displays will be used?

Use appropriate analysis methods for each data type. Identify the most appropriate type of analysis for the indicator, such as summary tables, review of data, and qualitative matrices, in the method of analysis. Quantitative data requires statistical analysis appropriate to the sampling design; qualitative data requires systematic coding and thematic analysis.

Follow a structured evaluation approach. The evaluation approach involves five main steps: 1) conceptualizing data needs, 2) developing data collection and management tools and processes, 3) collecting and managing data, 4) analysing and interpreting data, and 5) reporting and using findings. Each step should consider both quantitative and qualitative dimensions.

Common Mistakes

Treating mixed methods as simply adding qualitative to quantitative. The most common failure is conducting a primarily quantitative evaluation with a few focus groups tacked on at the end, without genuine integration. This is not mixed methods, it's quantitative evaluation with supplemental qualitative data. True mixed methods requires deliberate integration where both data types inform each other.

Failing to plan integration. Many evaluations collect both data types but never connect them during analysis. The quantitative results are reported in one section, qualitative findings in another, with no attempt to explain how they relate. This wastes the value of mixed methods and produces fragmented conclusions.

Using qualitative data as tokenism. Adding a small number of interviews or focus groups without systematic collection or analysis, then dismissing qualitative findings as "anecdotal" when they conflict with quantitative results. Both data types require rigorous methods appropriate to their nature.

Not resolving conflicting findings. When quantitative and qualitative data appear to contradict, some evaluations simply report the discrepancy without attempting to explain it. Conflicting findings are often the most valuable insights, they may reveal context-specific mechanisms, measurement issues, or subgroup differences that require explanation.

Inappropriate weighting. Giving equal weight to small, non-representative qualitative samples and large, rigorous quantitative surveys when making overall conclusions. The evaluation should specify how different data types will be weighted for different types of questions (e.g., quantitative for prevalence, qualitative for mechanisms).

Examples

Agricultural Resilience, East Africa

A 5-year agricultural resilience programme in Kenya and Uganda used an explanatory sequential design to understand why some farmer groups showed stronger income improvements than others. The quantitative phase (survey of 600 farmers) identified that farmer groups with female leadership showed 40% higher income gains. The qualitative phase (focus groups and interviews with 12 farmer groups) explored mechanisms, revealing that female-led groups had stronger peer learning networks and more equitable resource distribution. The integrated finding, that female leadership operates through specific social mechanisms, informed programme adaptations to strengthen female leadership development, not just participation.

WASH, South Asia

A water and sanitation programme in Bangladesh used a convergent design to triangulate health outcomes. Quantitative data (health facility records) showed no improvement in diarrhoea rates, while qualitative data (household interviews) suggested improved hygiene practices. Triangulation revealed that while hygiene had improved, water source contamination at the household level (stored water) was the binding constraint. The mixed methods approach identified the specific intervention gap, household water treatment, that quantitative data alone could not reveal.

Governance, West Africa

A governance programme in Sierra Leone used an exploratory sequential design to develop evaluation measures for a complex advocacy intervention. Initial qualitative work (key informant interviews with 30 stakeholders) identified three informal influence pathways not captured in the programme theory. These pathways were then operationalized as quantitative indicators and measured in a follow-up survey. The mixed methods approach captured both formal advocacy outcomes and emergent informal influence mechanisms, providing a more complete picture of programme impact.

Compared To

Mixed methods evaluation is one of several approaches to data collection and analysis. The key differences:

FeatureMixed MethodsQuantitative-OnlyQualitative-Only
Primary strengthBreadth + depth; triangulationGeneralizability; statistical powerRich contextual understanding
Causal explanationStrong (mechanisms + patterns)Moderate (patterns only)Strong (mechanisms only)
Resource intensityHighMediumMedium
Time required3-8 weeks2-4 weeks2-4 weeks
Best forComplex programmes; mechanism questionsLarge-scale outcome measurementExploratory; emergent outcomes
TriangulationBuilt-inLimitedLimited

Relevant Indicators

23 indicators across 5 major donor frameworks (USAID, DFID/FCDO, EU, Global Fund) relate to mixed methods evaluation design and use:

  • Evaluation methodology: "Proportion of evaluations using mixed-methods approaches with balanced quantitative and qualitative data" (USAID)
  • Triangulation: "Number of evaluation findings triangulated across multiple data sources" (FCDO)
  • Data integration: "Degree to which quantitative and qualitative findings are interpreted together in analysis" (EU)
  • Question coverage: "Percentage of evaluation questions addressed through multiple data sources" (Global Fund)

Related Tools

  • Data Visualization for M&E, Essential for creating joint displays that integrate quantitative and qualitative findings
  • Sampling Methods, Mixed methods often requires multi-stage sampling strategies to support both quantitative and qualitative phases
  • Survey Design, Quantitative component of mixed methods evaluations
  • Focus Group Discussions, Common qualitative method in mixed methods designs
  • Key Informant Interviews, Common qualitative method for mechanism exploration

Related Topics

  • Impact Evaluation, Mixed methods often serves as the foundation for rigorous impact attribution
  • Contribution Analysis, Mixed methods provides the evidence base for contribution claims
  • Data Quality Assurance, Critical for both quantitative and qualitative data in mixed methods
  • Triangulation, The core process of comparing findings across data sources
  • Quantitative Data and Qualitative Data, The two data types integrated in mixed methods
  • Thematic Analysis, Core qualitative analysis method in mixed methods evaluations

Further Reading

  • Creswell, J.W. & Plano Clark, V.L. (2017). Designing and Conducting Mixed Methods Research, The definitive textbook on mixed methods design and integration.
  • BetterEvaluation: Mixed Methods, Comprehensive resource on mixed methods approaches in evaluation.
  • USAID Evaluation Policy (2019), Requires mixed-methods approaches in major evaluations.
  • FCDO Evaluation Standards (2020), Explicitly requires mixed-methods approaches with balanced quantitative and qualitative data.
  • IOM Mixed Methods Evaluation Toolkit, Practical guidance for implementing mixed methods in development contexts.

At a Glance

Combines quantitative and qualitative data to answer what happened and why it happened

Best For

  • Understanding programme mechanisms and causal pathways
  • Triangulating findings to strengthen validity and credibility
  • Explaining unexpected quantitative results through qualitative inquiry
  • Capturing both breadth and depth of programme effects

Complexity

Medium

Timeframe

3-8 weeks depending on design complexity

Linked Indicators

23 indicators across 5 donor frameworks

USAIDDFIDFCDOEUGlobal Fund

Examples

  • Proportion of evaluations using mixed-methods approaches with balanced quantitative and qualitative data
  • Number of evaluation findings triangulated across multiple data sources
  • Degree to which quantitative and qualitative findings are interpreted together in analysis

Related Topics

Pillar
Impact Evaluation
A rigorous evaluation approach that measures the causal effect of a programme on outcomes by comparing what happened with what would have happened in its absence.
Pillar
Contribution Analysis
A structured approach to building a credible case for how and why a programme contributed to observed outcomes, without requiring experimental attribution.
Core Concept
Data Quality Assurance
A systematic process for verifying that collected data meets five quality dimensions, Validity, Integrity, Precision, Reliability, and Timeliness, ensuring data is fit for decision-making.
Core Concept
Survey Design
The process of designing structured questionnaires and survey protocols to collect reliable, valid, and actionable data from a defined population.
Core Concept
Focus Group Discussions
A qualitative data collection method that brings together 6-10 participants to discuss a specific topic, generating rich insights through group interaction and shared experiences.
Core Concept
Key Informant Interviews
In-depth, semi-structured interviews with individuals selected for their specific knowledge, experience, or perspectives relevant to the evaluation questions.
Term
Thematic Analysis
A systematic method for identifying, analyzing, and reporting patterns (themes) in qualitative data through coding and categorization.