Skip to main content
M&E Studio
Home
AI for M&E
GuidesPromptsPlugins
Resources
Libraries
Indicator LibraryReference Library
DownloadsTools
Topic Guides
EvaluationMEL DesignData CollectionIndicatorsData QualitySampling
Services
About
ENFRES
M&E Studio

AI for M&E, Built for Practitioners

About

  • About Us
  • Contact
  • Insights
  • LinkedIn

Services

  • Our Services

AI for M&E

  • Guides
  • Prompts
  • Plugins
  • Insights

Resources

  • Indicator Library
  • Reference Library
  • Downloads
  • Tools

Legal

  • Terms
  • Privacy
  • Accessibility

© 2026 Logic Lab LLC. All rights reserved.

  1. M&E Library
  2. /
  3. Decision Guides
  4. /
  5. Qualitative vs Quantitative vs Mixed Methods

Qualitative vs Quantitative vs Mixed Methods

Qualitative, quantitative, and mixed methods are not a quality ranking. They answer different questions. Here's when to use each, how to combine them, and what integration actually looks like.

At a Glance

QuantitativeQualitativeMixed Methods
AnswersHow much? How many? How widespread?Why? How? What is the experience?Both breadth and depth
Data typeNumbers, statistics, surveysWords, narratives, observationsBoth
SampleLarge, representative (statistical)Small, purposive (depth)Both types
AnalysisStatistical (frequencies, regression, tests)Thematic, narrative, content analysisIntegrated
StrengthGeneralizability, precision, comparabilityContext, nuance, explanationCompleteness
WeaknessCannot explain why; misses contextCannot generalize; smaller reachMore expensive, harder to integrate
CostMedium-high (large sample)Medium (skilled researchers needed)High (both)

Matching Method to Question

The choice between qualitative and quantitative is not about quality or rigor. It is about matching your method to what you need to know. If you are unsure which approach fits, the Method Selector can help narrow it down.

Use quantitative methods when you need to:

  • Measure the scale or prevalence of something ("What percentage of households have access to clean water?")
  • Compare groups statistically ("Did the treatment group improve more than the comparison group?")
  • Track changes over time with numerical indicators ("Did attendance increase from baseline to endline?")
  • Generalize findings to a larger population
  • Satisfy donor reporting requirements for numerical indicators

Quantitative methods are strong on precision and comparability. They tell you what happened and how much. They do not tell you why. A well-constructed survey design can capture the "what," but explaining the "why" behind the numbers requires qualitative depth. If your evaluation is primarily about reporting against numerical targets, quantitative methods are usually sufficient on their own.

Use qualitative methods when you need to:

  • Understand why something is or is not working ("Why are farmers not adopting the new techniques?")
  • Explore experiences, perceptions, and motivations ("How do beneficiaries perceive the program?")
  • Investigate complex processes or relationships ("How does the community health worker system function in practice?")
  • Capture unintended consequences that surveys would not detect
  • Understand context that shapes program outcomes

Qualitative methods give you depth and nuance. They surface patterns that structured instruments miss entirely. Techniques like thematic analysis and focus group discussions let you understand how people experience a program in their own words. The tradeoff: smaller samples, no statistical generalizability. If your evaluation question is about understanding process, perception, or context, qualitative methods are the right starting point.

Use mixed methods when you need to:

  • Answer both "how much" and "why" questions
  • Explain quantitative findings with qualitative depth ("The survey shows 60% adoption; interviews reveal the 40% who did not adopt lacked market access")
  • Explore a topic qualitatively before designing a survey (exploratory sequential)
  • Validate findings from one method with another (triangulation)
  • Understand how a program works for different subgroups

For guidance on choosing between specific qualitative instruments, see Surveys vs Interviews vs Focus Groups.

Mixed Methods Integration Designs

Most M&E practitioners who say they "use mixed methods" actually use multi-method approaches: they collect both quantitative and qualitative data but analyze and present them separately. True mixed methods involves deliberate integration.

Sequential Explanatory (Quant first, then Qual)

  1. Conduct your survey and analyze quantitative results
  2. Use the quantitative findings to identify areas needing deeper understanding
  3. Design qualitative data collection to explain the "why" behind the numbers
  4. Integrate findings in analysis and reporting

Example: A household survey shows that food security improved in coastal villages but not inland villages. Follow-up interviews and focus groups in both areas reveal that inland villages lost access to a key market due to road flooding.

Best for: Final evaluations, endline studies, answering "why did this work/not work?"

Sequential Exploratory (Qual first, then Quant)

  1. Conduct qualitative research to explore the topic
  2. Use the qualitative findings to design a survey instrument
  3. Conduct the survey to measure prevalence and scale
  4. Integrate findings in analysis and reporting

Example: Interviews with teachers reveal five distinct ways they use new curriculum materials. A subsequent survey measures how prevalent each usage pattern is across all project schools.

Best for: Baseline studies, needs assessments, designing surveys for unfamiliar contexts

Concurrent Triangulation (Both at the same time)

  1. Collect quantitative and qualitative data simultaneously
  2. Analyze each dataset independently
  3. Compare and contrast findings
  4. Identify convergence (both agree), complementarity (each adds different insight), or divergence (they disagree, requiring further investigation)

Example: While enumerators conduct household surveys, another team runs focus groups in the same communities. Survey data shows 75% of women report decision-making power over household income. Focus group discussions reveal that "decision-making power" means different things in different contexts, and some women report influence but not authority.

Best for: Mid-term reviews, rapid assessments, when time is limited

Embedded Design (Qual nested within Quant)

Qualitative data collection embedded within a larger quantitative study to provide depth on specific components.

Example: An RCT evaluating a nutrition program embeds case studies with 20 households (10 treatment, 10 control) to understand the mechanisms through which the program affects dietary diversity.

Best for: Impact evaluations, large-scale studies where context matters

How to Actually Integrate Findings

Integration is where most mixed methods studies fail. Avoid the "two chapters" problem: a quantitative results chapter followed by a qualitative results chapter with no connection between them.

When you skip integration, you get two separate stories that never speak to each other. Stakeholders read the quantitative section and the qualitative section, notice contradictions, and lose confidence in both. Worse, you miss the insight that only emerges when the two data streams interact. A survey might show that training attendance was high. Interviews might reveal that attendees found the content irrelevant. Without integration, neither finding is actionable. Together, they explain the gap between activity completion and outcome achievement.

Joint display tables are the most practical integration technique. For each finding or evaluation question, show quantitative evidence and qualitative evidence side by side, then state the integrated conclusion.

Evaluation QuestionQuantitative FindingQualitative FindingIntegrated Conclusion
Did adoption increase?Adoption rate rose from 25% to 62%Farmers report techniques are "easy to learn but hard to maintain without inputs"Adoption increased substantially, but sustainability depends on input supply chains

Build one row for each evaluation question. Force yourself to write the "Integrated Conclusion" column. If you cannot, your data collection did not actually address the same questions from both angles, and you have a multi-method study, not a mixed methods study.

When NOT to Use Mixed Methods

Mixed methods is not always the right choice. Do not default to it because it sounds more thorough.

Do not use mixed methods if you cannot afford to do both well. Running a strong survey plus a strong qualitative study costs significantly more than either alone. If your budget forces you to cut corners on one component, the weak component drags down the credibility of the entire study. A rigorous quantitative-only or qualitative-only design is better than a mixed methods design where one side is underpowered.

Do not use mixed methods if your team lacks qualitative capacity. Qualitative research requires trained analysts who can code systematically, manage large text datasets, and interpret patterns. If no one on your team has done qualitative analysis before, adding interview transcripts to a survey-based evaluation will produce anecdotes, not findings. Either invest in training, hire a qualitative specialist, or stick with quantitative methods you can execute well.

Do not use mixed methods if you have no plan for integration. Collecting both types of data without a strategy for combining them wastes resources. Before you begin, decide which integration design you will use, how you will structure your analysis, and what your joint display will look like. If you cannot answer those questions at the design stage, you are not ready for mixed methods.

A Note on Analysis Tools

Quantitative analysis typically uses Excel, SPSS, Stata, or R. These tools handle frequencies, cross-tabulations, regression, and statistical testing. Most M&E teams already have some quantitative capacity.

Qualitative analysis requires a different skill set. Systematic coding, theme development, and pattern interpretation take training and practice. Software like NVivo, Dedoose, or Atlas.ti supports the process but does not replace analytical skill. If your team does not have qualitative analysis experience, plan for training or outsource the analysis to a specialist. Skipping systematic analysis and jumping straight to "pulling quotes" is the most common way qualitative data gets wasted in M&E evaluations.

Common Mistakes

Mistake 1: Treating qualitative as "less rigorous." A well-designed qualitative study with purposive sampling, systematic coding, and triangulation is rigorous. A poorly designed survey with leading questions is not. Rigor is about design quality, not method type.

Mistake 2: Using qualitative methods only for "illustrative quotes." If you collect 40 interviews and use them only to pull a few quotes to decorate your quantitative report, you wasted time and money. Qualitative data should be systematically analyzed (thematic analysis, content analysis) and should contribute to findings in its own right.

Mistake 3: Collecting both types but not integrating. If your report has a "quantitative results" section and a "qualitative results" section with no connection between them, you have a multi-method study, not a mixed methods study. Integration must be deliberate.

Mistake 4: Using quantitative sample sizes for qualitative work. You do not need 400 interviews. Qualitative research follows saturation logic: collect data until you stop hearing new themes. This typically occurs at 12-20 interviews for a homogeneous group.

Mistake 5: Convenience sampling disguised as purposive sampling. Purposive sampling means deliberately selecting participants for a reason (maximum variation, typical case, extreme case). Interviewing whoever is available is convenience sampling, and it limits the value of your findings.

Decision Guide

Use these rules of thumb to match your method to your evaluation question. Start with what you need to know, not with what data is easiest to collect.

  • "How much?" or "How many?" Use quantitative methods. You need numbers, statistical comparisons, and representative samples.
  • "Why?" or "How does it work?" Use qualitative methods. You need depth, context, and participants' own explanations.
  • "How much, and why?" Use mixed methods with a sequential explanatory design. Start with the survey, then investigate the patterns qualitatively.
  • "We don't know enough to write a good survey yet." Use qualitative methods first (sequential exploratory). Let interviews and focus groups surface the right questions, then build the instrument.
  • "We need to confirm findings from multiple angles." Use concurrent triangulation. Collect both data types at the same time and compare results for convergence or divergence.

Frequently Asked Questions

PreviousOutput vs Outcome vs Impact: The Key DifferenceNextSurveys vs Interviews vs Focus Groups