Skip to main content
M&E Studio
Home
Services
Tools
AI for M&E
Workflows
Plugins
Prompts
Start a Conversation
Library
Contribution AnalysisDevelopmental EvaluationImpact EvaluationLogframe / Logical FrameworkMost Significant ChangeOutcome HarvestingOutcome MappingParticipatory EvaluationProcess TracingQuasi-Experimental DesignRealist EvaluationResults FrameworkResults-Based ManagementTheory of ChangeUtilization-Focused Evaluation
M&E Studio

Decision-Grade M&E, Responsibly Built

About

  • About Us
  • Contact
  • LinkedIn

Services

  • Our Services
  • Tools

AI for M&E

  • Workflows
  • Plugins
  • Prompts
  • AI Course

M&E Library

  • Decision Guides
  • Indicators
  • Reference
  • Downloads

Legal

  • Terms
  • Privacy
  • Accessibility

© 2026 Logic Lab LLC. All rights reserved.

  1. M&E Library
  2. /
  3. Learning Agendas
Core ConceptLearning12 min read

Learning Agendas

A structured set of priority learning questions that guide systematic inquiry throughout programme implementation, turning monitoring data into actionable knowledge for decision-making.

When to Use

A learning agenda is the right tool when you need to focus your programme's learning efforts on strategic questions that will drive better decisions. Use it when:

  • Managing complex programmes with limited learning capacity: You have more potential learning questions than resources to investigate. A learning agenda forces prioritisation around what matters most for decision-making.
  • Meeting donor CLA requirements: USAID, FCDO, Global Fund, and USDA all require learning agendas as part of their Collaborating, Learning, and Adapting expectations.
  • Aligning MEAL and programme teams: You need a shared framework that ensures MEAL staff and programme staff are investigating the same priority questions rather than working in silos.
  • Creating structured reflection opportunities: You want to move beyond ad-hoc lessons learned toward systematic inquiry that produces actionable knowledge.
  • Driving adaptive management: You need a documented set of questions that programme leadership regularly reviews when making adaptation decisions.

A learning agenda is less useful when you're in emergency response mode requiring rapid decision-making without time for structured learning, or when your programme is so stable that there are genuinely no uncertainties requiring investigation. In those cases, a simpler knowledge management approach may suffice.

ScenarioUse Learning Agenda?Better Alternative
Complex multi-year programmeYes—
Emergency response (rapid deployment)NoAfter Action Review
Donor requires CLA documentationYes—
Stable programme with no uncertaintiesNoKnowledge Management
Need to prioritise learning questionsYes—

How It Works

A learning agenda functions as both a planning document and an operational tool. The process follows these key stages:

  1. Identify strategic uncertainties. Begin by mapping what you genuinely don't know about your programme's design, implementation, or context. These are questions where uncertainty exists and where answers could change programme decisions. Ask: "What assumptions are we making that, if wrong, would require us to change course?" This stage draws heavily from your theory of change, particularly the weak-evidence links in your causal pathways.

  2. Categorise learning questions. Group your uncertainties into thematic categories that reflect your programme's key decision points. Common categories include: programme design effectiveness, implementation approaches, context and external factors, beneficiary responses, and partnership dynamics. Each category should map to a specific decision or adaptation opportunity.

  3. Draft specific, answerable questions. Transform each uncertainty into a clear learning question. Good learning questions are specific enough to guide data collection but open enough to allow for unexpected findings. Avoid questions that can be answered with a simple yes/no. Instead of "Did the training work?" ask "What aspects of the training approach contributed to behaviour change among participants?"

  4. Prioritise by decision relevance and feasibility. Not all learning questions are equally important. Score each question on two dimensions: (a) decision relevance, how likely is an answer to influence programme decisions? and (b) feasibility, can we reasonably answer this with available resources and data? Focus your learning agenda on questions that score high on both dimensions.

  5. Assign ownership and timelines. For each priority question, identify who is responsible for investigating it and when results should be available. This creates accountability and ensures learning doesn't remain abstract.

  6. Integrate with reflection events. Schedule regular reflection sessions where the learning agenda is the central discussion point. These events should bring together programme staff, partners, and sometimes beneficiaries to review findings and discuss adaptations.

  7. Document and share findings. Learning that stays in reports doesn't drive change. Create written memos that capture key findings, the decisions they influenced, and any resulting programme adaptations. Share these broadly with stakeholders.

  8. Review and update quarterly. A learning agenda is not a static document. As your programme evolves and new uncertainties emerge, update the agenda accordingly. Schedule formal reviews at least quarterly.

Key Components

A well-constructed learning agenda includes these essential elements:

  • Strategic learning questions: A prioritised list of 5-10 specific questions that address key uncertainties in your programme design and implementation. Each question should be answerable through available or collectable data.
  • Decision linkages: For each learning question, a clear statement of what decision or adaptation opportunity this inquiry informs. This connects learning directly to programme action.
  • Data sources: Identification of where answers can be found: routine monitoring data, dedicated evaluation activities, partner reports, or external research. This shows how learning will be generated without excessive additional data collection.
  • Ownership and timelines: Clear assignment of who is responsible for investigating each question and when results should be available. This creates accountability for learning production.
  • Reflection event schedule: A calendar of planned reflection sessions where learning agenda findings will be discussed. These should be regular (quarterly minimum) and include relevant stakeholders.
  • Documentation approach: A defined process for capturing learning findings, decisions influenced, and adaptations made. This ensures learning is recorded and shared rather than remaining in individual memories.
  • Stakeholder engagement plan: Identification of which stakeholders should be involved in investigating each question and which should be included in reflection events. This ensures diverse perspectives inform learning.
  • Donor alignment: Where applicable, mapping of your learning questions to donor-specific CLA requirements. This ensures you're meeting external expectations while pursuing internally relevant learning.

Best Practices

Anchor learning questions to decision points. The most effective learning agendas explicitly connect each question to a specific decision or adaptation opportunity. This ensures learning is never an abstract exercise, it always serves programme improvement. When drafting questions, ask: "If we answer this, what might we do differently?" If there's no clear answer, the question may not be decision-relevant.

Prioritise questions that address weak-evidence assumptions. Your theory of change likely contains causal links where you have limited evidence that the connection is real. These are prime candidates for learning questions. For example, if your ToC assumes "training farmers leads to adoption of new varieties" but you have no evidence this holds in your context, make that a priority learning question.

Include partners in learning agenda development. Learning agendas are more useful when programme partners help shape them. Their on-the-ground perspective often reveals learning questions that headquarters staff miss. Schedule collaborative sessions where partners can contribute to question development and prioritisation.

Assign clear ownership for each learning question. Learning doesn't happen automatically, it requires someone to own the investigation. Assign each learning question to a specific person or team, ideally someone with programme authority who can act on findings.

Schedule regular reflection events with structured facilitation. Don't rely on ad-hoc learning discussions. Plan reflection events in advance with facilitation guides that include: review of monitoring data, discussion of learning agenda questions, identification of emerging patterns, and documentation of decisions.

Document learning outcomes systematically. Create written memos that capture what was learned, what decisions were influenced, and what adaptations were made. This documentation becomes your programme's institutional memory and demonstrates learning to donors.

Make reflection events participatory and data-focused. Effective reflection events bring together programme and partner staff to analyse monitoring and accountability data together. They should focus on what is going well and why, not just problems. Partners should be actively involved, not passive observers.

Encourage stakeholder reflection on assumptions and themes. Build in time for individual and key stakeholder reflection on key issues, assumptions, and themes. This personal engagement increases ownership of learning and surfaces insights that structured discussions miss.

Common Mistakes

Treating learning agendas as compliance documents. The most common failure is creating a learning agenda solely to satisfy donor requirements, then filing it away. When a learning agenda isn't actively used to drive reflection and adaptation, it's wasted effort. The document must be living and regularly referenced in programme management.

Focusing on activities rather than learning. A learning agenda that tracks "held 5 reflection sessions" is measuring activities, not learning. The focus must be on what was learned, what decisions were influenced, and what adaptations were made. Missing a planned target is not a failure, failure is failing to capture this info, draw conclusions and act on them. Don't punish bad news but treat it as learning opportunity.

Asking questions that can't be answered. Learning questions that are too broad, lack clear data sources, or require resources you don't have will never produce actionable findings. Before finalising questions, verify that you have (or can reasonably obtain) the data needed to answer them.

Never revisiting the agenda. A learning agenda created at programme start becomes obsolete as the programme evolves. New uncertainties emerge, old ones are resolved, and context shifts. Without regular review and update, the agenda loses relevance and practitioners stop using it.

Separating learning from decision-making. When learning findings are documented but never connected to actual programme decisions, learning becomes an academic exercise. Every learning memo should explicitly state what decision it informed and what action resulted.

Not recording learning objectives before analysis. If you haven't already recorded your learning objectives, outcomes or goals, write them down now before analyzing responses. Without clear objectives, you cannot assess whether learning has occurred or what it means for programme decisions.

Examples

Education Quality Program, East Africa

A 5-year education quality programme in Kenya developed a learning agenda focused on three priority questions: (1) What teaching practices most effectively improve learning outcomes in multi-grade classrooms? (2) How do community engagement approaches influence school governance effectiveness? (3) What barriers prevent female teachers from remaining in rural postings? Each question was linked to specific decision points: curriculum adaptation, partnership strategy, and HR policy review. The programme scheduled quarterly reflection events where teachers, school boards, and county education officials reviewed monitoring data and discussed findings. Within 18 months, learning from question 1 led to a revised teacher training approach; findings from question 3 influenced a new rural posting incentive policy. The learning agenda functioned as a decision-support tool, not just a reporting requirement.

Health Systems Strengthening, West Africa

A health systems programme in Sierra Leone created a learning agenda that explicitly mapped questions to donor CLA requirements. The agenda included questions about supply chain reliability, community health worker motivation, and referral system effectiveness. Each question had assigned ownership (supply chain manager, HR officer, clinical lead) and was reviewed at monthly management meetings. The programme documented learning outcomes in written memos that were shared with Ministry of Health partners. When learning about supply chain failures led to a partnership adjustment, the memo explicitly documented this decision and its rationale. This approach satisfied donor CLA requirements while producing genuine programme improvement.

Governance Programme, South Asia

A governance programme initially designed a linear learning agenda focused on policy change outcomes. After the first year, outcome harvesting revealed unplanned pathways of influence through informal networks. The programme revised its learning agenda to include questions about informal influence mechanisms and adjusted its data collection accordingly. This demonstrated that learning agenda revision is not a sign of failure but of responsive programme management. The updated agenda produced insights that the original design would have missed entirely.

Compared To

A learning agenda is one of several tools used to structure programme learning. The key differences:

FeatureLearning AgendaMEL PlanReflection EventsKnowledge Management
Primary purposePrioritise strategic learning questionsDocument all M&E activities and indicatorsStructured sessions for discussing findingsCapture and share programme knowledge
Level of detail5-10 high-priority questionsComprehensive activity scheduleSession-specific discussion guideBroad knowledge repository
TimeframeOngoing, quarterly reviewProgramme start to endRegular (monthly/quarterly)Continuous
Best forFocusing learning on decisionsOverall M&E planningGenerating insights from dataInstitutional memory
FlexibilityDesigned to evolveRelatively fixedEvent-basedEvolving repository

Relevant Indicators

12 indicators across 4 major donor frameworks (USAID, FCDO, Global Fund, USDA) relate to learning agenda design and use:

  • Learning agenda quality: "Proportion of programmes with documented learning agendas that link questions to decision points" (USAID)
  • Learning utilisation: "Number of learning agenda findings that led to programme adaptations" (FCDO)
  • Staff awareness: "Percentage of programme staff who can articulate the programme's key learning questions" (Global Fund)
  • Agenda currency: "Frequency of learning agenda review and update during programme implementation" (USDA)

Related Tools

  • Learning Questions Template, Guided template for developing specific, answerable learning questions linked to decision points
  • Reflection Event Facilitation Guide, Structured facilitation guide for running data-focused reflection sessions

Related Topics

  • Adaptive Management, The management approach that uses learning agenda findings to inform programme adaptations
  • MEL Plans, The operational document that includes learning agenda as a key component
  • Knowledge Management, The broader system for capturing and sharing programme learning
  • Reflection Sessions, The structured events where learning agenda findings are discussed
  • Evaluation Use, Ensuring learning agenda findings influence decisions
  • Stakeholder Engagement, Involving partners and beneficiaries in learning agenda development

Further Reading

  • USAID CLA Guidance, Official USAID guidance on Collaborating, Learning, and Adapting requirements and best practices.
  • FCDO CLA Framework, FCDO's approach to CLA in international development programmes.
  • CDA Collaborative Learning Projects, Research and resources on organisational learning in humanitarian and development contexts.
  • BetterEvaluation: Learning, Comprehensive resource on evaluation and programme learning approaches.

At a Glance

Focuses programme learning on strategic questions that drive better decisions, rather than just documenting activities.

Best For

  • Prioritising learning questions when resources for inquiry are limited
  • Creating structured opportunities for reflection and adaptation
  • Meeting donor CLA requirements (USAID, FCDO, Global Fund)
  • Aligning MEAL staff and programme teams around key learning priorities

Complexity

Medium

Timeframe

2-4 weeks for initial development; quarterly review cycles

Linked Indicators

12 indicators across 4 donor frameworks

USAIDFCDOGlobal FundUSDA

Examples

  • Proportion of learning agenda questions addressed through routine monitoring data
  • Number of learning agenda findings that led to programme adaptations
  • Percentage of programme staff who can articulate the programme's key learning questions

Related Topics

Core Concept
Adaptive Management
A management approach that uses continuous learning from monitoring and evaluation data to adjust programme strategies and activities in response to changing evidence or context.
Core Concept
M&E Plans
A detailed operational document that translates your logframe and theory of change into actionable M&E requirements, specifying what data to collect, when, from whom, and how it will be used.
Core Concept
Knowledge Management for M&E
The systematic process of capturing, organising, and applying lessons, evidence, and insights from M&E across programmes and over time to improve organisational decision-making.
Term
Reflection Sessions
Structured gatherings where programme teams and stakeholders pause to examine what happened, why it happened, and what should change as a result.
Term
Learning Cycles
Structured, recurring periods of reflection and adaptation where programme teams review data, draw lessons, and adjust implementation accordingly.