When to Use
Results-Based Management (RBM) is the right framework when an organisation needs a structured, consistent approach to planning, monitoring, and being accountable for what it achieves, not just what it does. Use it when:
- Donors require results evidence: USAID, UNDP, the World Bank, and most bilateral agencies mandate RBM-aligned reporting as a condition of funding
- Managing across multiple projects: RBM provides a common language and structure for aggregating results from diverse initiatives under one accountability framework
- Institutional or capacity-building programmes: where outputs are visible (training delivered, systems installed) but the management challenge is demonstrating that those outputs led to meaningful change
- Annual planning and budgeting cycles: RBM connects resource decisions to evidence about what worked, preventing activity-driven budgeting
- Organisational performance management: staff, teams, and partners can be held accountable for specific results within the RBM hierarchy
RBM becomes less useful as a standalone tool when programmes operate in highly complex or emergent environments where pre-set results targets are too rigid (consider developmental evaluation or adaptive management to complement it). It also requires functioning M&E data systems, without reliable data, RBM becomes a compliance exercise rather than a management tool.
| Scenario | Use RBM? | Better Complement |
|---|---|---|
| Organisational-level accountability | Yes | — |
| Single-project monitoring | Yes, alongside | MEL Plans |
| Highly emergent, complex programme | Partially | Developmental Evaluation |
| Proving causal impact | No | Impact Evaluation |
| Understanding why results occurred | No | Contribution Analysis |
How It Works
RBM is more a management philosophy than a single tool, it shapes how an organisation plans, monitors, and makes decisions across an entire programme cycle.
Step 1: Define the results hierarchy
RBM starts with a clear hierarchy: inputs → activities → outputs → outcomes → impact. This hierarchy (sometimes called a results chain or results framework) is the map that tells managers what they are trying to achieve and at what level. Each level should have corresponding indicators.
Step 2: Set measurable targets
For each result in the hierarchy, set a time-bound target against a baseline. "Increase agricultural yield by 20% among 10,000 smallholders by Year 3" is an RBM target. "Support smallholders" is not.
Step 3: Build a performance monitoring system
Select indicators for each level, define data sources and collection methods, assign responsibility for data collection, and establish reporting frequency. This is the MEL plan within an RBM framework.
Step 4: Use data in management decisions
This is the step most organisations skip. RBM requires that monitoring data be actively reviewed and used to make decisions, adjusting activities, reallocating resources, revising targets, or stopping what is not working.
Step 5: Report results, not activities
RBM reporting focuses on what changed, not what was done. Instead of "trained 200 health workers," the report says "200 health workers demonstrated competency improvement post-training, contributing to a 15% reduction in case management errors." This shift requires both better data and a different reporting culture.
Step 6: Feed learning back into the next planning cycle
At the close of each cycle (annual, mid-term, final), synthesise what the results data shows, draw conclusions about programme effectiveness, and integrate those conclusions into the next planning period.
Key Components
A functional RBM system requires:
- Results framework or results hierarchy: a documented map of the results chain from inputs to impact
- SMART indicators at each level: with baselines, targets, data sources, and collection responsibility assigned
- Performance monitoring system: data collection tools, collection schedule, data quality procedures, and storage
- Regular data review process: management meetings or review cycles where data is examined and decisions documented
- Results-oriented reporting: reporting templates and practices that foreground what was achieved, not just what was done
- Accountability assignments: clear ownership of each result, typically mapped in a responsibility matrix
- Learning integration mechanism: a formal process for feeding M&E findings into planning and design
Best Practices
Align M&E, accountability, and learning from the start. RBM works best when monitoring, evaluation, accountability, and learning are designed together, not sequenced.
Use results frameworks as operational anchors. The results framework should drive programme decisions, not just reporting. Use it to structure resource agreements, review partner performance, and prioritise activities.
Don't rely solely on routine monitoring for learning. Routine data tells you whether results are on track; it rarely explains why. Build periodic evaluations into the RBM system to generate deeper understanding.
Commit to using data, not just collecting it. The most common RBM failure is collecting good data and not changing behaviour based on it. Create explicit management protocols that require documented decisions linked to data findings.
Set fewer, better results. RBM frameworks often suffer from results inflation, too many indicators, too many targets, too little focus. A framework with 5 well-defined outcome indicators that are actively tracked outperforms one with 30 indicators that exist only for donor reports.
Common Mistakes
Confusing RBM with results reporting. RBM is a management approach, not a reporting format. Producing results-formatted reports without using data to manage is compliance theatre, not RBM.
Setting output-level targets and calling them outcomes. "10,000 farmers trained" is an output. "10,000 farmers applying improved techniques" is an outcome. Organisations often set output targets and present them as outcome achievement.
Building RBM frameworks without baseline data. Targets without baselines are arbitrary. The framework must include a realistic baseline establishment phase before targets are set.
Treating the results framework as static. Results frameworks should be living documents, updated as context changes and evidence accumulates. A Year 1 framework that has not been reviewed by Year 3 is a historical document, not a management tool.
Disconnecting RBM from the ToC. Results hierarchies should flow directly from the theory of change. When they do not, the monitoring system measures the wrong things.
Examples
UNDP Country Programme, East Africa. A five-year UNDP governance programme in Tanzania adopted an RBM framework with four outcome-level results covering democratic participation, rule of law, public administration reform, and local governance. Each outcome had 3-5 SMART indicators with national baselines drawn from citizen surveys and government administrative data. A bi-annual data review process fed findings into annual work planning. The RBM framework was credited in the mid-term review with enabling a resource reallocation away from underperforming local governance activities toward rule-of-law initiatives that showed stronger uptake.
Multi-donor programme, South Asia. A DFID-funded health programme in Bangladesh operating through six implementing partners needed a common results accountability structure. The RBM framework served as the shared performance agreement, all partners reported against the same results hierarchy, enabling aggregate reporting across the programme. Partners used their own monitoring systems but reported to the shared framework quarterly. The structure allowed the programme manager to identify one underperforming partner early and provide targeted support.
Organisational performance management, West Africa. A regional NGO network in Francophone West Africa adopted RBM as its organisational management approach across 12 country offices. Each country developed its own results framework aligned to the regional framework. Annual reviews compared country results against targets and against each other. The process created internal accountability between country directors and surfaced good practices from high-performing offices for peer learning.
Compared To
| Framework | Focus | Accountability | Attribution |
|---|---|---|---|
| Results-Based Management | Management and accountability | Organisational | No |
| Results Framework | Portfolio tracking | Donor reporting | No |
| Logframe | Project design and M&E | Project-level | No |
| Theory of Change | Causal logic | Programme logic | Implicit |
| Impact Evaluation | Causal attribution | Evidence | Yes |
Relevant Indicators
44 donor-aligned indicators across USAID, UNDP, World Bank, and OECD-DAC frameworks. Key examples:
- Percentage of programme results achieved against targets at each reporting period
- Quality rating of M&E data used in management reviews (1-5 scale)
- Number of management decisions formally documented as evidence-informed
- Proportion of annual work plan changes traceable to prior-period results data
Related Tools
- Results Framework Builder: structure your results hierarchy with indicators and targets
- Indicator Library: search 2,923 donor-aligned indicators to populate your results framework
Related Topics
- Results Framework, the core planning tool within an RBM system
- MEL Plans, the operational M&E plan that feeds data into RBM
- Adaptive Management, how to use RBM data to adapt in real-time
- Logframe, the project-level equivalent of an RBM framework
- Performance Evaluation, periodic assessment of results achievement within an RBM system
Further Reading
- OECD-DAC (2002). Glossary of Key Terms in Evaluation and Results Based Management. Paris: OECD. The foundational terminology reference.
- UNDP (2009). Handbook on Planning, Monitoring and Evaluating for Development Results. The most widely used RBM handbook in the UN system.
- World Bank Independent Evaluation Group (2012). Designing a Results Framework for Achieving Results: A How-To Guide. Practical guidance for results framework design.
- USAID (2016). ADS Chapter 201: Program Cycle Operational Policy. Defines USAID's RBM requirements for all funded programmes.