Skip to main content
M&E Studio
Home
AI for M&E
GuidesPromptsPlugins
Resources
Indicator LibraryReference LibraryTopic GuidesTools
Services
About
ENFRES
M&E Studio

AI for M&E, Built for Practitioners

About

  • About Us
  • Contact
  • Insights
  • LinkedIn

Services

  • Our Services

AI for M&E

  • Guides
  • Prompts
  • Plugins
  • Insights

Resources

  • Indicator Library
  • Reference Library
  • Downloads
  • Tools

Legal

  • Terms
  • Privacy
  • Accessibility

© 2026 Logic Lab LLC. All rights reserved.

  1. M&E Library
  2. /
  3. Decision Guides
  4. /
  5. How to Write Donor Reports That Actually Get Read

How to Write Donor Reports That Actually Get Read

How to write donor reports that get read. Indicator tables, narrative structure, explaining underperformance, and what donors actually look for.

Who This Page Is For

You are writing a donor report. Maybe it is a quarterly progress report, maybe an annual review. You have data, you have activities to describe, and you have a deadline. This page tells you what to write, how to structure it, and what to stop wasting time on.

Most donor reports fail for the same reason: they describe what the team did instead of what changed. Program officers do not need a summary of activities they already approved. They need to know what is on track, what is off track, and what you are doing about the gaps. Follow reporting best practices and you will write reports that get read instead of filed.

What Donors Actually Read

A program officer managing 15 grants does not read every word of every report. Here is what they look at, in order:

  1. Executive summary. If you write nothing else well, write this well. One page. Key achievements, key challenges, headline indicator results, any deviations from the workplan. This is the only section every reader finishes.
  2. Indicator performance table. They scan for red: indicators below target, downward trends, missing data. If everything is green, they move on. If something is off, they read the explanation.
  3. Challenges and adaptive management. They want to see that problems are identified early and addressed, not hidden.
  4. Financial summary. Burn rate versus timeline. Underspending worries them as much as overspending.

What they skip: activity descriptions that repeat the workplan, success stories without data behind them, and any section that reads like it was written to fill space.

Write for the reader who has 20 minutes and 15 reports on their desk. Front-load the information that matters.

The Indicator Performance Table

This is the most important section in your report. Build it before you write a single sentence of narrative. The table is the skeleton; the narrative explains the skeleton.

A strong indicator performance table looks like this:

Result LevelIndicatorBaselineAnnual TargetActual (This Period)% AchievedVariance Explanation
Outcome 1% of target households with improved dietary diversity score34%50%47%94%On track. Slight shortfall due to delayed seed distribution in 2 districts.
Outcome 2% of community health workers demonstrating correct case management22%60%41%68%Below target. Training rollout delayed by 3 months due to security restrictions. Catch-up plan in place for Q3.
Output 1.1# of farmer field schools established04548107%Exceeded target. Community demand led to 3 additional schools opened with local contributions.
Output 2.1# of health workers trained in integrated case management020012663%Below target. See Outcome 2 explanation. Remaining 74 scheduled for Q3.

Rules for indicator tables:

  • Include every indicator in your logframe. Do not hide underperforming indicators by leaving them out.
  • Report cumulative progress against the life-of-project target AND progress for this reporting period. Reviewers need both.
  • Use a consistent threshold for variance explanations. If achievement is below 85% or above 115%, explain why. Some donors set their own thresholds; check your agreement.
  • Keep variance explanations to one or two sentences in the table. Put detailed analysis in the narrative.
  • Use indicator reporting standards: report the exact numerator and denominator for percentage indicators, not just the percentage.

Run your indicators through the SMART Indicator Checker if you are not confident they are well-defined. Weak indicators produce weak tables.

Writing the Narrative

The narrative section is not a diary of activities. It is an analytical account of progress, problems, and decisions. Structure it around your results framework, not around your workplan.

Write by result, not by activity. Organize sections by outcome or output, not by "what we did this month." The reader wants to know whether Outcome 1 is on track, not whether you held 14 meetings.

Lead with findings, not activities. Compare these two approaches:

Write ThisNot This
"Dietary diversity scores improved by 13 percentage points across 3 of 5 target districts. The two lagging districts experienced drought-related crop failure, reducing food availability.""The team conducted 48 farmer field school sessions across 5 districts during the reporting period. Sessions covered nutrition-sensitive agriculture practices."
"Training completion is at 63% of the annual target, driven by a 3-month delay caused by access restrictions. A revised training schedule will reach the remaining 74 health workers by September.""Security conditions prevented the team from accessing several communities. The team is working to reschedule training activities."
"The community feedback mechanism received 312 complaints this quarter. 89% were resolved within 14 days, up from 71% last quarter.""The community feedback mechanism continues to function well and the team is responding to complaints in a timely manner."

Notice the pattern. Good narrative reporting states the finding (with numbers), explains the cause, and describes the response. Weak reporting describes activities and uses vague adjectives.

Use data in every paragraph. If a paragraph contains no numbers, no percentages, and no comparison to a target, ask yourself what it is actually communicating. If the answer is "we did the thing we said we would do," delete it or compress it to a single line.

Link to visuals. A well-placed chart can replace an entire page of narrative. Trend lines showing indicator progress over time, bar charts comparing districts, maps showing geographic coverage. Use data visualization to communicate patterns that prose handles poorly. Put complex visuals in annexes with clear labels and reference them from the narrative.

Explaining Underperformance

This is where most reports fall apart. Teams either hide bad news or drown it in excuses. Neither works. Program officers expect some indicators to fall short. What they care about is whether you noticed, understand why, and have a plan.

Use this structure for every underperforming indicator:

  1. State the finding. "Output 2.1 reached 63% of the annual target."
  2. Explain the cause with evidence. "Access restrictions in the northern region prevented training activities for 3 months (March through May)."
  3. Describe your response. "The team has compressed the remaining training schedule into Q3, partnering with a local organization that maintains access."
  4. Set a recovery timeline. "We expect to reach the annual target by the end of Q3."

What this looks like when you compare approaches:

Write ThisNot This
"Enrollment is 22% below target due to competing harvest-season labor demands. We shifted registration to early morning hours in 3 communities and saw a 15% uptick. Expanding this to all sites in Q3.""Enrollment has been challenging due to contextual factors."
"The latrine construction target will not be met this year. Material costs increased 40% due to supply chain disruptions. We have requested a no-cost extension to complete the remaining 35 units.""Some delays have been experienced in the WASH component."

Do not blame the context without showing what you did about it. "Security prevented access" is a cause. "Security prevented access, so we partnered with local health volunteers who could still reach communities, resulting in 60% coverage in restricted areas" is adaptive management. The second version is what program officers want to see.

For guidance on how indicators, targets, and milestones relate in your reporting, see Indicator vs Target vs Milestone.

Report Types: What Changes

The core principles stay the same across report types. What changes is depth, scope, and emphasis.

Report TypeTypical LengthFocusKey Differences
Quarterly5-10 pages + indicator annexActivity progress, early warning flags, short-term adaptive actionsLighter analysis. Focus on what happened, what is off track, and what you will do next quarter.
Semi-annual10-15 pages + annexesOutcome-level progress, mid-course correctionsDeeper analysis. Include outcome data if available. Compare first-half performance to annual targets.
Annual15-25 pages + annexesFull indicator performance, annual reflection, workplan adjustmentsComprehensive indicator table with year-over-year trends. Include lessons learned section.
Final/Completion20-40 pages + annexesLife-of-project achievement, sustainability, lessonsCumulative indicator performance against all targets. Honest assessment of what worked and what did not. Include handover status and sustainability plan.

Quarterly reports are progress snapshots. Keep them tight. Report activities completed, flag risks, and propose adjustments. Do not write a mini-annual report every three months.

Annual reports require reflection. What patterns do you see across quarters? What did you learn? What will you do differently next year? This is where you demonstrate that your M&E system generates learning, not just numbers.

Final reports are your legacy document. Be honest about what the program achieved and what it did not. Program officers respect candor. They do not respect final reports that claim every indicator was met when the data says otherwise.

Common Mistakes

Mistake 1: Reporting activities instead of results. "We conducted 48 training sessions" tells the reader what you did. "126 of 200 target health workers completed integrated case management training, with post-test scores averaging 78% (up from 34% at baseline)" tells the reader what changed. Every section should answer "so what?"

Mistake 2: Hiding underperformance. Leaving struggling indicators out of the table, burying bad news on page 18, or using vague language ("some challenges were experienced") does not fool program officers. It erodes trust. Report all indicators honestly. Acknowledge shortfalls. Show your response.

Mistake 3: Writing the executive summary last and rushing it. The executive summary is the most-read section. It deserves the most editing, not the least. Write it last if you want, but then revise it at least twice. It should stand alone as a one-page briefing for someone who will read nothing else.

Mistake 4: No connection between data and narrative. The indicator table says 68% achieved. The narrative says "good progress was made." These do not match. Every claim in the narrative should be traceable to a number in the indicator table or a data source in the annex. Use the Review Studio to check whether your narrative and data tell the same story.

Mistake 5: Treating every report type the same way. A quarterly report written at annual-report depth wastes your time and the reader's. A final report written at quarterly depth fails the program's legacy. Match the depth to the report type. Quarterly reports are 5-10 pages for a reason.

Before You Submit

Use this as a final check:

  • Executive summary is one page and stands alone as a briefing
  • Indicator performance table includes every logframe indicator
  • Variance explanations provided for all indicators below 85% or above 115% achievement
  • Narrative is organized by results, not by activities
  • Every paragraph includes data, a comparison, or a specific finding
  • Underperformance is stated directly with cause, response, and timeline
  • Financial summary matches the narrative (no underspending without explanation)
  • Annexes are labeled clearly and referenced from the body text
  • Report length matches the type (quarterly is not 25 pages)
  • No copy-paste artifacts from a previous reporting period (check dates, quarter references, indicator values)

Frequently Asked Questions

PreviousHow to Conduct a Data Quality AssessmentNextHow to Write Evaluation Terms of Reference