Evaluation Inception Report Quality

AI Prompt Templates

Copy a prompt into Claude, ChatGPT, or Gemini. Paste your document at the bottom and run.

Paste a document and get a scored quality assessment with evidence and revision priorities.

5,253 characters
You are an expert M&E evaluation specialist. Score the evaluation inception report I will provide using the rubric below.

SCORING RUBRIC - Evaluation Inception Report Quality
Score each dimension 1-5 using these criteria:

DIMENSION 1: Refined Evaluation Questions and Matrix
- Score 5: All four elements present. Each evaluation question from the ToR is reproduced or explicitly refined, sub-questions identified for every main question (typically 2-4 each), evaluation matrix maps each question and sub-question to data sources, methods, and respondents, and any deviation from ToR EQs is justified.
- Score 4: At least three of four elements present. Matrix exists; sub-questions partially present or deviation rationale weak.
- Score 3: At least two of four elements present. Questions reproduced and matrix begun, but sub-questions or method-mapping incomplete.
- Score 2: Questions reproduced from ToR without refinement or sub-questions. No evaluation matrix.
- Score 1: No evaluation questions or matrix. Inception report cannot guide data collection.

DIMENSION 2: Detailed Methodology and Sampling
- Score 5: All five elements present. Design type named with rationale tied to evaluation questions, sample size calculated with documented assumptions, sampling/selection method specified, recruitment plan addresses access and consent, mixed-methods integration plan explains how qual and quant findings will combine.
- Score 4: At least four of five elements present. Design and sample described; recruitment or integration plan partial.
- Score 3: At least three of five elements present. Design and sampling named but rationale or calculation generic.
- Score 2: Methodology and sampling stated but at the level of the ToR with no further detail.
- Score 1: No methodology or sampling specification beyond what was in the ToR.

DIMENSION 3: Data Collection Tools and Field Plan
- Score 5: All four elements present. Every data source has a named tool (questionnaire, FGD guide, KII guide, observation form), a pretesting plan with sample size and timing, an enumerator training plan with curriculum and at least one mock field exercise, and a field schedule with dated activities and locations.
- Score 4: At least three of four elements present. Tools and pretesting plan present; training plan or schedule partial.
- Score 3: At least two of four elements present. Tools listed but pretesting plan or training plan generic.
- Score 2: Tools mentioned but not listed per source. No pretesting or training detail.
- Score 1: No data collection tools or field plan.

DIMENSION 4: Analysis Plan
- Score 5: All five elements present. Quantitative analysis steps specified (variables, transformations, statistical tests or descriptive approach), qualitative coding plan specified (codebook approach, inter-coder reliability strategy), triangulation rules state how multiple sources will converge or diverge, sensitivity or limitations analysis planned, software/tools named (e.g., SPSS, R, NVivo, Atlas.ti).
- Score 4: At least four of five elements present. Quantitative and qualitative steps present; triangulation rules or sensitivity analysis partial.
- Score 3: At least three of five elements present. Analysis types named but steps or rules generic.
- Score 2: Analysis described as "thematic analysis" or "descriptive statistics" without detail. No triangulation rules.
- Score 1: No analysis plan beyond high-level mention.

DIMENSION 5: Risk, Ethics, and Quality Assurance
- Score 5: All five elements present. Risk register lists program-specific risks with named mitigations and owners, ethics approval status documented (IRB, donor ethics review, or equivalent), DQA plan describes verification at collection and entry, deliverable quality control process names reviewers and timing, contingency plans address scenarios that would alter the design (e.g., access restrictions, security incidents).
- Score 4: At least four of five elements present. Risk register and ethics status present; DQA, QC, or contingency plans partial.
- Score 3: At least three of five elements present. Risks identified and ethics referenced; DQA or QC generic.
- Score 2: Risks listed without mitigations. Ethics referenced as a checkbox. No DQA or contingency plans.
- Score 1: No risk register, ethics provisions, DQA plan, or contingency planning.

OUTPUT FORMAT:
Return your assessment as a table followed by a summary:

| Dimension | Score (1-5) | Evidence from Inception Report | Priority Revision |
|-----------|-------------|-------------------------------|-------------------|
| Refined Evaluation Questions and Matrix | | | |
| Detailed Methodology and Sampling | | | |
| Data Collection Tools and Field Plan | | | |
| Analysis Plan | | | |
| Risk, Ethics, and Quality Assurance | | | |

**Total: X/25**
**Band:** Strong (22-25) / Adequate (17-21) / Needs Revision (11-16) / Substantial Revision (5-10)
**Single Most Important Revision:** [One specific sentence]
**Approval Recommendation:** [Approve to proceed / Approve with conditions / Return for revision / Do not approve]

For any dimension scored 1 or 2, add a brief explanation and a concrete revision example.

INCEPTION REPORT TO SCORE:
[Paste your evaluation inception report here]

Scoring Criteria

Refined Evaluation Questions and Matrix
5Excellent

All four elements present. Each evaluation question from the ToR is reproduced or explicitly refined, sub-questions identified for every main question (typically 2-4 each), evaluation matrix maps each question and sub-question to data sources, methods, and respondents, and any deviation from ToR EQs is justified.

4Good

At least three of four elements present. Matrix exists; sub-questions partially present or deviation rationale weak.

3Adequate

At least two of four elements present. Questions reproduced and matrix begun, but sub-questions or method-mapping incomplete.

2Needs Improvement

Questions reproduced from ToR without refinement or sub-questions. No evaluation matrix.

1Inadequate

No evaluation questions or matrix. Inception report cannot guide data collection.

Detailed Methodology and Sampling
5Excellent

All five elements present. Design type named with rationale tied to evaluation questions, sample size calculated with documented assumptions, sampling/selection method specified, recruitment plan addresses access and consent, mixed-methods integration plan explains how qual and quant findings will combine.

4Good

At least four of five elements present. Design and sample described; recruitment or integration plan partial.

3Adequate

At least three of five elements present. Design and sampling named but rationale or calculation generic.

2Needs Improvement

Methodology and sampling stated but at the level of the ToR with no further detail.

1Inadequate

No methodology or sampling specification beyond what was in the ToR.

Data Collection Tools and Field Plan
5Excellent

All four elements present. Every data source has a named tool (questionnaire, FGD guide, KII guide, observation form), a pretesting plan with sample size and timing, an enumerator training plan with curriculum and at least one mock field exercise, and a field schedule with dated activities and locations.

4Good

At least three of four elements present. Tools and pretesting plan present; training plan or schedule partial.

3Adequate

At least two of four elements present. Tools listed but pretesting plan or training plan generic.

2Needs Improvement

Tools mentioned but not listed per source. No pretesting or training detail.

1Inadequate

No data collection tools or field plan.

Analysis Plan
5Excellent

All five elements present. Quantitative analysis steps specified (variables, transformations, statistical tests or descriptive approach), qualitative coding plan specified (codebook approach, inter-coder reliability strategy), triangulation rules state how multiple sources will converge or diverge, sensitivity or limitations analysis planned, software/tools named (e.g., SPSS, R, NVivo, Atlas.ti).

4Good

At least four of five elements present. Quantitative and qualitative steps present; triangulation rules or sensitivity analysis partial.

3Adequate

At least three of five elements present. Analysis types named but steps or rules generic.

2Needs Improvement

Analysis described as "thematic analysis" or "descriptive statistics" without detail. No triangulation rules.

1Inadequate

No analysis plan beyond high-level mention.

Risk, Ethics, and Quality Assurance
5Excellent

All five elements present. Risk register lists program-specific risks with named mitigations and owners, ethics approval status documented (IRB, donor ethics review, or equivalent), DQA plan describes verification at collection and entry, deliverable quality control process names reviewers and timing, contingency plans address scenarios that would alter the design (e.g., access restrictions, security incidents).

4Good

At least four of five elements present. Risk register and ethics status present; DQA, QC, or contingency plans partial.

3Adequate

At least three of five elements present. Risks identified and ethics referenced; DQA or QC generic.

2Needs Improvement

Risks listed without mitigations. Ethics referenced as a checkbox. No DQA or contingency plans.

1Inadequate

No risk register, ethics provisions, DQA plan, or contingency planning.

Score Interpretation

Total (out of 25)BandNext Step
22-25StrongApprove inception. Proceed to fieldwork.
17-21AdequateApprove with conditions on flagged dimensions. Address before fielding.
11-16Needs RevisionReturn for revision. Use Revise prompt with AI output as revision brief. Do not approve fielding.
5-10Substantial RevisionInception report does not meet quality standards. Major revisions required before fielding.