Create
Create a Data Quality Audit Protocol
Create a data quality audit protocol with verification procedures, sampling strategy, scoring rubric, and corrective action framework for assessing M&E data reliability.
||
You are a senior MEAL specialist with expertise in data quality assurance. Your task is to create a comprehensive Data Quality Audit (DQA) Protocol for a program.
**Program Context:**
- Program name: the program requiring data quality assessment
- Number of implementation sites: the program's geographic footprint
- Key indicators to audit: the priority indicators for verification
- Data collection tools: how data is currently collected
- Reporting frequency: how often data is reported
- Donor: the primary funding agency
**Deliverables:**
**1. Audit Objectives and Scope**
- Purpose of the DQA (verification, improvement, compliance, or all three)
- Specific data quality dimensions to assess: Validity, Reliability, Completeness, Timeliness, Precision, and Integrity (define each in the program context)
- Time period under review
- Data sources and levels to audit (facility, district, national)
**2. Sampling Strategy**
- Sampling frame: list of all sites and reporting units
- Sample size calculation: recommend auditing at least 10-15% of sites, with justification
- Selection method: random sampling with stratification by performance level, geography, or facility type
- Indicator sampling: which indicators to audit in full versus spot-check
- Document sampling: how many records to verify per site (recommend minimum 20 records or 10% of total, whichever is larger)
**3. Verification Procedures**
For each data quality dimension, provide specific verification steps:
| Dimension | Verification Method | Data Source Cross-Reference | Pass Criteria |
|---|---|---|---|
Include:
- **Source document verification:** Compare reported figures against original source documents (registers, forms)
- **Recounting:** Independently recount a sample of records and compare with reported totals
- **Cross-system checks:** Compare figures across parallel systems (e.g., facility register vs. digital form vs. monthly report)
- **Timeliness check:** Percentage of reports submitted on time
- **Completeness check:** Percentage of required fields completed in source documents
- **Logic checks:** Identify impossible or implausible values
**4. Scoring Rubric**
Create a scoring matrix for each dimension on a 1-4 scale:
| Score | Label | Description |
|---|---|---|
| 4 | Highly Satisfactory | Data meet all quality standards with minor issues only |
| 3 | Satisfactory | Data are generally reliable with some correctable gaps |
| 2 | Needs Improvement | Significant quality issues that affect data reliability |
| 1 | Unsatisfactory | Data cannot be relied upon for decision-making |
Define what constitutes each score for each dimension. Provide an overall composite scoring method with weighting.
**5. Audit Tools and Templates**
List and briefly describe each tool needed:
- Site-level verification checklist
- Record recounting worksheet
- Data cross-reference matrix
- Timeliness tracking log
- Findings summary template
- Corrective action plan template
**6. Corrective Action Framework**
For each score level (1-4), specify:
- Required corrective actions
- Responsible party
- Timeline for remediation
- Follow-up verification date
- Escalation protocol if issues persist
**7. Audit Schedule and Team**
- Recommended audit frequency (quarterly for high-risk indicators, semi-annually for routine)
- Team composition and roles
- Estimated level of effort per site (person-days)
- Budget considerations
Align with USAID DQA guidelines, Global Fund LFA verification standards, and MEASURE Evaluation data quality frameworks.
data-qualityauditverificationdata-managementquality-assurance