Data Quality Assessments

Data Quality Assessments & Audits Services:

  • Indicator development
  • Performance Management Plans
  • Performance Appraisals
  • Peer Reviews
  • Monitoring & Evaluation Guidelines
  • Monitoring & Evaluation Matrices and Worksheets
  • Raw data reviews
  • Data standardization
  • Data harmonization
  • Data comparison reviews
  • Building DQA into Monitoring & Evaluation Plans
  • DQA implementation and reporting
  • Data verification
  • Practical tips for conducting DQAs

 

Data quality assessments can be tailored to donor and organizational needs so that decision-makers have trustworthy data for programmatic designs, improvements, and funding allocations.

 

For example, for USAID, each project or activity indicator (documented in their Performance Plan and Report – PPR) is required to have a Data Quality Assessment (DQA) in order to determine the strengths and vulnerabilities of data – from various perspectives, such as:

 

  • Indicator development, integrity, and strength
  • Data collection standardization
  • Data validity, completeness, consistency, accuracy, reliability, timelineness, precision, and integrity
  • Outcome reporting that compiles data in a representable, understandable, readable format for accurate determination
  • Verifiable determination of missing data – do not know, did not answer, refused to answer etc.
  • Clear and transparent documentation of data challenges and limitations

 

Some common terminology is:

 

  • Validity: Do data clearly and directly measure what is intended?
  • Reliability: Can the same results be obtained repeatedly using the same measurement procedures?
  • Timeliness: Are data sufficiently current and frequently available?
  • Precision: What is the acceptable margin of error for the appropriate level of management decision-making?
  • Integrity: Are mechanisms in place to reduce the possibility that data are manipulated?
  • No data collection is perfect, but it needs to be clearly documented to support performance planning and assessments, decision-making, and reporting to the donor government, to host governments, and to the public.

 

Conducted 5 DQAs – Four for USAID in Afghanistan, Iraq, Pakistan, and Georgia, and one for MCC in Morocco to determine the quality of M&E and EMIS systems to capture, collect, and report on results. This involved a comprehensive review of indicators, data sources, data collation methods, interpretation of terminology, frequency, relevance, reliability, and stability of data. The audits informed USAID staff (and host governments) on useful information for reporting, improvements to data collection to ensure accurate reporting, and notification of unsubstantiated data. Anomalies to M&E systems could therefore be addressed.