Table 3: Descriptions of various quality management strategies
Strategy Description
Protocol review and approval Research rigour consistency includes stipulating how you will protect the rights and welfare of research participants. Protocols may also be established to ensure approval consistency and diligence in data and collection procedures (standardized instruments, consistent interview protocols); as well as checklists and established protocols to ensure consistency and rigour of data analysis across sites/among researchers.
Standard operating procedures A principal investigator must put protocols in place to establish rigour and consistency between and among researchers and research sites. This may include standardized research collection procedures (establishing a protocol or checklist); creating standardized instruments and interview protocols to be used across sites and among all researchers; constant checks to ensure procedures are diligently adhered to; and holding training sessions with researchers and research assistants.
Validation of research instruments Indicate whether research instruments are standardized and whether they have been shown in previous studies and reports to have strong reliability and validity (with respect to content, criteria and construction).
Project team training Adequate training is essential to research subject/participant safety, protocol implementation, and quality assurance and improvement. Training of researchers and assistants in data collection procedures to ensure safety of the participants, as well as to ensure consistency and research rigour between and across sites, is essential.
Quality control and monitoring Quality control is important to ensure reliable and consistent findings. What procedures will be incorporated into the research design to ensure consistent data collection methods are implemented between and among research sites and among different researchers? The proposed methodology should help investigators identify data quality problems that can be corrected while data is still being collected, and also to identify biases in the data collection that might be adjusted at a later date.
Evaluation of services provided by the project Monitoring and evaluation of service provision is essential for analyzing and, where possible, improving the effectiveness of service regimes. Establish ‘critical limits’ to measure the effectiveness and quality of the services provided to participants/clients/patients. Establish appropriate record-keeping and documentation systems. Make regular site visits to monitor progress and assess impact. Establish corrective actions. Evaluate, with relevant health care workers, achievements made and lessons learnt, and apply any lessons learnt to existing and new arrangements.
Evaluation of service provider performance Generating and using information on the performance of service providers can lead to the substantial enhancement of transparency and accountability, which in turn fosters adherence to higher quality standards in service delivery. Assessment tools rely on external experts measuring quality and performance against a pre-determined set of indicators. Participatory monitoring and evaluation tools seek to engage service users beyond the provision of feedback, to also take an active role in the planning and implementation of the assessment. This helps to build the capacity of the local community to analyze, reflect and take action. Community scorecards envisage the active involvement of the group and allow participants themselves to identify indicators of quality and performance.
Review of reports Reports should be drafted and shared in a timely manner to provide all the researchers and appropriate stakeholders with sufficient opportunity to read, react to, provide feedback on, edit, revise, and provide input into relevant reports. Various formats will be required for different review platforms (e.g. Powerpoint presentations and/or narratives).