SciELO - Scientific Electronic Library Online

 
vol.98 issue7 author indexsubject indexarticles search
Home Pagealphabetic serial listing  

Services on Demand

Article

Indicators

Related links

  • On index processCited by Google
  • On index processSimilars in Google

Share


SAMJ: South African Medical Journal

On-line version ISSN 2078-5135
Print version ISSN 0256-9574

SAMJ, S. Afr. med. j. vol.98 n.7 Pretoria Jul. 2008

 

ORIGINAL ARTICLES

 

An evaluation of the District Health Information System in rural South Africa

 

 

A GarribI; K HerbstII; L DlaminiIII; A McKenzieIV; N StoopsV; T GovenderVI; J RohdeVII

IMB ChB, DFM, MSc, FCPHM (SA); Africa Centre for Health and Population Studies, Somkhele, University of KwaZulu-Natal
IIMB ChB, MSc, DGA, FFCH (SA); Africa Centre for Health and Population Studies, Somkhele, University of KwaZulu-Natal
IIIAfrica Centre for Health and Population Studies, Somkhele, University of KwaZulu-Natal
IVBSc, MB ChB, MPH, DPHCE; Health Information Systems Programme, Beacon Bay, E Cape
VBA, MPH; Health Information Systems Programme, Beacon Bay, E Cape
VIMB ChB, FCPHM (SA), DOH, DHSM, DCH; KwaZulu-Natal Provincial Department of Health, Pietermaritzburg, KwaZulu-Natal
VIIMD, FACPM, FAAP; Management Sciences for Health, East London, E Cape

Correspondence

 

 


ABSTRACT

BACKGROUND: Since reliable health information is essential for the planning and management of health services, we investigated the functioning of the District Health Information System (DHIS) in 10 rural clinics.
DESIGN AND SUBJECTS: Semi-structured key informant interviews were conducted with clinic managers, supervisors and district information staff. Data collected over a 12-month period for each clinic were assessed for missing data, data out of minimum and maximum ranges, and validation rule violations.
SETTING: Our investigation was part of a larger study on improving information systems for primary care in rural KwaZulu-Natal.
OUTCOMES: We assessed data quality, the utilisation for facility management, perceptions of work burden, and usefulness of the system to clinic staff.
RESULTS: A high perceived work burden associated with data collection and collation was found. Some data collation tools were not used as intended. There was good understanding of the data collection and collation process but little analysis, interpretation or utilisation of data. Feedback to clinics occurred rarely. In the 10 clinics, 2.5% of data values were missing, and 25% of data were outside expected ranges without an explanation provided.
CONCLUSIONS: The culture of information use essential to an information system having an impact at the local level is weak in these clinics or at the sub-district level. Further training and support is required for the DHIS to function as intended.


 

 

There is increasing demand for health information to inform policies, priority setting, resource allocation, monitoring of the impact of health programmes, and progress towards goals.1,2 Managing a health system requires various types of information from a variety of sources. Data collection within the health system includes disease surveillance, facility surveys, and routine reporting of health service statistics.1

Health information systems (HISs) aim to ensure the appropriate and effective use of resources to improve the health service performance and the health of the community. Therefore, these systems collect, analyse and convert data into information that will be useful in determining health system actions.3 Such data must be reliable, accurate and timely. However, few developing countries have the ability to effectively implement such procedures.2,4

HISs are rarely evaluated, in developed and developing countries, despite the large resources allocated to them.5 There has been little evaluation of primary care information systems, particularly in developing countries.6 Evaluation is fundamental to ensure that the information systems are efficient, collect high-quality relevant information, and are used by care givers, managers, and policy makers.7 As larger amounts of money are spent on HISs, the emphasis on cost-effectiveness in health care creates new pressures to evaluate their impact and determine whether they are achieving their putative benefits and justifying their costs.

 

Methods

The District Health Information System (DHIS)

The DHIS, developed to collect aggregated routine data from all public health facilities in a country, is intended to support decentralised decision making and health service management.8 Introduced in South Africa in 1996, it was extended to the entire country by 2001. It is used in several other developing countries in Africa and Asia.9 The DHIS allows health care workers to analyse their levels of service provision, predict service needs, and assess performance in meeting health service targets.10

Improvements in the completeness and quality of data collected through the DHIS have been reported. There have also been reported delays in submission of data due to non-delivery of forms, poor understanding of indicators, unreliable data quality, facility managers not maintaining data summaries, and poor feedback. Importantly, there was little indication that managers were using the information for facility level decision making.11 This system has not been systematically evaluated to assess its impact on health service delivery.

The DHIS consists broadly of two parts: data collection, collation and analysis which occurs in the facility, and DHIS software which processes data.

Data are collected routinely on all services provided by a facility, as well as periodically on infrastructure and human resources, as part of clinic surveys.11 These are collected by means of a paper-based system of registers, tally sheets, and monthly data collation forms. The collated data are sent monthly to the sub-district or district level where they are entered onto computer using DHIS software, then analysed, and a report is submitted to district, provincial and national health departments.8 This separation of data generation and entry of data into the DHIS software is important for the application of data validation and for data analysis and utilisation at the clinic. Regular feedback should be provided to facility staff by supervisors to assist in data interpretation and use. Clinic staff are encouraged to discuss their collected data and to graph and display selected indicators.

Data quality is addressed through mechanisms incorporated into the data collection process and functions within the DHIS software. These include checking of the data for inaccuracies by clinic managers and supervisors, using minimum and maximum expected values for each data element collected, and using the DHIS software. The minimum and maximum values for a data element are calculated on the basis of previous experience at the facility, and any outliers should be acknowledged and explained by the clinic staff. Validation rules detect impossible or improbable relationships between data elements. Expert or statistical validation rules imply a statistical relationship between two data elements. These mechanisms help to identify inconsistencies and data errors, and highlight areas where clinical problems may be occurring.12

 

Study site

We evaluated the implementation of the DHIS in 10 primary health care clinics in rural northern KwaZulu-Natal within a health sub-district which had 15 fixed clinics and several mobile service points. The clinics included were convenience sampled and included 6 that were intended intervention clinics for the larger study, and 4 chosen randomly.

The evaluation was designed around the information cycle framework (Fig. 1) and was structured to assess how well each step within this cycle was working. Interviews were conducted with key informants in each clinic, clinic supervisors, district information officers and other primary health care and district management staff. Interviewees were also shown graphs of data collected by their facilities and asked to explain and interpret them to assess understanding of the data collected and indicators calculated from them. Additional information and feedback based on their own indicators were provided to the interviewees as a training exercise.

 

 

Raw data extracted from the DHIS software for each clinic for a 12-month period were analysed, looking at data correctness, completeness, and consistency to assess the quality of the data collected. This was done separately from the interview to provide a more objective measure of data quality. Ethical approval of the larger study, of which this is part, was given by the Research Ethics Committee of the Nelson R Mandela School of Medicine, University of KwaZulu-Natal.

 

Results

The DHIS had been implemented in all 10 clinics, and the supporting organisational infrastructure was in place. The district had developed its own mission and vision statement for information, dedicated information staff had been appointed, and there was a clearly described pathway for information flow, although this was out of date.

Data collection and collation

Each clinic reported a high perceived work burden for data collection and collation. Data were collected by health care workers during each consultation, using paper-based record systems, and later collated. Duplication of data collection was found in all clinics. Several separate registers existed for collection of data on patients with chronic illnesses, tuberculosis care, HIV care including PMTCT, and immunisation. The format and availability of these registers differed from clinic to clinic. The tools (such as tally sheets) supplied to assist with data collation were universally poorly used, because of their poor design or lack of time. Seven out of 10 clinics reported that collation took 1 staff member about 2 days per month, while in 3 clinics it took between half a day and 1 day. Some of this work had to be done after hours. Only 1 clinic had a clerk.

The data that the clinics were required to collect were based on a national essential dataset and additional data added by the district. An updated monthly data submission form had been introduced which contained several duplicated data elements. A manual containing definitions of the data elements was provided, but no mention was made of indicators to be calculated at facility level or how to calculate and interpret these. Clinic supervisors conducted training on the use of the form, although usually with only 1 staff member of each clinic. For the 12-month period under study, all facilities had submitted data monthly and had copies available in the clinic. There was no computerisation of data collection and no facility for electronic submission of data in any clinic.

Data processing - validation and analysis

In each clinic, data validation was limited to ensuring that the data submitted were complete, and occasionally checking that they were correct. Clinic staff and supervisors reported that even if the data did not look correct, checking it was rarely done due to lack of time. Little analysis of data occurred at the clinic or by clinic supervisors. Data were not discussed in staff meetings nor analysed by them. When clinic staff were shown graphs of indicators calculated from their facility data, none could calculate the indicators presented, although most were able to interpret the graphs.

Data presentation

Graphical displays of data were seen in 8 of the clinics. However, these were usually of raw data rather than indicators, and in all but one were out of date.

Data use

The data were occasionally used to inform health education sessions run at the clinics and as a reflection of their work burden. There was, however, little understanding of the usefulness of the data, or its applicability with respect to facility or programme management. Several clinics had developed operational plans; however, clinic data were not used to inform targets or monitor plans.

Data feedback

There was no feedback on the data from district office to clinic supervisors, or from clinic supervisors to clinic staff. Most were not aware of their clinics' performance in relation to national targets or to other clinics.

Analysis of clinic data

Data from each of the clinics visited were extracted from the DHIS software for the 12-month period of January to December 2004. Of the 81 data elements routinely collected during this period, 11 were discontinued from August 2004. Of the total data values that should have been collected, there was a mean of 2.5% that were missing (range of 0.2 - 6.5%). In 75% of these missing data, a comment had been inserted that the data were missing, but no reason was given. The comments appeared to have been inserted at the point of entry of the data into the DHIS software rather than at the clinic.

Twenty-five per cent of data were outside the minimum and maximum values specified for the facilities. No explanations were offered, and any comments appeared to have been added at the point of entry of the data into the DHIS software.

There were 37 (0.4%) validation rule violations, involving 5 of the 18 absolute validation rules and 4 of the 7 expert validation rules. Some corrections had occurred in a few data elements that had been poorly interpreted and where obvious errors were made, or where errors had been highlighted by a validation rule. In total, 35 (0.4% ) data values were changed in the data over the 12-month period. Most changes (65% - 23/35) occurred in 1 data element for which there existed a statistical validation rule. Comments were included in the programme when such changes were made.

 

Discussion

We assessed the implementation of the DHIS (South Africa's public sector health facility information collection system) in a rural area of KwaZulu-Natal Province. In South Africa's district-based system of health care delivery, primary health care has been provided free at primary care clinics since 1996. The DHIS collects crucial health service delivery data at this level, and supports decentralised management of health services by enabling district and facility managers to make decisions about their service delivery based on local data. The DHIS was found to have strong district management support and was well integrated into the clinic routine. There were encouraging gains in information collection at the primary care level; standard data items were collected with generally high reliability and timeliness. However, data quality was poor and staff were unable to make effective use of it.

Despite training on the DHIS in the area, health care workers and managers were not putting the data collected to best use. This is not unique to South Africa and has been described as a culture of reporting rather than a culture of using the information.13 There is little tradition of information use for decision making at the facility level in most developing countries, even among health managers.4,10,14,15 Several factors affect the lack of data utilisation, including poor skills transfer within clinics due to high staff turnover, and poor communication of new knowledge within facilities; lack of understanding of indicators, lack of feedback to clinics; lack of access to the denominator data needed for calculating indicators; and poor numeracy skills among health care workers and managers.10 Renewed efforts are required to ensure that data are immediately transformed into standard indicators and used to make rational decisions about service delivery and quality of primary care. Clinic staff should be encouraged to routinely calculate a few key indicators from their data and monitor their performance towards achieving targets related to those indicators.

There was a severe shortage of health informatics skills needed to provide the necessary support, feedback and training in information utilisation. There was almost no feedback to clinic staff on the data they submitted. Feedback is a form of training and directly addresses the causes of poor quality data and enhances awareness of the importance of data. One factor in the absence of feedback was a lack of human resources, with only 2 dedicated information personnel in the sub-district. As the main providers of feedback, clinic supervisors should be trained in the interpretation and use of clinic data focusing on practical indicators of performance. Discussed monthly with clinic staff, this could drive improved service provision.

The information requirements demanded independently by some health programmes have resulted in duplication of data collection, which adds to the work burden and negatively affects data quality. Duplication can be avoided through the use of the essential dataset concept, whereby the information requirements of many programmes, including vertical and donor-funded programmes, are integrated into one set of routinely reported data.2 Essential data sets should be updated regularly to ensure that the collected information remains relevant and useful to managers. The duplication that was found, reflects a lack of scrutiny of the essential dataset by authorities at the district level. This problem in information system design can be avoided by thoughtful review of forms design. High-level support is also needed for a single integrated health management information system that will not allow data requirements to bypass the essential dataset. Any data item collected should be linked to an indicator for which there is a clear and actionable response.

The high work burden reported for data collection and collation suggests that a large amount of scarce health care worker time was required for these tasks. Therefore information-related duties are often allocated to junior staff members who may not have the appropriate skills or insight to recognise and correct problems, and no authority to take the necessary actions. A dedicated information clerk in each clinic with responsibility for data collection and validation would improve data quality and free up time for clinical staff to discuss, interpret and take action on the basis of the improved data, properly presented as indicators.

Computerisation of data collection, analysis and data transfer is often offered as the answer to health information problems. In many public health care facilities (particularly hospitals), data collection and management are computerised. While computerisation could reduce the burden of data collation and make data more accessible and easier to analyse, data utilisation will not improve if staff have no skills in analysis and interpretation, and no understanding of how and why data should be used. These are resource-intensive options requiring significant infrastructure investment and training. However, recent reductions in the cost and the development of newer technologies that allow passive recording of activity have created new opportunities for their use in rural settings.

The clinics included in this study are typical of rural clinics in the area, and probably the province, although perhaps not urban clinics. The consistency between our results and those of other reports on the DHIS also points to the external validity of the results.11 This study on how well the DHIS works within primary care clinics cannot give any indication of what happens at district level or higher.

 

References

1. Rommelmann V, Setel PW, Hemed Y, et al. Cost and results of information systems for health and poverty indicators in the United Republic of Tanzania. Bull World Health Organ 2005; 83(8): 569-577.         [ Links ]

2. Shaw V. Health information system reform in South Africa: developing an essential data set. Bull World Health Organ 2005; 83(8): 632-636.         [ Links ]

3. Lippeveld T. National and subnational health information systems. In: RHINO Conference, Potomac, Maryland, USA, 2001. Health Metrics Network. Geneva: World Health Organization, 2001.         [ Links ]

4. Odhiambo-Otieno GW. Evaluation of existing district health management information systems a case study of the district health systems in Kenya. Int J Med Inform 2005; 74(9): 733-744.         [ Links ]

5. Herbst K, Littlejohns P, Rawlinson J, et al. Evaluating computerized health information systems: hardware, software and human ware: experiences from the Northern Province, South Africa. J Public Health Med 1999; 21(3): 305-310.         [ Links ]

6. Mitchell E, Sullivan F. A descriptive feast but an evaluative famine: systematic review of published articles on primary care computing during 1980-97. BMJ 2001; 322(7281): 279-282.         [ Links ]

7. Hanmer L. Criteria for the evaluation of district health information systems. Int J Med Inform 1999; 56(1-3): 161-168.         [ Links ]

8. Heywood A., Rohde J. Using information for action. A manualfor health workers at facility level. Arcadia, Pretoria: The Equity Project, 2001.         [ Links ]

9. Health Information Systems Programme. East London: HISP SA. www.hisp.org (accessed 25 June 2007).         [ Links ]

10. Williamson L, Stoops N. Using information for health. In: Ijumba P (ed). South African Health Review 2001. Durban: Health Systems Trust, 2001: 101-116.         [ Links ]

11. Day C, Hedberg C. Health indicators. In: Ijumba P, Day C, Ntuli A (eds). South African Health Review 2003/2004. Durban: Health Systems Trust, 2004.         [ Links ]

12. Clinic Supervisors Manual. Pretoria: Department of Health, 2003.         [ Links ]

13. Byskov J, Olsen OE. The data set must focus on service quality. Bull World Health Organ 2005; 83(8): 639.         [ Links ]

14. AbouZahr C, Boerma T. Health information systems: the foundations of public health. Bull World Health Organ 2005; 83(8): 578-583.         [ Links ]

15. Cibulskis RE. Information is not only for managers. Bull World Health Organ 2005; 83(8): 637.         [ Links ]

 

 

Correspondence:
A Garrib
(anu_g@yahoo.com)

Accepted 29 October 2007

Creative Commons License All the contents of this journal, except where otherwise noted, is licensed under a Creative Commons Attribution License