SciELO - Scientific Electronic Library Online

 
vol.41 número2ForewordDetection of Vibrio cholerae O1 in animal stools collected in rural areas of the Limpopo Province índice de autoresíndice de assuntospesquisa de artigos
Home Pagelista alfabética de periódicos  

Serviços Personalizados

Artigo

Indicadores

Links relacionados

  • Em processo de indexaçãoCitado por Google
  • Em processo de indexaçãoSimilares em Google

Compartilhar


Water SA

versão On-line ISSN 1816-7950
versão impressa ISSN 0378-4738

Water SA vol.41 no.2 Pretoria  2015

http://dx.doi.org/10.4314/wsa.v41i2.01 

How well do our measurements measure up? An overview of South Africa's first proficiency testing scheme for organochlorine pesticides in water

 

 

M Fernandes-Whaley*; D Prevoo-Franzsen; L Quinn; N Nhlapo

National Metrology Institute of South Africa (NMISA), Organic Analysis Section, Lynnwood Ridge x 34, Pretoria, 0040, South Africa

 

 


ABSTRACT

Access to safe drinking water is a basic human right in South Africa. Therefore, the accurate measurement of water quality is critical in ensuring the safety of water prior to its intended use. Proficiency testing schemes (PTSs) are a recognised form of assessing the technical competence of laboratories performing these analyses. There are over 200 water testing laboratories in South Africa, with only 51 being accredited for testing some or all parameters (physical, chemical and microbiological content) prescribed in SANS 241. Only a limited number of laboratories test for organic contaminants, as this requires advanced, costly analytical instrumentation, such as GC-FID/ECD/MS and LC-UV/MS, as well as skilled staff. These laboratories are either looking at selected organic contaminants listed in the World Health Organisation (WHO) drinking water guidelines or performing the minimum requirements, as stipulated in SANS 241, for phenols, atrazine, trihalomethanes and total dissolved organic content. Whereas several local PTS providers are addressing the competent assessment of microbiological, physical and inorganic chemical testing of water, a clear need for a South African PTS provider for organic contaminant analysis in water was identified by NMISA (National Metrology Institute of South Africa) in 2012. The key drivers for the coordination of a local PTS stem mainly from the limited stability of analytes in the samples for analysis and the high cost and logistics of international PTS participation. During 2012 and 2013, NMISA conducted a PTS trial round, a workshop and 2 additional PTS rounds for organochlorine pesticides in water, for South African laboratories, and also several international participants from other countries in Africa. This paper will highlight some of the challenges faced by laboratories when analysing organochlorine pesticides at the ng/ concentration level. Issues surrounding the comparability of measurement results, traceability, method validation and measurement uncertainty are also discussed.

Keywords: Proficiency testing schemes, PTS, organochlorine pesticides, drinking water


 

 

INTRODUCTION

According to the South African constitution, South Africans have the right to an environment that is not harmful to their health or well-being. Organic contaminants are recognised as toxic substances that negatively impact the environment as well as human health (Cane, 2006). This group of chemicals includes persistent organic pollutants (POPs), such as chlorinated pesticides, dioxins, halogenated flame retardants and polyaro-matic hydrocarbons (PAHs). Organic contaminants are found in almost all environmental compartments due to their widespread use and formation during many anthropogenic activities. Sources known to affect the wastewater systems (through runoff as well as treated and untreated wastewaters enriching natural water resources) include industrial and agricultural activities, and sewage. These wastes contain personal care products and pharmaceuticals, which are major contributors to the burden of organic contaminants in water. Lifelong exposure to organic contaminants such as organochlorine pesticides and PAHs is associated with a myriad of negative health effects. These chemicals have been found in South African water systems (Das, 2008; Nieuwoudt, 2011; Moja, 2013). Therefore, research and monitoring of environmental toxicants in South African waters is essential.

The list of potentially hazardous chemicals is increasing; stricter legislation and initiation of environmental programmes are being applied globally. Steps taken include regulations, such as REACH, South African and global initiatives such as the direct estimation of the ecological effect potential (DEEEP) and the Stockholm Conventions. The routine monitoring of pesticides and other harmful organic contaminants/pollutants in drinking, natural, and treated waters, will soon be strictly regulated in South Africa. Furthermore, the quality of water in the environment directly impacts the quality, and consequently the safety, of food as well.

The implementation of a good quality assurance (QA) and quality control (QC) measurement system is required to ensure the comparability of measurement data over time. The ISO 17025 guide for the competence of testing laboratories (ISO/IEC17025, 2005), is an internationally recognised system, fostering the international acceptance of measurement data.

ISO 17025 incorporates management and technical requirements. Technical requirements specify staff competencies, method validation, measurement traceability and estimation of measurement uncertainty. A key requirement for a laboratory to obtain accreditation is the ability to demonstrate the continued competency of the measurement procedure and of the staff performing the measurements, through participation in proficiency testing schemes (PTSs). However, laboratories struggle to obtain accreditation due to PTS costs or due to the lack of appropriate PTSs to address their specific analytical needs. There are to date a limited number of PTSs organised and coordinated in South Africa for local laboratories and for those in the Africa region.

The South African national standard for drinking water provides the specifications for water that is safe for consumption over a lifetime (SANS241-1, 2011; SANS241-2, 2011). As a consequence, the minimum testing requirements for microbiological, physical and inorganic contaminant testing to ensure basic water quality, as prescribed in SANS 241, are typically the predominant measurements performed by water testing laboratories.

The serious consequences of microbial water contamination, diarrhoea, viral and bacterial infections and diseases, make microbiological testing and controls the most critical requirement that must be met to ensure water is safe for use. In addition, the presence of inorganic analytes e.g. excess fluoride, can affect the dental health of the population (WHO, 2011). The consequences of organic contaminants in water have been more difficult to assess as the harmful effects are usually the result of long-term exposure to a combination of man-made chemicals used in agriculture, manufacturing, incineration, and in the pharmaceutical industries (WHO, 2011).

A review of the South African National Accreditation System (SANAS) directory for ISO 17025-accredited facilities (SANAS, 2012) in South Africa yields the summary depicted in Fig. 1 for the distribution of water quality testing of the various major parameters. The measurement parameters are detailed in Table 1. The bulk of the measurements performed are clearly depicted as being microbiological (24.8%); inorganic (28.8%) and physical (28.8%).

 

 

The determination of organic contaminants has not received the same sense of urgency (London et al., 2005); currently SANS 241 recommends the analysis of total (dissolved) organic content, phenols, trihalomethanes and odour volatiles such as geosmin, methyl isobutyl (MIB), methyl isobutyl ketone (MIBK) and only a single pesticide - atrazine. Only 9.6% of accredited laboratories are providing this service, with an even smaller portion (8%) of laboratories analysing organic contaminants such as pesticides and benzene, toluene, ethylbenzene, and xylenes (BTEX).

There are over 200 water-testing laboratories in South Africa (Balfour et al, 2011), with only 51 being accredited for testing some/all parameters prescribed by SANS 241. A limited number of laboratories test for organic contaminants as these analyses require advanced and costly analytical instrumentation such as GC-FID; GC-ECD; GC-MS; LC-UV; LC-FLD and LC-MS, as well as skilled staff (London et al., 2005).

Of these 51 laboratories, some test water quality as it directly impacts on the quality of manufactured products for human consumption. These measurements are typically performed in the beverage and canning industries.

Internationally, a strategy for dealing with pollution of water from chemicals is set out in Article 16 of the European Union Water Framework Directive 2000/60/EC (EU WFD) (Lepoma, 2009). The World Health Organisation (WHO) provides drinking water guidelines (WHO, 2011) and the United States Environmental Protection Agency provides guidance levels for organic contaminants in water-based risk assessment of aquatic ecotoxicity, human toxicity and environmental contamination data. The EU WFD currently lists organic compounds such as pesticides, polycyclic aromatic hydrocarbons/polyaromatic hydrocarbons (PAHs), benzene, halogenated hydrocarbons (solvents), flame retardants, a plasticiser, surfactants and antifouling agents, and some heavy metals. The aim is to reduce the occurrence of these pollutants and terminate the use of some persistent organic pollutants that bio-accumulate in the environment.

Grey boxes (Table 1) indicate the minimum required parameters for domestic use; additional parameters are required for ground and wastewaters. The last column lists organic contaminants not prescribed in SANS 241, but which occur as a subset in the South African Department of Water Affairs (DWA) guidelines for aquatic ecosystems.

In the development of South African drinking water standards, based on WHO guidelines, the environmental, social, cultural, economic, dietary and other conditions affecting potentialexposure must be taken into account (WHO, 2011).

For example, SA still uses DDT for malaria control in malaria endemic regions, potentially resulting in significantly higher levels present in natural waters, compared to those found in Europe or the United States. Monitoring programmes have been implemented by the Department of Water Affairs (DWA) in South Africa to determine, amongst others, baseline levels of organic contaminants of concern in SA waters (London et al., 2005).

A significant amount of competent testing by DWA, and by the water testing laboratories that DWA outsources to, is required to obtain meaningful data. Therefore, PTSs that focus on South African organic contaminants of concern could prove valuable.

As custodians of water quality in South Africa, DWA will be implementing a system that will see reference laboratories required to be ISO 17025-accredited, with smaller water quality testing laboratories being registered with DWA and regularly audited by DWA, specifically for analytical competence based on ISO 17025 guidelines (Balfour et al, 2011). The smaller testing laboratories, typically further removed from the commercial centres of SA, will also be required to participate in PTSs at least 3 times a year, with problems identified through PTS participation to be reported to DWA. Data from water testing laboratories not meeting these requirements will not be accepted in future (Balfour et al., 2011).

The National Metrology Institute in South Africa (NMISA) conducted a survey in 2012 of South African water testing laboratories involved in organic contaminant analysis of water (Fernandes-Whaley, 2012). Figure 2 summarises the main organic contaminant classes being tested. Testing for PAHs, BTEXs, organochlorine and organophosphorus pesticides is predominant.

 

ESTABLISHING A PROFICIENCY TESTING SCHEME FOR ORGANICS

There are several local PTS providers in South Africa for water testing that consider physical, inorganic and microbiological parameters. There are none currently for organic contaminants in water.

The National Laboratory Association (NLA) coordinates a microbiology PTS for water quality testing, covering heterotrophic plate count, total coliforms, faecal coliforms and E. coli (NLA, 2012).

The South African Bureau of Standards (SABS) Water Check PTS caters for inorganic chemical testing. It has been operating since 1994 offering PTSs on a quarterly basis, with flexible participation in any of the 3 chemical groups, comprising:

Group 1: metals testing (Al, As, Ba, Be, B, Cd, Cr, Co, Cu, Fe, Hg, Mn, Mo, Ni, Pb, Se Si, Sr, V, Zn)

Group 2: nutrients testing (ammonia, chemical oxygen demand, dissolved organic carbon, Kjeldahl nitrogen, nitrate, O-phosphate, oxygen demand, suspended solids, total organic carbon, total phosphate)

Group 3: mineral testing (alkalinity, conductivity, calcium, colour, chloride, dissolved solids, fluoride, magnesium, nitrate, pH, potassium, sodium, sulphate, turbidity)

The SABS currently has 230 laboratories participating in the SABS Water Check PTS (Fouché, 2011).

The SABS has already indicated that they will be expanding their scope offering to include the following stable tests: oil and grease, uranium, surfactants, cyanide and bromate, and volatile tests: nitrite, chromium VI, chlorine, chloramine, bromate, phenol and trihalomethanes (Fouché, 2011).

Thistle QA predominantly offers PTSs for steroids and pharmaceuticals in biological matrices, but also includes a microbiological PTS for foods and beverages (SANAS, 2012). The Agricultural Laboratory Association of South Africa (AGRILASA) also coordinates a PTS for agricultural testing laboratories. Organic contaminants are not included in their offering (AGRILASA, 2014). Several more international PTSs can be searched for on the website www.eptis.bam.de.

NMISA received several requests to assist with a PTS for organochlorine pesticides (OCPs) in water. NMISA-PT-ORG10 was consequently developed as a trial PTS for the determination of OCPs in water. To develop certain aspects of the scheme, preliminary participant data were needed in order to best define quality parameters. The parameters included the following:

The best sample format and the associated implications for performance and stability

Transportation requirements

Storage requirements

Fit-for-purpose PTS reference value assignment and selection of an appropriate standard deviation of proficiency assessment

As it was a trial, the cost of the scheme was reduced to encourage maximum participation from laboratories. Conclusions reached from the trial round assisted with implementation of the official OCP PTS distributed towards the end of 2012.

The aim of the PTS for OCPs in water is to specifically assist laboratories that routinely analyse OCPs in water to monitor their laboratory performance. Aspects such as the identification of unknown OCPs in the sample, accuracy and comparability of measurement results produced; the continued competency of analytical staff, and the maintenance and effectiveness of the current quality assurance systems within the laboratory can all be assessed through careful evaluation of the laboratory's PTS results. These results could also be used to provide accreditation bodies and clients with objective evidence of laboratory performance.

In addition to z-scores, En scores are included in the report to assist laboratories with assessing the suitability of their estimated uncertainty of measurement.

Unlike most PTSs, which provide the analyte reference value based on the participants' 'consensus value', which consists of the mean of participant laboratory results with all outliers removed (Linsinger et al., 1998), NMISA is providing the International System of Units (SI)-traceable reference value for the analytes in the sample through the use of primary, and primary ratio, methods (ISO/IEC17043, 2010).

The use of consensus values requires a minimum data set of 12 measurement results. Reference values and performance data are thus dependent on the number of participants (ISO/ IEC13528, 2005). Although a consensus value PTS allows laboratories to compare their performance against each other and the methods employed, the consensus value does not ensure accuracy or traceability of the reported results to internationally agreed measurement standards such as the SI.

The consensus value may not always be an accurate reflection of the 'true' value. This may occur when laboratories do not apply metrologically traceable calibration standards for quantification. Traceable calibration is critical in ensuring the accuracy and comparability of measurement results (Heydorn and Anglov, 2002).

 

METHODOLOGY

The NMISA PTS was conducted in accordance with the ISO/ IEC17043 (2010) standard: conformity assessment - general requirements for proficiency testing (ISO/IEC17043, 2010). The data were processed according to ISO 13528 (ISO/IEC13528, 2005) and technical specification ISO/TS20612 water quality - inter-laboratory comparisons for proficiency testing of analytical chemistry laboratories (ISO/TS20612, 2007).

Expected pesticide analytes and concentration ranges

The OCPs listed in Table 2 are those that are currently being tested by laboratories in South Africa (Fernandes-Whaley, 2012). The listed concentration ranges encompass the recommended WHO concentration limits for these analytes in drinking water (WHO, 2011) and/or the South African water standard concentration limits for protection of aquatic ecosystems (DWAF, 1996).

Detection at these concentration levels should be achievable using analytical methods typically applied (GC-MS or GC-ECD) for quantification of OCPs. In order to exceed the limits of detection of these instruments, attention had to be given to achieving above 80% analyte recovery and sufficient analyte pre-concentration prior to GC analysis.

PTS samples

The NMISA-PT-ORG10 Trial PTS samples were distributed at the end of May 2012. Each participant received the PTS samples in 2 formats, namely:

2 x 2 m methanol OCP spike solutions for dilution by the laboratory prior to analysis (Samples 1A and B)

1 x 500 m diluted water sample previously spiked with OCPs (Sample 2)

Based on performance this would allow NMISA to identify the best sample format for the PTS.

During the trial, significant problems were encountered with the transportation of the 2 m methanol spike solutions. As a hazardous freight item, few couriers were prepared to transport this item at a reasonable cost. In addition, problems were experienced with these samples clearing international customs. It is recommended that the PTS sample should be a reflection of samples typically received in the laboratory (ISO/ TS20612, 2007), participants agreed that 1 sample volumes would be more appropriate.

The NMISA-PT-ORG12 Round 1 PTS samples were distributed in February 2013 and the NMISA-PT-ORG12 Round 2 samples in August 2013.

Participants either collected samples from NMISA or samples were couriered. Each participant received:

2 x 1 water samples

Gravimetrically diluted analytical standard (if requested by participant)

All results had to be submitted within a 3-week period.

PTS sample preparation

For all the NMISA OCP PTSs, the purity of the OCP reference materials (RMs), obtained through commercial ISO Guide 34 RM producers, was verified through chromatographic separation on 2 different stationary phases, and detection by gas chromatography with flame ionisation detector (GC-FID) and gas chromatography with time-of-flight mass spectrometry (GC-TOFMS).

For the NMISA-PT-ORG10 trial PTS, stock solutions and samples were prepared gravimetrically, and density corrected where applicable. Individual stocks of the selected OCPs were prepared from high purity RMs at concentrations between 500 and 1 000 μg/m (Sample 1). Aliquots from each of the stock solutions were combined to prepare a composite dilution. All vials were pre-cleaned by washing 3 times with hexane, acetone and methanol, and dried before use.

Sample 2 was prepared by diluting a 20 m aliquot of Sample 1 in 5 de-ionised water. The 5 solution was thoroughly mixed by inversion, before being transferred into 10 pre-cleaned 500 m Schott bottles. The caps of the 500 m Schott bottles were covered with pre-cleaned aluminium foil to prevent any possible contamination from the plastic caps. Shrink-sleeves were applied to all bottles and vials as tamper evidence, and the bottles and vials were subsequently packaged for distribution within 24 hours, or stored at 4°C until analysis. The bottling repeatability, based on weighing after dispensing into the bottles, was 0.3% RSD for Sample 1 and 0.1% RSD for Sample 2.

The expanded uncertainty of each assigned value (AV) was estimated using the following contributors: gravimetric operations during preparation and bottling, the purity of the reference materials used and the homogeneity of the samples.

For NMISA-PT-ORG12 rounds 1 and 2, samples were prepared gravimetrically, and density corrected where applicable. Individual stocks of the 5 selected OCPs were prepared from high-purity, certified reference material solutions at concentrations of 1 000 μg/m, and verified against stocks prepared from high-purity, solid reference materials.

Aliquots from each of the stock solutions were combined to prepare a composite dilution at an appropriate spiking concentration. The PTS samples were prepared by diluting a 200 m aliquot of the composite dilution into 1 de-ionised water in pre-cleaned 1 Schott bottles. The solution was thoroughly mixed by inversion. Shrink-sleeves were applied to all bottles as tamper evidence, and the bottles were subsequently packaged for distribution within 24 hours, or stored at 4°C until analysis.

The bottling repeatability, based on weighing after dispensing into the bottle, was 0.2% RSD for aliquot transfer and 0.5% RSD for the 1 dilution process.

The expanded uncertainty of each AV was estimated using the following contributors:

The gravimetric operations during preparation and bottling

The purity of the reference materials used and the homogeneity of the samples

NMISA analysis method

Samples were allowed to reach room temperature and spiked with carbon 13 labelled isotopes. Samples were thoroughly mixed by inversion. NMISA-PT-ORG10 Trial Sample 2 was quantitatively transferred into 500 m de-ionised water before analysis. The full volume of each of the samples was loaded onto preconditioned RP C18 SPE disks and the analytes were eluted with dichloromethane and ethyl acetate. The eluate was dried down under a stream of nitrogen and re-suspended in 100 μℓ isooctane.

The samples were injected and separated on a Restek Rxi-XLB (30 m, 0.25 mm ID, 0.25 μm df gas chromatography (GC) column and detected by LECO GC-TOFMS.

Bracketing isotope dilution mass spectrometry was employed for quantification. The calibration solutions prepared were matrix-matched by spiking standards and isotopes into 500 m de-ionised water and extracting the analytes by SPE. The matrix contribution from the SPE disks resulted in a signal enhancement for aldrin.

Test for sufficient analytical precision

In order to adequately estimate the homogeneity and stability of the analytes in the PTS samples, according to ISO 13528 (ISO/IEC13528, 2005), the method analytical precision should be such that when the between-sample standard deviation (in this case the standard deviation of replicate analyses) is compared with the standard deviation for proficiency assessment σp , the following is true:

The NMISA analytical method met this requirement, where σρ = σR,obtained using the Horwitz prediction model.

Homogeneity testing

For the homogeneity assessment it was not possible to perform 2 independent assessments of each sub-unit, because the measurement method uses the entire sample (1 or 500 m) for analysis. The use of ANOVA for homogeneity assessment, as recommended in ISO 13528 (ISO/IEC13528, 2005) statistical methods for PTSs, is therefore not applicable. In such instances the standard deviation of replicate analyses can be used as an indicator of homogeneity (Bercaru et al., 2009).

It should also be noted that certain homogeneity values appeared quite high since they also incorporated the error introduced through the analysis (Bercaru et al., 2009). These values are therefore the maximum heterogeneity that can be expected, even though it may be influenced by the method repeatability. This error is not included in the case where duplicate sub-units are reported and analysed with ANOVA (ISO/ TS20612, 2007).

The homogeneity requirement for the NMISA PTS has to meet the following requirement for 8 repeat analyses of the PTS samples, immediately following sample preparation and distribution (ISO/TS20612, 2007):

where:

σH = standard deviation of 8 repeat analyses for analyte x

σρ= standard deviation of proficiency assessment for analyte x

All analytes in the NMISA-PT-ORG12 rounds 1 and 2 samples met this homogeneity requirement.

Stability of analytes

An isochronous study was conducted for the analyte stability assessment. Five randomly selected bottles from the sample batch were stored at 4°C, 20°C and 40°C for a period of 5, 14 and 21 days respectively. Samples were stored at 4°C, after the respective storage periods were reached, until analysis under repeatability conditions at the end of the 21-day period. Results confirmed that the analytes are stable in the sample within a 5-day period at 4°C and 20°C. When stored at 4°C, the samples are stable within the 3-week PTS period.

Sample storage, distribution and receipt

All samples were stored in the dark at 4 ± 2°C until distribution or analysis. Participants were requested to store all sample solutions in the dark at 4 ± 2°C immediately upon receipt. No precaution was taken during transportation of the samples in terms of temperature control. With the exception of one participant, all international participants received samples within 96 hours (4 days) of preparation.

Instructions to participants

Participants were encouraged to perform the analysis using the laboratory's routine methods for the determination of these analytes. Results should have been corrected for recovery and blank controls applied if this was standard practice in the laboratory. All normal quality control procedures should have been applied. An electronic results-submission form was sent to participants when samples were delivered.

The water samples had to be equilibrated to room temperature 20 ± 5°C prior to performing analyses.

Participant laboratory information

Each registered participant was assigned a unique confidential code known only to NMISA and the participating laboratory.

Performance statistics

The terms and equations used are described below. The PTS data were presented in 3 formats, namely:

Graphically, where participants' measurement results and associated uncertainties were plotted relative to the assigned value and the assigned value's expanded uncertainty (at 95% level of confidence, k=2), together with the standard deviation for proficiency assessment (σp) using both Horwitz prediction models. This is equivalent to 1 standard deviation

z-scores, where:

both the Horwitz and alternative Horwitz model (for concentrations below 10 μg/) were used for estimating the reproducibility standard deviation (σR) and consequently the standard deviation for proficiency assessment (σp)

En-scores

Assigned value (AV)

The assigned value (AV) for the NMISA-PT-ORG12 R2 PTS is the purity and density-corrected gravimetric preparation value of the solutions. This assigned value is considered to be the best reflection of the 'true value' of the analyte concentration in the PTS samples.

The uncertainty associated with the PTS AVs was determined using the following uncertainty contributors described in Eq. (1):

where:

uAV: assigned value standard uncertainty

uCRM: standard uncertainty of the certified reference material

umass: combined standard uncertainty of gravimetric preparation operations involved in the PTS sample preparation

ubottljng: standard uncertainty from the PTS sample bottling procedure

uhomog: standard uncertainty due to PTS sample homogeneity as determined by NMISA

Standard deviation for proficiency assessment (σp)

The standard deviation of proficiency assessment is a measure of the spread of participants' results i.e. where the participants' measurement results can be expected to lie relative to the AV.

According to statistical guidelines in the ISO 13528 (ISO/ IEC13528, 2005) and ISO/TS20612 standards (ISO/TS20612, 2007), there are several ways to determine this expected spread of results. In order to use the standard deviation of participants' results, a minimum number of 12 results are required for meaningful statistical evaluation of the data. Due to limited participation in South Africa, i.e. limited results received, both the assigned value and standard deviation of the PTS cannot be determined by consensus and/or statistical techniques.

It is, however, possible to use a general model, such as the Horwitz model (Thompson, 2006), which predicts the reproducibility standard deviation between laboratories participating in inter-laboratory studies using 'strictly defined' analytical methods (Thompson, M, 2004). The Horwitz model is described by Eq. (2), or alternatively Eq. (3). A disadvantage of this model is that only the analyte concentration is taken into account and not challenges associated with the sample size, analyte type and the analysis thereof (ISO/IEC17043, 2010; ISO/IEC13528, 2005; ISO/TS20612, 2007).

Alternatively:

where:

σRis reproducibility standard deviation

c is analyte concentration

%RSD is percentage relative standard deviation

The Horwitz model predicts a reproducibility standard deviation which increases exponentially as the concentration of the analyte decreases. However, at a low ng/ level, this results in an acceptance range of 60-100% for all analytes. This raises serious doubts as to whether the analyte is present or not.

In 2000, Horwitz reported that inter-laboratory performance for analyte concentrations below 10 μg/ (ppb) shows invariance around 20-25% RSD (Thompson, 2000; 2004; Rivera and Rodriguez, 2011). It is recommended that an alternative Horwitz model be used to describe inter-laboratory performance at these levels using the alternative Horwitz model described by Eq. (4) (Thompson, 2000; Thompson, 2006).

At the NMISA PTS workshop in August 2012, participants agreed to also consider the standard deviation of the mean results submitted in the round, as an alternative estimate for repeatability (σr). According to ISO 13528 (ISO/IEC13528, 2005), the robust standard deviation (RSC, 2013) should be used when using data from a single round of the PTS. A disadvantage is that this value may vary considerably from one round to another. This would also make it difficult to compare trends for a laboratory's performance over several rounds using the z-score.

z-score

A z-score was calculated for each participant using Eq. (5) (ISO/IEC13528, 2005).

where:

z is the z-score

y is participant laboratory result

xa is the assigned value

σP is standard deviation for proficiency assessment, where the coordinator has proposed that σρ= σR, (calculated using Eq. (3)), where σR is the reproducibility standard deviation

How to interpret the z-score: a z-score with absolute value (|z|):

|z| < 2 is satisfactory

|z| > 2 <3 is questionable

|z| > 3 is unsatisfactory

En-score

An En score was calculated using Eq. (8) for PTS participants that reported an uncertainty of measurement. The Enscore is complementary to the z-score and includes the uncertainty of the measurements to evaluate the performance of the laboratory. Ennumbers should be used with caution when participants may have a poor understanding of their uncertainty and may not be reporting it in a uniform way (ISO/IEC13528, 2005; ISO/TS20612, 2007).

where:

Enis the Enscore

x is participant laboratory result

X is assigned value

Ulab is participant laboratory expanded uncertainty of measurement result x

UAV is assigned value X, expanded uncertainty of measurement

How to interpret the En-score: an En -score with absolute value a(|En|):

| En| < 1 is satisfactory

| En| > 1 is unsatisfactory

Traceability and measurement uncertainty

Establishing measurement traceability and estimating uncertainties for measurement results produced are key requirements for laboratories adhering to ISO 17025 (ISO/IEC17025, 2005). Participants were requested to include a measurement uncertainty together with the uncertainty budgets used to estimate the uncertainty.

 

RESULTS AND DISCUSSION

The NMISA-PT-ORG10 Trial PTS was conducted during June 2012. Of the 7 laboratories that were invited to participate, 6 laboratories registered to participate, and 4 submitted results. One set of results was qualitative only. The NMISA-PT-ORG12 Round 1 PTS was conducted during February-March 2013. Of the 9 laboratories that registered to participate, 7 submitted results. The NMISA-PT-ORG12 Round 2 PTS was conducted during September 2013. Of the 12 laboratories that registered to participate, 8 submitted results.

The z-score results are summarised in Table 3. The OCP concentrations in Samples 1 and 2 were identical. The z-scores were calculated using the Horwitz model. For the NMISA-PT-ORG10 Trial PTS, all participants that identified and quantified the spiked OCPs achieved z-scores below 2, except for the determination of p,p'-DDE. This was true for NMISA-PT-ORG12 R1, except for cis-chlordane; and for NMISA-PT-ORG12 R2, except for aldrin, p,p'-DDT, beta endosulphan and alpha-HCH. This implies that the laboratories' measurement results are largely performing within the variation predicted by Horwitz.

Results from NMISA-PT-ORG12 R2 will be used for further discussion.

Figure 3 graphically depicts the participants' results for the determination of p,p'-DDT in the PTS sample, relative to the AV, the AV uncertainty, and the 1 standard deviation of proficiency assessment (z=1) predicted using both Horwitz models.

Figure 3 is very effective in conveying the participants' performance in terms of accuracy and uncertainty of measurement. It also allows for easy comparison between participant results. From this figure it is evident that certain laboratories are not reporting any uncertainties with their measurement results, while others are underestimating their uncertainty, as in the case of Lab 07, where the uncertainty is almost equivalent to the gravimetric preparation of the solutions.

Table 4 summarises participant results for NMISA-PT-ORG12 R2 samples containing 5 gravimetrically spiked OCPs. Also listed at the end of the table are the AVs, the (robust) standard deviation estimations using participants' mean results (RSC, 2013), the Horwitz model (Thompson, 2006) and alternative Horwitz model (Thompson, 2000).

Assigned value (AV)

With the exception of alpha-HCH (at 10.5% difference), the NMISA-assayed values of the PTS samples are all within 8% of the gravimetrically prepared values. This is fit-for-purpose when considering the standard deviations at the expected trace concentration levels of the prepared solutions (ng/) (Bercaru et al., 2009). The expanded uncertainty associated with the gravimetric preparation of the PTS samples is in all cases within 3.4% (k=2), excluding endosulfan II at Urel % of 9.7%, which in all cases is significantly less than the individual analyte standard deviation of proficiency assessment values (σp ), and is thus also fit-for-purpose (ISO/IEC13528, 2005).

Table 4 shows the mean and the robust mean of participants' results. The percentage difference in the mean of participant results from the AV is listed in the last row of the table. The o,p'-DDT concentration percentage difference from the AV is the smallest at 1.8%, but only 3 measurement results were submitted for this analyte. The percentage difference for the other analytes ranges from -18.8% for aldrin to 48.2% for endosulfan II.

Standard deviation for proficiency assessment (σp)

The standard deviation for proficiency assessment was determined for each analyte using both the original Horwitz prediction model, and the alternative Horwitz prediction model for concentrations below 10 μg/ for reproducibility standard deviations (σR).

For the current analyte assigned values, the Horwitz model predicts an average relative standard deviation (RSD) of 33%, while the alternative Horwitz model predicts 22% RSD. This is comparable to the target standard deviation of 20-25% set for pesticides in water PTSs conducted in the EU (Bercaru et al., 2009).

At the bottom of Table 4, the participant and robust standard deviations are listed together with the standard deviations predicted by both Horwitz models. In the case of Aldrin, the standard deviations (SD) achieved by the participants (SD=15) compare well to the predicted Horwitz, σp = 15, and the alternative Horwitz, σp = 9, which is slightly lower. For p,p'-DDT, endo-sulfan-II and alph a-HCH, the standard deviations achieved by the participants all exceed those predicted by Horwitz. This may be attributed to differences in laboratory competence, sample size and the analytical methods used for the analysis. Only the o,p'-DDT participant SD is less than the Horwitz predicted values.

z-score

The standard deviation of proficiency assessment (σp), determined using both Horwitz prediction models, was used to calculate z-scores according to Eq. (5). The main difference between the two σR approaches is the accepted concentration range for calculating the z-scores which differs by approximately 10%. The number of participant results with a z-score greater than 2 increases from 11 using the Horwitz model to 14 using the alternative Horwitz model.

With additional data from more PTS rounds, NMISA will be able to establish a statistical model that can predict the standard deviations achievable by participants with a higher degree of confidence. Until then, both approaches will be used and monitored for calculating z-scores.

En-score

The En-score, although complementary to the z-score, is a more objective manner by which individual participant results can be compared to the assigned value, as no standard deviation of proficiency assessment estimate is required.

Table 5 summarises the calculated En -score for the participants. An En -score of < 1 can be visualised where the uncertainty bars of the measurement result overlap with the AV, or with the assigned value's uncertainty bars (dashed red lines; refer to Fig. 3).

The En-scores reported were predominantly less than 1 except where Lab 06 obtained En >1, as shown in Table 5 for ρ,ρ'-DDT. Ideally, since both PTS samples were identical, the reported uncertainties should overlap with the independent measurement results reported for each analyte. This was not always the case. Taking samples 1 and 2 for Lab 06, for example, reported uncertainties did not overlap although the samples were identical.

Estimation of uncertainty of measurement (UoM)

Of the 8 laboratories that submitted results, only 4 currently report on the uncertainty of measurement (UoM). It was impractical to compare the performances of laboratories 03, 04 and 05 which did not report on the UoM with laboratories which did report on the UoM.

 

RECOMMENDATIONS

Based on participant analytical methodologies, participants would benefit from due consideration of sample amount and pre-concentration required for trace water analysis (ng/). The lowest concentration that can be expected is 10 ng/ (Table 2). By using classical extraction and clean-up approaches, the final mass on-column (assuming 100% recovery) is 10 pg. This is, generally, very close to the limit of detection for GC detectors such as ECD and MSD (SIM mode). Table 6 shows the effect of the sample preparation approach taken on the final amount loaded onto the column.

As indicated, a classical approach with limited sample volume (200 m) yields only 2 pg on column. Similarly, using only 20 m of sample results in 0.2 pg on column which is below the LOD for most commercial mass spectrometers. Laboratory 08 used a 20 m sample size and several analytes were detected, differing in both samples, with a variation >50%. This implies that detected analytes originated from background contamination and not from the sample.

In reality, this 'mass-on-column' could be much lower (<100% recovery), causing analytes to be easily 'missed' by the detector. Large volume injection (LVI) and reducing the final reconstitution volumes will both improve the final mass on column. Use of specialised equipment such as thermal desorption allows for use of limited sample volumes for extraction as the entire extracted sample is desorbed into the instrument (LVI).

 

CONCLUSIONS

Regular participation in PTSs is an important tool used to demonstrate a laboratory's measurement procedure and analyst competence. Careful selection of the standard deviation of proficiency assessment is needed to ensure a fair reflection of laboratory performance. The proposed graphic representation of laboratories' results, including measurement uncertainties, provides a better reflection of laboratories' accuracy and measurement uncertainty relative to the traceable AV and other laboratories, than performance described by a z-score alone.

NMISA is proposing the use of the alternative Horwitz model for predicting the standard deviation of proficiency assessment, as the target concentrations for the organochlorine pesticides in water are below 10 μg/.

From the data presented in this report the decrease in the acceptable range for results is reduced by approximately 10%, and should still allow for meaningful comparison of performance in previous rounds of the NMISA-PT-ORG12 and NMISA-PT-ORG10 PTS. Future PTS rounds will allow for better monitoring for improvements in the participants' performance.

 

ACKNOWLEDGEMENTS

NMISA PTSs are funded by the South African Department of Trade and Industry. Additional support for this study from the International Technical Cooperation of the Physikalisch-Technische Bundesanstalt, is gratefully acknowledged. NMISA expresses its gratitude to various laboratories that participated in the PTS and looks forward to their continued participation in the future.

 

REFERENCES

AGRILASA (AGRI LABORATORY ASSOCIATION OF SOUTH AFRICA) (2014) Laboratory proficiency scheme. URL: http://www.agrilasa.co.za/LaboratoryProficiencyScheme.aspx (Accessed 12 September 2014)        [ Links ]

BALFOUR F, BADENHORST H and TROLLIP D (2011) A gap analysis of water testing laboratories in South Africa. WRC Report No. TT 488/11. Water Research Commission, Pretoria.         [ Links ]

BERCARU O, RICCI M, ULBERTH F, BRUNORI C, MORABITO R, IPOLYI I, SAHUQUILLO A and ROSENBERG E (2009) Challenges in preparing water-matrix reference materials for PAHs and pesticides: examples from SWIFT-WFD proficiency-testing schemes. Trends Anal. Chem. 28 (9) 1073-1081        [ Links ]

CANE MW (2006) Chronic aquatic environmental risks from exposure to human pharmaceuticals. Sci. Total Environ. 367 23-41.         [ Links ]

DAS SR (2008) Sources and historic changes in polycyclic aromatic hydrocarbon input in a shallow lake, Zeekoevlei, South Africa. Org. Geochem. 39 (8) 1109-1112.         [ Links ]

DWAF (DEPARTMENT OF WATER AFFAIRS AND FORESTRY, SOUTH AFRICA) (1996) Aquatic Ecosystems (1st edn). South African Water Quality Guidelines Vol. 7. Department of Water Affairs and Forestry, Pretoria.         [ Links ]

FERNANDES-WHALEY M (2012) PTS Requirements of South African water testing laboratories - organic contaminants. National Metrology Institute of South Africa (NMISA), Pretoria.         [ Links ]

FERNANDES-WHALEY M (2013) NMISA-PT-ORG12 R1 OCPs in water PTS report. National Metrology Institute of South Africa (NMISA), Pretoria.         [ Links ]

FOUCHÉ C (2011) SABS Water Check - proficiency testing provider for water laboratories. NLA PTS workshop presentation, Roodevallei Hotel, Pretoria, 20 September 2011.         [ Links ]

GARDNER M and TAYLOR I (1999) Calibration bias in the determination of polynuclear aromatic hydrocarbons in water. Accredit. Qual. Assur. 4 (1-2) 33-36.         [ Links ]

HEYDORN K and ANGLOV T (2002) Calibration uncertainty. Accredit. Qual. Assur. 7 (4) 153-158.         [ Links ]

ISO/IEC13528 (2005) Statistical methods for use in proficiency testing by interlaboratory comparisons. ISO, Geneva, Switzerland.         [ Links ]

ISO/IEC17025 (2005) General requirements for the competence of testing and calibration laboratories. ISO, Geneva, Switzerland.         [ Links ]

ISO/IEC17043 (2010) Conformity assessment - General requirements for proficiency testing. ISO, Geneva, Switzerland.         [ Links ]

ISO/TS20612 (2007) Water quality - Interlaboratory comparisons for proficiency testing of analytical chemistry laboratories. ISO, Geneva, Switzerland.         [ Links ]

LEPOMA PB (2009) Review: Need for reliable analytical methods for monitoring chemical pollutants in water. J. Chromatogr. A 1216 302-315.         [ Links ]

LINSINGER TPJ, KANDLER W, KRSKA R and GRASSERBAUER M (1998) The influence of different evaluation techniques on the results of interlaboratory comparisons Accredit. Qual. Assur. 3 (8) 322-327.         [ Links ]

LONDON L, DALVIE MA, NOWICKI A and CAIRNCROSS E (2005). Approaches for regulating water in South Africa for the presence of pesticides. Water SA 31(1) 53-59.         [ Links ]

MOJA SM (2013). Determination of polycyclic aromatic hydrocarbons (PAHs) in river water samples from the Vaal Triangle area in South Africa. J. Environ. Sci. Health, Part A: Environ. Sci. Eng. Toxic Hazard. Subst. Control 48 (8) 847-854.         [ Links ]

NLA (2012) National Laboratory Association. URL: http://www.home.nla.org.za/ (Accessed 20 March 2012).         [ Links ]

NIEUWOUDT CP (2011) Polycyclic aromatic hydrocarbons (PAHs) in soil and sediment from industrial, residential, and agricultural areas in Central South Africa: An initial assessment. Soil Sediment Contam. 20 (2) 188-204.         [ Links ]

RICCI M, BERCARU O, MORABITO R, BRUNORI C, IPOLYI I, PELLEGRINO C, SAHUQUILLO A and ULBERTH F (2007) Critical evaluation of interlaboratory comparisons for PAHs and pesticides in organic standard solutions in support of the implementation of the Water Framework Directive. Trends Anal. Chem. 26 (8) 818-827        [ Links ]

RIVERA C and RODRIGUEZ R (2011). Horwitz equation as quality benchmark in ISO/IEC 17025. URL: www.bii.mx/documentos/hor-witzCf11.pdf (Accessed 1 October 2013)        [ Links ]

RSC (ROYAL SOCIETY OF CHEMISTRY) (2013) Advancing the Chemical Sciences URL: http://www.rsc.org/Membership/Networking/InterestGroups/Analytical/AMC/Software/RobustStatistics.asp (Accessed 20 March 2013).         [ Links ]

SANAS (SOUTH AFRICAN NATIONAL ACCREDITATION SYSTEM) (2012) URL: http://www.sanas.co.za/directory.php. (Accessed 20 March 2013).         [ Links ]

SANS 241-1 (SOUTH AFRICAN NATIONAL STANDARDS) (2011) SANS Drinking water Part 1: Microbiological, physical, aesthetic and chemical determinands (1st edn). South African Bureau of Standards, Pretoria. ISBN 978-0-626-26115-3.         [ Links ]

SANS 241-2 (SOUTH AFRICAN NATIONAL STANDARDS) (2011) SANS Drinking Water Part 2: Application of SANS 241-1 (1st edn). South African Bureau of Standards, Pretoria. ISBN 978-0-626-26116-0.         [ Links ]

THOMPSON M (2000) Recent trends in inter-laboratory precision at ppb and sub-ppb concentrations in relation to fitness for purpose criteria in proficiency testing. Analyst 125 385-386.         [ Links ]

THOMPSON, M (2004). The Amazing Horwitz Function. Analytical Methods Committee (AMC) Royal Society of Chemistry. URL: www.rsc.org/pdf/amc/brief17.pdf. (Accessed 7 October 2013).         [ Links ]

THOMPSON ME (2006) The international harmonized protocol for the proficiency testing of analytical chemistry laboratories. IUPAC 78 (1) 145-196.         [ Links ]

WHO (WORLD HEALTH ORGANISATION) (2011) Guidelines for Drinking Water Quality (4th edn). WHO, Geneva. ISBN 978 92 4 154815 1.         [ Links ]

 

 

* To whom all correspondence should be addressed: e-mail: mfwhaley@nmisa.org

Creative Commons License Todo o conteúdo deste periódico, exceto onde está identificado, está licenciado sob uma Licença Creative Commons