SciELO - Scientific Electronic Library Online

 
vol.110 número1-2How has South Africa's scientific landscape changed?Breaking tradition with scientific learning índice de autoresíndice de assuntospesquisa de artigos
Home Pagelista alfabética de periódicos  

Serviços Personalizados

Artigo

Indicadores

Links relacionados

  • Em processo de indexaçãoCitado por Google
  • Em processo de indexaçãoSimilares em Google

Compartilhar


South African Journal of Science

versão On-line ISSN 1996-7489
versão impressa ISSN 0038-2353

S. Afr. j. sci. vol.110 no.1-2 Pretoria Jan. 2014

 

COMMENTARY

 

Mathematical errors, smoke and mirrors in pursuit of an illusion: Comments on Govinder et al. (2013)

 

 

Tim Dunne

Department of Statistical Sciences, University of Cape Town, Cape Town, South Africa

Correspondence

 

 

The 'Equity Index' (EI) as gratuitously labelled by Govinder and Makgoba in a recent paper1 is not an equity index. It is actually simply a demographic divergence index (DDI), one of many possible mathematical alternatives which warrant the name DDI. The invocation of the word 'equity' in the original name is a deliberate but implicit claim of moral and ethical authority for the construct. This claim needs to be tested before the label 'equity' is admitted as a meaningful description. A thorn by any other name is not a rose, and proximity is not provenance.

The DDI is a simple case of a long-known mathematical device to attribute numerical distances between pairs of points in a multidimensional space (dimension = n). The index is not new in itself. Its mathematical structure is well known. However, its applicability to the setting described in the paper of Govinder and Makgoba1 is both logically incorrect for the intended purpose and morally dubious. The error is compounded in a second paper by Govinder, Zondo and Makgoba2.

This critique addresses the mathematical adequacy of the DDI for its intended purpose. At the heart of the critique is the fact that some numbers do not admit arithmetic, essentially because they are only labels (e.g. digits on a motor licence plate or in a cell number). Other numbers may admit addition and subtraction under appropriate conditions, and perhaps multiplication and division under further conditions. Applying arithmetic where it is not valid will yield meaningless numbers as outcomes.

At its heart, the argument of Govinder and Makgoba1 invokes a single mathematical formula or structure. The gravamen of a mathematical formula is the implied source of unquestionable rational authority. Subrational application of the formula is then assumed to be objectivity, rather than error. The objectivity is inferred from the mathematical replicability of the error across all contexts. This objectivity is then applied to representations of South African universities,2 but its wider application to other social institutions and conundrums is extravagantly but explicitly envisaged by the authors.

By a further assumption of a universal reference demographic profile, postulated as an exclusive and complete notion of equity, the mathematical structure of the DDI is invoked in the first paper to make value judgements about the states of institutions. In the second paper the extreme simplicity of single criterion decision-making is explicitly advocated, and the notion of institutional punishment for demographic divergence is sketched as means of steering social policy outcomes. The whole artifice is then predicated as a model for general application, in all nations, and described as an unprecedented first mathematical engagement with inequity.

What is not explicitly stated is the intended range of institutional types to which this conceptual device is to be applied in South Africa or elsewhere. There are hints from the authors which might suggest applicability to the staff and the beneficiaries and the services of schools, hospitals, welfare institutions, businesses and perhaps also government departments and non-profit organisations. However, the imperative of the authors, namely conformity with their sublimely narrow notion of equity, is the core rationale for the apparent innovation. Their particular urgency is exasperation with some universities with larger DDI values than their counterparts. On this basis these universities are perceived and asserted as intransigent on the issue of transformation.

The danger of erroneous thinking rooted in a putative exclusive concern for moral purpose and social accountability is that any underlying logical or mathematical errors are too easily excused by the imputation of vested interest and mala fides to those who contest the dubious mathematics. Contrary voices can easily be caricatured as at least impervious or at worst opposed to the claimed moral purpose. Indeed, one of the hypotheses offered by the authors is that several universities (other than their own university - the University of KwaZulu-Natal, UKZN) are currently impervious to equity objectives.

There is a need therefore to clarify upfront that there are indisputably terrible and consequential residues of the apartheid past and all its evil consequences, in every aspect of South African society. Some of this residue of persistent inequality and suffering is in part a consequence of preserved privilege, unjust advantage, obdurate structural inequalities, culpable indifference, wilful ignorance, lack of compassion, hypocrisy, greed and plain incompetence. Some suffering has more recent origins of a similar kind. Inescapably, suffering in South Africa has a racial and gendered face.

Universities cannot and should not be immune from the probing and critique that exposes the current extent and the likely progress of their own transformation within the society. Holding universities to account for their internal structures and their external impacts is both a legitimate and necessary act of citizenship. But social phenomena and processes are inherently more complex in their causal and contextual relationships than their counterparts in the natural and physical sciences, precisely because of the inherent agency of every human participant and stakeholder. We cannot afford pseudoscience posturing itself as relevance and objectivity in social science domains, by virtue of a single mathematical device and the numbers which a formula generates.

The resort to the achievement of measurement for evidence in the physical sciences has great power, but is limited in extent to the particular contexts in which measurement is possible. Nonetheless measurement is an engine of technological progress, within the simplicities and regularities that order our experience of the physical world. Measurement is a worthy pursuit and a magnificent achievement. This achievement arises from three key elements. Firstly, the definition of a replicable unit of extent of a characteristic common to many objects in every salient context must be clarified and exhibited. Secondly, a replicable mechanism has to be discovered or constructed, by which the extent of the characteristic can be compared with the chosen unit, and elicit a ratio outcome that is reliable to some explicitly chosen degree of accuracy, in specified contexts. Thirdly, the concatenation of the extent of objects should elicit ratios that are consistent with the properties of arithmetic, to the same degree of accuracy. Thus, to borrow a term favoured by Govinder et al, we may assert there is no cheap or mahala measurement of any characteristic, least of all from mere invoking of a formula.

In the human sciences there is no analogue of scientific measurement. There may be stochastic rather than deterministic analogues of measurement instruments, but such instruments are fiendishly difficult to develop or achieve or validate or verify.

Measurement instruments in the natural sciences have to be accurate and reliable under environmental conditions. In the human sciences the instruments have to transcend the observer, the observed, their complex interactions and the entire set of all relevant milieux. Before any quantitative approach is ventured in these humanities domains, a sound and plausible qualitative conceptual and methodological framework of understanding has to be postulated and critically examined.

In the realm of human sciences, we are not simply concerned with natural phenomena, which would be difficult enough. We have also to deal with perception, motive, choice, belief, conscience, mutuality and relations of power, agency and efficacy. Thus any proposal for a mathematical panacea in the social sciences should properly evoke deep and vigorous scepticism, and robust debate. We have an obligation to dignify postulated nonsense by rigorously exploring its implicit and explicit foundations, so as to expose its seductive weaknesses for what they are.

The mere assignation of a number by a conceptual or arithmetic device, even if such a formula is centuries old, does not of itself offer any objectivity, coherence or relevance. Further, the structure and pertinence of any rule for assigning numbers is open to scrutiny.

Especially in this matter of equity, we have to contest the hidden assumptions imposed on the method and context of enquiry, when the root sum of squared differences (RSSD) is engineered and purported as a final arbiter of the state and fate of universities. This caveat will also apply to any rankings derived from flawed numbers (with or without decimals) within any sphere of application.

When there is clarity about what the notion of number can and cannot offer in this debate, we still have to contend with contrasting appearances, compositions and outputs of institutions. There we will have to address the cry of the poor and yearnings of those who may be victims of our own ongoing privileges of every kind in all walks of life.

 

Mathematical considerations

In mathematics a multidimensional space of interest may often involve dimensions for comparable measurements in a single common measurement unit for each dimension, such as length, breadth and height (e.g. in metres), of points in three-dimensional space. The distance measure is in the same units (metres). There are several other natural mathematical distances between pairs of points (with coordinates all in the same units). These various distances would all be admissible as alternatives to the specific RSSD. The distance measures all have different utilities.

Extensions of mathematical distance measures are well known throughout science. These measures have origins deeply embedded in the history of science. One variety involves giving different weightings rather than equal weightings to the separate dimensions of the space. The measures are all applicable in contexts where each dimension is essentially unconstrained, so that technically infinite differences and distances may arise, but need not.

Every such mathematical measure would be usable as a plausible distance measure for any context involving units of the same kind on every dimension. However, a declared common specific measurement- unit would be required on each dimension, before distance is meaningfully invoked in that unit.

The so-called EI (hereafter just DDI) offered in the paper is different from the mathematical distances, although it borrows one of the formulae. The DDI discards any dimensions of infinite extent. It discards continuous measurement and is simply a function of counts, not measurements. These properties are not necessarily faults but mathematical limitations, which render measurement impossible.

Although measurements invariably involve decimal fractions as muItiples of a defined physical unit, the mere appearance of decimals in numbers does not constitute evidence that measurement has occurred. It is seductive, but misleading, to impute the authority of scientific measurements to numbers derived from pure counts, just because the counts have proportional or percentage forms which include decimal components.

The DDI involves subcounts of some finite countable number of persons, in precisely n defined categories. After defining the n categories and all the associated subcounts, all inferences are drawn upon the basis that every person within any nominated category is fully described by that category. For all intents and purposes related to the counts, the persons within a category are equivalent and mutually exchangeable. This fact is a consequence of the act of reducing the persons to objects in categories that are subject to particular counting arrangements. Any act of counting is not inherently wrong, but that very act has limiting consequences. The issues of exchangeability and equivalence of persons within a count will be discussed further later.

The DDI involves the category counts but first reduces them to proportions (summing to 1.00) or percentages (summing to 100.00) with some minor rounding of decimals. The purpose of using only unit-free proportions or percentages is to introduce a constructed comparability between category counts from several distinct sources (e.g. 23 separate universities). This construction that assumes the total sizes of the sources has no relevance for the nature of the intended comparisons.

Next, each institution is located in an n-dimensional space. The number of relevant dimensions (n) may vary, depending upon the choice by the observer about the number of categories to be used as a means of partitioning the observations. For the authors of the DDI paper, this dimension has been reserved to be n=2 (gender), n=4 (race) or n=8 (gender within race), by an appeal to the authority of their particular interpretation of the South African constitution. Other additional categories would be admissible, such as age, location, competences, experience and qualifications, but are deliberately excluded.

Each institution is then allocated n coordinate values that reflect its profile of category counts. The sum of the values within location coordinates must be 1.00 for proportions, or 100% for percentages, whatever the choice of n.

Thus it might be coherent, but not necessarily useful, to record pseudodistances between profiles using the underlying RSSD, as in the DDI. But no inference about the pseudodistances in any hyperspace carries through into any reduced or extended set of dimensions.

The DDI by construction seeks to operate only on a surface, called the simplex plane of non-negative numbers summing to 1.00, in a particular n-dimensional space. These spaces are nested within one another in the same way that many two-dimensional surfaces are nested within our familiar three-dimensional space. Thus, these DDI measures are not comparable across distinct values of n, but possibly only within a fixed value of n. The authors of the DDI paper have apparently acknowledged that fact, but ignored its consequences.

The geometry of these simplex hyperspaces is peculiar, or at least unfamiliar in our usual ways of thinking. Firstly, these hyperspaces of dimension n have all possible subspaces of dimension m nested within them, providing m < n.

Each n-dimensional hyperspace has a central point whose n coordinate values are all equal, namely n_1 = 1/n. This central point has a common pseudodistance, RSSD=sqrt[(n-1)/n], from each of the extremal points in its hyperspace. For n=3, this hyperspace would have the appearance of an equilateral triangle, joining points at (1; 0; 0), (0; 1; 0), and (0; 0; 1). The central point is at pseudodistance sqrt(2/3) = 0.816497 from the extremal points.

All pairs of extremal points have a common pseudodistance RSSD=sqrt(2)=1.414214 between them. This extremal pseudodistance applies unchanged across all n-dimensional spaces. No pair of points in any n-dimensional simplex hyperspace can be further apart in RSSD than the common RSSD between all extremal points. This notion of maximal RSSD is discussed again later.

The DDI notion involves the assumption that a single specifiable point on the hyperplane has both a mathematically and a contextually significant position. It is legitimate to nominate a reference point mathematically, but the intended meaningfulness of the reference point must be argued from beyond mathematics (e.g. arguments from equity or other criteria), along with the meaningfulness of the number of dimensions. Choices of n and of reference points are contestable. In particular, national demographics may be too narrow a set of n categories to address the complexity of any issue in question.

The notion of RSSD pseudodistance is not a notion of inequity unless some reference point is hypothesised on the simplex hyperplane. That reference point is, by assumption, an ideal point that is relevant in and of itself, but also completely adequate for a purpose at hand. Hence the reference point can only be ideal in the particular n-dimensional space if no other space of smaller or larger dimension is deemed to matter at all.

This limitation implies that any use of the DDI in n dimensions necessarily discards the intrusion of any other source of information of any kind about the persons involved. The notion of equity is reduced in this context to a notion of deliberate and sustained ignorance about all other possible contributions to the choice of a reference other than those embodied in the chosen reference point. Rather than being a strength of the DDI method, as viewed by the authors' agenda, this feature constitutes a severe fragility for the DDI in all applications, including their applications.

In the applications cited by the first paper, we would have to infer that only the race and gender issues mattered as selection outcomes at each level of application, e.g. the senior administrative level at UKZN or Rhodes University. Moreover, the reference point is next subjectively defined as a fixed set of national demographic proportions. The DDI calculates a pseudodistance from that reference.

At the moment of definition of the reference, the profile (of any university) does not correspond to the ideal. Thus one is faced with a choice, either to discard the current cohort of leadership in senior positions and immediately replace them by new selections, or to permit passage over time towards the reference point, through controlled demographic selection.

The RSSD pseudodistance might be useful if our process for selecting these incoming university administrators was to randomly select such new appointments from a suitable pool. The preferred pool of the authors must be constituted precisely and only by the reference race and gender proportions, with no regard to any other characteristic that might be specific to the human resource requirements of an incumbent, in a prospective senior administrative appointee. The function of the reference point is to penalise all other considerations for appointment.

The DDI might then serve as an indicator of the randomness of the process of selection, if randomness from the desired demographic profile was the only criterion required. Any leeway to select on criteria other than demographics alone will necessarily permit, and even perhaps require, deviations from the defined target.

The same objective of randomness of selection can again be assured by the use of the DDI reference point at every level of aggregation (academic staff, technical staff, service staff, students, etc.). Use of randomness as the single selection criterion for new appointments from a pool of candidates already satisfying the reference profile will generate, over time, a series of appointments which will eventually satisfy the- same intended profile of incumbents, at every level of aggregation. If one exchanges the incumbents often enough, then random selection from the reference pool will steadily approximate the chosen reference profile.

In the sense that all selections from the pool will be random, the process and its replications will be fair (free of any selection bias). The utility of these selections would still require demonstration. The question may arise as to whether or not the use of any national demographic profile can be legitimately characterised as random selection. The legitimacy of this description is motivated later.

The RSSD pseudodistance might conceivably be adopted as a confirmatory criterion of the appointments processes over the period, beginning from the first moment when randomness of selection from the idealised pool is deemed appropriate. UKZN would conceivably congratulate itself on this assured journey to achievement of a reference point by the innovative device of iteratively ignorant blind choice from the entire population.

The utility of the reference point is moot for another reason. Unless the idealised point is rendered mathematically tractable, by rounding conventions, all configurations in all positions will be short of the reference (they will be at some pseudodistance, even in the putatively salient UKZN environment). The authors adopt a notion of tolerance to address this issue.

It is quite another matter whether such a pseudodistance from randomness is ever meaningful on instantaneous states (e.g. current occupants of the positions) rather than only on the process changes (e.g. new appointments) at each specific level. The authors have noted this limitation.

What the provision of a formula hides is the misconception that counts can be handled mathematically as if they are interchangeable with measures. The fact that we may count people does not make them equivalent and exchangeable. A principal, a registrar and a dean will count as three people in leadership positions, but we do not believe we can switch them arbitrarily, not even at UKZN. A person is not a unit of measurement. On the other hand, the metre in terms of which we measure height is equivalent to the metre by which we measure length. The fact that proportions and percentages can be written to some degree of accuracy as decimal numbers does not make either the proportions or percentages measurements.

If one wishes to ascertain how much the actual count profile of changes at staff selection differs from a desired set of random probabilities, then a formal randomisation test can be invoked. An approximate but correct method for checking compliance with the idealised profile is a chi-square goodness of fit test. This test is available in first-year texts and is easily calculated using software such as Microsoft Excel.

As in all statistical analysis and evidence collection, the use of any formula, such as chi-square, may elicit a signal from data. The signal indicates that at least one of the underlying assumptions we have made does not fit with the message from the data. The subject domain expert then has to take a view on whether or not the discernible signal constitutes evidence of some consequential violations of assumptions, possibly followed by decisions and actions. No statistic can replace the role of the thinking scientist in either the natural or human sciences.

The paper of Govinder and Makgoba1 in the South African Journal of Science is remarkable. It will in time become a frequently cited paper. The citations will not be to celebrate its elegance, simplicity or profundity - it has none of these characteristics to warrant citation. Instead it will gradually become cited for its errors and less scholarly characteristics. One such infelicity is its implicit argument for randomness as the principal criterion to distinguish one candidate from another, as the long-term strategy of a university to reach and maintain an ill-conceived idealised profile.

The issues of equity and redress are too important to be trivialised by allowing ourselves to be intimidated by the sequestered word 'equity' and the torrid outcomes of mathematical orchestration.

 

Confusion thrice confounded

In a follow-up paper, Govinder et al.2 claim to extend the original DDI apparatus into 'an important policy tool in steering the system towards a notion of transformation that connects, rather than disconnects, equity, development and differentiation'. They further aver 'The index may also become a useful universal measurement of equity in higher education (and other) systems globally.'

In support of this set of claims, they report 10 sets of applications of the formula to 23 universities in South Africa. These 10 sets cover seven employment categories, enrolment, graduation and a new indicator -their equity-weighted research output. These 10 sets of 23 indices give rise to 10 rankings in their analysis.

They proceed to consider arithmetic on some of these indices using subtraction and ratios (reported as percentages). They claim to explore relationships between equity and quality by the device of partitioned scatter plots involving DDI values and publication counts.

What appears to be unstated is that the entire set of 10 analyses reported are based upon four race categories alone, although there is bracketed comment: (ignoring gender imbalances). The consequence of this offhand remark is that the entire analysis appears to take n=4 rather than the claimed constitutional imperatives of race and gender, with n = 8. All the analyses appear to be implicitly referenced to Table 2 in the paper.

The effect of choosing as small an n as 4 is likely to be a very much exaggerated range of plausible DDI values than might be the case for n=8 or larger. Given the deep concern about equity that presumably motivates the paper, the analyses with gender included may well have been conducted, but are not reported. The applications for n=8 may give rise to an artefact: reduced DDIs in regions close to reference points.

Clarity on this matter of the reportage was sought by a request for the data and spreadsheet calculations on which the reported analyses were based. The advice received from the authors was to consult the sources specified in the references for the necessary data. Further, through the editor, a clarification was received: 'As far as our personal spreadsheets are concerned, we do not believe that it is appropriate to release them as this was obtained as a result of considerable work on our part.'

Such a position in a matter as consequential as this debate has severe ethical and scientific weaknesses. It is also open to several unfortunate interpretations. The reader is denied the opportunity to assess the data and the claimed calculations. Such an attitude contrasts with values of openness and transparency, and of the replicability of allegedly scientific methods and processes.

The StatsSA source of data reported in the references gives only the aggregate percentages for race (n=4), as determined by the 2011 census. The census outcomes have been announced and publicly contested. The 10% sample from the census has yet to be released, despite the controversy about the post-enumeration survey allegedly being resolved by a disciplinary process that has not yet been heard. Nonetheless, we may currently regard one part of Table 2 (labelled 'Overall') as being sufficiently coherent with official figures for an exploration of the DDI.

The authors have indicated that foreign visitors and permanent residents in South Africa apparently constituted 0.5% of the census population and they and their constitutional rights are ignored in the analyses. This approach might be constitutionally awkward, but unwelcome foreigners can be mathematically eliminated by a minor upward correction of the population percentages.

However, Table 2 and subsequent discussions introduce further errors. Briefly, these errors involve the maximal RSSD, problems with acceptable RSSD levels, and misunderstanding of the notion of quintiles.

For n=4, and its associated overall population percentages embodied in an ideal, a maximally contrasting profile for an institution would arise from an only-Indian composition, and yield a DDI value of 126.5 with minor rounding approximation permitted, using four percentages summing to 100. An only-foreigner institution (n=5) has a corresponding approximate value of 127.8, using five percentages summing to 100. As- previously noted, the maximum RSSD between any two extremal (single population group) institutions is 141.4 for any value of n. This maximal value of around 141.4 for extremal RSSDs contrasts with the repeated error in all four columns of Table 2 which report impossible maximal values of RSSD for the South African data.

The erroneous maxima are next partitioned into intervals of common width, equal to one-fifth of the reported maximal values. The one-fifth segments are further erroneously labelled as quintiles.

The fifths of an interval do not correspond to quintiles of a distribution except in one circumstance (uniform density of RSSD values over the entire correct permissible range). That necessary circumstance cannot possibly apply under the conditions of percentages summing to 100 as required here. This emperor has no clothes.

The language of the paper describes a tolerance of 5% of each target value. If we presume this tolerance, we still have to take into account that the permitted variations have to balance each other out. Thus 5% of the target for each of the three smallest of four racial categories in use, will maximally combine to 5% of their 19.3% total, about 1%. This maximal combined tolerance then also applies as the maximal tolerance for the complementary single largest racial category.

After adjusting for the eliminated foreigners, this calculation will permit a deviation of, at most, about 1% from the 79.2% recorded alongside the category Black African. The subsequent RSSD value is 1.20%. This number is very different from the reported value 5.3%. An Excel spreadsheet is available for the curious.

Other interpretations of the wording used for tolerances were explored. None of these gave rise to the tolerance quaintly labelled 'quintile zero' in Table 2, noted as 5.3%.

Despite all these difficulties, the paper goes on to claim the utility of being able to report both the pseudoquintile, and even changes in pseudoquintiles, as evidence of achievement and progress.

A further source of mathematical astonishment is the use of subtraction in Table 3. This operation generates the new and allegedly profound efficiency DDI by subtraction of graduation DDI from enrolment DDI across 23 institutions. The hidden assumption is that the RSSD functions behave additively or linearly for any fixed n. This assumption is false. For example, two persons both 3 units distant from their destination may be anywhere between 0 and 6 units distant from one another.

The same false assertion of additivity is again applied in the construction of Figure 2, in which the various DDIs for overall staff and the seven staff component categories are aggregated by concatenation, against a vertical axis for cumulative DDI values. Indeed, a new mathematical faux pas: summing of both the whole and the sum of some of its parts.

The RSSD is not a quantity of measurement in terms of a reliable unit of any kind. It is not even a count. The RSSD values cannot therefore be claimed to admit a valid arithmetic of addition or subtraction. They also do not admit ratio comparisons within or across institutions.

It is correct to treat RSSD as an ordinal feature, and hence we can admit rankings as offered extensively in the paper. We would be able to infer that on some rankings one university has a higher RSSD than another, but we would have difficulty in explaining what such difference in ranking meant per se for any decision-making. Further criteria from beyond mathematics would have to be argued and debated, and their fitness for purpose examined.

There is a lurking hint that RSSD values should be tracked over time, and that universities should be able to exhibit trajectories towards lower values. Again, we can make such comparative judgements over time within single institutions, on the basis of ordinality, but the judgements do not have the power to inform decision-making, except as self fulfilments.

Some final paragraphs of Govinder et al. impute intransigence in the higher education sector on the grounds that after some 23 years since the visible fall of apartheid, the universities have not yet reached adequate national profiles for these authors. A litany of allegations is neatly composed: passive resistance, denial of failure, abuse of autonomy, abhorrence of accountability, failure of government to steer or monitor, the state cowed by the privileged and impervious to the voice of the disadvantaged, conservatism.

All these allegations are worthy of debate, but it is a form of intellectual bullying to hide behind a mathematical formula as the justification for unspecified 'extraordinary measures'. The intended punitive actions assume that all playing fields prior to the imposition of the reference profile are level, and that the location of the problem of inequity lies singly and only in the universities themselves.

The imperative to adopt national reference profiles does not ameliorate in any way the profile of school-leavers apparently eligible for university entrance and technically capable of graduation. The rationale for urgent measures purports that the obstacles to better profiles are solely the fault of universities, and that no other constraints or preconditions or simultaneous imperatives apply.

The RSSD does not address the notions of real distance from home to institution, of term-time accommodation, of local travel costs and constraints, of access to books and technology, of adequate preparation, of emotional support, of scholarships, or of differential living costs across rural and urban settings.

In respect of prospective employees, the RSSD does not take into account the composition of pools of available candidates; the effects of competing positions in commerce, business and industry; or varying forms of family responsibility and cultural preferences of the candidates themselves.

These issues too are worthy of debate. It is not possible by mere fiat for universities to set aright the suffering of this society, by admission, graduation, research and employment profiles that match a national reference profile. Indeed, it has not been possible for democratic government in South Africa to achieve corresponding laudable goals in housing, education, nutrition, health, transport and employment in 20 years. It is legitimate to argue that some of the outcomes reflect difficult initial conditions rather than dereliction, fault or animosity within universities.

All societal change is contextual and inherently unpredictable. What is necessary is debate about mechanisms that work and the necessary conditions for their success. In such debates we may hold all role players mutually accountable for processes that eliminate or moderate suffering and injustice. It is a dubious principle to rule out regional objectives on the grounds that they reflect imbalances and injustices of the past. Contextualisation is not ipso facto a reneging on justice.

If we may not contextualise and if only DDI conformity matters, we can only comply by ensuring no criteria other than the national demographics alone, intrude into our decisions. There is only one way of verifying that conformity, by being able to demonstrate that only random selection from the national profile (and nothing else) is exercised at the level of every decision-making concerning individuals at universities. We require demonstrable random selection from the reference race and gender groups for admission, selection, passing, graduation, employment and promotion.

This argument is not a trite parody of the arguments of the DDI authors; it is unfortunately the essence of their position. It is also the basis upon which they diagnose culpable indifference, or worse, at the universities.

The issues of equity, development and democracy need robust engagement. They require open minds and open hearts. The DDI should be left in the Euclidean cupboard. There are too many flaws to warrant prolonged discussion. Let us rather debate the injustices and the needs authentically, and clarify the nature of processes and resourcing which will have some chance of offering a better future for all.

The great flaw in the DDI as a stand-alone methodology is that it permits only a partial view of outcomes of complex processes. The method focuses upon one set of outputs - demographics - but ignores all inputs and all process characteristics that precede and lead to those limited outcomes.

Such an approach cannot claim equity as a hallmark of its achievement. Yet the approach of these authors also predicates a whole white box of cause-and-effect relations dominated entirely by the leadership of institutions, as if no other role players exercised either effect or judgement.

 

Dancing with other divergence demons

In the latter segment of the second paper, the authors seek to expose recourse to quality (and extent) of scholarly output as an apparent disguise for intransigence, often invoked in their view by several target universities. The methods of the paper seek to correct for advantageous effects arising through retaining privileged DDI profiles, within various aggregate and per capita indices of research output. Partitioned scatter plots contrast the locations of the universities.

Scatter plots and their partitions may be meaningful as depictors of relationships between characteristics but only to the extent that the underlying coordinate systems are meaningful. Even then the plots have an inherent limitation. When we reduce, say 23 universities, to only the two characteristics in use within the scatter plot, all emerging graphical insights are filtered through the AOTBE (all other things being equal) lens.

In science, especially in human sciences, we have to take into account the distortions of this lens. We seldom mean that all other factors have been eliminated or effectively controlled by suitable balancing for equivalences. We usually mean that all other factors are ignored because the task of observing them and taking them into account is too difficult or too costly in time or money, or perhaps impossible.

Our lens and inferences must in almost every case be modified from AOTBE to AAOTBEU (almost all other things being equally unknown). This term describes a qualitatively different set of conditions, and alerts us to the practice that distinguishes scholarship from slippery reasoning and sleight of hand. That practice is to declare explicitly any ignorance or unknowability or limitations.

In the scatter plots (Figures 3 and 4) of the second paper, no such caveat is offered. All 23 points for 2011 data are plotted in each case. The choices to partition the scatter plots are admissible, but both the relevance and the adequacy of two sets of four groupings are open to challenges based upon other information.

We note that the weighted research output is a counting device. This count aggregates all papers published and all degrees awarded. The count does so in a manner that notes the existence but does not distinguish between any levels of quality of the publications and theses. All these elements are regarded as interchangeable in their weight classes.

The weighted count numbers are a bureaucrat's attempt to quantify scholarship, and remain subjects of debate, even as they are also sources of funding. These numbers appear in column 2 of Table 6.

Again a false assumption of admissible arithmetic is imposed on these numbers. The DDI overall numbers of column 2 in Table 5 are divided into the bureaucrat counts of Table 6. The results enter Table 6 at column 8, labelled 'equity-weighted research output'.

The inadmissibility of these various arithmetics is papered over implicitly by the loose use of the word index. What valid and honest scholarship requires is a contestation around the observable phenomena, not mathematical smokescreens.

Several grave dangers of the DDI as methodology have now been made further apparent in a press release from UKZN.3 Reported recommendations, apparently accepted and approved already by a Ministerial Transformation Oversight Committee, chaired by one of the authors, are drawn from the second paper. These elements include 'realistic targets for high-level knowledge production linked to equity', in respect of which Table 6 column 8 of the second paper conveniently asserts UKZN in the first rank.

The press report also makes various claims for time periods to attainment of demographic profiles by named institutions. Their source is allegedly a seminal study report published in the South African Journal of Science, for which neither the first nor the second paper provides any formal evidence. Thus we note a new confusion has been introduced into public life.

This confusion is an assertion that rigorous estimation of the passage of time from some current profiles to the attainment of DDI = 0 can be offered. The estimates of the periods specified range from 40 years for academic staff to 43 years for overall staff of the institutions generally. For particular institutions, the estimated periods include 261 years and 382 years for Stellenbosch University and the University of Cape Town staff numbers, respectively. No estimated standard errors accompany these estimates - an interesting omission.

In the latent scatter plots for 23 institutions over time, there will be fewer points than 23 in earlier scatter plots. There will be 23 distinct scatter plots of two (perhaps more) time points each. Private correspondence indicates there are precisely two time points, but ongoing data collection is expected to produce DDI values for more retrospective time points. Thus we currently have 23 time series analyses, one for each institution, based on exactly two observed values and one observed difference over time!

Whether the seminal study or the author of the unexpurgated press release is responsible for the time series analysis is as yet unclear. But a ministerial committee apparently believes in the AOTBE approach applied to two consecutive data points. They buy into inferences of periods spanning between at least 40 and at most 382 years before the required demographic profiles are reached, and without indications of imprecision. This type of pervasive foresight can only be matched by the prognostications of astrology, but unfortunately not by the application of scholarly methods.

The danger is that such perverse conclusions will determine policy, predicated on an assumption that scholarship has driven these inferences.

A further recommendation apparently specifies '20% of each institution's block grant must be reprioritised to address equity transformation [because] there is no cheap or mahala [free] transformation'.

 

Conclusions

The various DDI manifestations thus far offered in pursuit of an illusion speciously labelled as equity should be rejected outright as invalid and misleading in name and content and implied authority. The DDI may be- more fully debated. However no DDI will yield measurement in a scientific sense. Thus, for any specified set of counts (students, staff, etc.) the choice of DDI applied may be used as an ordinal variable, and can support rankings only. DDI values cannot support arithmetic, either within or across indices.

The contrasts between notions of divergence and notions of equity need to be clarified. The debate about equity, including its meaning and attainment, has to embrace the reality of suffering and injustice in South Africa. This debate may include the universities, but the other institutions also warrant attention, preferably of a rational rather than pejorative kind. The universities have a dual part in this debate, as objects of enquiry and voices of observers.

Many processes may be required to eliminate injustice and promote more rapid access to better life circumstances. Elimination of injustice cannot be adjudicated by evidence only from a mere calculation. Both the legitimacy and role of any arithmetic have to be firmly clarified. Otherwise the invocation of one or more indices becomes a vehicle of bureaucratic self-gratification, rather than a series of ordinal indicators, each indicative of only one possible objective at a time.

This position does not exonerate universities from accountability. It affirms a collective obligation of an examination of conscience in robust debate. However it also claims that true transformation is a matter of the heart and an issue of complexity, which warrants authentic scholarship rather than fumbling mathematical conjuring.

 

References

1. Govinder KS, Makgoba MW. An Equity Index for South Africa. S Afr J Sci. 2013;109(5/6), Art. #a0020, 2 pages. http://dx.doi.org/10.1590/sajs.2013/a0020        [ Links ]

2. Govinder KS, Zondo NP Makgoba MW. A new look at demographic transformation for universities in South Africa. S Afr J Sci. 2013;109(11/12), Art. #2013-0163, 11 pages. http://dx.doi.org/10.1590/sajs.2013/20130163        [ Links ]

3. Seminal study devises Equity Index to measure the pace of transformation in South African universities [UKZN press release]. 2013 Oct 24 [cited 2013 Dec 10]. Available from: http://www.ukzn.ac.za/news/2013/10/24/seminal-study-devises-equity-index-to-measure-the-pace-of-transformation-in-south-african-universities        [ Links ]

 

 

Correspondence:
Tim Dunne
Department of Statistical Sciences
PD Hahn Building, Upper Campus
University of Cape Town, Rhodes Gift 7707, South Africa
Email: tim.dunne@uct.ac.za

 

 

© 2014. The Authors. Published under a Creative Commons Attribution Licence.

Creative Commons License Todo o conteúdo deste periódico, exceto onde está identificado, está licenciado sob uma Licença Creative Commons