SciELO - Scientific Electronic Library Online

 
vol.53 issue1Reflecting on the ethics of PhD research in the Global South: reciprocity, reflexivity and situatednessEpistemic (in)justice and decolonisation in higher education: experiences of a crosssite teaching project author indexsubject indexarticles search
Home Pagealphabetic serial listing  

Services on Demand

Article

Indicators

Related links

  • On index processCited by Google
  • On index processSimilars in Google

Share


Acta Academica

On-line version ISSN 2415-0479
Print version ISSN 0587-2405

Acta acad. (Bloemfontein, Online) vol.53 n.1 Bloemfontein  2021

http://dx.doi.org/10.18820/24150479/aa53i1.3 

ARTICLES

 

Surveillance capitalism as white world-making

 

 

Yvonne Jooste

Dr Y Jooste, Department of Jurisprudence, University of Pretoria. E-mail: yvonne.jooste@up.ac.za

 

 


ABSTRACT

The era of 'surveillance capitalism' as a new logic of accumulation that claims human experience as free raw material necessitates an understanding of how corporate-controlled digital communication technologies govern and structure how we come to know the world. This article investigates surveillance capitalist operations and argues that it enables (l) algorithmic colonisation, (2) oppressive digital practices that reify bias along racial lines, and (3) the turning of bodies into objects in the creation and maintenance of whiteness. Through presenting these different arguments, a larger point emerges, namely, that surveillance capitalist operations must be understood as intimately tied to the project of white world-making.

Keywords: surveillance capitalism, digital surveillance, racial surveillance capitalism, algorithmic bias, algorithmic colonisation


 

 

Introduction

In this article, I reflect on surveillance capitalism (Zuboff 2019) from the viewpoint of what George Yancy (2008: xvi) terms 'white world-making'. The article considers a number of perspectives: firstly, the notion of algorithmic colonisation as investigated by Abeba Birhane (2020) is explored in order to highlight the colonialist rhetoric that forms part of surveillance capitalist operations. Secondly, I discuss the inherent bias embedded in algorithms and argue that surveillance capitalism, through algorithmic operations, entrenches social injustice and racial discrimination by enabling sexist and racist cataloguing and profiling. Thirdly, I consider Mirzoeff's (2020) understanding of racial surveillance capitalism and its relation to conquest patterns and the logic of colonialism to turn bodies into objects in the creation and maintenance of whiteness. As part of elaborating on Mirzoeff's arguments, I also shed light on the idea of "racial capitalism" (Robinson 1983) as well as on Simone Browne's (2015) examination of the surveillance of blackness. Through these arguments, I, firstly, hope to contribute to the call to critically interrogate corporate-controlled digital communication and data-driven technologies, which increasingly structure knowledge and govern how we come to know the world. The aim is to highlight the harmful consequences of these technologies and locate them within the architecture of racial capitalism. And secondly, each of the lines of thinking investigated in this article serves to ultimately contribute to the contention that the project of surveillance capitalism (as a project of technological racialisation that employs colonialist rhetoric and that seeks to turn bodies into raw material free for the taking) must be understood as intimately tied to the project of white world-making.

Before discussing these perspectives, I firstly elaborate on Yancy's notion of 'epistemic white world-making'. In the section that follows, I discuss Shoshanna Zuboff's formulation of 'surveillance capitalism' (2019). These notions serve to frame the enquiry or serve as a point of departure from which the different perspectives, mentioned above, are to be considered.

 

Racial world-making

Yancy builds on a number of anti-black racism theorists (most notably, Frederick Douglas, Frantz Fanon, Lewis Gordon and WEB Du Bois) to explore the subjectivity of black bodies under a white hegemonic gaze. He (2008: xvi) argues that the black body is fundamentally linked to the history of whiteness, which is expressed in a number of ways (through policing, politics, denial, and brutality). For Yancy (2008: xvi), from the perspective of whiteness, the black body is criminality itself and is deemed the quintessential object of the ethnographic gaze or the object of anthropology. The black body is constructed as antithetical within a binary logic that points to the white body's "own signifying and material forces to call attention to itself as normative" (2008: xvi). A few of these contentions call for further elaboration.

Firstly, Yancy's arguments above are in keeping with the general contention in whiteness studies that "whiteness" is to be understood not merely as a generic skin colour or ethno-racial identity but rather as connoting a social location of structural racial privilege, economic advantage and cultural dominance (Taylor 2004: 229).1 According to Modiri (2017: 9), this location in turn then produces an epistemic standpoint in which the world is seen "whitely" - the way whites see the world becomes the way the world is. Yancy (2008: 3) recalls Fanon's (1967: 110) observation that "not only must the black man be black; he must be black in relation to the white man". This relational dimension means that "blackness" is constituted and configured vis-ä-vis the construction of whiteness as transcendental norm, synonymous with humanity and civilisation. As Yancy (2008: 3) states:

To say that whiteness is deemed the transcendental norm is to say that whiteness takes itself to be that which remains the same across a field of difference. Indeed, it determines what is deemed different without itself being defined by that system of difference. Whiteness is that according to which what is nonwhite is rendered other, marginal [...] inferior, uncivilised.

Therefore, although the term "whiteness" invokes ideas related to skin colour, it, more importantly, refers to a structural position - to a racialised social identity positioned as superior relative to other "races" within a system of racial hierarchy. Cancelmo and Mueller (2019) explain that in this sense whiteness embodies both a material reality as well as a symbolic reality. The material reality of whiteness is connected to the disproportionate economic and political power wielded by those racialised as white, and the symbolic reality is the cultural meanings attached to whiteness as a form of inflated value, morality, aesthetics, and civilisation (Cancelmo & Mueller 2019).

Yancy (2008: xvi), therefore, regards whiteness as transcendental norm and, as such, argues that the black body has the "twisted fate" to be subjected to "white forms of disciplinary control, processes of racist embodied habituation, and the epistemic of white world-making". I rely on Yancy as this phrasing emphasises the fact that whiteness operates as a symbolic structure around which meanings and values are organised (rather than a representation of how individual whites might feel about their level of social empowerment) (Alcoff 2008: x), or put differently, the validity of knowledge is structured around whiteness. The idea of white world-making, in this article, will serve to suggest that surveillance capitalism is connected to a racialised way of seeing the world that draws on a history of colonial exploitation. Indeed, whiteness can be understood as the result of "social and cultural processes, rooted in a global history of European colonialism, imperialism, and transatlantic slavery" (Cancelmo & Muller 2019). Today, the legacy of colonialism continues and is maintained through various institutions, ideologies, and everyday social practices.

Along the lines of Yancy's thinking, I understand the term white world-making to refer to ways of feeling, thinking, and seeing the world that reproduces whiteness. Further, I understand white world-making as intimately tied to the project of what some scholars have referred to as "global coloniality" - that which describes the amalgamation of colonialism, racism, capitalism, imperialism, and Eurocentrism into the dominant social, cultural and political order and episteme of the world (Modiri 2017: 119).2

White world-making therefore embraces practices, ideologies, logics and the legacies of European colonialism in social orders and forms of knowledge. European colonialism imposed racial, political and social hierarchical orders that prescribed value to some people while disenfranchising others. My interest lies specifically in some of the practices of white world-making as rooted in colonial exploitation, namely, the logic of conquest in the declaring and imposing of new social realities and facts; the drive to colonise in the pursuit of power and economic interest; the creation of hierarchies of difference in order to extract value; and the turning of bodies into objects or sources of raw material. I elaborate on each of these practices below in order to argue that the project of surveillance capitalism was and is shaped by a specific racialised way of seeing the world.

Before elaborating on this contention, in the next section, I discuss the idea of surveillance capitalism.

 

Surveillance capitalism

Shoshanna Zuboff's (2019) formulation of surveillance capitalism has been much analysed and discussed. Individuals have become aware of the fact that contemporary digital platforms of the Web, retail and e-commerce, smart infrastructure systems, and mobile telecommunications produce vast amounts of detailed data about users - our preferences as consumers, spatial and temporal patterns, online behaviour, "hopes, beliefs, desires" (Cinnamon 2017: 609) are all recorded and tracked in order to become 'known entities' (Lawrence 2018) toward economic ends. Thus, our relationships and networks, our physical infrastructures, as well as our devices are all being repurposed for data extraction and profit (Lawrence 2018). Zuboff (2019: 2) describes surveillance capitalism as a "new economic order that claims human experience as free raw material for hidden commercial practices of extraction, prediction, and sales" and she locates these practices within a market-driven process where the commodity for sale is individuals' behavioural data. The capture and production of behavioural data relies on mass surveillance of the internet (Lawrence 2018).3 The most notable firms employing these practices provide free online services but make profits by collecting and scrutinising online behaviour and activity to produce products that further their commercial objectives. Although some of the data is applied to product or service improvement, the rest is declared as "behavioural surplus" and ultimately fabricated into "prediction products" traded in "behavioural future markets" (Zuboff 2019: 8). Google, for example, has arguably become the most powerful corporation in the history of the world from these trading operations (Zuboff 2018: 15-17). Zuboff (2019: 8) asserts that these prediction products, created through and refined by algorithmic processing, anticipate what consumers will do "now, soon, and later". Prediction products are sold mainly for the purposes of advertising - the more sophisticated and effective targeted advertising, the more valuable the prediction product. Targeted advertising is a form of online advertising that focuses on the specific traits, interests, and preferences of an individual consumer (Lawrence 2018). It becomes necessary for firms to acquire ever-more-predictive sources of behavioural surplus to stay competitive (Zuboff 2019: 8).

Zuboff (2019: 8), in tracing the history of surveillance capitalism, therefore, describes an instrumentarian power that constantly seeks to shape and know human behaviour through the automated medium of increasingly "ubiquitous computational architectures of smart networked devices, things, and spaces". This instrumentarian power has far-reaching implications for democracy, human dignity, freedom, and the right to privacy. Zuboff, as such, sketches a rapidly accelerating phase of capitalism based on asymmetrical personal data accumulation (Cinnamon 2017: 609), where unprecedented economic value is generated for the corporations that control these digital architectures. To be sure, the use of personal data in advertising, strategic marketing and client management is not new (Cinnamon 2017: 609). However, for Zuboff (2019: 504-512), this is a new era of personal data analytics defined by a new logic of accumulation because of the fact that surveillance capitalism sharply diverges from the neoliberal ideas about the market as inherently unknowable.4 As Cinnamon (2017: 610) asserts, "perhaps the foremost principle of big data analytics is that every actor, event, and transaction can be made visible and calculable" and "knowability and visibility in surveillance capitalism is wildly asymmetrical; power is sharply concentrated in the hands of a small number of Web companies, data brokers, and retailers". Although the extent of the acceleration of personal data analytics as a core economic strategy has been largely unanticipated, surveillance scholars have long pointed to the increasing processes of dataveillance as a mechanism for social manipulation and control in the information age (Cinnamon 2017: 610).

For Lawrence (2018), this drive of dominant platforms is the apotheosis of neoliberal rationality and he suggests that the "marriage of neoliberal rationality and geolocational technologies that track and trace everyday life" has led to an explosion of capitalist power that drives spiralling inequality. By referring to Wendy Brown's (2015: 17, 31-32) contention that neoliberalism is the encoding of all fields of activity into an economic register, he argues that neoliberalism has found its most complete form so far in the fact that all digital activities are transformed into acts of profits under surveillance capitalism. Transforming every action into a market action is "the economisation of society and the neutering of the uncertain, hopeful natality of political life" (Lawrence 2018; Arendt 1958: 9). Zuboff (2019: 94) exactly demonstrates how surveillance capitalism seeks to turn our entire lives into behavioural data, eroding privacies of life, and amassing massive power that, as mentioned, threatens democracy and democratic contestation, and reduces our ability to shape our own lives. Lawrence (2018; Brown 2015: 17) argues that if neoliberalism is the "disenchantment of politics by economics", then, under surveillance capitalism, all acts are market acts, all of society is turned into a site of digital labour and an engine of accumulation. Ultimately, Zuboff's analysis asks pressing questions about how we want to live, and she demonstrates the harmful implications of living within increasing ubiquitous digital architectures that seek to make us knowable in order to predict our behaviour. She highlights, in detail, the conditions of unfreedom within societies of techno-capitalist control.

As mentioned above, in the sections that follow I consider a number of perspectives in order to argue that surveillance capitalism must be understood as connected to and enabled by the project of white world-making. As such, I attempt to argue that surveillance capitalism as an economic order that claims human experience as free raw material for hidden commercial practices of extraction, prediction, and sales can also be understood as part of the long history of colonial exploitation. Indeed, I argue that it is only within the contours of colonial logics that surveillance capitalism can flourish. Birhane's (2020) exploration regarding algorithmic colonisation, discussed below, demonstrates that surveillance capitalist operations employ a colonial rhetoric in its efforts to organise knowledge and to create what is to be perceived as legitimate knowledge. In this way, Birhane's (2020) arguments also point to the contention that the validity of knowledge is structured around whiteness; in the case of algorithmic colonisation, the knowledge structures of digital and technological architectures are explored. As part of the discussion of algorithmic colonisation, I also highlight the idea that conquest patterns depend on the making of declarations that impose new facts on social reality. This discussion deepens Birhane's (2020) argument and also relates to Mirzoeff's (2020) contentions, discussed in the last section, regarding the fact that surveillance capitalism should be understood as racial surveillance capitalism.

 

Algorithmic colonisation

In her work on the algorithmic colonisation of Africa, Birhane (2020: 391) points to the rampant tendency toward technological solutions for social, political and economic problems. The author (2020: 389) traces this tendency on the African continent and argues more broadly that in the Global South, technology that is developed from Western perspectives, values and interests is imported with little regulation or critical scrutiny. Her work examines how Western tech monopolies "with their desire to control and influence social, political and cultural discourse share common characteristics with traditional colonialism" (2020: 391). Whereas traditional colonialism is historically driven by governmental and political forces, algorithmic colonialism is driven by corporate agendas of wealth accumulation and takes the form of state-of-the-art algorithms and AI-driven solutions to a number of social problems (Birhane 2020: 391). Traditional colonial powers sought unilateral power and domination of the colonised, declaring control of social, economic and political spheres in a manner that benefited themselves. Algorithmic colonialism is marked by domination and control that occurs through invisible and nuanced mechanisms, specifically, the control of digital ecosystems and infrastructure. Common to both traditional and algorithmic colonisation, therefore, is control of core communication and infrastructure mediums (Birhane 2020: 391).

Birhane (2020: 391), like Zuboff (2019), points to the fact that tech monopolies, driven by profit maximisation at any cost, assume that human behaviour and actions are raw material free for the taking. Birhane (2020: 391) states: "Knowledge, authority and power to sort, categorise, and order human activity rests with the technologist, for which we are merely data producing 'human natural resources'". Birhane (2020) relies to an extent on the assertions made by Zuboff regarding 'surveillance capitalism' explained above. She (2020: 391-392) specifically refers to Zuboff's analysis of the unfolding of conquest patterns in three phases: firstly, colonial powers invent legal measures to provide justification for invasion. Secondly, declarations of territorial claims are legitimised and institutionalised as tools for conquering in order to impose a new social reality. Lastly, the building of ecosystems of commerce, politics, and culture further entrenches legitimacy and inevitability. Although Birhane (2020) doesn't discuss Zuboff's assertions in further detail, some of Zuboff's remarks regarding conquest declarations deserve further elaboration and also tie into the discussion of 'racial surveillance capitalism' below.

In referring to the work of John Searle (2010: 85-86), Zuboff (2019: 99, 177) explains that a declaration is a particular way of speaking and acting that establishes facts "out of thin air". In the process of colonial conquest, declarations assert a new reality by describing the world as if a desired change were already true - "we make something the case by representing it as being the case" (Searle 2010: 85-86). Declarations are inherently invasive as they impose new facts on the social world, devising ways for others to agree with those facts.

Zuboff (2019: 99) further contends that the history of capitalism is marked by "taking things that live outside the market sphere and declaring their new life as market commodities". In this assertion, she relies on the work of historian Karl Polanyi (2001) and specifically his 1944 grand narrative regarding the "great transformation" to a self-regulating market economy. Polanyi (2001: 75-76) describes the origins of this translation process in three phases that are crucial mental inventions that he terms "commodity fictions". The first revolves around the idea that human life could be subordinated to market dynamics and reborn as "labour" (Zuboff 2019: 99; Polanyi 2001: 75-76). The second fiction refers to the fact that nature could be translated into the market and reborn as "property", "land" or "real estate" (Zuboff 2019: 99; Polanyi 2001: 75-76). And the third fiction was that exchange could be reborn as "money" (Zuboff 2019: 99; Polanyi 2001: 75-76). Marx had previously described the taking of lands and natural resources as the original "big bang" that ignited modern capital formation, a process that he termed "primitive accumulation" (Marx 1992; Zuboff 2019: 99). Hannah Arendt (2004:198) later complicated both Marx's and Polanyi's formulations, rightly observing that primitive accumulation cannot be described as a once-off primal explosion birthing capitalism. Rather, it is a "recurring phase in a repeating cycle as more aspects of the social and natural world are subordinated to the market dynamic", and further, "the primitive accumulation primal explosion or Marx's 'original sin' of simple robbery" had to "eventually be repeated lest the motor of capital accumulation suddenly die down" (Arendt 2001: 198). Of course, as Zuboff (2019: 99) points out, in our time, this cycle has become so pervasive that we fail to notice its claims. What Zuboff ultimately shows, and as explained above, is that surveillance capitalism continuously and expandingly seeks to translate our lives into behavioural data, turning our activities, actions, habits, communications and spatial and temporal patterns into market commodities. This drive, as mentioned, is described by Zuboff (2019: 2) as a new economic logic that depends on internet surveillance for profit and economic competitiveness.

In her argument on algorithmic colonisation, Birhane (2020: 392) uses an obvious but important example of conquest declaration: in 2016, Facebook attempted to create a population density map of the African continent using population data, computer vision techniques and high-resolution satellite imagery. According to Birhane (2020: 392), in this instance, Facebook assigned itself the authority responsible for mapping, controlling and creating population knowledge of the African content:

In doing so, not only does Facebook assume that the continent (its people, movement, and activities) are up for grabs for the purpose of data extraction and profit maximisation [...] it also assumed authority over what is perceived as legitimate knowledge of the continent's population.

Colonialist rhetoric is echoed in a number of statements made by Facebook regarding the project; 'creating knowledge about Africa's population distribution', 'connecting the unconnected', and 'providing humanitarian aid', entrenching the notion that the African continent was inevitably in line to receive these "solutions" (Birhane 2020: 392).

Further, as much of the continent's digital infrastructure and ecosystem is controlled and managed by Western tech monopoly powers, exploitation efforts are characterised by declarations to 'liberate the bottom billion', helping 'the unbanked bank' and 'connecting Africa' - these declarations are therefore dressed in technological help for the developing world (Birhane 2020: 393). Exploitative practices are further marked by an 'evangelical advocacy' (Birhane 2020: 394) for anything that mentions innovation or Artificial Intelligence (AI).5

For Birhane (2020: 393-394), there seems to be blind trust and no critical engagement regarding the motives of invested parties that seek to monetise, quantify and capitalise on every aspect of human life. She (2020: 396) pointedly asks about the relevance and appropriateness of AI software developed with the values, norms and interests of Western societies for the African continent. Further, attempts to solve complex social problems - complex cultural, political and moral issues embedded within specific histories and contexts - are reduced to problems that can be measured and quantified.

Birhane (2020: 397) argues that the idea that all problems can effectively be solved through the application of the right technology, transforms humans into passive objects instead of "active meaning seekers embedded in dynamic social, cultural, and historical backgrounds". Ultimately, in the technologist framework, moral questions are to be dictated by corporate interests. To be clear, technology can, in many contexts, deepen and extend human freedom. Birhane's arguments do not involve any type of contention that technology is inherently damaging; rather her arguments call for a critical interrogation regarding digital and algorithmic expansion to the African continent, and specifically, the shaping and sustaining of asymmetrical power relations between the Global North and the Global South (Ndlovu-Gatsheni 2013, 2014). I read Birhane as locating the 'algorithmic colonisation of Africa' within a longer history of colonial matrices of power and technologies of subjection that "produced African subjectivity as that which is constituted by a catalogue of deficits and a series of 'lacks'" (Ndlovu-Gatsheni 2013: 3). Further, the discourses of 'data mining', 'data rich continent', and 'data abundance' disregard the actual individual behind each data point and is reminiscent of the coloniser attitude that declares humans as material (Birhane 2020: 397-398).

In the section below, I discuss the idea of algorithmic bias and its implications for racial social justice. Apart from employing colonialist logic in its operations, surveillance capitalism, viewed from the perspective of algorithmic bias, can also be seen as entrenching existing racial bias and oppression. As Birhane highlights, one of the single most erroneous and harmful conceptions regarding automated systems are that they are objective, value-free and unbiased.

 

Algorithms of oppression

In her influential work, Weapons of math destruction: how big data increases inequality and threatens democracy, Cathy O'Neil (2016: 21) remarks that "algorithms are opinions embedded in code". This refers to the fact that just like any other technological artefact, code is not neutral but inherently political and therefore has important societal implications insofar as it might support certain political structures and certain actions and behaviours (Hassan & De Filippi 2017: 88). Moreover, data-driven decision making has been proven to be implicitly biased (Hardt 2014). Allegedly neutral algorithms systematically discriminate against minority groups in employing generalisations, and showing results which may be catalogued, for instance, as sexist and racist (Guarino 2016).

Along the same lines, Safiya Umoja Noble (2018) has explored the discriminatory effects of algorithmic classification. Noble (2018) focuses specifically on Google's ubiquitous search engine algorithms as integral to the socio-technical production of digital and 'real world' inequalities. Algorithms "reinforce oppressive social relationships and enact new models of racial profiling" (Noble 2018: 1). Noble's (2018: 171) work can be located within the emergence of the field of critical algorithm studies as well as along the lines of a cogent intersectional framework of "black feminist technology studies". She (2018: 1-5) demonstrates how a simple search for "black girls" delivers dehumanising pornographic references to black women as well as racist vitriol. A number of searches from Google autosuggest and images reveal contrasted representations of black and white women, which reflects Google's hegemonic narratives and frameworks. Noble, therefore, interrogates the corporate-controlled digital communication technologies which increasingly structure knowledge. As Noble (2018: 148) states:

[...] the search results retrieved in a commercial search engine create their own particular material reality. Ranking is itself information that also reflects the political, social and cultural values of society that search engine companies operate within.

What is more, not only do Google search rankings reflect what people desire and, as such, the cultural and social environment within which Google operates, there is a remarkable lack of diversity in Google's workforce (in 2016, 96% of US employees were white) (Sharma 2019: 592). Noble (2018: 69) argues that this technological racialisation has evolved from ideologies foundational to the web's construction: individualism, militarism, and consumption, which take whiteness and maleness as norms. Therefore, bias is not merely the result of coding errors but is part of the very architecture and language of certain technologies and, as such, systemic and entangled with the operations of digital racial capitalism (Noble 2018: 9). Further, hierarchical listings of Google search engine results are influenced by a plethora of factors, including user queries, incoming links, and advertising revenue - "not only does search ranking promote and reify dominant ideologies of racism and misogyny, but it also effectively marginalises alternative voices and representations from surfacing" (Sharma 2019: 594). Search results are, therefore, corrupted by a potent combination of advertising interests, search engine optimisation and neo-liberal values (Galliah 2019:1).

Noble (2018) calls for a re-evaluation of the implications of our information resources being governed by corporate-controlled advertising companies while offering a defence to maintain the internet as a public resource - "democratising technology in the pursuit of racial equality and justice". Noble's (2018) assertions create a sense of urgency involving ubiquitous societies in the era of surveillance capitalism, especially when considering the fact that Google now owns an 89.95% share of the global search engine market as well as the fact that many societies have grown dependent on Google and its associated products such as Gmail, Chrome, Calendar, Cloud, Google Scholar, Maps, and Chrome Books.

The privileging of certain biases, interests and groups in algorithmic operations comes to the fore in a variety of sectors. Cinnamon (2017: 616), in his analysis of the practices of corporate dataveillance as threats to social justice, refers to the emergence of a 'scored society' as a powerful example of how data 'maldistribution' and its algorithmic processing can lead to 'misrecognition' and injustice. Cinnamon (2018: 611-612) relies on the theory of 'abnormal justice' as conceptualised by Nancy Fraser (2008: 393-422) to explain the fact that practices of surveillance capitalism enable injustices of data maldistribution (the issue of class inequality and distributive injustice that deny some people the ability to participate equally in social life due to lack of resources) and further enables significant injustices of socio-cultural misrecognition (which occurs when institutionalised hierarchies afford some citizens or groups a higher status in society at the expense of others, which results in inequality and an inability to shape one's own identity). Data maldistribution and misrecognition occur through algorithmic data processing, classification and predictive analysis. By relying on Fraser (2008), Cinnamon (2017) makes a strong argument regarding the threats to social justice and the entrenching of racial injustice through and by algorithmic operations and dataveillance.

Take for example how credit scores, determined by algorithms, shape our identity and status, thereby determining our financial and material circumstances. One study has shown that roughly 25% of credit scores have serious inaccuracies of maldistribution of debts as well as weak matching criteria (Cinnamon 2017: 616). Further, the industry of "alternative data" uses new sources of potential data gleaned from online activities and social media to calculate the credit scores of the 'underbanked'. As Cinnamon (Cinnamon 2017: 616) asserts, "relying on the vagaries of one's online identity can produce serious inaccuracies that wrongly shape a person's creditworthiness".

Cinnamon further explains that automated approaches are increasingly being developed and deployed in entirely new forms, categorising people according to their predicted future behaviours and outcomes. The data and calculation parameters used in these scores are opaque and largely unregulated and unknown to the public. The author (2017: 616) states:

Inability to secure a loan, mortgage, job, or health insurance due to inaccurate placement in a 'risk' category is clearly unfair -however, the accuracy of the classification is perhaps unimportant in the context of social justice - accurate or not, personal scoring systems 'make up people'; they produce new social categories of difference and restrict our ability to shape our own sense of self, a clear threat to the parity of participation in social life [emphasis added].

What is more, some algorithms are purposefully developed to enable companies to engage in illegal forms of discrimination (Cinnamon 2017: 616). These companies use proxy datasets to hide oppressive practices. For example, although it is illegal to discriminate against potential property renters based on race or socio-demographic characteristics, "algorithms can be designed that intentionally avoid advertising on social media platforms to users from deemed undesirable backgrounds or statuses, which can be inferred from analysis of their Web activities, such as their 'likes' on Facebook" (Cinnamon 2017: 616). A range of discriminatory practices are being conducted via algorithmic processing of personal data; this includes differential pricing by retailers, predatory lending to vulnerable groups, racial profiling, and higher life insurance rates for those people suspected of having disease or illness (Cinnamon 2017: 616).

Additionally, as Birhane (2020: 399) points out, the use of technology in social or public spheres often focuses on punitive practices, "whether it is to predict who will commit the next crime or who may fail to repay their loan". Technology designed and applied with the aim of security often results in cruel and inhumane practices. In the South African context, for example, Swart (2021) has investigated the city of Johannesburg's major upgrades to existing CCTV camera systems. These upgrades involved the use of smart technology and facial recognition software in keeping law and order. By referring to a number of studies, Swart (2021) demonstrates the possibility of false arrests, the targeting of innocent citizens and unfair discrimination against certain groups. Swart (2021) therefore points to the dangers of predictive policing and associated punitive practices as well as the inhumane outcomes of facial recognition systems, usually based on unrepresentative datasets.

For Cinnamon (2017: 610), the injustices of these specific practices are inherent to the current mode of capital accumulation. He further contends that although surveillance capitalism feeds on individuals, its target is automatically generated groupings of people based on behavioural attributes and propensities (Cinnamon 2017: 615). Personal data analytics produces virtual representations of us, described as 'data doubles' - these so-called doubles result in oppressive control over our identities (Cinnamon 2017: 615). In this regard, Cohen (2017: 14) explains that the purpose of algorithms of behaviour is to "make human behaviours and preferences calculable, predictable and profitable in the aggregate". More specifically:

The economic value of personal data is largely realised in the aggregate - patterns and future potential become visible when data points are linked together through data analytics and algorithmic processing (Cohen 2017: 14).

Therefore, our future actions and behaviours are predicted based on our present data doubles, which in themselves are created through algorithmic operations that are designed to serve corporate interests in a number of life spheres and business sectors. Ultimately, surveillance capitalism has serious consequences for social justice. As Cinnamon (2017: 615) argues, the implication of our inability to access data and benefit from data analytics is "the giving up of a degree of control over our lives as we are increasingly subject to classification and profiling". Cinnamon (2017: 615) further states:

Big data algorithms and data mining are fundamentally about discrimination; their purpose is to separate society into groups through identifying patterns of difference and sameness - or norms and deviations from them - in vast data sets through processes of data reduction and categorisation. These processes can simplify a complex world - or a complex data stream -however in doing so they also enact and constrain the world; they reduce and narrow possibilities for action.

Zwitter (2014: 5) argues that big data makes random connectedness on the basis of random commonalities extremely likely. As such, we are shaped "before we make up our own minds" about who we are. Further, as Cinnamon (2017: 616) contends, "in the rulebook of surveillance capitalism, inaccuracies that might affect an individual are only problematic insofar as they are ineffective at accurately predicting patterns and behaviours" and as Cohen (2017: 14) argues, "[a]s long as [the] project is effective on its own terms - an outcome that can be measured in hit rates or revenue increments - partial (or even complete) misalignments at the individual level are irrelevant".

To be clear, the practices of dataveillance and big data operations under the economic model of surveillance capitalism, hold damaging consequences for all who are subjected to digital communication technologies. However, what scholars like Cinnamon and Noble demonstrate is the capacity of these operations to entrench social injustice and racism. Scholars in the field of critical algorithm studies have demonstrated, in a number of contexts, that algorithmic operations and other related technologies are not objective and neutral.6 Rather, they frequently reflect the political, social, and economic environments in which they operate. As such, we cannot only ask about these operations' tendency to produce new categories of social difference, we also have to ask about the implications of them operating along existing lines of difference and institutional hierarches. In essence we have to ask, in Browne's (2015: 8) wording, about the discrimination against those "who are negatively racialised" by surveillance capitalism. In this way, the operations that Noble and Cinnamon explain (under surveillance capitalism) should be interrogated for their potential to produce new forms of racist habituation and subjugation. Browne (2015: 8) describes racialised surveillance as those instances "when enactments of surveillance reify boundaries along racial lines, thereby reifying race, and where the outcome of this is often discriminatory and violent in treatment". To put it another way, "although [digital] surveillance is penetrating deeply throughout society, its penetration is differential" (Fiske 1998: 85). Mirzoeff's discussion below should make this point clearer. The author draws attention to what he terms 'racial surveillance capitalism'. Mirzoeff's thinking, as with algorithmic colonisation and racialised digital architectures, points to the fact that surveillance capitalism involves a familiar racialised way of seeing the world. As such, Mirzoeff's contentions form an integral part of presenting an understanding of surveillance capitalism as part of the project of white world-making.

 

Artificial vision and white space

According to Mirzoeff (2020: preface), before words comes seeing, and "before seeing comes the space in which to see". Mirzoeff (2020: preface) explains that the way of seeing that arises in the space in which to see erases "so as to produce white space, which came to be claimed for absolute ownership". White space, in this context, is the product of the systemic erasure of colonised territories. Mirzoeff (2020: preface) describes 'whiteness' in line with the descriptions explained above, namely, as the "place of organising". For Mirzoeff (2020), then, coloniality operates by erasing the cultures and ways of life that were there before. This erasure then enables forcefully imposing a new way of seeing the world (a white space) and new knowledge systems on to colonised peoples, specifically knowledge systems and ways of seeing and living structured around whiteness - the norms, values and symbolic structures of colonisers as well as the implied inferiority of the colonised and superiority of the coloniser. My interest in Mirzoeff's (2020) contention lies in the relationship that the author draws between modern digital surveillance technologies and the colonial surveillance of black bodies. Both of these projects depend on creating a space in which to see, or simply put, they depend on creating a specific way of seeing the world for their justification. For Mirzoeff (2020), this way of seeing is structured around whiteness as the place of organising. Two specific paragraphs are of importance (Mirzoeff uses the plantation in depictions of transatlantic slavery as a site through which to consider racial surveillance capitalism):

Assertions that 'surveillance capitalism is young' fail to account for its long role in generating and sustaining racial surveillance capitalism on stolen land in the plantation and the factory. Sustaining racialised hierarchy is and was co-dependent with the extraction of value by means of persistent surveillance of those excluded from [humanity] (Mirzoeff 2020: preface).

And

'[T]he plantation uncovers a logic that emerges in the present and folds over to repeat itself anew'. In this case, that logic is the means by which plantation oversight continues to structure the automated systems of racial surveillance capitalism. White space [whiteness as rooted in colonial exploitation] subsequently metamorphoses a person into a commodity. It transubstantiates life into value or renders life into data... This logic rendered life into property, the process of enslavement by which a body becomes and object according to colonial law. (Mirzoeff 2020: 1)

Mirzoeff (2020: preface), therefore, reads surveillance capitalist operations as the continuance of coloniality deploying different technologies - whether ships, trains, or drones. Firstly, it is necessary to explain that Mirzoeff's reading relies on acknowledging the well-documented relationship between racism and capitalism, or rather racial capitalism. A detailed explanation is not possible here, but broadly, this idea was first expounded on by Cedric J Robinson in his text Black Marxism: The Making of the Black Radical Tradition (1983). Robinson critically interrogated the Eurocentrism of Marxism and argued that it did not account for the racial nature of capitalism. In this regard, Robinson criticised Marx for his failure to consider the importance and revolutionary potential of radical social movements outside Europe. As Ashe (2020) explains, for Robinson "the idea of race and the processes of racial differentiation were key components of how class identities were imagined". Robinson highlighted capitalism's tendency towards fragmentation rather than homogenisation and insisted that pre-capitalist society lived on in the capitalist system (Ashe 2020). For Robinson, racism is a means by which the relationship between the modes of production, and the associated differentiated labour, is recorded, managed and legitimated (Kundnani 2020). Relying on the work of Oliver Cromwell Cox (a major figure in the intellectual tradition of black Marxism), Robinson rejected ideas that capitalism was a radical break with the feudal system.7 Rather, Cox pointed to the idea that capitalism emerged from within feudalism and in concurrence with pre-existing forms of racialism already present in Western feudal society (Ashe 2020). According to Kelley (2017), "capitalism and racism ... did not break with the old order but rather evolved from it to produce a modern world system of racial capitalism dependent on slavery, violence, imperialism and genocide". Importantly, for Robinson, "capitalism was 'racial' not because of some conspiracy to divide workers or justify slavery and dispossession, but because racialisation had already permeated the Western feudal society" (Kelley 2017). Racialisation within Europe was very much a colonial process involving invasion, settlement, expropriation, and racial hierarchy (Kelley 2017). As explained by Hudson (2017: 13), "racial capitalism suggests both the simultaneous historical emergence of racism and capitalism in the modern world and their mutual dependence". Importantly, as Bhattacharyya (2018: 101) explains, racial capitalism "is not a way of understanding capitalism as a racist conspiracy or racism as a capitalist conspiracy". Rather, capitalism is inextricably linked with the histories of racist expropriation. Robinson, therefore, encouraged thinking about capitalism as a world system intimately tied to the history of racism.

Mirzoeff poses the question of the constitutive genealogies of surveillance capitalism and also draws attention to the long history of oversight and control of black bodies in colonial exploitation. Within the project of coloniality, 'surveillance' has a long history. Indeed, as Browne (2015: 6) notes, "to cue surveillance in and of black life [i]s a fact of blackness" and "surveillance is nothing new to black folks. It is the fact of antiblackness." (2015: 10) Wynter (1999) similarly draws this connection in her formulation of the "sociogenic principle" as that which determines the category of human and that frames blackness as an object of surveillance. Frederick Douglas (2004: 87) also notes how surveillance functioned as a comprehensive and regulating practice of slave life: "at every gate through which we are to pass, we saw a watchman - at every ferry guard - on every bridge sentinel - and in every wood patrol. We were hemmed in upon every side." Browne (2015: 8), especially, has explored the surveillance of blackness, arguing that within the discipline of surveillance studies, race has remained undertheorised and no serious consideration has been given to the role of surveillance in the archive of slavery and the transatlantic slave trade. Browne (2015: 6) contends that by placing race at the centre of surveillance studies, "another mode of reading surveillance can be had" and we can discover "new ways of understanding surveillance in contemporary life". Importantly, Browne (2015: 8) contends that the historical formation of surveillance is not outside the historical formation of slavery. In a detailed examination of slave surveillance practices, Browne (2015: 17) asserts that rather than seeing surveillance as something inaugurated by new technologies, we should see it as ongoing, which "is to insist that we factor in how racism and antiblackness undergird and sustain the intersecting of surveillance in the present order." Browne (2005: 16-17) states:

To say that racialising surveillance is a technology of social control is not to take this form of surveillance as involving a fixed set of practices that maintain a racial order of things. Instead, it suggests that how things get ordered racially by way of surveillance depends on space and time and is subject to change, but most often upholds the negating strategies that first accompanied European colonial expansion and transatlantic slavery that sought to structure social relations [around] whiteness.

Thus, when "surveillance capitalism" becomes surveillance and capitalism, Mirzoeff's analysis leads us to see surveillance capitalism as interlocked with racial capitalism and racialised surveillance.

 

Conclusion

Initial criticisms of the thinking that I have presented in this article include the fact that surveillance capitalism tracks and controls all individuals who are essentially required to make use of surveillance capitalist technologies in the modern world. The operations of surveillance capitalism that declare humans as free raw material for the purposes of data extraction, prediction and sales essentially turn all bodies into property. My contention is not that surveillance capitalism only targets specific groupings of people or only presents potential damage for some. Indeed, Zuboff (2019: 8) essentially argues that if we thought that our "voices, personalities, emotions" could not be possessed in capitalist terms, the era of surveillance capitalism has made it so for all individuals subjected to the digital communication technologies that we have become dependent on. Rather, my contention is that surveillance capitalism draws on a history of colonial exploitation and its concomitant racialised ways of seeing the world. As mentioned above, today, the legacy of colonialism continues and is maintained through various institutions, ideologies, and everyday social practices. The question that arises and that precedes the argument that surveillance capitalism is connected to a racialised way of seeing the world can be formulated as follows: where or how can we detect in surveillance capitalist operations the residue and traces of the colonial project? This question can be answered in a number of ways and the arguments above in no way provides a thorough theoretical account. Rather, my focus above was on drawing some connections and tracing some of the logics at play in surveillance capitalism vis-ä-vis colonial expressions.

Firstly, Birhane's (2020) reasoning around the algorithmic colonisation of Africa highlights the colonial logic present in the expansion of surveillance capitalism into new territories. The continent is seen as inevitably in line to receive technological "solutions", and, as such, Birhane locates this rhetoric within the colonial matrices of power. The author invokes conquest patterns that depend on imposing new facts and claims to legitimate knowledge. With regards to Google's vast economic power and expansion, Zuboff (2019: 338) notes that "Google imposed the logic of conquest, defining human experience as free for the taking, available to be rendered as data and claimed as surveillance assets".

Secondly, although surveillance capitalism penetrates deeply throughout society, its penetration is differential. Some groups of people are negatively racialised by its operations, reifying boundaries along racial lines and as such reifying practices of white world-making. We, therefore, have to ask about the implications regarding surveillance capitalism's enabling of new forms of racist habituation and subjugation. Browne (2015) draws attention not only to differential penetration, but also to the fact that the historical formation of surveillance is not outside the historical formation of racial oppression. According to Browne (2015: 8), racialised surveillance is "of course... not the entire story of surveillance", however, her analysis leads us to ask about the historical and central role of surveillance practices in the creation of racial hierarchy.

Third, Mirzoeff's understanding of racial surveillance capitalism insists on the racialised nature of capitalism. Robinson's insight regarding the fact that capitalism is inextricably linked with the histories of racist expropriation, alone, necessitates an understanding of the constitutive genealogies of surveillance capitalism.

Further, surveillance capitalism employs the logic of turning bodies into raw material free for the taking, turning bodies into objects, transubstantiating life into value, into data. The logic of rendering life as property, to turn bodies into objects that value can be extracted from, is not new. Rather, this strategy accompanied European colonial expression that sought to expand the project of white world-making. Fanon's (1967: 109) description of the experience of epidermalisation is worth noting here. This reasoning points to the white hegemonic gaze as constructing the black body into "an object in the midst of other objects". Under the matrices of surveillance capitalism, Zuboff (2019: 12) notes:

The body is simply a set of coordinates in time and space where sensation and action are translated as data. All things animate and inanimate share the same existential status in this blended confection, each reborn as an objective and measurable, indexable, browsable, searchable "it".

These methods reduce individuals to the lowest common denominator of sameness - an organism among organisms -despite all the vital ways in which we are not the same . In this new regime, objectification is the moral milieu in which our lives unfold . [Surveillance capitalism] poaches our behaviour for surplus and leaves behind all the meaning lodged in our bodies, our brains, and our beating hearts [...] (2019: 337).

Although there are important contextual differences here in the sense that Fanon points to the imposition of race in black life, while Zuboff's analysis interrogates the extent of data extraction throughout modern society, I read Zuboff's paragraphs in relation to Fanon's formulation of epidermalisation as both pointing to a process of objectification in which bodies become legible as property. As mentioned above, I understand the notion of white world-making to mean ways of feeling, thinking, and seeing that reproduce whiteness and as intimately tied to the project of global coloniality. My interests, therefore, lie in the logics that make possible strategies of conquest declarations rooted in colonial matrices of power, the turning of bodies into objects, and racialised surveillance. These logics point to interweaved histories that set the structural conditions within which surveillance capitalism could and can continue to occur. To say that surveillance capitalism is part of the project of white world-making is to say that it draws on an episteme constructed through interlocking forces of oppression, including racial capitalism, colonialism, imperialism and conquest.

Locating surveillance capitalism within the project of white world-making in no way attempts to ascribe new or different qualities to it. Rather, tracing some of the longer histories and oppressive knowledge structures in the workings of surveillance capitalism might help us to disentangle the logics and conditions that allow it to thrive.

 

Bibliography

Alcoff LM. 2008. Foreword. In: Yancy G. Black bodies, white gazes: the continuing significance of race. Maryland: Rowman and Littlefield Publishers.         [ Links ]

Arendt H. 2004. The origins of totalitarianism. New York: Schocken.         [ Links ]

Ashe SD. 2020. Racial capitalism. Global Social Theory. Available at: https://globalsocialtheory.org/topics/racial-capitalism/ [accessed on 4 June 2021].         [ Links ]

Berger J .1972. Ways of seeing. Harmondsworth: Pelican.         [ Links ]

Berry P. 2020. Troubleshooting algorithms: a book review of Weapons of math destruction. McMaster Journal of Communication. 12(2): 91-96. doi: 10.15173/mjc.v12i2.2450.         [ Links ]

Bhattacharyya G. 2018. Rethinking racial capitalism: questions of reproduction and survival. Maryland: Rowman and Littlefield Publishers.         [ Links ]

Birhane A. 2020. The algorithmic colonization of Africa. Sripted 17(2): 389-402. doi: 10.2966/scrip.170220.389.         [ Links ]

Browne S. 2015. Dark matters: on the surveillance of blackness. Durham: Duke University Press. https://doi.org/10.1515/9780822375302        [ Links ]

Brown W. 2015. Undoing the demos: neoliberalism's stealth revolution. New York: Zone Books. https://doi.org/10.2307/j.ctt17kk9p8        [ Links ]

Cancelmo C and Mueller JC. 2019. Whiteness. Oxford Bibliographies. doi: 10.1093/OBO/9780199756384-0231.         [ Links ]

Cinnamon J. 2017. Social injustice in surveillance capitalism. Surveillance and Society 15(5): 609-625. doi:10.24908/ss.v15i5.6433.         [ Links ]

Cohen JE. 2017. The biopolitical public domain: the legal construction of the surveillance economy. Philosophy & Technology 31: 1-33. doi: 10.1007/s13347-017-0258-2.         [ Links ]

Fanon F. 1967. Black skins, white masks. Transl Charles Lam Markmann. New York: Grove Press.         [ Links ]

Fiske J. 1998. Surveilling the city: white- ness, the black man and democratic totalitarianism. Theory, Culture and Society 15(2): 67-88. https://doi.org/10.1177/026327698015002003        [ Links ]

Fraser N. 2008. Abnormal justice. Critical Inquiry 34(3): 393-422. doi: /10.1086/589478.         [ Links ]

Galic M, Timan T and Koops B. 2017. Bentham, Deleuze and beyond: an overview of surveillance theories from the panopticon to participation. Philosophy & Technology 30: 9-37. doi: 10.1007/s13347-016-0219-1.         [ Links ]

Galliah SA. 2019. Algorithms of oppression: Safiya Umoja Noble's powerful exploration of search engines' underlying hegemony and their racist, sexist practices. The Liminal: Interdisciplinary Journal of Technology in Education 1(1): 2-5.         [ Links ]

Gilroy P. 1993. The black Atlantic: modernity and double consciousness. Cambridge: Harvard University Press.         [ Links ]

Guarino B. 2016. Google faulted for racial bias in image search results for black teenagers. Washington Post. 6 October. Available at: https://www.washingtonpost.com/news/morning-mix/wp/2016/06/10/google-faulted-for-racial-bias-in-imagesearch-results-for-black-teenagers/ [accessed on 4 June 2021].         [ Links ]

Hardt M. 2014. How big data is unfair: understanding unintended sources of unfairness in data driven decision making. Medium. Available at: https://medium.com/@mrtz/how-big-data-is-unfair-9aa544d739de.         [ Links ]

Hassan S and De Filippi P. 2017. The expansion of algorithmic governance: from code is law to law is code. Field Actions Science Reports 17: 88-90 Available at: http://journals.openedition.org/factsreports/4518.         [ Links ]

Hudson PJ. 2017. Bankers and empire: how bankers colonized the Caribbean. London: University of Chicago Press. https://doi.org/10.7208/chicago/9780226459257.001.0001        [ Links ]

Kelley RDG. 2017. What did Cedric Robinson mean by racial capitalism? Boston Review. Available at: http://bostonreview.net/race/robin-d-g-kelley-what-did-cedric-robinson-mean-racial-capitalism.         [ Links ]

Kundnani A. 2020. What is racial capitalism? Available at: https://www.kundnani.org/what-is-racial-capitalism/ [accessed on 4 June 2021].         [ Links ]

Lawrence M. 2018. Control under surveillance capitalism: from Bentham's panopticon to Zuckerberg's 'Like'. Political Economy Research Centre Blog. Available at: https://www.perc.org.uk/project_posts/control-surveillance-capitalism-benthams-panopticon-zuckerbergs-like/ [accessed on 4 June 2021].         [ Links ]

Marx K. 1992. Capital. Transl D Fernbach. Harmondsworth: Penguin.         [ Links ]

Mirzoeff N. 2020. Artificial vision, white space and racial surveillance capitalism. AI and Society. Published Online. doi: 10.1007/s00146-020-01095-8.         [ Links ]

Modiri J. 2017. The Jurisprudence of Steve Biko: a study in race, law and power in the 'afterlife' of colonial-apartheid. PhD thesis. Pretoria: University of Pretoria.         [ Links ]

Nlodvü-Gatsheni SJ. 2013. Global coloniality and the challenge of creating African futures. Strategic Review for Southern Africa. 36(2):181-202. https://doi.org/10.35293/srsa.v36i2.189        [ Links ]

Noble SU. 2018. Algorithms of oppression: how search engines reinforce racism. New York: New York University Press. https://doi.org/10.2307/j.ctt1pwt9w5        [ Links ]

O'Neil C. 2016. Weapons of math destruction: how big data increases inequality and threatens democracy. New York: Crown Publishing Group.         [ Links ]

Polanyi K. 2001. The great transformation: the political and economic origins of our time. Boston: Beacon.         [ Links ]

Qüijano A. 2000. Coloniality of power, eurocentrism and Latin America. Nepantla 1(3): 215-232. https://doi.org/10.1177/0268580900015002005        [ Links ]

Robinson CJ. 1983. Black Marxism: the making of the black radical tradition. Chapel Hill: University of North Carolina Press.         [ Links ]

Rose J. 1986. Sexuality in the field of vision. London: Verso.         [ Links ]

Searle JR. 2010. Making the social world: the structure of human civilization. Oxford: Oxford University Press. https://doi.org/10.1093/acprof:osobl/9780195396171.001.0001        [ Links ]

Sharma S. 2019. Book review: Noble SU, Algorithms of oppression. Ethnic and Racial Studies 43(3): 592-594. doi: 10.1080/01419870.2019.1635260.         [ Links ]

Swart H. 2021. Face-off: South Africa's population register is on course to becoming a criminal database - with your mugshot. Daily Maverick. 3 March. Available at: https://www.dailymaverick.co.za/article/2021-03-03-face-off-south-africas-population-register-is-on-course-to-becoming-a-criminal-database-with-your-mugshot/ [accessed on 4 June 2021].         [ Links ]

Taylor P. 2004. Silence and sympathy: Dewey's whiteness. In: Yancy G (ed). What white looks like: African-American philosophers on the whiteness question. New York: Routledge.         [ Links ]

Wynter S. 2015. The ceremony found: towards the autopoetic turn/overturn, its autonomy of human agency and extraterritoriality of (self-)cognition. In: Broeck S and Ambroise JR (eds). Black knowledges/black struggles: essays in critical epistemology. Liverpool: Liverpool University Press. https://doi.org/10.5949/liverpool/9781781381724.003.0008        [ Links ]

Yancy G. 2008. Black bodies, white gazes: the continuing significance of race. Maryland: Rowman and Littlefield Publishers.         [ Links ]

Zuboff S. 2019. The age of surveillance capitalism: the fight for a human future and the new frontier of power. London: Profile Books.         [ Links ]

Zwitter A. 2014. Big data ethics. Big Data & Society 1(2): 1-6. doi: 10.1177/205395171 4559253. https://doi.org/10.17645/pag.v2i1.2        [ Links ]

 

 

First submission: 23 April 2021
Accepted: 7 June 2021
Published: 5 July 2021

 

 

1 Scholars such as Ralph Ellison, James Baldwin, Frantz Fanon, Toni Morrison, WEB Du Bois and bel hooks have maintained that whiteness lies at the heart of racist subjugation. Whiteness studies, as an interdisciplinary field that employs a wide variety of approaches, is committed to disrupt racism through problematising whiteness as reproducing white supremacy and white privilege. The term 'whitely' involves "a commitment to the centrality of white people and their perspectives" (Taylor 2004: 230).
2 As Ndlovu-Gatsheni (2013: 181) explains, "[g]lobal coloniality" is a modern global power structure that has been in place since the dawn of Euro-North American-centric modernity. This modernity is genealogically and figuratively traceable to 1492 when Christopher Columbus claimed to have discovered a 'New World'. It commenced with enslavement of black people and culminated in global coloniality. Today global coloniality operates as an invisible power matrix that is shaping and sustaining asymmetrical power relations between the Global North and the Global South. Even the current global power transformations which have enabled the re-emergence of a Sinocentric economic power and deWesternisation processes including the rise of South-South power blocs such as BRICS, do not mean that the modern world system has now undergone genuine decolonisation and deimperialisation to the extent of being amenable to the creation of other futures. Global coloniality continues to frustrate decolonial initiatives aimed at creating postcolonial futures free from coloniality."
3 Galic et al. (2017: 10-11, 32-34) distinguish between three phases of surveillance in theory building: The first phase involves Bentham and Foucault and revolves around the Panopticon and panopticism. This phase can be categorised as offering architectural theories of surveillance, "where surveillance is largely physical and spatial in character (either in concrete, closed places such as institutional buildings or more widespread in territorially based social structures) and largely involves centralised mechanisms of watching over subjects" (Galic et al. 2017: 32). Panoptic structures are theorised as "architectures of power"; through panoptic technologies, surveillance enables power exercise, not only directly but also through (self-) disciplining of the watched subjects. The second phase shifts the focus from institutions to networks, "from relatively ostensible forms of discipline to relatively opaque forms of control" (2017: 32). According to Galic et al., this phase involves offering infrastructural theories of surveillance, "where surveillance is networked in character and relies primarily on digital rather than physical technologies" (2017: 32). It, therefore, refers to distributed forms of watching over people, "with increasing distance to the watched and often dealing with data doubles rather than physical persons" (2017: 33). This phase involves the different theoretical accounts of Deleuze, Haggerty and Ericson and Zuboff and has the common feature of critically questioning "not only the power structures in contemporary network societies and how surveillance reinforces, or sometimes undermines, these, but also how we can conceptualise this power play beyond panoptic effects of self-disciplining" (2017: 9). In the third phase, surveillance theory builds on and relies on perspectives in the first two phases and aims to conceptualise surveillance "through concepts or lenses such as dataveillance, access control, social sorting, peer-to-peer surveillance and resistance" (2017: 33). Galic et al. assert, "with the datafication of society, surveillance combines the monitoring of physical spaces with the monitoring of digital spaces. In these hybrid surveillance spaces, not only government or corporate surveillance is found, but also self-surveillance and complex forms of watching-and-being-watched through social media networks and their paradigm of voluntary data sharing." (2017: 33)
4 For Zuboff (2019), surveillance capitalism differs from the history of market capitalism in the following ways: firstly, surveillance capitalism insists on the privilege of "unfettered freedom and knowledge" (498). Secondly, it abandons long-standing "organic reciprocities with people". And thirdly, "the specter of life in the hive betrays a collectivist societal vision sustained by radical indifference and its material expression by Big Other" (495-496). Put simply, for Zuboff surveillance capitalism represents a new era of personal data analytics defined by a new logic of accumulation because of the fact that surveillance capitalism sharply diverges from the neolibera ideas about the market as inherently unknowable. For more detail, see Zuboff (2019) 495-512. Further, surveillance capitalism further rescinds the organic reciprocities with people that has long been the mark of capitalism. It anticipates the behaviour of populations, groups and individuals. The business model of surveillance capitalist operations requires relying not only on people as consumers but on users as sources of raw material aimed at a new business customer. With regards to Zuboff's third point, surveillance capitalism represents a new form of collectivism in which it is the market, not the state, which concentrates both knowledge and freedom within its domain (504). Further, a "radical indifference" is applied where content is judged by its volume, range and depth of surplus as measured by anonymous clicks, dwell times and likes "despite the obvious fact that its profoundly dissimilar meanings originate in distinct human situations" (2019: 505).
5 Birhane (2020: 395-396) explains that the atmosphere during one of the major technology conferences in Tangier, Morocco embodies this tech-evangelism. CyFyAfrica 2019, The Conference on Technology, Innovation, and Society 7 is one of Africa's largest annual technology conferences attended by various policy makers, UN delegates, ministers, governments, diplomats, media, tech corporations, and academics from over 65 (mostly African and Asian) nations. She contends that although these leaders want to place "the voice of the youth of Africa at the front and centre", the atmosphere was one that can be summed up as a race to get the African continent 'teched-up'" (Birhane 2020: 395-396).
6 See for example B Friedman and H Nissenbaum (1996) Bias in computer systems. ACM Transactions on Information Systems, 14(3): 330-347. In recent years, a growing number of socia scientists, legal theorists and philosophers have critically engaged the rise of algorithms, machine learning and other techniques used by data scientists. Algorithms have become an issue of social concern as they penetrate a number of human life spheres. As Berry (2020: 92) explains, "algorithms control our smallest, most miniscule choices, to our largest, life-defining decisions". From home-loan approvals, to university rankings, online advertising, law enforcement, human resources, credit lending, insurance, social media, politics, and consumer marketing - algorithms operate within these systems, "collecting, segmenting, defining" in all spheres of human life.
7 See Cromwell Cox Caste, Class, and Race: A Study in Social Dynamics (1948); The Foundations of Capitalism (1959); Capitalism and American Leadership (1962); and Race Relations: Elements and Social Dynamics (1976).

Creative Commons License All the contents of this journal, except where otherwise noted, is licensed under a Creative Commons Attribution License