Summary of September reading

Capitalist Realism: Is There No Alternative – Mark Fisher – 2009
This is just an introductory chapter on why it is easier to imagine the end of the world than the end of capitalism. If capitalism is the only coherent economic system, there can be no alternative. Capital therefore persists throughout disaster, which often happens slowly and without being noticed. We have thus come to a moment of disaster in which we believe nothing new can ever emerge. We exist only in the present, which rejects the past and tradition, as these require contestation to be alive. Instead, everything is simply assigned a monetary value; nothing remains other than Capital’s consumer spectrum which commodifies all.

This is portrayed as a virtue in capitalist realism. It allows us to accept that the world is dire but that any alternatives are even worse. There has been an acceptance of the ‘end of history’ in the cultural unconscious, as a result. Today we are all Nietzche’s Last Man who has seen everything and is enfeebled by this excess of (self) awareness.

It goes on to discuss why capitalist realism is more accurate than postmodernism, the latter having being claimed as the cultural logic of late capitalism by Federic Jameson. Fisher states that capitalist realism is a more useful term for three reasons: Jameson’s writing on postmodernism is from the 1980s, when there was an alternative to capitalism being presented; postmodernism is tied to modernism while capitalist realism takes the defeat of modernism for granted; and there has been a whole generation raised after the fall of the Berlin Wall who have nothing but Capitalism on the horizon of the thinkable.

When data is capital: Datafication, accumulation, and extraction – Sadowski – 2019
The accumulation of data is therefore a central tenet of 21st century political economy and we should analyse data as a form of capital, rather than commodity. Three broad insights for understanding data as capital are used here: data is valuable and value creating; data collection influences the functioning of companies and governments; and data systems are rife with exploitation. The world is becoming/has become datafied. However, data is not naturally occurring like an oil spring but created by people and read by machine (designed by humans for specific purposes). Data cannot be viewed as neutral as a result. Data as a form of capital constantly seeks to expand, resulting in datafication of previously non-commodified areas of life. It also acts as a new imperialism through cheap provision of services in order to datafy and monopolise large markets.

Using Capital, Vol 1, we can discuss the economic form of data. It is viewed here as the raw material required for production of goods, making it constant capital. It can also be a commodity, produced by recognised labour of users i.e. the audience commodity. Bourideu’s cultural and social capital may be more effective for analysis, as data capital can be converted into economic capital but not all value derived from data is necessary economic. It is not a substitute for money but can be elevated to the same level as financial capital, due to data being subject to the logic of capital accumulation.

There are five major ways data can be used to create value. Data is used to: profile and target people; optimise systems (digital Taylorism); manage and control (data provides knowledge which provides power, including the power to extract more data); model probabilities (and predict behaviours); build things (data platforms which ostensibly provide a service like Uber): and to grow the value of assets by slowing depreciation.

Data mining is positioned as natural so it is important to highlight its extraction for the purposes above. With data as capital in this way, there is an imperative to collect as much as possible, as often as possible, as fast as possible. Thus data extraction happens without meaningful consent or fair compensation. EULA’s are positioned as consent but they are one-sided and no-negotiable so are not meaningful consent. Access to platforms is framed as compensation but there are many more companies which extract and capitalise on data than platforms people make use of. These extraction based companies make billions of dollars and traditional economic sectors are becoming dependent upon data too. Users see little to none of this profit, making it exploitation. It is, of course, framed by a matrix of oppression.

Data Feminism – Catherine D’Ignazio and Lauren F. Klein – 2020

This starts with the story of Christine Darden, a NASA employee who worked through the ranks from the late-1960s onwards. Part of the reason she was able to do this was because of data compiled by Gloria Champine, who worked at NASA’s Equal Opportunity Office. The point made here is that women’s experiences of oppression are often disbelieved until quantitative data is produced.

Feminism in this book is used broadly for projects which challenge sexism in an intersectional manner (how elements of a person intersect in and the resultant meeting of privileges and oppressions), with the aims of creating a more equitable society. Data should be used to do this through highlighting privileges and oppressions within the status quo. This includes challenging the notion that numbers are neutral. This stands in contrast to data collection traditional being used to categorise people and consolidate existing power relations e.g. biometrics being used to disproportionately surveil non-white people. This is done by governments and corporations, with nothing being out of the reach of datafication, creating pernicious feedback loops of power structures.

Data feminism suggests we must trace biased data back to the source as the data is not the root cause of the oppression. This means data can also be used to fight oppression. Data is defined loosely here as traditional understandings of what is ‘legitimate’ data has been used to exclude people who are not white and not male. Seven core principles drive the book, derived from intersectional feminism: examine power; challenge power; elevate emotion and embodiment; rethink binaries and hierarchies; embrace pluralism; consider context; and make labour visible. These will be used to show that a project can be feminist in content by challenging power through choice of subject matter; in form through shifting registers of data communication; or in process by creating inclusive processes of knowledge production.

Chapter five: Unicorns, Janitors, Ninjas, Wizards, and Rock Stars – Embrace Pluralism
Data feminism insists that the most complete knowledge comes from synthesizing multiple perspectives, with priority given to local, Indigenous, and experiential ways of knowing. One example of this is the Anti-Eviction Mapping Project (AEMP). The AEMP was formed in 2013 to map evictions in San Francisco with antiracist, feminist, and decolonial methodologies. Their maps are multimodal, with some being traditional and others being based on oral history and unclear, intended to reflect the uncertainty of the interviewees’ lives. This rejects the traditional ‘cleanliness’ of data visualisation and offers counterdata from the dominant viewpoint.

Data scientists are oftentimes seen as people who clean up data, with the NYT calling them the 21st century janitor. The idea here is that there is something wrong with data that needs to be put in the ‘correct’ order, implying there is an objectively right way for data to be. Data science’s history is tied with that of eugenics and this fixation on cleanliness within data is worth remembering. Data cleaning often just means subjecting it to the dominant ideology, which reflects and reifies the status quo. The AEMP sought out people who had never made maps before but were from the areas under assessment. The aim of this was to seek a plurality of voices without losing the context of the geographical area examined. This rejects one Truth, instead seeking a pooling of viewpoints. This requires transparency from researchers in its data collection and methodologies.

The AEMP’s centring of local voices involves these local actors from the design process onwards. This is positioned as a focus on co-liberation, rather than data for good. Data for good acknowledges concerns around data collection and analysis but remains vague in terms of solutions. Co-liberation has specific goals: leadership by minoritized groups in the community; resources managed by these groups; data owned by the community; community-centred data analysis process; data scientists as community facilitators; data education and knowledge transfer; and building of social infrastructure. Therefore, data for co-liberation is more specific and larger in scope than data for good. Data for co-liberation adheres to the principle of embracing pluralism throughout the process of working with data, from collection to analysis to decision making.

%d bloggers like this:
search previous next tag category expand menu location phone mail time cart zoom edit close