Minutes from June meeting

Kitchin article discussion:

  • The piece was chosen given its attempt to understand what Big Data is complementing the discussion from the previous meeting on how to understand data broadly. Retrospectively reading the article raised questions of how our understanding of Big Data has developed since 2014 (when the article was written) and if we would add any new criteria for what makes something Big Data. 
  • It was initially suggested that there could be three new additions to what constitutes Big Data: it belies its own infrastructure, requiring infrastructure while presenting itself as seamless; there is a degree of immediacy to Big Data (in its use rather than its collection, which comes under Kitchin’s ‘velocity’ point); it is not  just about the creation of an archive of data but the use of the data gathered for obtaining action in the present i.e. Big Data sets are pragmatic. Alexia provided 10 common characteristics of Big Data, which can be found here.
  • A difference between Kitchin’s article and now is that a consensus has developed in regard to inductive/deductive/abductive methods. The inductive approach has become more common but not in a strictly positivist manner; it is used to identify points which necessitate a closer reading.
  • The topic of Big Data as a discursive device was then brought up. This is distinct from Big Data as literal data sets and is instead a way to claim expertise through a huge set of ‘knowledge’ in the form of data. Big Data allows for a ‘frontstage’ projection of expertise to the public by virtue of the size of the data, giving the perception that findings drawn from this data must be ‘correct’. It was felt that Kitchin uses Big Data being a discursive device here in order to distinguish Big Data as a field of study in this article (as it was the first article in Big Data & Society. This ‘setting the stage’ by Kitchin is a narrative power play in itself by trying to establish the epistemological stance for studying Big Data. Kitchin is taking the discursive topic of Big Data being ‘new’ and giving a way to understand it. This then raised the question if we should see Big Data as a method on one hand and a everyday phenomenon we now experience on the other hand.
  • The authority of Big Data came up based on the contrast in the example of the census given by Kitchin (which had around 40 questions per person) while Big Data can collect hundreds of data points per online interaction, with further metadata points. This gives the perception that data has become so large that it must be computed and cannot be done by people. This brought up the idea that Big Data presents itself as a natural step in data processing; that this move towards data being entirely computed is inevitable. In this sense, Big Data reflects the functioning of Ideology in subordinating every ‘paradigm’ of data collection and analysis previous to itself. This notion of Big Data as inevitable and natural was contrasted with Gleick’s (2011) The Information, A History, A Theory, A Flood, which presents the work done by molecular biology as influencing how computers function, rather than the functioning of computers being inevitable. 
  • Kitchin’s reference to the ‘end of theory’ Wired article was raised. Big Data was seen here to be more of an end of hypothesis, rather than an end of theory, with Big Data making everyone a theorist in how they read data/results from data. It was also felt that the ‘end of theory’ concept seemed to be talking much more about natural sciences and the scientific method than social sciences.
  • The ‘end of theory’ debate was seen to have utility in opening up the conversation as what actually counts as theory and what counts as expertise? Big Data has helped to usher in a new form of governance and regulation superseding Reaganism, which is seen with how the pandemic has been handled. People like Dominic Cummings are able to use the authority that comes along with Big Data to gain positions of power and make actual changes (whether or not they are good or bad, they are still changes). The opening up of what constitutes expertise isn’t necessarily a bad thing and shouldn’t be automatically dismissed. The ‘end of theory’ acts in a similar way to Fukuyama’s ‘end of history’ through hiding the Ideology inherent in the shifting form of governance. However, this raises the question of why Big Data does not always have an impact, with the specific example of climate change science being used? This has long had scientific backing but was/is still regularly ignored by huge numbers of people so what is the difference between this use of Big Data and someone like Dominic Cumming’s use?
  • The Kitchin article was seen to have practical use too in bringing up how we can see Big Data having positives in use through a compassionate, humanistic use of it. This necessitates recognising that Big Data does not have a God’s eye view of the world (which Kitchin points out) and so helps to point out how a compassionate data model should look. This model should be dynamic in its understanding of social processes; providing ‘very detailed models of environments in flux (as opposed to at freeze-points in time and space)’ according to Kitchin. 
%d bloggers like this:
search previous next tag category expand menu location phone mail time cart zoom edit close