Good Data book discussion
- While chapter three was seen to be somewhat administrative/dry in its presentation, chapters eight and twelve worked together to show alternatives for data processes, such as the concept of the commons and who owns/governs data in chapter twelve. Involved in this concept of the data commons was addressing how data is viewed through a neoliberal/individualistic lens currently, often being justified through ideas of consent and transparency around data. However, it is worth noting the question of what constitutes ‘good data’ was left for each chapter to define on its own terms which often resulted in ‘good data’ still being positioned as transactional. Indigenous data processes offer a critique to this transactional form of data and totally open forms of data, showing how it can be good to not circulate data entirely.
- The question of how ‘goodness’ can be attached to data came up. This question addressed how the phrase ‘good data’ may not be accurate as a description (it can suggest that data has some inherent ‘goodness’, rather than data being constructed) which can cause foundational issues. The example of Henrietta Lacks was given here. Lacks had ovarian cancer and during her treatment, her cancer cells were taken, later being used ‘to test the effects of radiation and poisons, to study the human genome, to learn more about how viruses work, and played a crucial role in the development of the polio vaccine’ according to Johns Hopkins Medicine (2016). The issue here is that Lacks (and her estate) were given effectively no rights over her body and cells or over how they were used. While the data generated from the cells may have been used for ‘good’, is the data ‘good’ given where/how it was extracted? (See Rebecca Skloot, The Immortal Life of Henrietta Lacks, New York: Crown, 2010; Maureen Dorney, ‘Moore v. The Regents of the University of California: Balancing the need for biotechnology innovation against the right of informed consent’, High Technology Law Journal 5 (1989): 333; and Jasper Bovenberg, ‘Inalienably yours? The new case for an inalienable property right in human biological material: Empowerment of sample donors or a recipe for a tragic anti-commons’, SCRIPT-ed 1 (2004): 545).
- The process of data collection was then raised as a topic. Chapter three seemed to suggest that we should know why we’re collecting data and what it will be used for before we collect it. However, there may be reasons to gather data which are not for research and people always give off data regardless, making this concept difficult to enact. One suggestion here was to view data not as a privacy issue but as a ‘dignity’ issue. Janice Richardson’s concept of treating data breaches being more like assault than theft (taken from Law and the Philosophy of Privacy) was given here.
- The intended purpose of the Good Data book itself was also raised. The book was intended to be a useful public tool to be used by industries. Essentially, it was intended to be useful and applicable. Despite this, it was felt that many of the conception of data here came to a dead-end in our discussion. The commoning of data, for example, was deemed impractical and unlikely to be taken up by any companies we would like to see take it up. This raised questions of how we could act to decentralise data. The Rewired State was given as an example of this, as well as Google’s Dataset Search, the Victorian government’s open datasets, and Research Data Australia.
- We then came to a discussion of the functional understandings of (open) data. For example, many government data stewards were seen to not understand the importance or need for open data. Conversely, one problem with the data commons suggestion was that the idea that data should only be gathered for a purpose (from chapter three) would limit the ability to collect data. The question was then raised of if data was latent and, if so, how do we govern it? There is the assumption that we inherently recognise data. In contrast to this, colonisers may not be able to recognise many forms of indigenous data as they do not have the knowledge base to understand it. Kitchin’s differentiation between capta and data was raised here too (data being the phenomena which can be measured and capta being the measurement and representation of data). D’Ignazio and Klein’s justice oriented approach to data was also suggested as a different way of approaching data, in light of the concepts of dignity and data commons here reaching dead ends. This is the suggestion that digital citizenship in a datafied society should move away from a discourse of ethics around data because the ethics discourse accepts the commercialised account of data and works ‘well’ within it, while a justice oriented account may be able to challenge this framework (see Monique Mann on ‘Trust and Transparency in Contact Tracing Applications’).