Datafying childhood. Can schools or families resist?

This event brought together scholars related to datafication, childhood, and privacy in order to address concerns around how the actions of children, both in and out of school, are becoming increasingly monitored and datafied using digital technology.

Some key, guiding questions were set out here: can schools and families resist the datafication of childhood?; what issues associated with the datafication are unique to children and young people?; what role can the nation state play in regulating multinational technology companies?; how does datafication change the process of schooling? What can or should schools do to ensure the ethical uses of data?; How can we ensure young people’s rights are protected without limiting the potential of digital systems and communication?

Further information on the day, including details on the program and speakers, can be found here and a link to the stream of the event can found here.

Summary of proceedings

Sonia Livingstone

Download link for Sonia’s PowerPoint

Sonia Livingston’s lecture was on talking to children about their data and what was actually happening to it. This was seen as important in research as children’s voices have often been ignored in formulating policy around children’s data, despite the UN’s Convention of the Rights of a Child stating that children should be consulted in matters which impact them. This research is in the light of a greater public understanding of data privacy, in a post Cambridge Analytica world. Privacy was formulated here as being relational, contingent on interconnection with other people in order to exist. Further to this, privacy is understood as existing in three realms: interpersonal, institutional, and commercial. The first is privacy between people, about inferences made by people about one another. The second privacy is about analytics developed using our data by structural bodies. The third is about how our data is used to profile us in order to marketise us as consumers.

The research itself interviewed children between 11-16 as these are the general age boundaries established for ‘maturity’ to consent to giving your data away online in laws such as the GDPR in the EU and the COPPA in the US. Children were found to both understand and feel strongly about privacy issues in this research, although this did often require first overcoming a focus on e-safety which is dominant in discourses around children’s privacy online. However, understandings of privacy did generally stay within the interpersonal level, rather than understanding the use of this data in the commercial sense. When it came to data collected in schools, there was generally a lack of knowledge about where this data went, including from teachers and parents, but generally what was deemed a ‘black mirror’ moment was reached during interviews in which it was felt we were too late to do anything about this. Despite this seemingly negative viewpoint, the main takeaway was that children do want to know what is happening to their data and that, generally, a specific age of maturity is difficult to define. The older children became though, the more cynical they were seen as being around data. This may be improved if our perspective of how we interact with institutions and commercial bodies which make use of our data, if we had less fatalism.

 

Moira Paterson and Sven Bluemmel 

This discussion focused on data regulation, particularly for minors, and its complications. Sven Bluemmel, Victorian Information Commissioner, identified two broad streams of datafication: seemingly innocent datafication via services such as Netflix or Spotify; and direct surveillance of data, often by the state, in childhood and beyond. These two streams were presented as coming together in an algorithmically driven manner. Sven’s concerns were that this would limit creativity and critical thinking within society, with people being entirely shaped from early on by algorithmic suggestions while being prevented from challenging orthodoxy due to constant surveillance. Part of the issue for Sven was our understanding of privacy and data exchange in a transactional manner, particularly when we largely do not read terms and conditions documents and so do not really know what data we are giving away. For Sven, some things were not worth giving away as a society in the name of freedom, just as we are not required to know health and safety laws to know a building is safe to be in, we should not have to all be data experts in order to feel our data is safe from exploitation.

Moira’s specific focus was pieces of regulatory legislation for the commercial sector, as she is a Professor of Law at Monash. For example, GDPR law in the EU makes specific mention of children through its focus on the digital age of consent, which is determined specifically by each country. Further, Article 12 states that terms and conditions must be made available in a clear manner, understandable for children when it relates to them. It is not clear, however, how this is being enforced so its effectiveness is difficult to assess. In the US, the COPPA determined that the age of digital consent for data collection should be 13 but given the structure of US laws, it is difficult to have many across the board, federal laws. Instead, they tend to be carried out in a case-by-case basis. California’s CCPA law has attempted to be less narrow, disallowing businesses from selling a California resident’s data if they are under 16 without their consent. It also assumes that businesses know the age of consumers if they wilfully disregard these regulations. Moira concluded that these laws do offer a useful basis for regulation of data in regard to children but that the issue around the specific age of consent persists.

 

Neil Selwyn

The focus of Neil’s talk was on how digital data is enacted within schools in practice. Data has always been collected in schools, being largely fixated on grades and attendance. This is still the case with digitised data. Neil found little evidence of the grand scale analytics that are often spoken about with great hand wringing really being present in most schools. Simply, schools do not have the time to do this and, as one teacher pointed out to Neil, the children have to be in school so concerns about things like retention rates exist primarily in higher education.

It was also pointed out that this datafication of schools is a ‘British disease’, that it has not really happened in Australia the way it did in Britain even five years ago. The result of this is that Australia can see the negatives and attempt to resist them. Concepts such as ‘Google schools’, in which all analytics and resources are provided by Google, are real and concerning but it is not quite the threat it is sometimes made out to be. Instead, schools are often very localised and the data they produce and use can often reflect this. This does not mean every teacher knows what is happening with the data generated though. This is not the case though as there are internal politics of data, with the appointed ‘data people’ often being white, male, maths teachers, displaying the power imbalances within schools. For bigger scale data such as NAPLAN results, it is simply analysed by the state governments and spat back at schools in charts and graphs, adding to the confusion often felt around what is actually being done with the data.

What Neil saw as a hopeful light in this were the ‘street-level bureaucrats’ who controlled data within schools. Despite the politics of this mentioned above, their existence allows for resistance on a local level to the worst aspects of datafication. This is not perfect but it allows for identification of issues such as the extension of work into overwork which can often happen implicitly through datafication, thus giving some room for optimism.

 

Kath Albury and Melissa Kang 

This conversation was regarding how young people interact with data in the everyday, particularly in dating apps and health services such as My Health Record. Melissa Kang is an academic and a GP, working as a medical officer in Western Sydney with homeless and marginalised young people. In her research, she has found that confidentiality as a barrier to seeking medical advice has become much less of a factor in young people in Australia. This is partially understood through the idea that we have so much of ourselves online today that privacy doesn’t seem as much of an issue as it did in the past. However, with more vulnerable students, such as international students who may be homosexual but not out to their parents, confidentiality was still a big concern. As a result, these students often try to find work arounds to having their data collected e.g. paying out of pocket for treatment rather than using their insurance due to the fear of their parents gaining access to their records. Young people who Melissa saw in her work often went to various different doctors for different reasons, meaning their data was not centralised and so they did resist the totalising nature of datafication.

Kath Albury is a Professor of media communication at Swinburne University, examining young people’s self presentation and the role of user generated media in sexual learning. Her research has shown how people navigate dating apps through interconnection with other apps. For example, when people match on dating apps, they will often then talk on other platforms, in order to ensure the person they have matched with is who they claim to be. This concern around safety and authenticity in self-presentation appeared to be a much larger concern for the participants in her research than concerns around data exploitation or commodification. Norms of self-presentation to prove your authenticity were seen as dominant in these spaces, resulting in a lot of data being given out without much concern for commercial privacy. Due to the interpersonal connections these dating platforms are predicated on, this type of privacy came to the forefront, rather than the commercial form of privacy.

 

Mark Andrejevic 

Download link for Mark’s PowerPoint

The focus of Mark’s talk was of automation, which he presented as a consequence of datafication, with the former being powered by the latter. This is intended to challenge how we think about privacy issues. Through examples from the ‘hype realm’ of technology patents, i.e. futuristic devices that may never really get developed, Mark presented how the cascading logics present here can be seen as a path to automated surveillance. The concept here is that first the data collection becomes automated, then the processing, and finally the end result. While these technologies may not become mainstream, the tendencies they display are more important to examine.

In this talk, through examples of patents, children are seen as coming up repeatedly in regard to governance. Thus, the behaviour of children and its governance can be seen as moving in the direction of automation, of products like Google Home altering parents of ‘mischief’ or even carrying out automated punishments for this behaviour. This norm making is only possible through persistent and all pervasive monitoring, seen here both in the home and in the classroom.

From this, the biases of automation can be seen, meaning which biases are built into the decision to automate a process, rather than the biases within the process. These biases are split into three elements here: environmentality; framelessness; and operationalism. Environmentality refers to the practice of changing the environment rather than the individual in order to achieve the desired results. Framelessness is about the ubiquity of data collection, the desire to collect everything and figure out what is useful post hoc using automated analysis. Operationalism refers to how this becomes automated, how machines no longer deal with the representational, only what can actually be operationalised. This collapses the ability to reflect on meaning, as all that exists is the operational outcome. While this may be useful for pro-social goals, it can result in a loss of concern for causality, reducing people to simply the expected outcome in potentially black boxed software.

%d bloggers like this:
search previous next tag category expand menu location phone mail time cart zoom edit close