Machine Behaviourism: Future Visions of ‘Learnification’ and ‘Datafication’ Across Humans and Digital Technologies – Knox, Williamson, & Bayne – 2020 – longer notes here
This paper recognises and seeks to combat the tendency within educational research to ignore datafication. It uses Biesta’s concept of learnification, suggesting we’ve moved beyond it to datafication. The primary object of investigation is machine learning (ML) and how it influences education. Particularly, the focus here is on education being oriented around data collection through ML-promoted ‘nudging’ practices.
Learnification is the privileging of the provision of learning experiences over teaching. This focuses on an autonomous learner/consumer of educational commodities. The language of learning has two problematic assumptions: first, that education institutions merely respond to learners’ needs (which assumes learners fundamentally understand their needs; and secondly, the transactional nature of education (due to the first assumption) means the only important educational questions are around efficiency and effectiveness of the educational process. This makes education more amenable to data-intensive practices.
Learnification and datafication have worked in tandem to promote learner-centred practices. Learning analytic (LA) literature sees itself as a technical discipline, as data science. Combined with the commodified educational landscape in learnification (and its privileging of the learner), LA is able to penetrate the educational market, positioning itself as unlocking hidden truths through data. However, datafication moves past learnification, replacing the consumerist-driven education market with a data-driven education market (i.e. data as a commodity is more valuable than students as a commodity). This is seen in the move to data dashboards instead of VLEs, with data on student behaviours feeding the ML processes of the dashboard. These data-centred environments are not simply passive either but actively produce data through IoT devices. This moves from the autonomous learner model of learnification to a behaviourist model of nudging.
Biesta suggests learnification moves into all sectors of life but datafication goes beyond the human, to the machine. This makes knowledge production a human-machine assemblage, with machines accumulating data and setting limits for the scope of knowledge, while depending on people for coding and hardware building. Three forms of ML processes are identified here. The first is supervised, in which ML is trained through datasets. The second is unsupervised, in which the ML learns from data ‘in the wild’. The third is reinforcement learning, which teaches through rewarding positive actions. AlphaGo is given as an example of reinforcement learning, which uses ML to learn Go through learning from its own actions. In people, this would basically promote learning through behaviourist psychology around ‘animal behaviour’ and dopamine releases.
This kind of behavioural science is prominent in contemporary governmental authority. The idea of the rational utility-maximiser has been replaced by the idea of the emotional, irrational, yet predictable agent. This agent can supposedly be nudged into more normative behaviour (and as such, responsibilised for supposedly negative behaviours). This can happen through environment designs or ‘choice architectures’, which frame certain decisions to make them seem more desirable than others. The UK government has used choice environments in their approach to education. It is also seen in the persistent tracking and predicting of student behaviours. Apps such as ClassDojo use this kind of data to promote a ‘growth mindset’, while Microsoft has used data to pre-emptively predict the need for mental health interventions for students. Overall, this shows a reconceptualisation of learning in terms of psychologically quantifiable, affective characteristics, which are to be nudged into the pre-determined, normative position.
How We Became Our Data – Koopman – 2019 – Chapter Five: Redesign – Data’s Turbulent Pasts and Future Paths – longer notes here
This chapter explores the difference between a theory of information and a theory of communication, examining the historical development of the former as a consequence of the latter. It goes on to then suggest a politics of information, which is not possible through a theory of communication. This seeks to address the rise of infopower, moving beyond the restrictions of deliberative democracy.
To do this, it is first important to recognise that information theory is not the beginning of information practices. Information theory arose in the 1940s and into the Cold War but information practices were happening from the beginning of the 20th century. Koopman suggests we have ignored this early history in favour of a narrative of the military-industrial complex developing information theories and practices. Koopman does not deny that there was a consolidation of information (and thus power) through information theory, rather seeks to examine when and how this happened, looking at the cusp of this consolidation.
Koopman begins with looking at the change in data in the 18th century, moving from being received knowledge to what has been proven; moving data from being an assumption to a conclusion. This was reversed in the 20th century, with data being assumed or given, with Shannon’s information theory removing the need for data to be true or meaningful in order to be classified as data. Debates around this epistemic shift in the status of data during the 20th century but it was the work of everyday data technicians who really created change in the status of data, working with and producing data (rather than just talking about data). This quiet work still helps define how we see data today but gets lost in the story of information. The creation of data formats which we still use today were often done out of immediate necessity, not considering how they would be used in decades to come.
Koopman then asks: what kind of normative guidance can we provide today for ongoing technological development? This is not about programming the future but about decoding present practices. Current practices of infopower set limits on our ability for resistance so this is vital to explore. Koopman primarily critiques the idea that communication is taken to be the core of normative democratic theory as it makes the politics of information invisible. This begins with Wiener’s theory of cybernetics and Shannon’s technical contribution to electrical engineering. They first suggest communication is vital for 20th century projects, highlighting the obsession with communication before information theory was formulated. They secondly suggest information precede communication, as without information, communication is pointless. However, this also means communication cannot address information. Thus, we have channels of communication which are not concerned with the content of the information delivered. Shannon and Wiener are better thought of as presupposing information for communication than as contributing to information’s formation, consolidating existing information practices rather than inventing something new.
Koopman then moves to talking about political theories of communication, particularly deliberative democracy. Deliberative democracy cannot see beyond communicativism but this does not mean it is without merit (communication is important within democracy). Koopman suggests confronting information at the level of formats instead to improve upon deliberative democracy. Koopman basically offers the same critique of Habermas’ procedural communication paradigm, which suggests all discourse should be included in political institutions, provided it is communicated and processed freely. This can only address information formats when they become a problem for communication then, rather than as their own form of politics and privileging existing forms of communication. It has no means of dealing with communication which is both free but politically dangerous, for example. Essentially, to focus purely on the communicative procedure is ineffective; we must look to both the communicative procedure and the information design that precedes it.
This is explicitly not a rejection of information or data outright then, nor a rejection of communication and its importance. Instead, Koopman seeks a practical critique of data technologies from within. He points to Dewey’s claims that a bewildered public is the result of disconnect from government and so improving communicative media and allowing greater interpersonal and mechanical connections is key to modern politics. However, Dewey still falls victim to seeing all unimpeded information as positive or useful. Lippmann promoted the idea of the technic of experts giving voice to the public and their grievances, as well as education on the critical scrutiny of information use, recognising the potential negatives within these types of information, as well as the positives.
Koopman’s basic claim then is that a politics of communication is not enough, we must also engage with a politics of information critically interrogating information to uncover the operations of power and, resultantly, resistance to this power. This should be done through an internal resistance to technological practices, rather than an external resistance to technology altogether. This is a resistance of occupation, contestation, and transformation; it is about questioning dynamics of information which seem incontestable and then conceptualising the politics of information in a practically different manner. Just as the information formats we have today were designed in an everyday, ad hoc manner, we should look to everyday, mundane forms of resistance from auditors, leakers, jackets, etc. who problematise infopolitics. We should also consider how newer formats for data will become entrenched in the future, setting limits for how we think and communicate politically.
‘Where do we resist? In the forms, formats, and information. How do we assemble the competencies to resist these forms and formats? By learning how to reformat them. By understanding how to redesign them. By interrogating the manifold technologies with which they have been designed and redesigned. Thus does resistance for the sake of the forthcoming future require critical engagement with the proximate past.’