Minutes from October meeting

Discussion of Ernst and Hui chapters:

  • The Ernst piece was seen as the less relevant of the two overall, as it only connected to infrastructures in an incidental way: this chapter primarily functioned as a means to connect his conception of media archaeology to current technologies, resulting in a somewhat dislocated series of different commentaries on technologies. His own book on chronopoetics is likely a better summary of Ernst’s ideas than this chapter. Chronopoetics discusses microtemporalities which arise from technology use. This addresses how we talk about time (deemed here to be largely artificial) with the microtemporalities of computers often being imperceptible for people.
  • The discussion arising from this was around if Ernst is saying we should de-centre the human in histories, with historical timelines being detrimental as they measure time which humans can experience, rather than the microtemporalities of technologies. This includes comparing different forms of technology and how they structure time. The AM radio comparison to current technologies was seen as the most useful for this (or at least the most accessible) but it was noted that the escapement-based clock example is generally an important one for time-studies scholars. This latter example is important as it helped in the creation of uniform units of time, which then became central to capitalism as it measured value through labour time, rather than labour outputs.
  • The Hui piece was seen to be more useful as the idea of tertiary protentions were seen as more directly applicable to technologies (and how they order our experience of time) than the media archaeology approach. Hui gives a good overview of the primary, secondary, and tertiary retentions and protentions seen in European philosophies of time, contextualised here through Deleuze and Stiegler in particular. Hui introduces the tertiary protention to discuss how the world becomes datafied and how this datafication is pushed onto the individual. Hui also discusses how the notion of datafying the world is framed through geopolitical anxieties around a monopolar conception of international (super)power. Hui suggests that we need to move away from such a zero-sum and Westerncentric view of technology and seeks to chart cosmotechnics of technologies in different societies. This links to our (as a reading group) consistent aim to find new ways of thinking about technology that moves beyond a platform-oriented paradigm.
  • We then moved on to discussions of how we could apply these types of thinking into an educational setting. One point was how the ‘value’ of education is measured, often being done through predictions of future returns in terms of benefits to the students (and subsequently, the economy). The problem here is that these are predictions which are unlikely to be accurate and do not really measure power as it stands in the current context. This led to an example of investment in education being measured through how much money this put into the building industry through construction costs; this, at least, was a measurement of ‘value’ which had a sociomaterial element to it. The focus on future prediction via technology (which is the core concern of tertiary protention) seems less tangible than this; what does this say about data-driven policies, or policies shaped by the promises of data-analytics?
  • This question is worth asking when such predictions are so often wrong and (in terms of economics, for example) are more speculative guesses than an exact science. This led to a discussion of the direct impact such economic predictions have more widely. The example given was the financialisaton of water in Australia, coming from the book Sold Down The River. Basically, people betting on water futures does have direct impacts on the present (as suggested by the tertiary protention concept) but that this goes beyond the human and actually reconfigures the earth. But this is also the present shaping the future; not just the predicted future shaping the present. Similarly, the automation of increasingly small tasks (autocorrect, predictive text) seeks to minimise the amount of time spent by individuals making decisions, centralising these decisions within an app/algorithm/tech company.
  • In the education context, Microsoft Co-pilot was brought up. This is effectively a tool Microsoft has developed to search GitHub for relevant code based on plain-text inputs from users e.g. if a user wanted a random number generator, they would type this in and Co-pilot would pull it from GitHub. This removes a barrier to entry away from coding, with the new barrier to entry being an understanding of how the infrastructure of Co-pilot works. This is relevant to education in that students (in HE, at least) are often better off developing an understanding of institutional desires than of a specific subject’s goals. Thus, students often act in a manner to maximise their measured educational outcomes, based on institutional measures, rather than a less quantifiable concern around ‘learning’.
  • Finally, there was a discussion of how all of this connects to classic mediation affordances theory. This basically asked, are we simply discussing a complicated form of tool mediation? What differentiates platforms from, say, a hammer as a tool, other than being more complicated? One suggestion was that platforms are more direct reflections of social and political power which is harder to combat, as they can make determinations about what is ‘true’ or ‘valid’. Highlighting where power lies is a central part of the philosophies of technology presented by both Ernst and Hui, which a mediated affordance paradigm may not provide. This is particularly so when discussing whose predictions around the future are deemed ‘worthwhile’ and thus help to shape the future through present actions.
%d bloggers like this:
search previous next tag category expand menu location phone mail time cart zoom edit close