Notes from September meeting

Discussion of The Unseen Teen:

  • In general, we have been moving in the direction of grey literature, or at least more public facing literature. The Unseen Teen works in this regard because it comes from the Data and Society Group, essentially a think tank which private companies are willing to work with. The participation of platform company staff was something repeatedly discussed, in contrast to academic research which tends to have a much harder time gaining this kind of access. 
  • The way the core concepts (digital health and wellbeing for teenagers) was felt to be, at times, a missed opportunity. Some felt that the definitions of these core concepts used here went effectively untroubled (not problematising the concepts enough), while others felt the voices of young people were lacking here. The lack of a youth voice was seen as somewhat ironic here, given the name of the report. Even if it was not youth voices, voice for youth (mental health experts, for example) could have been integrated into the report. 
  • Despite this, the highlight of the developer voice was still seen as useful. There is a lack of institutional studies in regard to digital platforms so the data this report was able to generate remains a positive, even if it does make the focus on how developers see teens, rather than how teens see platforms. Ultimately, having developers involved in research is a good thing, even if it is not perfectly executed. It is worth noting here that the writing and layout of the report is important as well. The report is written in fairly straightforward language with little in the way of theory, which makes it more immediately open to a general audience. Combined with the developer interviews, this means the report can reach a much broader audience than many academic writings can. Being able to bring together developers, a lay audience, and regulators could prove very impactful, even if the content of the report is not quite what we would like.
  • The question then became: what would we do differently, compared with this report? This question was intended to move past purely being critical of the report, into saying specifically how it could be improved. 
  • One suggestion was to examine the concepts of education and rehabilitation, as mentioned in the report. This is now often happening within platforms (outside of the formal education system) and so outlining what this looks like in a platform would be helpful. Another suggestion was to have a more robust discussion of what was meant by wellbeing (hearing from a range of actors, including teens). If we just work with more milquetoast understandings of digital wellbeing (concerned about screentime, for example/) we are only going to get generic responses from developers. Thus, changing this discourse could provide different outcomes from developers. 
  • Somewhat differently, it was suggested that a focus on teen-oriented UX design could be interesting (with more child-centred platforms developing currently, like Instagram Kids) and thus a focus on how specific platforms operate. In its current form, the report essentially treats all platforms as the same, rendering it difficult to make any specific policy or design recommendations. With outcomes often being vague or near-platitudes, it can generate feelings of ambivalence in readers as it appears nothing can be done to combat issues on platforms.
  • Related to the idea of teen-oriented UX, there was a discussion of compartmentalisation of platforms, with users of different groups (such as age ranges) being filtered into different parts of a platform. This was discussed in terms of feasibility for developers but also in terms of governmental regulation: is it easier to regulate a platform with a specifically identified audience, as opposed to the ‘average user’? The concept of the average user (and how this idea was leveraged by development companies) was of interest, particularly around the use of strategic ignorance. In a way, if platforms are being designed around a policy of strategic ignorance, adhering to the law as much (or as little) as is possible, does this shape platform functionalities?
  • How the report dealt with (or did not deal with) the issues driven by the economic models of the companies was raised as well. In particular, business models (and capitalism writ large) were identified as fundamental issues with platforms but nothing more was really done with this; the topic was dropped shortly after it was brought up. This was apparently when the report discussed wellbeing-oriented design: the report promotes an universal design approach, basically saying everyone would benefit and companies would make more profit from such design. It simultaneously claims that it is the data-driven profit-seeking business models which prevent a universal design approach currently. In this way, the report seemed to have some internal contradictions. 
  • It was noted that the report ultimately promotes ideas around regulation via civil society pressure, bad PR, and the threat of (but not implementation of) government regulation. Carol Bacchi’s ‘What is the problem represented to be?’ approach is useful here because it allows us to see how the report conceptualises the issues of wellbeing and health in the first place. The suggestion of making developers more aware that humans use their platforms implies they are not currently aware of it, for example, which is not true. So examining how the problems are framed (and how they could be reframed) is worth doing.
  • Ultimately, the conversation came back to how we would do things differently. How could we work with these findings (and the data that drives the findings) and what would we want to see out of it? Some suggested examples are found below:
    • making it easier to access the tools and interfaces for resolving issues
    • implementing consent frameworks into the service in a dedicated way
    • a committed international implementation for young people to delete data and metadata at various points during their development
    • separation of platforms between a youth and adult model
    • further, no transmission of content between youth and adult platforms
    • specific recommendations about the T+C and or privacy policy that are focused on children-based stuff
    • a tax that is specifically geared towards provisioning youth services 
%d bloggers like this:
search previous next tag category expand menu location phone mail time cart zoom edit close