Transcription, analysis and reflexivity

This section examines methods related to transcription and analysis, as well as the role of reflexivity.

Transcription

Interviews were generally transcribed in their entirety, with the exception of passages containing non-related or non-prioritized discussions. For the informal conversations, passages related to salient moments were transcribed. As Davies (2007, p. 127) points out, transcribing is an act based on considerations and theoretical assumptions. The selected material was initially transcribed verbatim, although in a a way that was adapted to prevailing spelling norms rather than spoken language pronunciations. Then, the text was adapted to a "clean verbatim" where unnecessary elements were removed; repeated words, stutters, filler words, and speech sounds like 'um', 'er',. Besides the improved readability, my assessment was that this representation better conveyed the responses and reflections of the participants.

 

Most of the recorded speech was in dialects of Swedish or Norwegian, and transcribed using common written standards of those countries. For Norwegian speakers, transcriptions were made to Bokmål – one of the nation's two standard writing systems, the other being Nynorsk – which means that, depending on the dialect of the speaker, written words occasionally differ from their spoken counterpart. Selected quotes were then translated to English – by me – sometimes with the aid of online translators for isolated phrases. This was particularly the case with idiomatic phrases and expressions that are difficult to translate, as a way of investigating alternatives. The transcribed excerpts – and my translations of these passages – were then sent to the participants when obtaining consent to publish.

 

Automatic transcription

Automatic transcription was mainly used in conjunction with audio journaling. In these situations, all of the recordings were done via Otter.ai, an application for smartphones that instantly transcribes speech to text. An advantage with this way of working was that it provided a quick overview of the recorded data, something that also facilitated organizing the recorded material. However, the resulting text mainly functioned as a form of raw material that then needed to be edited manually, since AI-based transcripts generally contain a high amount of errors that needs to be corrected for accuracy, context and readability (McMullin, 2023). This was especially the case with composer names and technical terms related to music making. Also, as McMullin (2023) points out, there are several subjective decisions that AI is simply unable to take, such as determining where to place punctuation, which words to include or exclude (such as filler words and hesitations), and how to indicate interruptions, hesitations, and nonverbal cues. 

 

I also transcribed some conversations using automatic transcription services, in the third (and last) year of the project, which included the public discussions. This had the obvious advantage of being time-efficient, as the auto-generated data provided an overview of the conversations, as a starting point for manually editing selected passages.

 

It should be pointed out, however, that automatic transcription can be problematic, since they often involve cloud-based services; this poses certain risks from a security perspective, which makes these types of services less suitable for processing sensitive data. This includes certain categories of personal data, related to topics such as health, religion, race or political opinion. This, however, wasn't a concern with my audio journals, as they typically revolved around music compositions, approaches to working with the material, and reflections on the processes. The conversations – and excerpts of conversations – were also assessed to be safe in this respect, and transcribed using a system called Speech to text, an NTNU-developed service for automatic transcription that has undergone a risk assessment and privacy impact assessment (DPIA).

Reflexive thematic analysis

To analyze the qualitative data, I used reflexive thematic analysis, a method developed by Virginia Braun and Victoria Clarke (2022) where patterns and themes are generated from the data through a systematic – yet creative – process. One of the core assumptions made by Braun & Clarke is that researcher subjectivity is an asset rather than a problem, seen that ”knowledge generation is inherently subjective and situated” (p. 8). The role of the researcher is also emphasized in the recognition that themes do not passively ’emerge’ from the data but are actively produced by the researcher through systematically engaging with the dataset. This calls for reflexivity from the researcher, i.e. an ongoing, critical reflection on one’s role in the production of knowledge.

 

The text documents generated from written journals, fieldnotes and transcriptions – from audio journals, conversations, and interviews – were all gathered in NVivo, a computer software for organizing and analyzing qualitative data. The analysis then took on the following stages, based on Braun & Clarke (2022):

  1. Building familiarity with the dataset. This involved immersing myself with the data; reading, re-reading, listening – to recorded material – and making preliminary notes. 
  2. Coding. This involved going through the assigning labels (codes) to snippets of text – e.g. respect or chord symbols & limitations – that were then organized in categories.
  3. Generating initial themes. Based on the categories, initial themes were generated based on what seemed to be the salient concepts.
  4. Reviewing themes. This stage involved re-arranging codes and ’adjusting the contours’ of the themes. This could be done to further differentiate the themes from each other, but it could also involve merging themes that were based on the same concept. One part of this was to ’listen’ attentively to the data, which also involved a lot of listening in a literal sense, since so much of the material was audio-based. At several points, I returned to recordings of dialogues, as well as musical recordings, to make sure that how I analyzed the material actually reflected our musical processes.
  5. Defining and naming themes. This stage involved defining what core qualities the themes represented, and making sure that these qualities were conveyed through their names.
  6. Writing up. This involved weaving together the analytic narrative with data extracts, as well as balancing researcher and participant perspectives.

 

Building themes for RQ1 and RQ2 was largely done in a deductive way, in that categories and themes were clear from the beginning; e.g. categories related to musical media and musical elements. In contrast, themes related to RQ3 were mostly created inductively – from the data – since there were not any given literature for how to conceptualize such findings. Similar to the musical workflow, developing themes was very much an iterative and dynamic process, which involved a lot of ’back-and-forth’, particularly between steps 4 and 5. An important aspect was to try to achieve a critical distance; to let the data rest for periods of time, and then return with a more open view on how the data could be interpreted.

Reflexivity

Reflexivity is about maintaining a continuous inner dialogue and critical self-evaluation of one's positionality as a researcher, to see how one's assumptions – epistemological, theoretical, political, etc. – and power relations to the participants and the structures of which one is a part potentially affect the research process (Berger, 2013). For me, it was to a large degree about accessing 'blind spots' that basic assumptions, preconceptions and other things along my horizon of understanding prevent me from seeing. For instance, rather than jumping to conclusions that a certain theme would seem to emerge through discussions based on the interest and reasonings of the participants, I kept in mind that it might have been based on a topic that I brought into the dialogue – implicitly or explicitly – as a result of my particular interest in the field, my personal preferences, and my focus at the time. 

 

As Finlay (2002) describes it, there are points in looking at reflexivity as a process that is very much alive, a process in which we continuously evaluate subjective responses, the dynamics of intersubjectivity and the research process itself. As Finley describes it, it is about shifting from seeing data collection as something objective – as something that can be achieved by examining 'what I know and how I know it' in a detached way – to understanding and recognizing how we actively participate in constructing our knowledge. Following this view, a recurrent part of the research process was to revisit interpretations I had done previously, as I often came across new perspectives on how things could be understood.