Phenomenally good lecture which develops ideas around bio-politics in general but also has references to all kinds of relevant ideas for me at this stage of my PhD.
Been thinking a lot about methods recently, and coming to some interesting points in the PhD in that respect, but that’s for a few other posts. My thought today was a reflection on the times that I’ve had to use NVIVO as part of the process of research.
I’ve used the software on a couple of large scale qualitative projects, one in which I was a Research Assistant looking at the health literacy of of older people with either diabetes or depression, the other looking at Eye Clinic Liaison Officers, who connect eye clinics to other services and help with processes such as Certificates for Vision Impairment. Both projects had very large amounts of interview ‘data’ (easily over 100 for each), and other types of data, and both also used framework analysis as a way of systematically coding the data. Generally, my experience is that for logistical reasons, of course NVIVO helps with such large amounts of ‘data’, and especially if you’re involved in collecting the data, is a handy way of organising things. Framework analysis, which I would class as a kind of thematic analysis ++ approach, is also a very systematic way of getting through such projects.
However, on both these projects (and even smaller ones), as a researcher, though it was undoubtedly useful to be able to compare and cross reference texts and other data, I always got to a point where I seriously questioned what I was doing.
This is because it always felt that the software, at some point, begins to dictate, at some level, the analytical task which we have as researchers. Many times, you feel that you’re ‘feeding the machine’, or becoming an ‘nvivo monkey’, which is how I felt at certain points in each project after days of obsessively coding. I’m not saying that there’s a real use to be able to organise and look at data this way, it’s just that if you don’t look at it in other ways, maybe you’re missing something. Instead of having a task driven, spreadsheet technology dominating what you do at certain points in a project, you need to grasp the data in other ways. It might even be a generational thing, but I really want to read those accounts on paper which I can mark and change and spill coffee over without having to worry about software back ups or compatibility issues (….both of these have, at other points, seriously hampered progress and have meant nightmares when using NVIVO also).
I believe that, at least from my experience, there is a real danger in having everything manipulated and ‘rendered’ through the use of digital technology – that’s what we do to the data if we only see it through NVIVO, or even only read it exclusively on a screen. Can’t back this up – would love to hear of any research about this – but I know that a lot of my better analytical insights come from scrawling and marking bits of paper, rather than always looking at a screen.
NVIVO to me is a bit like going to IKEA: a great idea and you can’t help but be impressed by the sheer organisation and anticipation of your needs, things you hadn’t dreamed of – but after a while, you feel a bit dizzy and suffer ‘Ikea – blindness’, a gnawing feeling that you need to get out while you’re absentmindedly filling your trolley with things you don’t need, a bit like coding the fiftieth interview with a theme that you think is important, but may not be because you’re unable to look at the bigger task of extending, thinking of implications. You have to ‘get one more example’, feed the machine, keep going, instead of attending to the entirety of the work in hand.
So this is my current book pile, sort of at the start of things. Some oldies (The Craft of Research and Neuro in particular), some library books, looking at some of new possibilities at this point. Dementia, biopolitics, Foucault, methods, all jumbled up.