The Roman stoic poet Persius, in around 58 CE, advised his readers to “meet the malady on its way.” We have various modern renderings with a parallel notion: ‘to nip things in the bud’, ‘a stitch in time saves nine’, and ‘an ounce of prevention is worth a pound of cure.’
It’s similar to the rationale we use for up-front research in a human-centered design context: problems are more efficiently solved if understood correctly and addressed early.
“Be prepared,” the Scouting motto, takes a much more adventurous and aspirational angle that seeks to be ready, come what may. In the design research context, this places the emphasis on agile personal skills and distances the obligations of employers and work environments. This is familiar territory for many researchers who rarely get to dictate the terms for early or proactive groundwork, and who usually need to manage messy and reactive studies as best we can.
Let’s propose an ideal attitude somewhere between these two perspectives. We’ll define ‘preparing well’ as considering things early while also remaining flexible to change. We can aim to develop a big-picture view of our research work while also tempering any over-thinking and under-resourcing with lightly-held opinions about the outcomes. We can embody the values of ‘Semper Gumby’ and adapt to change as we plan our research and approach a state of readiness for analysis itself.
What that preparation and readiness will look like in practice is determined by the data, the methods used to collect it, and the purposes of our respective research efforts. The secret to effectively navigating analysis lies in recognizing the connections between these elements.
The inherent connections between research goals and research outcomes signal that our analysis journey literally begins the instant we create a plan for our research, not after we’ve finished assembling all our data.
Often, in our attempts to provide rigor and depth to our work we will overly deconstruct our research process into discrete stages and substages related to planning, collection, or analysis. This kind of partitioning might align with high-level decisions or allow us to focus, but it can also deny us the opportunity to be mindfully adaptive in response to the inevitable challenges to our research goals.
It also means that we’ll be blind to threats if we fail to keep a clear eye on progress. We need to monitor how our assumptions and hypotheses are shaping up, and how the relevance and quality of our data are appearing in light of our research questions.
This kind of analysis can begin as soon as we have raw information in our hands. We shouldn’t wait until after collection to begin prodding and poking to see what our investigations are about.
The practice of taking small, analytical bites into research as early as reasonably possible is known in formal circles as ‘periodic analysis.’ Most experienced qualitative researchers consider it good practice.
Doing analysis concurrently with data collection can be very rewarding for several reasons:
The biggest advantage of periodic analysis is that it saves us from colliding with a giant wall of research data all at once.
Having familiarised with the data and drafted some ways of thinking about it, we’ll already be well on our way.
Analysis during collection helps us assess the health of our research. Just as we go to the dentist or mechanic for proactive care, we can incorporate checkpoints that help us assess the quality of our research projects or continuous collection activities. We can do this by jumping into our research data and starting to analyze it in small bites as we continue data collection.
If high-quality data is the ultimate purpose of research, how might we look across our people, processes, research methods, documentation, and project alignment with organization and our target audience?
As we wrestle with the things we uncover and learn during research, the dynamic between the interpretations in our mind and the evidence in our collected data guides our analysis of it.
Keeping notes during sessions will allow us to tame some of our mind’s acuity. Thoughts fly past our window of focus as we moderate. These moments we’re with participants and interviewees can provide a valuable set of impressions about the individuals, understanding of context, follow up questions, needs for better research focus, and other ideas and feelings that build empathy.
These rich impressions fade quickly and can be difficult to re-conjure if not properly articulated. This is where allowing enough time between sessions is critical. Interviewing four or five participants in a single day can blur our memory. Some solo researchers allow 15 minutes to fill out and tidy up their scribbled field notes while they refocus for the next session. Collaborative teams of researchers might allow 45 minutes for similar reasons, but spend this time collectively discussing and documenting their observations.
Our field notes form the crux of our earliest available data. They capture the highlights and key impressions we consider most compelling, and make an excellent starting point for forming our key learnings.
They become an initial framework for later analysis to help fill in with other evidence from transcriptions and other artifacts. Importantly, our notes also reflect our personal interpretations of what we’ve seen, and so keep our journey accountable as our understanding increases over time.
It’s common for researchers to begin tagging data as they gather notes and post-session reviews. Tags or shorthand labels can be jotted in the margins or as part of session comments.
Some observers or note-takers use a pre-prepared spreadsheet to record specific or expected responses during sessions. More commonly, tagging is done afterwards by highlighting chunks in a transcription record of the session, or in a specialized research tool that helps to slice and label various forms of qualitative data.
The codes used in our tagging might be known upfront, but we’ll likely find that new tags need to be added as we go, or that the tags we already have are skewed with assumptions. The power of tagging as we go lies in indexing the raw data. Essentially we’re making the raw data usable in a way that we’d otherwise need to re-write everything ourselves.
There is no need to wait until data collection is complete before we check how it is shaping up. We can see the beginnings of possible patterns emerging, and note the effect of particular variables on the themes we’re exploring. If we’re concerned about judgements being made without a full picture, we assess continually and confirm findings at the end.
The benefits of going back to reconsider previous decisions and adjust or iterate on analysis as we gain better understanding of a topic are well-known. But qualitative research, where people are at the focus of the study, is well suited to a reciprocal process of data collection and data analysis.
This kind of analysis is such a mentally intensive process that we should welcome opportunities that help us spread the load over time. It will give us thinking space for ideas to develop and mature.
Analyze data, collaborate on insights, and build your research repository.Try free for 7 days
Use casesUsability testing analysisCustomer interview analysisNet Promoter Score analysisSurvey response analysisSupport ticket analysisRemote user research
© Dovetail Research Pty. Ltd.
Made in Australia