20.6 Tools to account for our influence

Learning Objectives

Learners will be able to…

  • Identify key tools for enhancing qualitative rigor at various stages of the research process
  • Begin to critique the quality of existing qualitative studies based on the use of these tools
  • Determine which tools may strengthen the quality of our own qualitative research designs

So I’ve saved the best for last. This is a concrete discussion about tools that you can utilize to demonstrate qualitative rigor in your study. The previous sections in this chapter suggest topics you need to think about related to rigor, but this suggests strategies to actually accomplish it. Remember, these are tools you should also be looking for as you examine other qualitative research studies. As I previously mentioned, you won’t be looking to use all of these in any one study, but rather determining which tools make the most sense based on your study design.

Some of these tools apply throughout the research process, while others are more specifically applied at one stage of research. For instance, an audit trail is created during your analysis phase, while peer debriefing can take place throughout all stages of your research process. These come to us from the work of Lincoln and Guba (1985).[1] Along with the argument that we need separate criteria for judging the quality of from the interpretivist paradigm (as opposed to positivist criteria of reliability and validity), they also proposed a compendium of tools to help meet these criteria. We will review each of these tools and an example will be provided after the description.

Observer triangulation

Observer triangulation involves including more than one member of your research team to aid in analyzing the data. Essentially, you will have at least two sets of eyes looking at the data, drawing it out, and then comparing findings, converging on agreement about what the final results should be. This helps us to ensure that we aren’t just seeing what we want to see.

Example. You and another member of your research team both review and code the same qualitative data. You meet regularly to compare your coding and the themes that are emerging. You discuss differences of opinion and agree on a strategy for resolving these.

Data triangulation

Data triangulation is a strategy that you build into your research design where you include data from multiple sources to help enhance your understanding of a topic. This might mean that you include a variety of groups of people to represent different perspectives on the issue. This can also mean that you collect different types of data. The main idea here is that by incorporating different sources of data (people or types), you are seeking to get a more well-rounded or comprehensive understanding of the focus of your study.

Example.

People: Instead of just interviewing mental health consumers about their treatment, you also include family members and providers.

Types: I have conducted a case study where we included interviews and the analysis of multiple documents, such as emails, agendas, and meeting minutes.

Peer debriefing

Peer debriefing means that you intentionally plan for and meet with a qualitative researcher outside of your team to discuss your process and findings and to help examine the decisions you are making, the logic behind them, and your potential influence and accountability in the research process. You will often meet with a peer debriefer multiple times during your research process and may do things like: review your reflexive journal; review certain aspects of your project, such as preliminary findings; discuss current decisions you are considering; and review the current status of your project. The main focus here is building in some objectivity to what can become a very subjective process. We can easily become very involved in this research and it can be hard for us to step back and thoughtfully examine the decisions we are making.

Example. You ask one of your social work faculty not connected to your research project to act as your peer debriefer. You meet every two weeks to discuss the progress in your research and review both your methodological and reflexive research journals, including the decisions you have been making about how to proceed. This includes reviewing your findings and how you are arriving at them. You put this person in your IRB as part of your research team.

Member-checking

Member-checking has to do with incorporating research participants into the data analysis process. This may mean actively including them throughout the analysis, either as a co-researcher or as a consultant. This can also mean that once you have the findings from your analysis, you take these to your participants (or a subset of your participants) and ask them to review these findings and provide you feedback about their accuracy. I will often ask participants when I member-check, can you hear your voice in these findings? Do you recognize what you shared with me in these results? We often need to preface member-checking by saying that we are bringing together many people’s ideas, so we are often trying to represent multiple perspectives, but we want to make sure that their perspective is included in there. This can be a very important step in ensuring that we did a reasonable job getting from our raw data to our findings…did we get it right. It also gives some power back to participants, as we are giving them some say in what our findings look like.

 

Example. After conducting a focus group and analyzing the data, you contact the participants by email, sending them a copy of the preliminary findings. You ask them to give you feedback about how accurately these findings represent the ideas that were presented in the focus group and if they have anything they want to add or challenge. You built in this member-checking into your informed consent so that participants knew they would be asked to do this from the beginning.

Thick description

Providing a thick description means that you are giving your audience a rich, detailed description of your findings and the context in which they exist. As you read a thick description, you walk away feeling like you have a very vivid picture of what the research participants felt, thought, or experienced, and that you now have a more complete understanding of the topic being studied. Of course, a thick description can’t just be made up at the end. You can’t hope to produce a thick description if you haven’t done work early on to collect detailed data and performed a thorough analysis. Our main objective with a thick description is being accountable to our audience in helping them to understand what we learned in the most comprehensive way possible.

Example. As you are writing up your findings, you continue to refine them, adding more details about the context of your study, such as the social and political climate of the places where you are collecting data and the circumstances that led to the study itself. You also offer clear explanations of the themes and how they relate to each other (and how they differ). Your explanation is well-supported by quotes that reflect these ideas and how your interpretation of the data was shaped along the way.

Reflexivity

Reflexivity pertains to how we understand and account for our influence, as researchers, on the research process. In social work practice, we talk extensively about our “use of self” as social workers, meaning that we work to understanding how our unique personhood (who we are) impacts or influences how we work with our clients. Reflexivity is about applying this to the process of research, rather than practice. It assumes that our values, beliefs, understanding, and experiences all may influence the decisions that we make as we engage in research. By engaging in qualitative research with reflexivity, we are attempting to be transparent about how we are shaping and being shaped by the research we are conducting.

Example. Reflexivity is often captured in a reflexive journal kept throughout the research process, as we have been practicing with. You may also choose to include elements of reflexivity in the presentation of your findings. Sometimes qualitative researchers do this by providing a positioning statement. A positioning statement is a description that incorporates reflexive information about how the researcher is personally and professionally connected to this specific research project and may accompany a research report, manuscript or presentation; it links who the researcher is to how the research was conducted.

Prolonged engagement

Prolonged engagement means that we are extensively spending time with participants or are in the community we are studying. We are visiting on multiple occasions during the study in an attempt to get the most complete picture or understanding possible. This can be very important for us as we attempt to analyze and interpret our data. If we haven’t spent enough time getting to know our participants and their community, we may miss the meaning of data that is shared with us because we don’t understand the cultural subtext in which this data exists. The main idea here is that we don’t know what we don’t know; furthermore, we can’t know it unless we invest time getting to know it! There’s no short-cut here, you have to put in the time.

Example. For your research project, spend time getting to know the location where you will be collecting data and the people that frequent it. Once you start data collection, you also spend considerable time in this same location, making detailed observations and really trying to get to know the ‘culture’ of the place. You conduct interviews while you are there, but you also spend about an hour before and after each interview getting to know the area and the people better. You also attend some of the community functions that take place there and even volunteer at a couple events a few months into the study.

Audit trail

Creating an audit trail is something we do during our data analysis process as qualitative researchers. An audit trail is essentially creating a map of how you got from your raw data to your research findings. This means that we should be able to work backwards, starting with your research findings and trace them back to your raw data. It starts with labeling our data as we begin to break it apart (deconstruction) and then reassemble it (reconstruction). It allows us to determine where ideas came from and how/why we put ideas together to form broader themes. An audit trail offers transparency in our data analysis process. It is the opposite of the “black box” we spoke about in our qualitative analysis chapter, making it clear how we got from point A to point B.

 

Example. During your research study, you engage in detailed note-taking, including a research methods journal where you document your decision-making throughout the research process and memoing as you are analyzing your data. You label your data units clearly so you can trace each individual unit back to its original source. As you move your data units around and determine which ideas belong together and what they mean, you record each of these decisions and how you arrived at them.

External audit

An external audit is when we actually bring in a qualitative researcher not connected to our project once the study has been completed to examine the research project and the findings to “evaluate the accuracy and evaluate whether or not the findings, interpretations and conclusions are supported by the data” (Robert Wood Johnson Foundation, External Audits). An external auditor will likely look at all of our research materials, but will likely make extensive use of our audit trail to ensure that a clear link can be established between our findings and the raw data we collected by an external observer. Much like a peer debriefer, an external auditor can offer an outside critique of the study, thereby helping us to reflect on the work we are doing and how we are going about it.

Example. You hire an external qualitative research auditor, someone not connected to your research project but who you know is knowledgeable about qualitative research, to conduct an external audit. You share your data, findings, and audit trail materials with them. They review these materials and a month later, produce a report for you that documents their conclusions about the trustworthiness of the connection between your findings and the original data you collected.

Negative case analysis

Negative case analysis involves including data that contrasts, contradicts, or challenges the majority of evidence that we have found or expect to find. This may come into play in our sampling, meaning that we may seek to recruit or include a specific participant or group of participants because they represent a divergent opinion. Or, as we begin our analysis, we may identify a unique or contrasting idea or opinion that seems to contradict the majority of what our other data seem to be point to. In this case, we choose to intentionally analyze and work to understand this unique perspective in our data. As with a thick description, a negative case analysis is attempting to offer the most comprehensive and complete understanding of the phenomenon we are studying, including divergent or contradictory ideas that may be held about it.

Example. You are conducting a study that is analyzing published personal memoirs on a specific topic. After you have gathered your memoirs and begin reviewing them, you realize that one of them goes in a radically different direction than the others. Rather than ditching it, you analyze it as a separate case and try to do some additional information hunting to find out more about this particular memoir, the author, and its circumstances. Because the findings from this particular memoir differ so much from the main findings of the study from the other data, you write up an additional negative case report to supplement your findings and highlight this unique experience contrasting it with your other findings.

Now let’s take some time to think through each of the stages of the design process and consider how we might apply some of these strategies. Again, these tools are to help us, as human instruments, better account for our role in the qualitative research process and also to enhance the trustworthiness of our research when we share it with others. It is unrealistic that you would apply all of these, but attention to some will indicate that you have been thoughtful in your design and concerned about the quality of your work and the confidence in your findings.

First let’s discuss sampling. We have already discussed that qualitative research generally relies on non-probability sampling and have reviewed some specific non-probability strategies you might use. However, along with selecting a strategy, you might also include a couple of the rigor-related tools discussed above. First, you might choose to employ data triangulation. For instance, maybe you are conducting an ethnography studying the culture of a peer-support clubhouse. As you are designing your study, along with extensive observations you plan to make in the clubhouse, you are also going to conduct interviews with staff, board members, and focus groups with members. In this way you are combining different types of data (i.e. observations, focus groups, interviews) and perspectives (i.e. yourself as the researcher, members, staff, board). In addition, you might also consider using negative case analysis. At the planning stage, this could involve you intentionally sampling a case or set of cases that are likely to provide an alternative view or perspective compared to what you might expect to find. Finally, specifically articulating your sampling rationale can also enhance the rigor of your research (Barusch, Gringeri, & George, 2011).[2] While this isn’t listed in our tools table, it is generally a good practice when reporting your research (qualitative or quantitative) to outline your sampling strategy with a brief rationale for the choices you made. This helps to improve the transparency of your study.

Next, we can progress to data gathering. The main rigor-related tool that directly applies to this stage of your design is likely prolonged engagement. Here we build in or plan to spend extensive time with participants gathering data. This might mean that we return for repeated interviews with the same participants or that we go back numerous times to make observations and take field notes. While this can take many forms, the overarching idea here is that you build in time to immerse yourself in the context and culture that you are studying. Again, there is no short-cut here, it demands time in the field getting to know people, places, significance, history, etc. You need to appreciate the context and the culture of the situation you are studying. Something special to consider here is insider/outsider status. If you would consider yourself an “outsider”, that is to say someone who does not belong to the same group or community of people you are studying, it may be quite obvious that you will need to spend time getting to know this group and take considerable time observing and reflecting on the significance of what you see. However, if you are a researcher who is a member of the particular community you are studying, or an “insider”, I would suggest that you still need to work to objectively to take a step back, make observations, and try to reflect on what you see, what you thought you knew, and what you come to know about the community you belong to. In both cases, prolonged engagement requires good self-reflection and observation skills.

A number of these tools may be applied during the data analysis process. First, if you have a research team, you might use observer triangulation, although this might not be an option as a student unless you are building a proposal as a group. As explained above, observer triangulation means that more than one of you will be examining the data that has been collected and drawing results from it. You will then compare these results and ultimately converge on your findings.

Example. I’m currently using the following strategy on a project where we are analyzing focus group data that was collected over a number of focus groups. We have a team of four researchers and our process involves:

  1. reviewing our initial focus group transcripts
  2. individually identifying important categories that were present
  3. collectively processing these together and identifying specific labels we would use for a second round of coding
  4. individually returning to the transcripts with our codes and coding all the transcripts
  5. collectively meeting again to discuss what subthemes fell under each of the codes and if the codes fit or needed to be changed/merged/expanded

While the process was complex, I do believe this triangulation of observers enriched our analysis process. It helped us to gain a clearer understanding of our results as we collectively discussed and debated what each theme meant based on our individual understandings of the data.

While we did discuss negative case analysis above in the sampling phase, it is also worth mentioning here. Contradictory findings may creep up during our analysis. One of our participants may share something or we may find something in a document that seemingly is at odds with the majority of the rest of our data. Rather than ignoring this, negative case analysis would seek to understand this perspective and what might be behind this contradiction. In addition, we may choose to construct an audit trail as we move from raw data to our research findings during our data analysis. This means that we will institute a strategy for tracking our analysis process. I imagine that most researchers develop their own variation on this tracking process, but at its core, you need to find a way to label your segments of data so that you know where they came from once you start to break them up. Furthermore, you will be making decisions about what groups of data belong together and what they mean. Your tracking process for your audit trail will also have to provide a way to document how you arrived at these decisions. Often towards the end of an analysis process, researchers may choose to employ member checking (although you may also implement this throughout your analysis). In the example above where I was discussing our focus group project, we plan to take our findings back to some of our focus group participants to see if they feel that we captured the important information based on what they shared with us. As discussed in sampling, it is also a good practice to make sure to articulate your qualitative analysis process clearly. Unfortunately, I’ve read a number of qualitative studies where the researchers provide little detail regarding what their analysis looked like and how they arrived at their results. This often leaves me with questions about the quality of what was done.

Now we need to share our research with others. The most relevant tool specific to this phase is providing a thick description of our results. As indicated in the table, a thick description means that we offer our audience a very detailed, rich narrative in helping them to interpret and make sense of our results. Remember, the main aim of qualitative research is not necessarily to produce results that generalize to a large group of people. Rather, we are seeking to enhance understanding about a particular experience, issue, or phenomenon by studying it very extensively for a relatively small sample. This produces a deep, as opposed to, a broad understanding. A thick description can be very helpful by offering detailed information about the sample, the context in which the study takes place, and a thorough explanation of findings and often how they relate to each other. As a consumer of research, a thick description can help us to make our own judgments about the implications of these results and what other situations or populations these findings might apply to.

 

You may have noticed that a few of the tools in our table haven’t yet been discussed in the qualitative process yet. This is because some of these rigor-related tools are meant to span the researcher process. To begin with, reflexivity is a tool that best applied through qualitative research. I encourage students in my social work practice classes to find ways to build reflexivity into their professional lives as a way of improving their professional skills. This is no less true of qualitative research students. Throughout our research process, we need to consider how our use-of-self is shaping the decisions we are making and how the research may be transforming us during the process. What led you to choose your research question? Why did you group those ideas together? What caused you to label your theme that? What words do you use to talk about your study at a conference? The qualitative researcher has much influence throughout this process, and self-examination of that influence can be an important piece of rigor. As an example, one step that I sometimes build into qualitative projects is reflexively journaling before and after interviews. I’m often driving to these interviews, so I’ll turn my Bluetooth on in the car and capture my thoughts before and after, transcribing them later. This helps me to check-in with myself during data collection and can help me illuminate insights I might otherwise miss. I have also found this to be helpful to use in my peer debriefing. Peer debriefing can be used throughout the research process. Meeting with a peer debriefer throughout the research process can be a good way to consistently reflect on your progress and the decisions you are making throughout a project. A peer debriefer can make connections that we may otherwise miss and question aspects of our project that may be important for us to explore. As I mentioned, combining reflexivity with peer debriefing can be a powerful tool for processing our self-reflection in connection with the progress of our project.

Finally, the use of an external audit really doesn’t come into play until the end of the research process, but an external auditor will look extensively at the whole research process. Again, this is a researcher who is unattached to the project and seeking to follow the path of the project in hopes of providing an external perspective on the trustworthiness of the research process and its findings. Often, these auditors will begin at the end, starting with the findings, and attempt to trace backwards to the beginning of the project. This is often quite a laborious task and some qualitative scholars debate whether the attention to objectivity in this strategy may be at odds with the aims of qualitative research in illuminating the uniquely subjective experiences of participants by inherently subjective researchers. However, it can be a powerful tool for demonstrating that a systematic approach was used.

As you are thinking about designing your qualitative research proposal, consider how you might use some of these tools to strengthen the quality of your proposed research. Again, you might be using these throughout the entire research process, or applying them more specifically to one stage of the process (e.g. data collection, data analysis). In addition, as you are reviewing qualitative studies to include in your literature review or just in developing your understanding of the topic, make sure to look out for some of these tools being used. They are general indicators that we can use to assess the attention and care that was given to using a scientific approach to producing the knowledge that is being shared.

Key Takeaways

  • As qualitative researchers there are a number of tools at your disposal to help support quality and rigor. These tools can aid you in assessing the quality of others’ work and in supporting the quality of your own design.
  • Qualitative rigor is not a box we can tick complete somewhere along our research project’s timeline. It is something that needs to be attended to thoughtfully throughout the research process; it is a commitment we make to our participants and to our potential audience.

Exercises

List out 2-3 tools that seem like they would be a good fit for supporting the rigor of your qualitative proposal. Also, provide a justification as to why they seem relevant to the design of your research and what you are trying to accomplish.

  • Tool:
    • Justification:
  • Tool:
    • Justification:
  • Tool:
    • Justification:

  1. Lincoln, YS. & Guba, EG. (1985). Naturalistic inquiry. Newbury Park, CA: Sage Publications.
  2. Barusch, A., Gringeri, C., & George, M. (2011). Rigor in qualitative social work research: A review of strategies used in published articles. Social Work Research, 35(1), 11-19.
definition

License

Doctoral Research Methods in Social Work Copyright © by Mavs Open Press. All Rights Reserved.

Share This Book