How to conduct qualitative interviews (tips and best practices)

Last updated

18 May 2023

Reviewed by

Miroslav Damyanov

However, conducting qualitative interviews can be challenging, even for seasoned researchers. Poorly conducted interviews can lead to inaccurate or incomplete data, significantly compromising the validity and reliability of your research findings.

When planning to conduct qualitative interviews, you must adequately prepare yourself to get the most out of your data. Fortunately, there are specific tips and best practices that can help you conduct qualitative interviews effectively.

  • What is a qualitative interview?

A qualitative interview is a research technique used to gather in-depth information about people's experiences, attitudes, beliefs, and perceptions. Unlike a structured questionnaire or survey, a qualitative interview is a flexible, conversational approach that allows the interviewer to delve into the interviewee's responses and explore their insights and experiences.

In a qualitative interview, the researcher typically develops a set of open-ended questions that provide a framework for the conversation. However, the interviewer can also adapt to the interviewee's responses and ask follow-up questions to understand their experiences and views better.

  • How to conduct interviews in qualitative research

Conducting interviews involves a well-planned and deliberate process to collect accurate and valid data. 

Here’s a step-by-step guide on how to conduct interviews in qualitative research, broken down into three stages:

1. Before the interview

The first step in conducting a qualitative interview is determining your research question . This will help you identify the type of participants you need to recruit . Once you have your research question, you can start recruiting participants by identifying potential candidates and contacting them to gauge their interest in participating in the study. 

After that, it's time to develop your interview questions. These should be open-ended questions that will elicit detailed responses from participants. You'll also need to get consent from the participants, ideally in writing, to ensure that they understand the purpose of the study and their rights as participants. Finally, choose a comfortable and private location to conduct the interview and prepare the interview guide.

2. During the interview

Start by introducing yourself and explaining the purpose of the study. Establish a rapport by putting the participants at ease and making them feel comfortable. Use the interview guide to ask the questions, but be flexible and ask follow-up questions to gain more insight into the participants' responses. 

Take notes during the interview, and ask permission to record the interview for transcription purposes. Be mindful of the time, and cover all the questions in the interview guide.

3. After the interview

Once the interview is over, transcribe the interview if you recorded it. If you took notes, review and organize them to make sure you capture all the important information. Then, analyze the data you collected by identifying common themes and patterns. Use the findings to answer your research question. 

Finally, debrief with the participants to thank them for their time, provide feedback on the study, and answer any questions they may have.

Free AI content analysis generator

Make sense of your research by automatically summarizing key takeaways through our free content analysis tool.

how to conduct an interview qualitative research

  • What kinds of questions should you ask in a qualitative interview?

Qualitative interviews involve asking questions that encourage participants to share their experiences, opinions, and perspectives on a particular topic. These questions are designed to elicit detailed and nuanced responses rather than simple yes or no answers.

Effective questions in a qualitative interview are generally open-ended and non-leading. They avoid presuppositions or assumptions about the participant's experience and allow them to share their views in their own words. 

In customer research , you might ask questions such as:

What motivated you to choose our product/service over our competitors?

How did you first learn about our product/service?

Can you walk me through your experience with our product/service?

What improvements or changes would you suggest for our product/service?

Have you recommended our product/service to others, and if so, why?

The key is to ask questions relevant to the research topic and allow participants to share their experiences meaningfully and informally. 

  • How to determine the right qualitative interview participants

Choosing the right participants for a qualitative interview is a crucial step in ensuring the success and validity of the research . You need to consider several factors to determine the right participants for a qualitative interview. These may include:

Relevant experiences : Participants should have experiences related to the research topic that can provide valuable insights.

Diversity : Aim to include diverse participants to ensure the study's findings are representative and inclusive.

Access : Identify participants who are accessible and willing to participate in the study.

Informed consent : Participants should be fully informed about the study's purpose, methods, and potential risks and benefits and be allowed to provide informed consent.

You can use various recruitment methods, such as posting ads in relevant forums, contacting community organizations or social media groups, or using purposive sampling to identify participants who meet specific criteria.

  • How to make qualitative interview subjects comfortable

Making participants comfortable during a qualitative interview is essential to obtain rich, detailed data. Participants are more likely to share their experiences openly when they feel at ease and not judged. 

Here are some ways to make interview subjects comfortable:

Explain the purpose of the study

Start the interview by explaining the research topic and its importance. The goal is to give participants a sense of what to expect.

Create a comfortable environment

Conduct the interview in a quiet, private space where the participant feels comfortable. Turn off any unnecessary electronics that can create distractions. Ensure your equipment works well ahead of time. Arrive at the interview on time. If you conduct a remote interview, turn on your camera and mute all notetakers and observers.

Build rapport

Greet the participant warmly and introduce yourself. Show interest in their responses and thank them for their time.

Use open-ended questions

Ask questions that encourage participants to elaborate on their thoughts and experiences.

Listen attentively

Resist the urge to multitask . Pay attention to the participant's responses, nod your head, or make supportive comments to show you’re interested in their answers. Avoid interrupting them.

Avoid judgment

Show respect and don't judge the participant's views or experiences. Allow the participant to speak freely without feeling judged or ridiculed.

Offer breaks

If needed, offer breaks during the interview, especially if the topic is sensitive or emotional.

Creating a comfortable environment and establishing rapport with the participant fosters an atmosphere of trust and encourages open communication. This helps participants feel at ease and willing to share their experiences.

  • How to analyze a qualitative interview

Analyzing a qualitative interview involves a systematic process of examining the data collected to identify patterns, themes, and meanings that emerge from the responses. 

Here are some steps on how to analyze a qualitative interview:

1. Transcription

The first step is transcribing the interview into text format to have a written record of the conversation. This step is essential to ensure that you can refer back to the interview data and identify the important aspects of the interview.

2. Data reduction

Once you’ve transcribed the interview, read through it to identify key themes, patterns, and phrases emerging from the data. This process involves reducing the data into more manageable pieces you can easily analyze.

The next step is to code the data by labeling sections of the text with descriptive words or phrases that reflect the data's content. Coding helps identify key themes and patterns from the interview data.

4. Categorization

After coding, you should group the codes into categories based on their similarities. This process helps to identify overarching themes or sub-themes that emerge from the data.

5. Interpretation

You should then interpret the themes and sub-themes by identifying relationships, contradictions, and meanings that emerge from the data. Interpretation involves analyzing the themes in the context of the research question .

6. Comparison

The next step is comparing the data across participants or groups to identify similarities and differences. This step helps to ensure that the findings aren’t just specific to one participant but can be generalized to the wider population.

7. Triangulation

To ensure the findings are valid and reliable, you should use triangulation by comparing the findings with other sources, such as observations or interview data.

8. Synthesis

The final step is synthesizing the findings by summarizing the key themes and presenting them clearly and concisely. This step involves writing a report that presents the findings in a way that is easy to understand, using quotes and examples from the interview data to illustrate the themes.

  • Tips for transcribing a qualitative interview

Transcribing a qualitative interview is a crucial step in the research process. It involves converting the audio or video recording of the interview into written text. 

Here are some tips for transcribing a qualitative interview:

Use transcription software

Transcription software can save time and increase accuracy by automatically transcribing audio or video recordings.

Listen carefully

When manually transcribing, listen carefully to the recording to ensure clarity. Pause and rewind the recording as necessary.

Use appropriate formatting

Use a consistent format for transcribing, such as marking pauses, overlaps, and interruptions. Indicate non-verbal cues such as laughter, sighs, or changes in tone.

Edit for clarity

Edit the transcription to ensure clarity and readability. Use standard grammar and punctuation, correct misspellings, and remove filler words like "um" and "ah."

Proofread and edit

Verify the accuracy of the transcription by listening to the recording again and reviewing the notes taken during the interview.

Use timestamps

Add timestamps to the transcription to reference specific interview sections.

Transcribing a qualitative interview can be time-consuming, but it’s essential to ensure the accuracy of the data collected. Following these tips can produce high-quality transcriptions useful for analysis and reporting.

  • Why are interview techniques in qualitative research effective?

Unlike quantitative research methods, which rely on numerical data, qualitative research seeks to understand the richness and complexity of human experiences and perspectives. 

Interview techniques involve asking open-ended questions that allow participants to express their views and share their stories in their own words. This approach can help researchers to uncover unexpected or surprising insights that may not have been discovered through other research methods.

Interview techniques also allow researchers to establish rapport with participants, creating a comfortable and safe space for them to share their experiences. This can lead to a deeper level of trust and candor, leading to more honest and authentic responses.

  • What are the weaknesses of qualitative interviews?

Qualitative interviews are an excellent research approach when used properly, but they have their drawbacks. 

The weaknesses of qualitative interviews include the following:

Subjectivity and personal biases

Qualitative interviews rely on the researcher's interpretation of the interviewee's responses. The researcher's biases or preconceptions can affect how the questions are framed and how the responses are interpreted, which can influence results.

Small sample size

The sample size in qualitative interviews is often small, which can limit the generalizability of the results to the larger population.

Data quality

The quality of data collected during interviews can be affected by various factors, such as the interviewee's mood, the setting of the interview, and the interviewer's skills and experience.

Socially desirable responses

Interviewees may provide responses that they believe are socially acceptable rather than truthful or genuine.

Conducting qualitative interviews can be expensive, especially if the researcher must travel to different locations to conduct the interviews.

Time-consuming

The data analysis process can be time-consuming and labor-intensive, as researchers need to transcribe and analyze the data manually.

Despite these weaknesses, qualitative interviews remain a valuable research tool . You can take steps to mitigate the impact of these weaknesses by incorporating the perspectives of other researchers or participants in the analysis process, using multiple data sources , and critically analyzing your biases and assumptions.

Mastering the art of qualitative interviews is an essential skill for businesses looking to gain deep insights into their customers' needs , preferences, and behaviors. By following the tips and best practices outlined in this article, you can conduct interviews that provide you with rich data that you can use to make informed decisions about your products, services, and marketing strategies. 

Remember that effective communication, active listening, and proper analysis are critical components of successful qualitative interviews. By incorporating these practices into your customer research, you can gain a competitive edge and build stronger customer relationships.

Should you be using a customer insights hub?

Do you want to discover previous research faster?

Do you share your research findings with others?

Do you analyze research data?

Start for free today, add your research, and get to key insights faster

Editor’s picks

Last updated: 18 April 2023

Last updated: 27 February 2023

Last updated: 6 February 2023

Last updated: 6 October 2023

Last updated: 5 February 2023

Last updated: 16 April 2023

Last updated: 7 March 2023

Last updated: 9 March 2023

Last updated: 12 December 2023

Last updated: 11 March 2024

Last updated: 6 March 2024

Last updated: 5 March 2024

Last updated: 13 May 2024

Latest articles

Related topics, .css-je19u9{-webkit-align-items:flex-end;-webkit-box-align:flex-end;-ms-flex-align:flex-end;align-items:flex-end;display:-webkit-box;display:-webkit-flex;display:-ms-flexbox;display:flex;-webkit-flex-direction:row;-ms-flex-direction:row;flex-direction:row;-webkit-box-flex-wrap:wrap;-webkit-flex-wrap:wrap;-ms-flex-wrap:wrap;flex-wrap:wrap;-webkit-box-pack:center;-ms-flex-pack:center;-webkit-justify-content:center;justify-content:center;row-gap:0;text-align:center;max-width:671px;}@media (max-width: 1079px){.css-je19u9{max-width:400px;}.css-je19u9>span{white-space:pre;}}@media (max-width: 799px){.css-je19u9{max-width:400px;}.css-je19u9>span{white-space:pre;}} decide what to .css-1kiodld{max-height:56px;display:-webkit-box;display:-webkit-flex;display:-ms-flexbox;display:flex;-webkit-align-items:center;-webkit-box-align:center;-ms-flex-align:center;align-items:center;}@media (max-width: 1079px){.css-1kiodld{display:none;}} build next, decide what to build next.

how to conduct an interview qualitative research

Users report unexpectedly high data usage, especially during streaming sessions.

how to conduct an interview qualitative research

Users find it hard to navigate from the home page to relevant playlists in the app.

how to conduct an interview qualitative research

It would be great to have a sleep timer feature, especially for bedtime listening.

how to conduct an interview qualitative research

I need better filters to find the songs or artists I’m looking for.

Log in or sign up

Get started for free

Root out friction in every digital experience, super-charge conversion rates, and optimize digital self-service

Uncover insights from any interaction, deliver AI-powered agent coaching, and reduce cost to serve

Increase revenue and loyalty with real-time insights and recommendations delivered to teams on the ground

Know how your people feel and empower managers to improve employee engagement, productivity, and retention

Take action in the moments that matter most along the employee journey and drive bottom line growth

Whatever they’re are saying, wherever they’re saying it, know exactly what’s going on with your people

Get faster, richer insights with qual and quant tools that make powerful market research available to everyone

Run concept tests, pricing studies, prototyping + more with fast, powerful studies designed by UX research experts

Track your brand performance 24/7 and act quickly to respond to opportunities and challenges in your market

Explore the platform powering Experience Management

  • Free Account
  • Product Demos
  • For Digital
  • For Customer Care
  • For Human Resources
  • For Researchers
  • Financial Services
  • All Industries

Popular Use Cases

  • Customer Experience
  • Employee Experience
  • Net Promoter Score
  • Voice of Customer
  • Customer Success Hub
  • Product Documentation
  • Training & Certification
  • XM Institute
  • Popular Resources
  • Customer Stories
  • Artificial Intelligence

Market Research

  • Partnerships
  • Marketplace

The annual gathering of the experience leaders at the world’s iconic brands building breakthrough business results, live in Salt Lake City.

  • English/AU & NZ
  • Español/Europa
  • Español/América Latina
  • Português Brasileiro
  • REQUEST DEMO
  • Experience Management
  • Qualitative Research Interviews

Try Qualtrics for free

How to carry out great interviews in qualitative research.

11 min read An interview is one of the most versatile methods used in qualitative research. Here’s what you need to know about conducting great qualitative interviews.

What is a qualitative research interview?

Qualitative research interviews are a mainstay among q ualitative research techniques, and have been in use for decades either as a primary data collection method or as an adjunct to a wider research process. A qualitative research interview is a one-to-one data collection session between a researcher and a participant. Interviews may be carried out face-to-face, over the phone or via video call using a service like Skype or Zoom.

There are three main types of qualitative research interview – structured, unstructured or semi-structured.

  • Structured interviews Structured interviews are based around a schedule of predetermined questions and talking points that the researcher has developed. At their most rigid, structured interviews may have a precise wording and question order, meaning that they can be replicated across many different interviewers and participants with relatively consistent results.
  • Unstructured interviews Unstructured interviews have no predetermined format, although that doesn’t mean they’re ad hoc or unplanned. An unstructured interview may outwardly resemble a normal conversation, but the interviewer will in fact be working carefully to make sure the right topics are addressed during the interaction while putting the participant at ease with a natural manner.
  • Semi-structured interviews Semi-structured interviews are the most common type of qualitative research interview, combining the informality and rapport of an unstructured interview with the consistency and replicability of a structured interview. The researcher will come prepared with questions and topics, but will not need to stick to precise wording. This blended approach can work well for in-depth interviews.

Free eBook: The qualitative research design handbook

What are the pros and cons of interviews in qualitative research?

As a qualitative research method interviewing is hard to beat, with applications in social research, market research, and even basic and clinical pharmacy. But like any aspect of the research process, it’s not without its limitations. Before choosing qualitative interviewing as your research method, it’s worth weighing up the pros and cons.

Pros of qualitative interviews:

  • provide in-depth information and context
  • can be used effectively when their are low numbers of participants
  • provide an opportunity to discuss and explain questions
  • useful for complex topics
  • rich in data – in the case of in-person or video interviews , the researcher can observe body language and facial expression as well as the answers to questions

Cons of qualitative interviews:

  • can be time-consuming to carry out
  • costly when compared to some other research methods
  • because of time and cost constraints, they often limit you to a small number of participants
  • difficult to standardize your data across different researchers and participants unless the interviews are very tightly structured
  • As the Open University of Hong Kong notes, qualitative interviews may take an emotional toll on interviewers

Qualitative interview guides

Semi-structured interviews are based on a qualitative interview guide, which acts as a road map for the researcher. While conducting interviews, the researcher can use the interview guide to help them stay focused on their research questions and make sure they cover all the topics they intend to.

An interview guide may include a list of questions written out in full, or it may be a set of bullet points grouped around particular topics. It can prompt the interviewer to dig deeper and ask probing questions during the interview if appropriate.

Consider writing out the project’s research question at the top of your interview guide, ahead of the interview questions. This may help you steer the interview in the right direction if it threatens to head off on a tangent.

how to conduct an interview qualitative research

Avoid bias in qualitative research interviews

According to Duke University , bias can create significant problems in your qualitative interview.

  • Acquiescence bias is common to many qualitative methods, including focus groups. It occurs when the participant feels obliged to say what they think the researcher wants to hear. This can be especially problematic when there is a perceived power imbalance between participant and interviewer. To counteract this, Duke University’s experts recommend emphasizing the participant’s expertise in the subject being discussed, and the value of their contributions.
  • Interviewer bias is when the interviewer’s own feelings about the topic come to light through hand gestures, facial expressions or turns of phrase. Duke’s recommendation is to stick to scripted phrases where this is an issue, and to make sure researchers become very familiar with the interview guide or script before conducting interviews, so that they can hone their delivery.

What kinds of questions should you ask in a qualitative interview?

The interview questions you ask need to be carefully considered both before and during the data collection process. As well as considering the topics you’ll cover, you will need to think carefully about the way you ask questions.

Open-ended interview questions – which cannot be answered with a ‘yes’ ‘no’ or ‘maybe’ – are recommended by many researchers as a way to pursue in depth information.

An example of an open-ended question is “What made you want to move to the East Coast?” This will prompt the participant to consider different factors and select at least one. Having thought about it carefully, they may give you more detailed information about their reasoning.

A closed-ended question , such as “Would you recommend your neighborhood to a friend?” can be answered without too much deliberation, and without giving much information about personal thoughts, opinions and feelings.

Follow-up questions can be used to delve deeper into the research topic and to get more detail from open-ended questions. Examples of follow-up questions include:

  • What makes you say that?
  • What do you mean by that?
  • Can you tell me more about X?
  • What did/does that mean to you?

As well as avoiding closed-ended questions, be wary of leading questions. As with other qualitative research techniques such as surveys or focus groups, these can introduce bias in your data. Leading questions presume a certain point of view shared by the interviewer and participant, and may even suggest a foregone conclusion.

An example of a leading question might be: “You moved to New York in 1990, didn’t you?” In answering the question, the participant is much more likely to agree than disagree. This may be down to acquiescence bias or a belief that the interviewer has checked the information and already knows the correct answer.

Other leading questions involve adjectival phrases or other wording that introduces negative or positive connotations about a particular topic. An example of this kind of leading question is: “Many employees dislike wearing masks to work. How do you feel about this?” It presumes a positive opinion and the participant may be swayed by it, or not want to contradict the interviewer.

Harvard University’s guidelines for qualitative interview research add that you shouldn’t be afraid to ask embarrassing questions – “if you don’t ask, they won’t tell.” Bear in mind though that too much probing around sensitive topics may cause the interview participant to withdraw. The Harvard guidelines recommend leaving sensitive questions til the later stages of the interview when a rapport has been established.

More tips for conducting qualitative interviews

Observing a participant’s body language can give you important data about their thoughts and feelings. It can also help you decide when to broach a topic, and whether to use a follow-up question or return to the subject later in the interview.

Be conscious that the participant may regard you as the expert, not themselves. In order to make sure they express their opinions openly, use active listening skills like verbal encouragement and paraphrasing and clarifying their meaning to show how much you value what they are saying.

Remember that part of the goal is to leave the interview participant feeling good about volunteering their time and their thought process to your research. Aim to make them feel empowered , respected and heard.

Unstructured interviews can demand a lot of a researcher, both cognitively and emotionally. Be sure to leave time in between in-depth interviews when scheduling your data collection to make sure you maintain the quality of your data, as well as your own well-being .

Recording and transcribing interviews

Historically, recording qualitative research interviews and then transcribing the conversation manually would have represented a significant part of the cost and time involved in research projects that collect qualitative data.

Fortunately, researchers now have access to digital recording tools, and even speech-to-text technology that can automatically transcribe interview data using AI and machine learning. This type of tool can also be used to capture qualitative data from qualitative research (focus groups,ect.) making this kind of social research or market research much less time consuming.

how to conduct an interview qualitative research

Data analysis

Qualitative interview data is unstructured, rich in content and difficult to analyze without the appropriate tools. Fortunately, machine learning and AI can once again make things faster and easier when you use qualitative methods like the research interview.

Text analysis tools and natural language processing software can ‘read’ your transcripts and voice data and identify patterns and trends across large volumes of text or speech. They can also perform khttps://www.qualtrics.com/experience-management/research/sentiment-analysis/

which assesses overall trends in opinion and provides an unbiased overall summary of how participants are feeling.

how to conduct an interview qualitative research

Another feature of text analysis tools is their ability to categorize information by topic, sorting it into groupings that help you organize your data according to the topic discussed.

All in all, interviews are a valuable technique for qualitative research in business, yielding rich and detailed unstructured data. Historically, they have only been limited by the human capacity to interpret and communicate results and conclusions, which demands considerable time and skill.

When you combine this data with AI tools that can interpret it quickly and automatically, it becomes easy to analyze and structure, dovetailing perfectly with your other business data. An additional benefit of natural language analysis tools is that they are free of subjective biases, and can replicate the same approach across as much data as you choose. By combining human research skills with machine analysis, qualitative research methods such as interviews are more valuable than ever to your business.

Related resources

Market intelligence 10 min read, marketing insights 11 min read, ethnographic research 11 min read, qualitative vs quantitative research 13 min read, qualitative research questions 11 min read, qualitative research design 12 min read, primary vs secondary research 14 min read, request demo.

Ready to learn more about Qualtrics?

Logo for Open Educational Resources

Chapter 11. Interviewing

Introduction.

Interviewing people is at the heart of qualitative research. It is not merely a way to collect data but an intrinsically rewarding activity—an interaction between two people that holds the potential for greater understanding and interpersonal development. Unlike many of our daily interactions with others that are fairly shallow and mundane, sitting down with a person for an hour or two and really listening to what they have to say is a profound and deep enterprise, one that can provide not only “data” for you, the interviewer, but also self-understanding and a feeling of being heard for the interviewee. I always approach interviewing with a deep appreciation for the opportunity it gives me to understand how other people experience the world. That said, there is not one kind of interview but many, and some of these are shallower than others. This chapter will provide you with an overview of interview techniques but with a special focus on the in-depth semistructured interview guide approach, which is the approach most widely used in social science research.

An interview can be variously defined as “a conversation with a purpose” ( Lune and Berg 2018 ) and an attempt to understand the world from the point of view of the person being interviewed: “to unfold the meaning of peoples’ experiences, to uncover their lived world prior to scientific explanations” ( Kvale 2007 ). It is a form of active listening in which the interviewer steers the conversation to subjects and topics of interest to their research but also manages to leave enough space for those interviewed to say surprising things. Achieving that balance is a tricky thing, which is why most practitioners believe interviewing is both an art and a science. In my experience as a teacher, there are some students who are “natural” interviewers (often they are introverts), but anyone can learn to conduct interviews, and everyone, even those of us who have been doing this for years, can improve their interviewing skills. This might be a good time to highlight the fact that the interview is a product between interviewer and interviewee and that this product is only as good as the rapport established between the two participants. Active listening is the key to establishing this necessary rapport.

Patton ( 2002 ) makes the argument that we use interviews because there are certain things that are not observable. In particular, “we cannot observe feelings, thoughts, and intentions. We cannot observe behaviors that took place at some previous point in time. We cannot observe situations that preclude the presence of an observer. We cannot observe how people have organized the world and the meanings they attach to what goes on in the world. We have to ask people questions about those things” ( 341 ).

Types of Interviews

There are several distinct types of interviews. Imagine a continuum (figure 11.1). On one side are unstructured conversations—the kind you have with your friends. No one is in control of those conversations, and what you talk about is often random—whatever pops into your head. There is no secret, underlying purpose to your talking—if anything, the purpose is to talk to and engage with each other, and the words you use and the things you talk about are a little beside the point. An unstructured interview is a little like this informal conversation, except that one of the parties to the conversation (you, the researcher) does have an underlying purpose, and that is to understand the other person. You are not friends speaking for no purpose, but it might feel just as unstructured to the “interviewee” in this scenario. That is one side of the continuum. On the other side are fully structured and standardized survey-type questions asked face-to-face. Here it is very clear who is asking the questions and who is answering them. This doesn’t feel like a conversation at all! A lot of people new to interviewing have this ( erroneously !) in mind when they think about interviews as data collection. Somewhere in the middle of these two extreme cases is the “ semistructured” interview , in which the researcher uses an “interview guide” to gently move the conversation to certain topics and issues. This is the primary form of interviewing for qualitative social scientists and will be what I refer to as interviewing for the rest of this chapter, unless otherwise specified.

Types of Interviewing Questions: Unstructured conversations, Semi-structured interview, Structured interview, Survey questions

Informal (unstructured conversations). This is the most “open-ended” approach to interviewing. It is particularly useful in conjunction with observational methods (see chapters 13 and 14). There are no predetermined questions. Each interview will be different. Imagine you are researching the Oregon Country Fair, an annual event in Veneta, Oregon, that includes live music, artisan craft booths, face painting, and a lot of people walking through forest paths. It’s unlikely that you will be able to get a person to sit down with you and talk intensely about a set of questions for an hour and a half. But you might be able to sidle up to several people and engage with them about their experiences at the fair. You might have a general interest in what attracts people to these events, so you could start a conversation by asking strangers why they are here or why they come back every year. That’s it. Then you have a conversation that may lead you anywhere. Maybe one person tells a long story about how their parents brought them here when they were a kid. A second person talks about how this is better than Burning Man. A third person shares their favorite traveling band. And yet another enthuses about the public library in the woods. During your conversations, you also talk about a lot of other things—the weather, the utilikilts for sale, the fact that a favorite food booth has disappeared. It’s all good. You may not be able to record these conversations. Instead, you might jot down notes on the spot and then, when you have the time, write down as much as you can remember about the conversations in long fieldnotes. Later, you will have to sit down with these fieldnotes and try to make sense of all the information (see chapters 18 and 19).

Interview guide ( semistructured interview ). This is the primary type employed by social science qualitative researchers. The researcher creates an “interview guide” in advance, which she uses in every interview. In theory, every person interviewed is asked the same questions. In practice, every person interviewed is asked mostly the same topics but not always the same questions, as the whole point of a “guide” is that it guides the direction of the conversation but does not command it. The guide is typically between five and ten questions or question areas, sometimes with suggested follow-ups or prompts . For example, one question might be “What was it like growing up in Eastern Oregon?” with prompts such as “Did you live in a rural area? What kind of high school did you attend?” to help the conversation develop. These interviews generally take place in a quiet place (not a busy walkway during a festival) and are recorded. The recordings are transcribed, and those transcriptions then become the “data” that is analyzed (see chapters 18 and 19). The conventional length of one of these types of interviews is between one hour and two hours, optimally ninety minutes. Less than one hour doesn’t allow for much development of questions and thoughts, and two hours (or more) is a lot of time to ask someone to sit still and answer questions. If you have a lot of ground to cover, and the person is willing, I highly recommend two separate interview sessions, with the second session being slightly shorter than the first (e.g., ninety minutes the first day, sixty minutes the second). There are lots of good reasons for this, but the most compelling one is that this allows you to listen to the first day’s recording and catch anything interesting you might have missed in the moment and so develop follow-up questions that can probe further. This also allows the person being interviewed to have some time to think about the issues raised in the interview and go a little deeper with their answers.

Standardized questionnaire with open responses ( structured interview ). This is the type of interview a lot of people have in mind when they hear “interview”: a researcher comes to your door with a clipboard and proceeds to ask you a series of questions. These questions are all the same whoever answers the door; they are “standardized.” Both the wording and the exact order are important, as people’s responses may vary depending on how and when a question is asked. These are qualitative only in that the questions allow for “open-ended responses”: people can say whatever they want rather than select from a predetermined menu of responses. For example, a survey I collaborated on included this open-ended response question: “How does class affect one’s career success in sociology?” Some of the answers were simply one word long (e.g., “debt”), and others were long statements with stories and personal anecdotes. It is possible to be surprised by the responses. Although it’s a stretch to call this kind of questioning a conversation, it does allow the person answering the question some degree of freedom in how they answer.

Survey questionnaire with closed responses (not an interview!). Standardized survey questions with specific answer options (e.g., closed responses) are not really interviews at all, and they do not generate qualitative data. For example, if we included five options for the question “How does class affect one’s career success in sociology?”—(1) debt, (2) social networks, (3) alienation, (4) family doesn’t understand, (5) type of grad program—we leave no room for surprises at all. Instead, we would most likely look at patterns around these responses, thinking quantitatively rather than qualitatively (e.g., using regression analysis techniques, we might find that working-class sociologists were twice as likely to bring up alienation). It can sometimes be confusing for new students because the very same survey can include both closed-ended and open-ended questions. The key is to think about how these will be analyzed and to what level surprises are possible. If your plan is to turn all responses into a number and make predictions about correlations and relationships, you are no longer conducting qualitative research. This is true even if you are conducting this survey face-to-face with a real live human. Closed-response questions are not conversations of any kind, purposeful or not.

In summary, the semistructured interview guide approach is the predominant form of interviewing for social science qualitative researchers because it allows a high degree of freedom of responses from those interviewed (thus allowing for novel discoveries) while still maintaining some connection to a research question area or topic of interest. The rest of the chapter assumes the employment of this form.

Creating an Interview Guide

Your interview guide is the instrument used to bridge your research question(s) and what the people you are interviewing want to tell you. Unlike a standardized questionnaire, the questions actually asked do not need to be exactly what you have written down in your guide. The guide is meant to create space for those you are interviewing to talk about the phenomenon of interest, but sometimes you are not even sure what that phenomenon is until you start asking questions. A priority in creating an interview guide is to ensure it offers space. One of the worst mistakes is to create questions that are so specific that the person answering them will not stray. Relatedly, questions that sound “academic” will shut down a lot of respondents. A good interview guide invites respondents to talk about what is important to them, not feel like they are performing or being evaluated by you.

Good interview questions should not sound like your “research question” at all. For example, let’s say your research question is “How do patriarchal assumptions influence men’s understanding of climate change and responses to climate change?” It would be worse than unhelpful to ask a respondent, “How do your assumptions about the role of men affect your understanding of climate change?” You need to unpack this into manageable nuggets that pull your respondent into the area of interest without leading him anywhere. You could start by asking him what he thinks about climate change in general. Or, even better, whether he has any concerns about heatwaves or increased tornadoes or polar icecaps melting. Once he starts talking about that, you can ask follow-up questions that bring in issues around gendered roles, perhaps asking if he is married (to a woman) and whether his wife shares his thoughts and, if not, how they negotiate that difference. The fact is, you won’t really know the right questions to ask until he starts talking.

There are several distinct types of questions that can be used in your interview guide, either as main questions or as follow-up probes. If you remember that the point is to leave space for the respondent, you will craft a much more effective interview guide! You will also want to think about the place of time in both the questions themselves (past, present, future orientations) and the sequencing of the questions.

Researcher Note

Suggestion : As you read the next three sections (types of questions, temporality, question sequence), have in mind a particular research question, and try to draft questions and sequence them in a way that opens space for a discussion that helps you answer your research question.

Type of Questions

Experience and behavior questions ask about what a respondent does regularly (their behavior) or has done (their experience). These are relatively easy questions for people to answer because they appear more “factual” and less subjective. This makes them good opening questions. For the study on climate change above, you might ask, “Have you ever experienced an unusual weather event? What happened?” Or “You said you work outside? What is a typical summer workday like for you? How do you protect yourself from the heat?”

Opinion and values questions , in contrast, ask questions that get inside the minds of those you are interviewing. “Do you think climate change is real? Who or what is responsible for it?” are two such questions. Note that you don’t have to literally ask, “What is your opinion of X?” but you can find a way to ask the specific question relevant to the conversation you are having. These questions are a bit trickier to ask because the answers you get may depend in part on how your respondent perceives you and whether they want to please you or not. We’ve talked a fair amount about being reflective. Here is another place where this comes into play. You need to be aware of the effect your presence might have on the answers you are receiving and adjust accordingly. If you are a woman who is perceived as liberal asking a man who identifies as conservative about climate change, there is a lot of subtext that can be going on in the interview. There is no one right way to resolve this, but you must at least be aware of it.

Feeling questions are questions that ask respondents to draw on their emotional responses. It’s pretty common for academic researchers to forget that we have bodies and emotions, but people’s understandings of the world often operate at this affective level, sometimes unconsciously or barely consciously. It is a good idea to include questions that leave space for respondents to remember, imagine, or relive emotional responses to particular phenomena. “What was it like when you heard your cousin’s house burned down in that wildfire?” doesn’t explicitly use any emotion words, but it allows your respondent to remember what was probably a pretty emotional day. And if they respond emotionally neutral, that is pretty interesting data too. Note that asking someone “How do you feel about X” is not always going to evoke an emotional response, as they might simply turn around and respond with “I think that…” It is better to craft a question that actually pushes the respondent into the affective category. This might be a specific follow-up to an experience and behavior question —for example, “You just told me about your daily routine during the summer heat. Do you worry it is going to get worse?” or “Have you ever been afraid it will be too hot to get your work accomplished?”

Knowledge questions ask respondents what they actually know about something factual. We have to be careful when we ask these types of questions so that respondents do not feel like we are evaluating them (which would shut them down), but, for example, it is helpful to know when you are having a conversation about climate change that your respondent does in fact know that unusual weather events have increased and that these have been attributed to climate change! Asking these questions can set the stage for deeper questions and can ensure that the conversation makes the same kind of sense to both participants. For example, a conversation about political polarization can be put back on track once you realize that the respondent doesn’t really have a clear understanding that there are two parties in the US. Instead of asking a series of questions about Republicans and Democrats, you might shift your questions to talk more generally about political disagreements (e.g., “people against abortion”). And sometimes what you do want to know is the level of knowledge about a particular program or event (e.g., “Are you aware you can discharge your student loans through the Public Service Loan Forgiveness program?”).

Sensory questions call on all senses of the respondent to capture deeper responses. These are particularly helpful in sparking memory. “Think back to your childhood in Eastern Oregon. Describe the smells, the sounds…” Or you could use these questions to help a person access the full experience of a setting they customarily inhabit: “When you walk through the doors to your office building, what do you see? Hear? Smell?” As with feeling questions , these questions often supplement experience and behavior questions . They are another way of allowing your respondent to report fully and deeply rather than remain on the surface.

Creative questions employ illustrative examples, suggested scenarios, or simulations to get respondents to think more deeply about an issue, topic, or experience. There are many options here. In The Trouble with Passion , Erin Cech ( 2021 ) provides a scenario in which “Joe” is trying to decide whether to stay at his decent but boring computer job or follow his passion by opening a restaurant. She asks respondents, “What should Joe do?” Their answers illuminate the attraction of “passion” in job selection. In my own work, I have used a news story about an upwardly mobile young man who no longer has time to see his mother and sisters to probe respondents’ feelings about the costs of social mobility. Jessi Streib and Betsy Leondar-Wright have used single-page cartoon “scenes” to elicit evaluations of potential racial discrimination, sexual harassment, and classism. Barbara Sutton ( 2010 ) has employed lists of words (“strong,” “mother,” “victim”) on notecards she fans out and asks her female respondents to select and discuss.

Background/Demographic Questions

You most definitely will want to know more about the person you are interviewing in terms of conventional demographic information, such as age, race, gender identity, occupation, and educational attainment. These are not questions that normally open up inquiry. [1] For this reason, my practice has been to include a separate “demographic questionnaire” sheet that I ask each respondent to fill out at the conclusion of the interview. Only include those aspects that are relevant to your study. For example, if you are not exploring religion or religious affiliation, do not include questions about a person’s religion on the demographic sheet. See the example provided at the end of this chapter.

Temporality

Any type of question can have a past, present, or future orientation. For example, if you are asking a behavior question about workplace routine, you might ask the respondent to talk about past work, present work, and ideal (future) work. Similarly, if you want to understand how people cope with natural disasters, you might ask your respondent how they felt then during the wildfire and now in retrospect and whether and to what extent they have concerns for future wildfire disasters. It’s a relatively simple suggestion—don’t forget to ask about past, present, and future—but it can have a big impact on the quality of the responses you receive.

Question Sequence

Having a list of good questions or good question areas is not enough to make a good interview guide. You will want to pay attention to the order in which you ask your questions. Even though any one respondent can derail this order (perhaps by jumping to answer a question you haven’t yet asked), a good advance plan is always helpful. When thinking about sequence, remember that your goal is to get your respondent to open up to you and to say things that might surprise you. To establish rapport, it is best to start with nonthreatening questions. Asking about the present is often the safest place to begin, followed by the past (they have to know you a little bit to get there), and lastly, the future (talking about hopes and fears requires the most rapport). To allow for surprises, it is best to move from very general questions to more particular questions only later in the interview. This ensures that respondents have the freedom to bring up the topics that are relevant to them rather than feel like they are constrained to answer you narrowly. For example, refrain from asking about particular emotions until these have come up previously—don’t lead with them. Often, your more particular questions will emerge only during the course of the interview, tailored to what is emerging in conversation.

Once you have a set of questions, read through them aloud and imagine you are being asked the same questions. Does the set of questions have a natural flow? Would you be willing to answer the very first question to a total stranger? Does your sequence establish facts and experiences before moving on to opinions and values? Did you include prefatory statements, where necessary; transitions; and other announcements? These can be as simple as “Hey, we talked a lot about your experiences as a barista while in college.… Now I am turning to something completely different: how you managed friendships in college.” That is an abrupt transition, but it has been softened by your acknowledgment of that.

Probes and Flexibility

Once you have the interview guide, you will also want to leave room for probes and follow-up questions. As in the sample probe included here, you can write out the obvious probes and follow-up questions in advance. You might not need them, as your respondent might anticipate them and include full responses to the original question. Or you might need to tailor them to how your respondent answered the question. Some common probes and follow-up questions include asking for more details (When did that happen? Who else was there?), asking for elaboration (Could you say more about that?), asking for clarification (Does that mean what I think it means or something else? I understand what you mean, but someone else reading the transcript might not), and asking for contrast or comparison (How did this experience compare with last year’s event?). “Probing is a skill that comes from knowing what to look for in the interview, listening carefully to what is being said and what is not said, and being sensitive to the feedback needs of the person being interviewed” ( Patton 2002:374 ). It takes work! And energy. I and many other interviewers I know report feeling emotionally and even physically drained after conducting an interview. You are tasked with active listening and rearranging your interview guide as needed on the fly. If you only ask the questions written down in your interview guide with no deviations, you are doing it wrong. [2]

The Final Question

Every interview guide should include a very open-ended final question that allows for the respondent to say whatever it is they have been dying to tell you but you’ve forgotten to ask. About half the time they are tired too and will tell you they have nothing else to say. But incredibly, some of the most honest and complete responses take place here, at the end of a long interview. You have to realize that the person being interviewed is often discovering things about themselves as they talk to you and that this process of discovery can lead to new insights for them. Making space at the end is therefore crucial. Be sure you convey that you actually do want them to tell you more, that the offer of “anything else?” is not read as an empty convention where the polite response is no. Here is where you can pull from that active listening and tailor the final question to the particular person. For example, “I’ve asked you a lot of questions about what it was like to live through that wildfire. I’m wondering if there is anything I’ve forgotten to ask, especially because I haven’t had that experience myself” is a much more inviting final question than “Great. Anything you want to add?” It’s also helpful to convey to the person that you have the time to listen to their full answer, even if the allotted time is at the end. After all, there are no more questions to ask, so the respondent knows exactly how much time is left. Do them the courtesy of listening to them!

Conducting the Interview

Once you have your interview guide, you are on your way to conducting your first interview. I always practice my interview guide with a friend or family member. I do this even when the questions don’t make perfect sense for them, as it still helps me realize which questions make no sense, are poorly worded (too academic), or don’t follow sequentially. I also practice the routine I will use for interviewing, which goes something like this:

  • Introduce myself and reintroduce the study
  • Provide consent form and ask them to sign and retain/return copy
  • Ask if they have any questions about the study before we begin
  • Ask if I can begin recording
  • Ask questions (from interview guide)
  • Turn off the recording device
  • Ask if they are willing to fill out my demographic questionnaire
  • Collect questionnaire and, without looking at the answers, place in same folder as signed consent form
  • Thank them and depart

A note on remote interviewing: Interviews have traditionally been conducted face-to-face in a private or quiet public setting. You don’t want a lot of background noise, as this will make transcriptions difficult. During the recent global pandemic, many interviewers, myself included, learned the benefits of interviewing remotely. Although face-to-face is still preferable for many reasons, Zoom interviewing is not a bad alternative, and it does allow more interviews across great distances. Zoom also includes automatic transcription, which significantly cuts down on the time it normally takes to convert our conversations into “data” to be analyzed. These automatic transcriptions are not perfect, however, and you will still need to listen to the recording and clarify and clean up the transcription. Nor do automatic transcriptions include notations of body language or change of tone, which you may want to include. When interviewing remotely, you will want to collect the consent form before you meet: ask them to read, sign, and return it as an email attachment. I think it is better to ask for the demographic questionnaire after the interview, but because some respondents may never return it then, it is probably best to ask for this at the same time as the consent form, in advance of the interview.

What should you bring to the interview? I would recommend bringing two copies of the consent form (one for you and one for the respondent), a demographic questionnaire, a manila folder in which to place the signed consent form and filled-out demographic questionnaire, a printed copy of your interview guide (I print with three-inch right margins so I can jot down notes on the page next to relevant questions), a pen, a recording device, and water.

After the interview, you will want to secure the signed consent form in a locked filing cabinet (if in print) or a password-protected folder on your computer. Using Excel or a similar program that allows tables/spreadsheets, create an identifying number for your interview that links to the consent form without using the name of your respondent. For example, let’s say that I conduct interviews with US politicians, and the first person I meet with is George W. Bush. I will assign the transcription the number “INT#001” and add it to the signed consent form. [3] The signed consent form goes into a locked filing cabinet, and I never use the name “George W. Bush” again. I take the information from the demographic sheet, open my Excel spreadsheet, and add the relevant information in separate columns for the row INT#001: White, male, Republican. When I interview Bill Clinton as my second interview, I include a second row: INT#002: White, male, Democrat. And so on. The only link to the actual name of the respondent and this information is the fact that the consent form (unavailable to anyone but me) has stamped on it the interview number.

Many students get very nervous before their first interview. Actually, many of us are always nervous before the interview! But do not worry—this is normal, and it does pass. Chances are, you will be pleasantly surprised at how comfortable it begins to feel. These “purposeful conversations” are often a delight for both participants. This is not to say that sometimes things go wrong. I often have my students practice several “bad scenarios” (e.g., a respondent that you cannot get to open up; a respondent who is too talkative and dominates the conversation, steering it away from the topics you are interested in; emotions that completely take over; or shocking disclosures you are ill-prepared to handle), but most of the time, things go quite well. Be prepared for the unexpected, but know that the reason interviews are so popular as a technique of data collection is that they are usually richly rewarding for both participants.

One thing that I stress to my methods students and remind myself about is that interviews are still conversations between people. If there’s something you might feel uncomfortable asking someone about in a “normal” conversation, you will likely also feel a bit of discomfort asking it in an interview. Maybe more importantly, your respondent may feel uncomfortable. Social research—especially about inequality—can be uncomfortable. And it’s easy to slip into an abstract, intellectualized, or removed perspective as an interviewer. This is one reason trying out interview questions is important. Another is that sometimes the question sounds good in your head but doesn’t work as well out loud in practice. I learned this the hard way when a respondent asked me how I would answer the question I had just posed, and I realized that not only did I not really know how I would answer it, but I also wasn’t quite as sure I knew what I was asking as I had thought.

—Elizabeth M. Lee, Associate Professor of Sociology at Saint Joseph’s University, author of Class and Campus Life , and co-author of Geographies of Campus Inequality

How Many Interviews?

Your research design has included a targeted number of interviews and a recruitment plan (see chapter 5). Follow your plan, but remember that “ saturation ” is your goal. You interview as many people as you can until you reach a point at which you are no longer surprised by what they tell you. This means not that no one after your first twenty interviews will have surprising, interesting stories to tell you but rather that the picture you are forming about the phenomenon of interest to you from a research perspective has come into focus, and none of the interviews are substantially refocusing that picture. That is when you should stop collecting interviews. Note that to know when you have reached this, you will need to read your transcripts as you go. More about this in chapters 18 and 19.

Your Final Product: The Ideal Interview Transcript

A good interview transcript will demonstrate a subtly controlled conversation by the skillful interviewer. In general, you want to see replies that are about one paragraph long, not short sentences and not running on for several pages. Although it is sometimes necessary to follow respondents down tangents, it is also often necessary to pull them back to the questions that form the basis of your research study. This is not really a free conversation, although it may feel like that to the person you are interviewing.

Final Tips from an Interview Master

Annette Lareau is arguably one of the masters of the trade. In Listening to People , she provides several guidelines for good interviews and then offers a detailed example of an interview gone wrong and how it could be addressed (please see the “Further Readings” at the end of this chapter). Here is an abbreviated version of her set of guidelines: (1) interview respondents who are experts on the subjects of most interest to you (as a corollary, don’t ask people about things they don’t know); (2) listen carefully and talk as little as possible; (3) keep in mind what you want to know and why you want to know it; (4) be a proactive interviewer (subtly guide the conversation); (5) assure respondents that there aren’t any right or wrong answers; (6) use the respondent’s own words to probe further (this both allows you to accurately identify what you heard and pushes the respondent to explain further); (7) reuse effective probes (don’t reinvent the wheel as you go—if repeating the words back works, do it again and again); (8) focus on learning the subjective meanings that events or experiences have for a respondent; (9) don’t be afraid to ask a question that draws on your own knowledge (unlike trial lawyers who are trained never to ask a question for which they don’t already know the answer, sometimes it’s worth it to ask risky questions based on your hypotheses or just plain hunches); (10) keep thinking while you are listening (so difficult…and important); (11) return to a theme raised by a respondent if you want further information; (12) be mindful of power inequalities (and never ever coerce a respondent to continue the interview if they want out); (13) take control with overly talkative respondents; (14) expect overly succinct responses, and develop strategies for probing further; (15) balance digging deep and moving on; (16) develop a plan to deflect questions (e.g., let them know you are happy to answer any questions at the end of the interview, but you don’t want to take time away from them now); and at the end, (17) check to see whether you have asked all your questions. You don’t always have to ask everyone the same set of questions, but if there is a big area you have forgotten to cover, now is the time to recover ( Lareau 2021:93–103 ).

Sample: Demographic Questionnaire

ASA Taskforce on First-Generation and Working-Class Persons in Sociology – Class Effects on Career Success

Supplementary Demographic Questionnaire

Thank you for your participation in this interview project. We would like to collect a few pieces of key demographic information from you to supplement our analyses. Your answers to these questions will be kept confidential and stored by ID number. All of your responses here are entirely voluntary!

What best captures your race/ethnicity? (please check any/all that apply)

  • White (Non Hispanic/Latina/o/x)
  • Black or African American
  • Hispanic, Latino/a/x of Spanish
  • Asian or Asian American
  • American Indian or Alaska Native
  • Middle Eastern or North African
  • Native Hawaiian or Pacific Islander
  • Other : (Please write in: ________________)

What is your current position?

  • Grad Student
  • Full Professor

Please check any and all of the following that apply to you:

  • I identify as a working-class academic
  • I was the first in my family to graduate from college
  • I grew up poor

What best reflects your gender?

  • Transgender female/Transgender woman
  • Transgender male/Transgender man
  • Gender queer/ Gender nonconforming

Anything else you would like us to know about you?

Example: Interview Guide

In this example, follow-up prompts are italicized.  Note the sequence of questions.  That second question often elicits an entire life history , answering several later questions in advance.

Introduction Script/Question

Thank you for participating in our survey of ASA members who identify as first-generation or working-class.  As you may have heard, ASA has sponsored a taskforce on first-generation and working-class persons in sociology and we are interested in hearing from those who so identify.  Your participation in this interview will help advance our knowledge in this area.

  • The first thing we would like to as you is why you have volunteered to be part of this study? What does it mean to you be first-gen or working class?  Why were you willing to be interviewed?
  • How did you decide to become a sociologist?
  • Can you tell me a little bit about where you grew up? ( prompts: what did your parent(s) do for a living?  What kind of high school did you attend?)
  • Has this identity been salient to your experience? (how? How much?)
  • How welcoming was your grad program? Your first academic employer?
  • Why did you decide to pursue sociology at the graduate level?
  • Did you experience culture shock in college? In graduate school?
  • Has your FGWC status shaped how you’ve thought about where you went to school? debt? etc?
  • Were you mentored? How did this work (not work)?  How might it?
  • What did you consider when deciding where to go to grad school? Where to apply for your first position?
  • What, to you, is a mark of career success? Have you achieved that success?  What has helped or hindered your pursuit of success?
  • Do you think sociology, as a field, cares about prestige?
  • Let’s talk a little bit about intersectionality. How does being first-gen/working class work alongside other identities that are important to you?
  • What do your friends and family think about your career? Have you had any difficulty relating to family members or past friends since becoming highly educated?
  • Do you have any debt from college/grad school? Are you concerned about this?  Could you explain more about how you paid for college/grad school?  (here, include assistance from family, fellowships, scholarships, etc.)
  • (You’ve mentioned issues or obstacles you had because of your background.) What could have helped?  Or, who or what did? Can you think of fortuitous moments in your career?
  • Do you have any regrets about the path you took?
  • Is there anything else you would like to add? Anything that the Taskforce should take note of, that we did not ask you about here?

Further Readings

Britten, Nicky. 1995. “Qualitative Interviews in Medical Research.” BMJ: British Medical Journal 31(6999):251–253. A good basic overview of interviewing particularly useful for students of public health and medical research generally.

Corbin, Juliet, and Janice M. Morse. 2003. “The Unstructured Interactive Interview: Issues of Reciprocity and Risks When Dealing with Sensitive Topics.” Qualitative Inquiry 9(3):335–354. Weighs the potential benefits and harms of conducting interviews on topics that may cause emotional distress. Argues that the researcher’s skills and code of ethics should ensure that the interviewing process provides more of a benefit to both participant and researcher than a harm to the former.

Gerson, Kathleen, and Sarah Damaske. 2020. The Science and Art of Interviewing . New York: Oxford University Press. A useful guidebook/textbook for both undergraduates and graduate students, written by sociologists.

Kvale, Steiner. 2007. Doing Interviews . London: SAGE. An easy-to-follow guide to conducting and analyzing interviews by psychologists.

Lamont, Michèle, and Ann Swidler. 2014. “Methodological Pluralism and the Possibilities and Limits of Interviewing.” Qualitative Sociology 37(2):153–171. Written as a response to various debates surrounding the relative value of interview-based studies and ethnographic studies defending the particular strengths of interviewing. This is a must-read article for anyone seriously engaging in qualitative research!

Pugh, Allison J. 2013. “What Good Are Interviews for Thinking about Culture? Demystifying Interpretive Analysis.” American Journal of Cultural Sociology 1(1):42–68. Another defense of interviewing written against those who champion ethnographic methods as superior, particularly in the area of studying culture. A classic.

Rapley, Timothy John. 2001. “The ‘Artfulness’ of Open-Ended Interviewing: Some considerations in analyzing interviews.” Qualitative Research 1(3):303–323. Argues for the importance of “local context” of data production (the relationship built between interviewer and interviewee, for example) in properly analyzing interview data.

Weiss, Robert S. 1995. Learning from Strangers: The Art and Method of Qualitative Interview Studies . New York: Simon and Schuster. A classic and well-regarded textbook on interviewing. Because Weiss has extensive experience conducting surveys, he contrasts the qualitative interview with the survey questionnaire well; particularly useful for those trained in the latter.

  • I say “normally” because how people understand their various identities can itself be an expansive topic of inquiry. Here, I am merely talking about collecting otherwise unexamined demographic data, similar to how we ask people to check boxes on surveys. ↵
  • Again, this applies to “semistructured in-depth interviewing.” When conducting standardized questionnaires, you will want to ask each question exactly as written, without deviations! ↵
  • I always include “INT” in the number because I sometimes have other kinds of data with their own numbering: FG#001 would mean the first focus group, for example. I also always include three-digit spaces, as this allows for up to 999 interviews (or, more realistically, allows for me to interview up to one hundred persons without having to reset my numbering system). ↵

A method of data collection in which the researcher asks the participant questions; the answers to these questions are often recorded and transcribed verbatim. There are many different kinds of interviews - see also semistructured interview , structured interview , and unstructured interview .

A document listing key questions and question areas for use during an interview.  It is used most often for semi-structured interviews.  A good interview guide may have no more than ten primary questions for two hours of interviewing, but these ten questions will be supplemented by probes and relevant follow-ups throughout the interview.  Most IRBs require the inclusion of the interview guide in applications for review.  See also interview and  semi-structured interview .

A data-collection method that relies on casual, conversational, and informal interviewing.  Despite its apparent conversational nature, the researcher usually has a set of particular questions or question areas in mind but allows the interview to unfold spontaneously.  This is a common data-collection technique among ethnographers.  Compare to the semi-structured or in-depth interview .

A form of interview that follows a standard guide of questions asked, although the order of the questions may change to match the particular needs of each individual interview subject, and probing “follow-up” questions are often added during the course of the interview.  The semi-structured interview is the primary form of interviewing used by qualitative researchers in the social sciences.  It is sometimes referred to as an “in-depth” interview.  See also interview and  interview guide .

The cluster of data-collection tools and techniques that involve observing interactions between people, the behaviors, and practices of individuals (sometimes in contrast to what they say about how they act and behave), and cultures in context.  Observational methods are the key tools employed by ethnographers and Grounded Theory .

Follow-up questions used in a semi-structured interview  to elicit further elaboration.  Suggested prompts can be included in the interview guide  to be used/deployed depending on how the initial question was answered or if the topic of the prompt does not emerge spontaneously.

A form of interview that follows a strict set of questions, asked in a particular order, for all interview subjects.  The questions are also the kind that elicits short answers, and the data is more “informative” than probing.  This is often used in mixed-methods studies, accompanying a survey instrument.  Because there is no room for nuance or the exploration of meaning in structured interviews, qualitative researchers tend to employ semi-structured interviews instead.  See also interview.

The point at which you can conclude data collection because every person you are interviewing, the interaction you are observing, or content you are analyzing merely confirms what you have already noted.  Achieving saturation is often used as the justification for the final sample size.

An interview variant in which a person’s life story is elicited in a narrative form.  Turning points and key themes are established by the researcher and used as data points for further analysis.

Introduction to Qualitative Research Methods Copyright © 2023 by Allison Hurst is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License , except where otherwise noted.

Grad Coach

Qualitative Research 101: Interviewing

5 Common Mistakes To Avoid When Undertaking Interviews

By: David Phair (PhD) and Kerryn Warren (PhD) | March 2022

Undertaking interviews is potentially the most important step in the qualitative research process. If you don’t collect useful, useable data in your interviews, you’ll struggle through the rest of your dissertation or thesis.  Having helped numerous students with their research over the years, we’ve noticed some common interviewing mistakes that first-time researchers make. In this post, we’ll discuss five costly interview-related mistakes and outline useful strategies to avoid making these.

Overview: 5 Interviewing Mistakes

  • Not having a clear interview strategy /plan
  • Not having good interview techniques /skills
  • Not securing a suitable location and equipment
  • Not having a basic risk management plan
  • Not keeping your “ golden thread ” front of mind

1. Not having a clear interview strategy

The first common mistake that we’ll look at is that of starting the interviewing process without having first come up with a clear interview strategy or plan of action. While it’s natural to be keen to get started engaging with your interviewees, a lack of planning can result in a mess of data and inconsistency between interviews.

There are several design choices to decide on and plan for before you start interviewing anyone. Some of the most important questions you need to ask yourself before conducting interviews include:

  • What are the guiding research aims and research questions of my study?
  • Will I use a structured, semi-structured or unstructured interview approach?
  • How will I record the interviews (audio or video)?
  • Who will be interviewed and by whom ?
  • What ethics and data law considerations do I need to adhere to?
  • How will I analyze my data? 

Let’s take a quick look at some of these.

The core objective of the interviewing process is to generate useful data that will help you address your overall research aims. Therefore, your interviews need to be conducted in a way that directly links to your research aims, objectives and research questions (i.e. your “golden thread”). This means that you need to carefully consider the questions you’ll ask to ensure that they align with and feed into your golden thread. If any question doesn’t align with this, you may want to consider scrapping it.

Another important design choice is whether you’ll use an unstructured, semi-structured or structured interview approach . For semi-structured interviews, you will have a list of questions that you plan to ask and these questions will be open-ended in nature. You’ll also allow the discussion to digress from the core question set if something interesting comes up. This means that the type of information generated might differ a fair amount between interviews.

Contrasted to this, a structured approach to interviews is more rigid, where a specific set of closed questions is developed and asked for each interviewee in exactly the same order. Closed questions have a limited set of answers, that are often single-word answers. Therefore, you need to think about what you’re trying to achieve with your research project (i.e. your research aims) and decided on which approach would be best suited in your case.

It is also important to plan ahead with regards to who will be interviewed and how. You need to think about how you will approach the possible interviewees to get their cooperation, who will conduct the interviews, when to conduct the interviews and how to record the interviews. For each of these decisions, it’s also essential to make sure that all ethical considerations and data protection laws are taken into account.

Finally, you should think through how you plan to analyze the data (i.e., your qualitative analysis method) generated by the interviews. Different types of analysis rely on different types of data, so you need to ensure you’re asking the right types of questions and correctly guiding your respondents.

Simply put, you need to have a plan of action regarding the specifics of your interview approach before you start collecting data. If not, you’ll end up drifting in your approach from interview to interview, which will result in inconsistent, unusable data.

Your interview questions need to directly  link to your research aims, objectives and  research questions - your "golden thread”.

2. Not having good interview technique

While you’re generally not expected to become you to be an expert interviewer for a dissertation or thesis, it is important to practice good interview technique and develop basic interviewing skills .

Let’s go through some basics that will help the process along.

Firstly, before the interview , make sure you know your interview questions well and have a clear idea of what you want from the interview. Naturally, the specificity of your questions will depend on whether you’re taking a structured, semi-structured or unstructured approach, but you still need a consistent starting point . Ideally, you should develop an interview guide beforehand (more on this later) that details your core question and links these to the research aims, objectives and research questions.

Before you undertake any interviews, it’s a good idea to do a few mock interviews with friends or family members. This will help you get comfortable with the interviewer role, prepare for potentially unexpected answers and give you a good idea of how long the interview will take to conduct. In the interviewing process, you’re likely to encounter two kinds of challenging interviewees ; the two-word respondent and the respondent who meanders and babbles. Therefore, you should prepare yourself for both and come up with a plan to respond to each in a way that will allow the interview to continue productively.

To begin the formal interview , provide the person you are interviewing with an overview of your research. This will help to calm their nerves (and yours) and contextualize the interaction. Ultimately, you want the interviewee to feel comfortable and be willing to be open and honest with you, so it’s useful to start in a more casual, relaxed fashion and allow them to ask any questions they may have. From there, you can ease them into the rest of the questions.

As the interview progresses , avoid asking leading questions (i.e., questions that assume something about the interviewee or their response). Make sure that you speak clearly and slowly , using plain language and being ready to paraphrase questions if the person you are interviewing misunderstands. Be particularly careful with interviewing English second language speakers to ensure that you’re both on the same page.

Engage with the interviewee by listening to them carefully and acknowledging that you are listening to them by smiling or nodding. Show them that you’re interested in what they’re saying and thank them for their openness as appropriate. This will also encourage your interviewee to respond openly.

Need a helping hand?

how to conduct an interview qualitative research

3. Not securing a suitable location and quality equipment

Where you conduct your interviews and the equipment you use to record them both play an important role in how the process unfolds. Therefore, you need to think carefully about each of these variables before you start interviewing.

Poor location: A bad location can result in the quality of your interviews being compromised, interrupted, or cancelled. If you are conducting physical interviews, you’ll need a location that is quiet, safe, and welcoming . It’s very important that your location of choice is not prone to interruptions (the workplace office is generally problematic, for example) and has suitable facilities (such as water, a bathroom, and snacks).

If you are conducting online interviews , you need to consider a few other factors. Importantly, you need to make sure that both you and your respondent have access to a good, stable internet connection and electricity. Always check before the time that both of you know how to use the relevant software and it’s accessible (sometimes meeting platforms are blocked by workplace policies or firewalls). It’s also good to have alternatives in place (such as WhatsApp, Zoom, or Teams) to cater for these types of issues.

Poor equipment: Using poor-quality recording equipment or using equipment incorrectly means that you will have trouble transcribing, coding, and analyzing your interviews. This can be a major issue , as some of your interview data may go completely to waste if not recorded well. So, make sure that you use good-quality recording equipment and that you know how to use it correctly.

To avoid issues, you should always conduct test recordings before every interview to ensure that you can use the relevant equipment properly. It’s also a good idea to spot check each recording afterwards, just to make sure it was recorded as planned. If your equipment uses batteries, be sure to always carry a spare set.

Where you conduct your interviews and the equipment you use to record them play an important role in how the process unfolds.

4. Not having a basic risk management plan

Many possible issues can arise during the interview process. Not planning for these issues can mean that you are left with compromised data that might not be useful to you. Therefore, it’s important to map out some sort of risk management plan ahead of time, considering the potential risks, how you’ll minimize their probability and how you’ll manage them if they materialize.

Common potential issues related to the actual interview include cancellations (people pulling out), delays (such as getting stuck in traffic), language and accent differences (especially in the case of poor internet connections), issues with internet connections and power supply. Other issues can also occur in the interview itself. For example, the interviewee could drift off-topic, or you might encounter an interviewee who does not say much at all.

You can prepare for these potential issues by considering possible worst-case scenarios and preparing a response for each scenario. For instance, it is important to plan a backup date just in case your interviewee cannot make it to the first meeting you scheduled with them. It’s also a good idea to factor in a 30-minute gap between your interviews for the instances where someone might be late, or an interview runs overtime for other reasons. Make sure that you also plan backup questions that could be used to bring a respondent back on topic if they start rambling, or questions to encourage those who are saying too little.

In general, it’s best practice to plan to conduct more interviews than you think you need (this is called oversampling ). Doing so will allow you some room for error if there are interviews that don’t go as planned, or if some interviewees withdraw. If you need 10 interviews, it is a good idea to plan for 15. Likely, a few will cancel , delay, or not produce useful data.

You should consider all the potential risks, how you’ll reduce their probability and how you'll respond if they do indeed materialize.

5. Not keeping your golden thread front of mind

We touched on this a little earlier, but it is a key point that should be central to your entire research process. You don’t want to end up with pages and pages of data after conducting your interviews and realize that it is not useful to your research aims . Your research aims, objectives and research questions – i.e., your golden thread – should influence every design decision and should guide the interview process at all times. 

A useful way to avoid this mistake is by developing an interview guide before you begin interviewing your respondents. An interview guide is a document that contains all of your questions with notes on how each of the interview questions is linked to the research question(s) of your study. You can also include your research aims and objectives here for a more comprehensive linkage. 

You can easily create an interview guide by drawing up a table with one column containing your core interview questions . Then add another column with your research questions , another with expectations that you may have in light of the relevant literature and another with backup or follow-up questions . As mentioned, you can also bring in your research aims and objectives to help you connect them all together. If you’d like, you can download a copy of our free interview guide here .

Recap: Qualitative Interview Mistakes

In this post, we’ve discussed 5 common costly mistakes that are easy to make in the process of planning and conducting qualitative interviews.

To recap, these include:

If you have any questions about these interviewing mistakes, drop a comment below. Alternatively, if you’re interested in getting 1-on-1 help with your thesis or dissertation , check out our dissertation coaching service or book a free initial consultation with one of our friendly Grad Coaches.

how to conduct an interview qualitative research

Psst... there’s more!

This post was based on one of our popular Research Bootcamps . If you're working on a research project, you'll definitely want to check this out ...

You Might Also Like:

Writing A Dissertation/Thesis Abstract

Submit a Comment Cancel reply

Your email address will not be published. Required fields are marked *

Save my name, email, and website in this browser for the next time I comment.

  • Print Friendly

How to Conduct Interviews in Qualitative Research: Interview Guidelines for Qualitative Research

how to conduct an interview qualitative research

Rev › Blog › Market Research › How to Conduct Interviews in Qualitative Research: Interview Guidelines for Qualitative Research

Qualitative research interviews are depth interviews. They elicit detailed feedback from your leads and customers. Unstructured interviews reveal why people react in a certain way or make certain decisions. According to The Hartford , qualitative research provides an anecdotal look into your business. That provides an important form of data.

Why Your Business Should Use a Qualitative Interview Process

Qualitative research helps business owners:

  • Identify customer needs
  • Clarify marketing messages
  • Generate ideas for improvements of a product
  • Decide to extend a line or brand
  • Gain perspective on how a product fits into a customer’s lifestyle

How Is Conducting Qualitative Research & Quantitative Research Different?

Quantitative research concerns measurable quantities and numbers. It involves close-ended questions. Answer possibilities include yes or no, true or false, or various set choices. Qualitative research is descriptive and concerned with understanding behavior. It invites people to tell their stories in their own words.

Examples of Qualitative Research

Qualitative research helps researchers understand the social reality of individuals, groups and cultures. Qualitative research for businesses involves understanding consumer behavior. It can involve ethnographic techniques, including participant observation and field research. It also includes phenomenology, understanding life experiences using written or recorded narratives. Qualitative research also includes in-depth interviews.

What Is a Qualitative Interview?

A qualitative interview is a more personal form of research compared to questionnaires. The interviewer can probe or ask follow-up research questions of the interview participant. In some cases, subjects may start to interview the interviewer. This fosters deep discussion of the interview topic.

Why Are Interview Techniques in Qualitative Research Effective?

Qualitative research interviews help you explain, understand and explore opinions, behavior and experiences. Qualitative research can provide insights into a phenomenon. Qualitative research discoveries can be further researched and analyzed to influence business decisions.

How Are Interviews in Qualitative Research Formatted?

Qualitative research interviews may take place one-on-one or in focus groups. Learn how to run a successful focus group . Interviews span around 30 to 90 minutes. The interview can take place in person, over the phone or through video chat. The interviewer collects information about opinions, behavior, attitudes, feelings, preferences and knowledge.

How to Conduct Interviews in Qualitative Research

1. determine your goal., 2. target people to interview., 3. design interview questions., 4. prep the interview., 5. conduct the interview., 6. transcribe and analyze the interview., 7. optimize and evolve your interview guide., the first step in qualitative research: determine your goal.

Determine what you want to study:

  • A current or potential product, service or brand positioning
  • Strengths and weaknesses in products
  • Purchasing decisions
  • Reactions to advertising or marketing campaigns
  • Usability of a website or other interactive services
  • Perceptions about the company, brand or product
  • Reactions to packaging and design

How Can You Decide a Goal for a Qualitative Interview?

Have your business team ask the following questions: 

  • What information do you want to get?
  • Why do you want to pursue in-depth information about this research topic?
  • Why is a qualitative interview process the best solution for this research?
  • How will you use qualitative data to improve your business? 

How to Determine the Right Interview Participants

When looking for people to talk to for a qualitative interview, consider your goal. If you want to expand a product line, interview existing customers about their needs. If you’re researching marketing, ask new customers how they found your business. Match interview subjects with the goal of the interview.

How to Design Interview Questions for Qualitative Research

When you’re creating an interview guide, it’s a good idea to: 

  • Plan structured interviews with open ended questions.
  • Avoid leading questions.
  • Create interview questions that are clear and easy to understand.
  • Make research questions focused but flexible.
  • Design questions that align with data collection and data analysis goals.

Tips for Preparing a Qualitative Research Interview

Preparation improves interview effectiveness. Tips to prepare include:

  • Create an interview guide. The guide should include questions, question intent and answer-based paths to take.
  • Choose a setting where the subject feels comfortable.
  • Build rapport with interview participants.
  • Have a reliable way to record the interview.
  • Rehearse the interview first.

Environmental Concerns for Qualitative Interviews

The setting of a qualitative interview also affects the quality of the interview. Consider the needs of the subject. For example, if you’re interviewing a teenager, a formal boardroom may not be the best setting. Some cultures may not value direct eye contact. An interview that’s non-face-to-face may be better.

How to Make Qualitative Interview Subjects Comfortable

For long interviews, offer water and breaks to participants. Be polite and respectful when interacting with interview subjects. Let interview participants know the purpose of the research. Explain exactly how you’ll use their answers. Address terms of confidentiality if necessary. Thank participants after the interview and let them know what to expect next.

What Are Interview Techniques in Qualitative Research?

Qualitative research techniques include:

  • Start interviews with “get-to-know-you” questions to put the interview participant at ease.
  • Pay attention.
  • Use active listening techniques.
  • Watch for body language cues.
  • Pivot questions as needed.
  • Acknowledge emotions.
  • Avoid interrogation.
  • Ending interviews, ask subjects if they have anything to add.

What Is Active Listening in Interviews in Qualitative Research?

Active listening techniques include: 

  • Make eye contact.
  • Lean in and use body language to show you’re listening.
  • Don’t get distracted by devices.
  • Use verbal affirmation.
  • Paraphrase answers for reflection.
  • Reference earlier answers.
  • Avoid interrupting.
  • Embrace pauses.
  • Ask for clarification.
  • Pay attention in the moment.

Tips for Transcribing a Qualitative Interview

It’s best to transcribe and analyze a qualitative research interview right away. This helps you optimize future interviews. Transcribe the interview word for word. Note non-verbal interactions in your transcription. Interactions like pauses and laughter can provide deeper insights into responses.

How to Analyze a Qualitative Interview

Analyze your qualitative research data early. That way, you can identify emerging themes to shape future interviews. Consider adding these to each interview report:

  • The goal of the interview
  • Details about the interview participant
  • Questions asked, summarized responses and key findings
  • Recommendations

Relate the analysis to the goal of the qualitative research interview.

Optimize the Interview Guide for Qualitative Research

Each interview can help you improve the efficiency and effectiveness of future ones. Adjust your interview guide based on insights from each previous interview. Keep all versions of your transcriptions and interview guides with notes on them. You can reference these for future qualitative research.

Get Reliable Transcription Services for Qualitative Research Interviews

As mentioned, you should transcribe qualitative research interviews as soon as possible. There are several reasons for this.

  • You can gain insights that help you shape your interview guide. You might identify questions to add or questions to clarify.
  • Your interview participants may not be appropriate for this type of qualitative research. Finding more targeted interview subjects may be better.
  • Answers may evolve the qualitative research goal and/or data analysis.
At Rev, we understand the need for fast transcription for accurate market research. We provide a turnaround time of as few as 12 hours, no matter how big your project is. We guarantee 99%+ accuracy. Learn about Rev’s market research transcription . We can help make your qualitative research project a success.

Download our FREE Qualitative Research Interview Checklist

Everybody’s favorite speech-to-text blog.

We combine AI and a huge community of freelancers to make speech-to-text greatness every day. Wanna hear more about it?

Have a language expert improve your writing

Run a free plagiarism check in 10 minutes, generate accurate citations for free.

  • Knowledge Base

Methodology

  • Types of Interviews in Research | Guide & Examples

Types of Interviews in Research | Guide & Examples

Published on March 10, 2022 by Tegan George . Revised on June 22, 2023.

An interview is a qualitative research method that relies on asking questions in order to collect data . Interviews involve two or more people, one of whom is the interviewer asking the questions.

There are several types of interviews, often differentiated by their level of structure.

  • Structured interviews have predetermined questions asked in a predetermined order.
  • Unstructured interviews are more free-flowing.
  • Semi-structured interviews fall in between.

Interviews are commonly used in market research, social science, and ethnographic research .

Table of contents

What is a structured interview, what is a semi-structured interview, what is an unstructured interview, what is a focus group, examples of interview questions, advantages and disadvantages of interviews, other interesting articles, frequently asked questions about types of interviews.

Structured interviews have predetermined questions in a set order. They are often closed-ended, featuring dichotomous (yes/no) or multiple-choice questions. While open-ended structured interviews exist, they are much less common. The types of questions asked make structured interviews a predominantly quantitative tool.

Asking set questions in a set order can help you see patterns among responses, and it allows you to easily compare responses between participants while keeping other factors constant. This can mitigate   research biases and lead to higher reliability and validity. However, structured interviews can be overly formal, as well as limited in scope and flexibility.

  • You feel very comfortable with your topic. This will help you formulate your questions most effectively.
  • You have limited time or resources. Structured interviews are a bit more straightforward to analyze because of their closed-ended nature, and can be a doable undertaking for an individual.
  • Your research question depends on holding environmental conditions between participants constant.

Receive feedback on language, structure, and formatting

Professional editors proofread and edit your paper by focusing on:

  • Academic style
  • Vague sentences
  • Style consistency

See an example

how to conduct an interview qualitative research

Semi-structured interviews are a blend of structured and unstructured interviews. While the interviewer has a general plan for what they want to ask, the questions do not have to follow a particular phrasing or order.

Semi-structured interviews are often open-ended, allowing for flexibility, but follow a predetermined thematic framework, giving a sense of order. For this reason, they are often considered “the best of both worlds.”

However, if the questions differ substantially between participants, it can be challenging to look for patterns, lessening the generalizability and validity of your results.

  • You have prior interview experience. It’s easier than you think to accidentally ask a leading question when coming up with questions on the fly. Overall, spontaneous questions are much more difficult than they may seem.
  • Your research question is exploratory in nature. The answers you receive can help guide your future research.

An unstructured interview is the most flexible type of interview. The questions and the order in which they are asked are not set. Instead, the interview can proceed more spontaneously, based on the participant’s previous answers.

Unstructured interviews are by definition open-ended. This flexibility can help you gather detailed information on your topic, while still allowing you to observe patterns between participants.

However, so much flexibility means that they can be very challenging to conduct properly. You must be very careful not to ask leading questions, as biased responses can lead to lower reliability or even invalidate your research.

  • You have a solid background in your research topic and have conducted interviews before.
  • Your research question is exploratory in nature, and you are seeking descriptive data that will deepen and contextualize your initial hypotheses.
  • Your research necessitates forming a deeper connection with your participants, encouraging them to feel comfortable revealing their true opinions and emotions.

A focus group brings together a group of participants to answer questions on a topic of interest in a moderated setting. Focus groups are qualitative in nature and often study the group’s dynamic and body language in addition to their answers. Responses can guide future research on consumer products and services, human behavior, or controversial topics.

Focus groups can provide more nuanced and unfiltered feedback than individual interviews and are easier to organize than experiments or large surveys . However, their small size leads to low external validity and the temptation as a researcher to “cherry-pick” responses that fit your hypotheses.

  • Your research focuses on the dynamics of group discussion or real-time responses to your topic.
  • Your questions are complex and rooted in feelings, opinions, and perceptions that cannot be answered with a “yes” or “no.”
  • Your topic is exploratory in nature, and you are seeking information that will help you uncover new questions or future research ideas.

Here's why students love Scribbr's proofreading services

Discover proofreading & editing

Depending on the type of interview you are conducting, your questions will differ in style, phrasing, and intention. Structured interview questions are set and precise, while the other types of interviews allow for more open-endedness and flexibility.

Here are some examples.

  • Semi-structured
  • Unstructured
  • Focus group
  • Do you like dogs? Yes/No
  • Do you associate dogs with feeling: happy; somewhat happy; neutral; somewhat unhappy; unhappy
  • If yes, name one attribute of dogs that you like.
  • If no, name one attribute of dogs that you don’t like.
  • What feelings do dogs bring out in you?
  • When you think more deeply about this, what experiences would you say your feelings are rooted in?

Interviews are a great research tool. They allow you to gather rich information and draw more detailed conclusions than other research methods, taking into consideration nonverbal cues, off-the-cuff reactions, and emotional responses.

However, they can also be time-consuming and deceptively challenging to conduct properly. Smaller sample sizes can cause their validity and reliability to suffer, and there is an inherent risk of interviewer effect arising from accidentally leading questions.

Here are some advantages and disadvantages of each type of interview that can help you decide if you’d like to utilize this research method.

Advantages and disadvantages of interviews
Type of interview Advantages Disadvantages
Structured interview
Semi-structured interview , , , and
Unstructured interview , , , and
Focus group , , and , since there are multiple people present

If you want to know more about statistics , methodology , or research bias , make sure to check out some of our other articles with explanations and examples.

  • Student’s  t -distribution
  • Normal distribution
  • Null and Alternative Hypotheses
  • Chi square tests
  • Confidence interval
  • Quartiles & Quantiles
  • Cluster sampling
  • Stratified sampling
  • Data cleansing
  • Reproducibility vs Replicability
  • Peer review
  • Prospective cohort study

Research bias

  • Implicit bias
  • Cognitive bias
  • Placebo effect
  • Hawthorne effect
  • Hindsight bias
  • Affect heuristic
  • Social desirability bias

The four most common types of interviews are:

  • Structured interviews : The questions are predetermined in both topic and order. 
  • Semi-structured interviews : A few questions are predetermined, but other questions aren’t planned.
  • Unstructured interviews : None of the questions are predetermined.
  • Focus group interviews : The questions are presented to a group instead of one individual.

The interviewer effect is a type of bias that emerges when a characteristic of an interviewer (race, age, gender identity, etc.) influences the responses given by the interviewee.

There is a risk of an interviewer effect in all types of interviews , but it can be mitigated by writing really high-quality interview questions.

Social desirability bias is the tendency for interview participants to give responses that will be viewed favorably by the interviewer or other participants. It occurs in all types of interviews and surveys , but is most common in semi-structured interviews , unstructured interviews , and focus groups .

Social desirability bias can be mitigated by ensuring participants feel at ease and comfortable sharing their views. Make sure to pay attention to your own body language and any physical or verbal cues, such as nodding or widening your eyes.

This type of bias can also occur in observations if the participants know they’re being observed. They might alter their behavior accordingly.

A focus group is a research method that brings together a small group of people to answer questions in a moderated setting. The group is chosen due to predefined demographic traits, and the questions are designed to shed light on a topic of interest. It is one of 4 types of interviews .

Quantitative research deals with numbers and statistics, while qualitative research deals with words and meanings.

Quantitative methods allow you to systematically measure variables and test hypotheses . Qualitative methods allow you to explore concepts and experiences in more detail.

Cite this Scribbr article

If you want to cite this source, you can copy and paste the citation or click the “Cite this Scribbr article” button to automatically add the citation to our free Citation Generator.

George, T. (2023, June 22). Types of Interviews in Research | Guide & Examples. Scribbr. Retrieved June 24, 2024, from https://www.scribbr.com/methodology/interviews-research/

Is this article helpful?

Tegan George

Tegan George

Other students also liked, unstructured interview | definition, guide & examples, structured interview | definition, guide & examples, semi-structured interview | definition, guide & examples, "i thought ai proofreading was useless but..".

I've been using Scribbr for years now and I know it's a service that won't disappoint. It does a good job spotting mistakes”

  • Harvard Library
  • Research Guides
  • Faculty of Arts & Sciences Libraries

Library Support for Qualitative Research

  • Interview Research
  • Resources for Methodology
  • Remote Research & Virtual Fieldwork

Resources for Research Interviewing

Nih-funded qualitative research.

  • Oral History
  • Data Management & Repositories
  • Campus Access

Types of Interviews

  • Engaging Participants

Interview Questions

  • Conducting Interviews
  • Transcription
  • Coding and Analysis
  • Managing & Finding Interview Data
  • UX & Market Research Interviews

Textbooks, Guidebooks, and Handbooks  

  • The Ethnographic Interview by James P. Spradley  “Spradley wrote this book for the professional and student who have never done ethnographic fieldwork (p. 231) and for the professional ethnographer who is interested in adapting the author’s procedures (p. iv). Part 1 outlines in 3 chapters Spradley’s version of ethnographic research, and it provides the background for Part 2 which consists of 12 guided steps (chapters) ranging from locating and interviewing an informant to writing an ethnography. Most of the examples come from the author’s own fieldwork among U.S. subcultures . . . Steps 6 and 8 explain lucidly how to construct a domain and a taxonomic analysis” (excerpted from book review by James D. Sexton, 1980).  
  • Fundamentals of Qualitative Research by Johnny Saldana (Series edited by Patricia Leavy)  Provides a soup-to-nuts overview of the qualitative data collection process, including interviewing, participant observation, and other methods.  
  • InterViews by Steinar Kvale  Interviewing is an essential tool in qualitative research and this introduction to interviewing outlines both the theoretical underpinnings and the practical aspects of the process. After examining the role of the interview in the research process, Steinar Kvale considers some of the key philosophical issues relating to interviewing: the interview as conversation, hermeneutics, phenomenology, concerns about ethics as well as validity, and postmodernism. Having established this framework, the author then analyzes the seven stages of the interview process - from designing a study to writing it up.  
  • Practical Evaluation by Michael Quinn Patton  Surveys different interviewing strategies, from, a) informal/conversational, to b) interview guide approach, to c) standardized and open-ended, to d) closed/quantitative. Also discusses strategies for wording questions that are open-ended, clear, sensitive, and neutral, while supporting the speaker. Provides suggestions for probing and maintaining control of the interview process, as well as suggestions for recording and transcription.  
  • The SAGE Handbook of Interview Research by Amir B. Marvasti (Editor); James A. Holstein (Editor); Jaber F. Gubrium (Editor); Karyn D. McKinney (Editor)  The new edition of this landmark volume emphasizes the dynamic, interactional, and reflexive dimensions of the research interview. Contributors highlight the myriad dimensions of complexity that are emerging as researchers increasingly frame the interview as a communicative opportunity as much as a data-gathering format. The book begins with the history and conceptual transformations of the interview, which is followed by chapters that discuss the main components of interview practice. Taken together, the contributions to The SAGE Handbook of Interview Research: The Complexity of the Craft encourage readers simultaneously to learn the frameworks and technologies of interviewing and to reflect on the epistemological foundations of the interview craft.  
  • The SAGE Handbook of Online Research Methods by Nigel G. Fielding, Raymond M. Lee and Grant Blank (Editors) Bringing together the leading names in both qualitative and quantitative online research, this new edition is organised into nine sections: 1. Online Research Methods 2. Designing Online Research 3. Online Data Capture and Data Collection 4. The Online Survey 5. Digital Quantitative Analysis 6. Digital Text Analysis 7. Virtual Ethnography 8. Online Secondary Analysis: Resources and Methods 9. The Future of Online Social Research

ONLINE RESOURCES, COMMUNITIES, AND DATABASES  

  • Interviews as a Method for Qualitative Research (video) This short video summarizes why interviews can serve as useful data in qualitative research.  
  • Companion website to Bloomberg and Volpe's  Completing Your Qualitative Dissertation: A Road Map from Beginning to End,  4th ed Provides helpful templates and appendices featured in the book, as well as links to other useful dissertation resources.
  • International Congress of Qualitative Inquiry Annual conference hosted by the International Center for Qualitative Inquiry at the University of Illinois at Urbana-Champaign, which aims to facilitate the development of qualitative research methods across a wide variety of academic disciplines, among other initiatives.  
  • METHODSPACE ​​​​​​​​An online home of the research methods community, where practicing researchers share how to make research easier.  
  • SAGE researchmethods ​​​​​​​Researchers can explore methods concepts to help them design research projects, understand particular methods or identify a new method, conduct their research, and write up their findings. A "methods map" facilitates finding content on methods.

The decision to conduct interviews, and the type of interviewing to use, should flow from, or align with, the methodological paradigm chosen for your study, whether that paradigm is interpretivist, critical, positivist, or participative in nature (or a combination of these).

Structured:

  • Structured Interview. Entry in The SAGE Encyclopedia of Social Science Research Methodsby Floyd J. Fowler Jr., Editors: Michael S. Lewis-Beck; Alan E. Bryman; Tim Futing Liao (Editor)  A concise article noting standards, procedures, and recommendations for developing and testing structured interviews. For an example of structured interview questions, you may view the Current Population Survey, May 2008: Public Participation in the Arts Supplement (ICPSR 29641), Apr 15, 2011 at https://doi.org/10.3886/ICPSR29641.v1 (To see the survey questions, preview the user guide, which can be found under the "Data and Documentation" tab. Then, look for page 177 (attachment 8).

Semi-Structured:

  • Semi-Structured Interview. Entry in The SAGE Encyclopedia of Qualitative Research Methodsby Lioness Ayres; Editor: Lisa M. Given  The semi-structured interview is a qualitative data collection strategy in which the researcher asks informants a series of predetermined but open-ended questions. The researcher has more control over the topics of the interview than in unstructured interviews, but in contrast to structured interviews or questionnaires that use closed questions, there is no fixed range of responses to each question.

Unstructured:

  • Unstructured Interview. Entry in The SAGE Encyclopedia of Qualitative Research Methodsby Michael W. Firmin; Editor: Lisa M. Given  Unstructured interviews in qualitative research involve asking relatively open-ended questions of research participants in order to discover their percepts on the topic of interest. Interviews, in general, are a foundational means of collecting data when using qualitative research methods. They are designed to draw from the interviewee constructs embedded in his or her thinking and rationale for decision making. The researcher uses an inductive method in data gathering, regardless of whether the interview method is open, structured, or semi-structured. That is, the researcher does not wish to superimpose his or her own viewpoints onto the person being interviewed. Rather, inductively, the researcher wishes to understand the participant's perceptions, helping him or her to articulate percepts such that they will be understood clearly by the journal reader.

Genres and Uses

Focus groups:.

  • "Focus Groups." Annual Review of Sociology 22 (1996): 129-1524.by David L. Morgan  Discusses the use of focus groups and group interviews as methods for gathering qualitative data used by sociologists and other academic and applied researchers. Focus groups are recommended for giving voice to marginalized groups and revealing the group effect on opinion formation.  
  • Qualitative Research Methods: A Data Collector's Field Guide (See Module 4: "Focus Groups")by Mack, N., et al.  This field guide is based on an approach to doing team-based, collaborative qualitative research that has repeatedly proven successful in research projects sponsored by Family Health International (FHI) throughout the developing world. With its straightforward delivery of information on the main qualitative methods being used in public health research today, the guide speaks to the need for simple yet effective instruction on how to do systematic and ethically sound qualitative research. The aim of the guide is thus practical. In bypassing extensive discussion on the theoretical underpinnings of qualitative research, it distinguishes itself as a how-to guide to be used in the field.

In-Depth (typically One-on-One):

  • A Practical Introduction to in-Depth Interviewingby Alan Morris  Are you new to qualitative research or a bit rusty and in need of some inspiration? Are you doing a research project involving in-depth interviews? Are you nervous about carrying out your interviews? This book will help you complete your qualitative research project by providing a nuts and bolts introduction to interviewing. With coverage of ethics, preparation strategies and advice for handling the unexpected in the field, this handy guide will help you get to grips with the basics of interviewing before embarking on your research. While recognising that your research question and the context of your research will drive your approach to interviewing, this book provides practical advice often skipped in traditional methods textbooks.  
  • Qualitative Research Methods: A Data Collector's Field Guide (See Module 3: "In-Depth Interviews")by Mack, N., et al.  This field guide is based on an approach to doing team-based, collaborative qualitative research that has repeatedly proven successful in research projects sponsored by Family Health International (FHI) throughout the developing world. With its straightforward delivery of information on the main qualitative methods being used in public health research today, the guide speaks to the need for simple yet effective instruction on how to do systematic and ethically sound qualitative research. The aim of the guide is thus practical. In bypassing extensive discussion on the theoretical underpinnings of qualitative research, it distinguishes itself as a how-to guide to be used in the field.

Folklore Research and Oral Histories:

In addition to the following resource, see the  Oral History   page of this guide for helpful resources on Oral History interviewing.

American Folklife Center at the Library of Congress. Folklife and Fieldwork: A Layman’s Introduction to Field Techniques Interviews gathered for purposes of folklore research are similar to standard social science interviews in some ways, but also have a good deal in common with oral history approaches to interviewing. The focus in a folklore research interview is on documenting and trying to understand the interviewee's way of life relative to a culture or subculture you are studying. This guide includes helpful advice and tips for conducting fieldwork in folklore, such as tips for planning, conducting, recording, and archiving interviews.

An interdisciplinary scientific program within the Institute for Quantitative Social Science which encourages and facilitates research and instruction in the theory and practice of survey research. The primary mission of PSR is to provide survey research resources to enhance the quality of teaching and research at Harvard.

  • Internet, Phone, Mail, and Mixed-Mode Surveysby Don A. Dillman; Jolene D. Smyth; Leah Melani Christian  The classic survey design reference, updated for the digital age. The new edition is thoroughly updated and revised, and covers all aspects of survey research. It features expanded coverage of mobile phones, tablets, and the use of do-it-yourself surveys, and Dillman's unique Tailored Design Method is also thoroughly explained. This new edition is complemented by copious examples within the text and accompanying website. It includes: Strategies and tactics for determining the needs of a given survey, how to design it, and how to effectively administer it. How and when to use mail, telephone, and Internet surveys to maximum advantage. Proven techniques to increase response rates. Guidance on how to obtain high-quality feedback from mail, electronic, and other self-administered surveys. Direction on how to construct effective questionnaires, including considerations of layout. The effects of sponsorship on the response rates of surveys. Use of capabilities provided by newly mass-used media: interactivity, presentation of aural and visual stimuli. The Fourth Edition reintroduces the telephone--including coordinating land and mobile.

User Experience (UX) and Marketing:

  • See the  "UX & Market Research Interviews"  tab on this guide, above. May include  Focus Groups,  above.

Screening for Research Site Selection:

  • Research interviews are used not only to furnish research data for theoretical analysis in the social sciences, but also to plan other kinds of studies. For example, interviews may allow researchers to screen appropriate research sites to conduct empirical studies (such as randomized controlled trials) in a variety of fields, from medicine to law. In contrast to interviews conducted in the course of social research, such interviews do not typically serve as the data for final analysis and publication.

ENGAGING PARTICIPANTS

Research ethics  .

  • Human Subjects (IRB) The Committee on the Use of Human Subjects (CUHS) serves as the Institutional Review Board for the University area which includes the Cambridge and Allston campuses at Harvard. Find your IRB  contact person , or learn about  required ethics training.  You may also find the  IRB Lifecycle Guide  helpful. This is the preferred IRB portal for Harvard graduate students and other researchers. IRB forms can be downloaded via the  ESTR Library  (click on the "Templates and Forms" tab, then navigate to pages 2 and 3 to find the documents labelled with “HUA” for the Harvard University Area IRB. Nota bene: You may use these forms only if you submit your study to the Harvard University IRB). The IRB office can be reached through email at [email protected] or by telephone at (617) 496-2847.  
  • Undergraduate Research Training Program (URTP) Portal The URTP at Harvard University is a comprehensive platform to create better prepared undergraduate researchers. The URTP is comprised of research ethics training sessions, a student-focused curriculum, and an online decision form that will assist students in determining whether their project requires IRB review. Students should examine the  URTP's guide for student researchers: Introduction to Human Subjects Research Protection.  
  • Ethics reports From the Association of Internet Researchers (AoIR)  
  • Respect, Beneficence, and Justice: QDR General Guidance for Human Participants If you are hoping to share your qualitative interview data in a repository after it has been collected, you will need to plan accordingly via informed consent, careful de-identification procedures, and data access controls. Consider  consulting with the Qualitative Research Support Group at Harvard Library  and consulting with  Harvard's Dataverse contacts  to help you think through all of the contingencies and processes.  
  • "Conducting a Qualitative Child Interview: Methodological Considerations." Journal of Advanced Nursing 42/5 (2003): 434-441 by Kortesluoma, R., et al.  The purpose of this article is to illustrate the theoretical premises of child interviewing, as well as to describe some practical methodological solutions used during interviews. Factors that influence data gathered from children and strategies for taking these factors into consideration during the interview are also described.  
  • "Crossing Cultural Barriers in Research Interviewing." Qualitative Social Work 63/3 (2007): 353-372 by Sands, R., et al.  This article critically examines a qualitative research interview in which cultural barriers between a white non-Muslim female interviewer and an African American Muslim interviewee, both from the USA, became evident and were overcome within the same interview.  
  • Decolonizing Methodologies: Research and Indigenous Peoples by Linda Tuhiwai Smith  This essential volume explores intersections of imperialism and research - specifically, the ways in which imperialism is embedded in disciplines of knowledge and tradition as 'regimes of truth.' Concepts such as 'discovery' and 'claiming' are discussed and an argument presented that the decolonization of research methods will help to reclaim control over indigenous ways of knowing and being. The text includes case-studies and examples, and sections on new indigenous literature and the role of research in indigenous struggles for social justice.  

This resource, sponsored by University of Oregon Libraries, exemplifies the use of interviewing methodologies in research that foregrounds traditional knowledge. The methodology page summarizes the approach.

  • Ethics: The Need to Tread Carefully. Chapter in A Practical Introduction to in-Depth Interviewing by Alan Morris  Pay special attention to the sections in chapter 2 on "How to prevent and respond to ethical issues arising in the course of the interview," "Ethics in the writing up of your interviews," and "The Ethics of Care."  
  • Handbook on Ethical Issues in Anthropology by Joan Cassell (Editor); Sue-Ellen Jacobs (Editor)  This publication of the American Anthropological Association presents and discusses issues and sources on ethics in anthropology, as well as realistic case studies of ethical dilemmas. It is meant to help social science faculty introduce discussions of ethics in their courses. Some of the topics are relevant to interviews, or at least to studies of which interviews are a part. See chapters 3 and 4 for cases, with solutions and commentary, respectively.  
  • Research Ethics from the Chanie Wenjack School for Indigenous Studies, Trent University  (Open Access) An overview of Indigenous research ethics and protocols from the across the globe.  
  • Resources for Equity in Research Consult these resources for guidance on creating and incorporating equitable materials into public health research studies that entail community engagement.

The SAGE Handbook of Qualitative Research Ethics by Ron Iphofen (Editor); Martin Tolich (Editor)  This handbook is a much-needed and in-depth review of the distinctive set of ethical considerations which accompanies qualitative research. This is particularly crucial given the emergent, dynamic and interactional nature of most qualitative research, which too often allows little time for reflection on the important ethical responsibilities and obligations. Contributions from leading international researchers have been carefully organized into six key thematic sections: Part One: Thick Descriptions Of Qualitative Research Ethics; Part Two: Qualitative Research Ethics By Technique; Part Three: Ethics As Politics; Part Four: Qualitative Research Ethics With Vulnerable Groups; Part Five: Relational Research Ethics; Part Six: Researching Digitally. This Handbook is a one-stop resource on qualitative research ethics across the social sciences that draws on the lessons learned and the successful methods for surmounting problems - the tried and true, and the new.

RESEARCH COMPLIANCE AND PRIVACY LAWS

Research Compliance Program for FAS/SEAS at Harvard : The Faculty of Arts and Sciences (FAS), including the School of Engineering and Applied Sciences (SEAS), and the Office of the Vice Provost for Research (OVPR) have established a shared Research Compliance Program (RCP). An area of common concern for interview studies is international projects and collaboration . RCP is a resource to provide guidance on which international activities may be impacted by US sanctions on countries, individuals, or entities and whether licenses or other disclosure are required to ship or otherwise share items, technology, or data with foreign collaborators.

  • Harvard Global Support Services (GSS) is for students, faculty, staff, and researchers who are studying, researching, or working abroad. Their services span safety and security, health, culture, outbound immigration, employment, financial and legal matters, and research center operations. These include travel briefings and registration, emergency response, guidance on international projects, and managing in-country operations.

Generative AI: Harvard-affiliated researchers should not enter data classified as confidential ( Level 2 and above ), including non-public research data, into publicly-available generative AI tools, in accordance with the University’s Information Security Policy. Information shared with generative AI tools using default settings is not private and could expose proprietary or sensitive information to unauthorized parties.

Privacy Laws: Be mindful of any potential privacy laws that may apply wherever you conduct your interviews. The General Data Protection Regulation is a high-profile example (see below):

  • General Data Protection Regulation (GDPR) This Regulation lays down rules relating to the protection of natural persons with regard to the processing of personal data and rules relating to the free movement of personal data. It protects fundamental rights and freedoms of natural persons and in particular their right to the protection of personal data. The free movement of personal data within the Union shall be neither restricted nor prohibited for reasons connected with the protection of natural persons with regard to the processing of personal data. For a nice summary of what the GDPR requires, check out the GDPR "crash course" here .

SEEKING CONSENT  

If you would like to see examples of consent forms, ask your local IRB, or take a look at these resources:

  • Model consent forms for oral history, suggested by the Centre for Oral History and Digital Storytelling at Concordia University  
  • For NIH-funded research, see this  resource for developing informed consent language in research studies where data and/or biospecimens will be stored and shared for future use.

POPULATION SAMPLING

If you wish to assemble resources to aid in sampling, such as the USPS Delivery Sequence File, telephone books, or directories of organizations and listservs, please contact our  data librarian  or write to  [email protected] .

  • Research Randomizer   A free web-based service that permits instant random sampling and random assignment. It also contains an interactive tutorial perfect for students taking courses in research methods.  
  • Practical Tools for Designing and Weighting Survey Samples by Richard Valliant; Jill A. Dever; Frauke Kreuter  Survey sampling is fundamentally an applied field. The goal in this book is to put an array of tools at the fingertips of practitioners by explaining approaches long used by survey statisticians, illustrating how existing software can be used to solve survey problems, and developing some specialized software where needed. This book serves at least three audiences: (1) Students seeking a more in-depth understanding of applied sampling either through a second semester-long course or by way of a supplementary reference; (2) Survey statisticians searching for practical guidance on how to apply concepts learned in theoretical or applied sampling courses; and (3) Social scientists and other survey practitioners who desire insight into the statistical thinking and steps taken to design, select, and weight random survey samples. Several survey data sets are used to illustrate how to design samples, to make estimates from complex surveys for use in optimizing the sample allocation, and to calculate weights. Realistic survey projects are used to demonstrate the challenges and provide a context for the solutions. The book covers several topics that either are not included or are dealt with in a limited way in other texts. These areas include: sample size computations for multistage designs; power calculations related to surveys; mathematical programming for sample allocation in a multi-criteria optimization setting; nuts and bolts of area probability sampling; multiphase designs; quality control of survey operations; and statistical software for survey sampling and estimation. An associated R package, PracTools, contains a number of specialized functions for sample size and other calculations. The data sets used in the book are also available in PracTools, so that the reader may replicate the examples or perform further analyses.  
  • Sampling: Design and Analysis by Sharon L. Lohr  Provides a modern introduction to the field of sampling. With a multitude of applications from a variety of disciplines, the book concentrates on the statistical aspects of taking and analyzing a sample. Overall, the book gives guidance on how to tell when a sample is valid or not, and how to design and analyze many different forms of sample surveys.  
  • Sampling Techniques by William G. Cochran  Clearly demonstrates a wide range of sampling methods now in use by governments, in business, market and operations research, social science, medicine, public health, agriculture, and accounting. Gives proofs of all the theoretical results used in modern sampling practice. New topics in this edition include the approximate methods developed for the problem of attaching standard errors or confidence limits to nonlinear estimates made from the results of surveys with complex plans.  
  • "Understanding the Process of Qualitative Data Collection" in Chapter 13 (pp. 103–1162) of 30 Essential Skills for the Qualitative Researcher by John W. Creswell  Provides practical "how-to" information for beginning researchers in the social, behavioral, and health sciences with many applied examples from research design, qualitative inquiry, and mixed methods.The skills presented in this book are crucial for a new qualitative researcher starting a qualitative project.  
  • Survey Methodology by Robert M. Groves; Floyd J. Fowler; Mick P. Couper; James M. Lepkowski; Eleanor Singer; Roger Tourangeau; Floyd J. Fowler  coverage includes sampling frame evaluation, sample design, development of questionnaires, evaluation of questions, alternative modes of data collection, interviewing, nonresponse, post-collection processing of survey data, and practices for maintaining scientific integrity.

The way a qualitative researcher constructs and approaches interview questions should flow from, or align with, the methodological paradigm chosen for the study, whether that paradigm is interpretivist, critical, positivist, or participative in nature (or a combination of these).

Constructing Your Questions

Helpful texts:.

  • "Developing Questions" in Chapter 4 (pp. 98–108) of Becoming Qualitative Researchers by Corrine Glesne  Ideal for introducing the novice researcher to the theory and practice of qualitative research, this text opens students to the diverse possibilities within this inquiry approach, while helping them understand how to design and implement specific research methods.  
  • "Learning to Interview in the Social Sciences" Qualitative Inquiry, 9(4) 2003, 643–668 by Roulston, K., deMarrais, K., & Lewis, J. B. See especially the section on "Phrasing and Negotiating Questions" on pages 653-655 and common problems with framing questions noted on pages 659 - 660.  
  • Qualitative Research Interviewing: Biographic Narrative and Semi-Structured Methods (See sections on “Lightly and Heavily Structured Depth Interviewing: Theory-Questions and Interviewer-Questions” and “Preparing for any Interviewing Sequence") by Tom Wengraf  Unique in its conceptual coherence and the level of practical detail, this book provides a comprehensive resource for those concerned with the practice of semi-structured interviewing, the most commonly used interview approach in social research, and in particular for in-depth, biographic narrative interviewing. It covers the full range of practices from the identification of topics through to strategies for writing up research findings in diverse ways.  
  • "Scripting a Qualitative Purpose Statement and Research Questions" in Chapter 12 (pp. 93–102) of 30 Essential Skills for the Qualitative Researcher by John W. Creswell  Provides practical "how-to" information for beginning researchers in the social, behavioral, and health sciences with many applied examples from research design, qualitative inquiry, and mixed methods.The skills presented in this book are crucial for a new qualitative researcher starting a qualitative project.  
  • Some Strategies for Developing Interview Guides for Qualitative Interviews by Sociology Department, Harvard University Includes general advice for conducting qualitative interviews, pros and cons of recording and transcription, guidelines for success, and tips for developing and phrasing effective interview questions.  
  • Tip Sheet on Question Wording by Harvard University Program on Survey Research

Let Theory Guide You:

The quality of your questions depends on how you situate them within a wider body of knowledge. Consider the following advice:

A good literature review has many obvious virtues. It enables the investigator to define problems and assess data. It provides the concepts on which percepts depend. But the literature review has a special importance for the qualitative researcher. This consists of its ability to sharpen his or her capacity for surprise (Lazarsfeld, 1972b). The investigator who is well versed in the literature now has a set of expectations the data can defy. Counterexpectational data are conspicuous, readable, and highly provocative data. They signal the existence of unfulfilled theoretical assumptions, and these are, as Kuhn (1962) has noted, the very origins of intellectual innovation. A thorough review of the literature is, to this extent, a way to manufacture distance. It is a way to let the data of one's research project take issue with the theory of one's field.

McCracken, G. (1988), The Long Interview, Sage: Newbury Park, CA, p. 31

When drafting your interview questions, remember that everything follows from your central research question. Also, on the way to writing your "operationalized" interview questions, it's  helpful to draft broader, intermediate questions, couched in theory. Nota bene:  While it is important to know the literature well before conducting your interview(s), be careful not to present yourself to your research participant(s) as "the expert," which would be presumptuous and could be intimidating. Rather, the purpose of your knowledge is to make you a better, keener listener.

If you'd like to supplement what you learned about relevant theories through your coursework and literature review, try these sources:

  • Annual Reviews   Review articles sum up the latest research in many fields, including social sciences, biomedicine, life sciences, and physical sciences. These are timely collections of critical reviews written by leading scientists.  
  • HOLLIS - search for resources on theories in your field   Modify this example search by entering the name of your field in place of "your discipline," then hit search.  
  • Oxford Bibliographies   Written and reviewed by academic experts, every article in this database is an authoritative guide to the current scholarship in a variety of fields, containing original commentary and annotations.  
  • ProQuest Dissertations & Theses (PQDT)   Indexes dissertations and masters' theses from most North American graduate schools as well as some European universities. Provides full text for most indexed dissertations from 1990-present.  
  • Very Short Introductions   Launched by Oxford University Press in 1995, Very Short Introductions offer concise introductions to a diverse range of subjects from Climate to Consciousness, Game Theory to Ancient Warfare, Privacy to Islamic History, Economics to Literary Theory.

CONDUCTING INTERVIEWS

Equipment and software:  .

  • Lamont Library  loans microphones and podcast starter kits, which will allow you to capture audio (and you may record with software, such as Garage Band). 
  • Cabot Library  loans digital recording devices, as well as USB microphones.

If you prefer to use your own device, you may purchase a small handheld audio recorder, or use your cell phone.

  • Audio Capture Basics (PDF)  - Helpful instructions, courtesy of the Lamont Library Multimedia Lab.
  • Getting Started with Podcasting/Audio:  Guidelines from Harvard Library's Virtual Media Lab for preparing your interviewee for a web-based recording (e.g., podcast, interview)
  • ​ Camtasia Screen Recorder and Video Editor
  • Zoom: Video Conferencing, Web Conferencing
  • Visit the Multimedia Production Resources guide! Consult it to find and learn how to use audiovisual production tools, including: cameras, microphones, studio spaces, and other equipment at Cabot Science Library and Lamont Library.
  • Try the virtual office hours offered by the Lamont Multimedia Lab!

TIPS FOR CONDUCTING INTERVIEWS

Quick handout:  .

  • Research Interviewing Tips (Courtesy of Dr. Suzanne Spreadbury)

Remote Interviews:  

  • For Online or Distant Interviews, See "Remote Research & Virtual Fieldwork" on this guide .  
  • Deborah Lupton's Bibliography: Doing Fieldwork in a Pandemic

Seeking Consent:

Books and articles:  .

  • "App-Based Textual Interviews: Interacting With Younger Generations in a Digitalized Social Reallity."International Journal of Social Research Methodology (12 June 2022). Discusses the use of texting platforms as a means to reach young people. Recommends useful question formulations for this medium.  
  • "Learning to Interview in the Social Sciences." Qualitative Inquiry, 9(4) 2003, 643–668 by Roulston, K., deMarrais, K., & Lewis, J. B. See especially the section on "Phrasing and Negotiating Questions" on pages 653-655 and common problems with framing questions noted on pages 659-660.  
  • "Slowing Down and Digging Deep: Teaching Students to Examine Interview Interaction in Depth." LEARNing Landscapes, Spring 2021 14(1) 153-169 by Herron, Brigette A. and Kathryn Roulston. Suggests analysis of videorecorded interviews as a precursor to formulating one's own questions. Includes helpful types of probes.  
  • Using Interviews in a Research Project by Nigel Joseph Mathers; Nicholas J Fox; Amanda Hunn; Trent Focus Group.  A work pack to guide researchers in developing interviews in the healthcare field. Describes interview structures, compares face-to-face and telephone interviews. Outlines the ways in which different types of interview data can be analysed.  
  • “Working through Challenges in Doing Interview Research.” International Journal of Qualitative Methods, (December 2011), 348–66 by Roulston, Kathryn.  The article explores (1) how problematic interactions identified in the analysis of focus group data can lead to modifications in research design, (2) an approach to dealing with reported data in representations of findings, and (3) how data analysis can inform question formulation in successive rounds of data generation. Findings from these types of examinations of interview data generation and analysis are valuable for informing both interview practice as well as research design.

Videos:  

video still image

The way a qualitative researcher transcribes interviews should flow from, or align with, the methodological paradigm chosen for the study, whether that paradigm is interpretivist, critical, positivist, or participative in nature (or a combination of these).

TRANSCRIPTION

Before embarking on a transcription project, it's worthwhile to invest in the time and effort necessary to capture good audio, which will make the transcription process much easier. If you haven't already done so, check out the  audio capture guidelines from Harvard Library's Virtual Media Lab , or  contact a media staff member  for customized recommendations. First and foremost, be mindful of common pitfalls by watching this short video that identifies  the most common errors to avoid!

SOFTWARE:  

  • Adobe Premiere Pro Speech-To-Text  automatically generates transcripts and adds captions to your videos. Harvard affiliates can download Adobe Premiere in the Creative Cloud Suite.  
  • GoTranscript  provides cost-effective human-generated transcriptions.  
  • pyTranscriber  is an app for generating automatic transcription and/or subtitles for audio and video files. It uses the Google Cloud Speech-to-Text service, has a friendly graphical user interface, and is purported to work nicely with Chinese.   
  • Otter  provides a new way to capture, store, search and share voice conversations, lectures, presentations, meetings, and interviews. The startup is based in Silicon Valley with a team of experienced Ph.Ds and engineers from Google, Facebook, Yahoo and Nuance (à la Dragon). Free accounts available. This is the software that  Zoom  uses to generate automated transcripts, so if you have access to a Zoom subscription, you have access to Otter transcriptions with it (applicable in several  languages ). As with any automated approach, be prepared to correct any errors after the fact, by hand.  
  • Panopto  is available to Harvard affiliates and generates  ASR (automated speech recognition) captions . You may upload compatible audio files into it. As with any automatically generated transcription, you will need to make manual revisions. ASR captioning is available in several  languages . Panopto maintains robust security practices, including strong authentication measures and end-to-end encryption, ensuring your content remains private and protected.  
  • REV.Com  allows you to record and transcribe any calls on the iPhone, both outgoing and incoming. It may be useful for recording phone interviews. Rev lets you choose whether you want an AI- or human-generated transcription, with a fast turnaround. Rev has Service Organization Controls Type II (SOC2) certification (a SOC2 cert looks at and verifies an organization’s processing integrity, privacy practices, and security safeguards).   
  • Scribie Audio/Video Transcription  provides automated or manual transcriptions for a small fee. As with any transcription service, some revisions will be necessary after the fact, particularly for its automated transcripts.  
  • Sonix  automatically transcribes, translates, and helps to organize audio and video files in over 40 languages. It's fast and affordable, with good accuracy. The free trial includes 30 minutes of free transcription.  
  • TranscriptionWing  uses a human touch process to clean up machine-generated transcripts so that the content will far more accurately reflect your audio recording.   
  • Whisper is a tool from OpenAI that facilitates transcription of sensitive audiovisual recordings (e.g., of research interviews) on your own device. Installation and use depends on your operating system and which version you install. Important Note: The Whisper API, where audio is sent to OpenAI to be processed by them and then sent back (usually through a programming language like Python) is NOT appropriate for sensitive data. The model should be downloaded with tools such as those described in this FAQ , so that audio is kept to your local machine. For assistance, contact James Capobianco .

EQUIPMENT:  

  • Transcription pedals  are in circulation and available to borrow from the Circulation desk at Lamont, or use at Lamont Library's Media Lab on level B. For hand-transcribing your interviews, they work in conjunction with software such as  Express Scribe , which is loaded on Media Lab computers, or you may download for free on your own machine (Mac or PC versions; scroll down the downloads page for the latter). The pedals are plug-and-play USB, allow a wide range of playback speeds, and have 3 programmable buttons, which are typically set to rewind/play/fast-forward. Instructions are included in the bag that covers installation and set-up of the software, and basic use of the pedals.

NEED HELP?  

  • Try the virtual office hours offered by the Lamont Multimedia Lab!    
  • If you're creating podcasts, login to  Canvas  and check out the  Podcasting/Audio guide . 

Helpful Texts:  

  • "Transcription as a Crucial Step of Data Analysis" in Chapter 5 of The SAGE Handbook of Qualitative Data Analysisby Uwe Flick (Editor)  Covers basic terminology for transcription, shares caveats for transcribers, and identifies components of vocal behavior. Provides notation systems for transcription, suggestions for transcribing turn-taking, and discusses new technologies and perspectives. Includes a bibliography for further reading.  
  • "Transcribing the Oral Interview: Part Art, Part Science " on p. 10 of the Centre for Community Knowledge (CCK) newsletter: TIMESTAMPby Mishika Chauhan and Saransh Srivastav

QUALITATIVE DATA ANALYSIS

Software  .

  • Free download available for Harvard Faculty of Arts and Sciences (FAS) affiliates
  • Desktop access at Lamont Library Media Lab, 3rd floor
  • Desktop access at Harvard Kennedy School Library (with HKS ID)
  • Remote desktop access for Harvard affiliates from  IQSS Computer Labs . Email them at  [email protected] and ask for a new lab account and remote desktop access to NVivo.
  • Virtual Desktop Infrastructure (VDI) access available to Harvard T.H. Chan School of Public Health affiliates

CODING AND THEMEING YOUR DATA

Data analysis methods should flow from, or align with, the methodological paradigm chosen for your study, whether that paradigm is interpretivist, critical, positivist, or participative in nature (or a combination of these). Some established methods include Content Analysis, Critical Analysis, Discourse Analysis, Gestalt Analysis, Grounded Theory Analysis, Interpretive Analysis, Narrative Analysis, Normative Analysis, Phenomenological Analysis, Rhetorical Analysis, and Semiotic Analysis, among others. The following resources should help you navigate your methodological options and put into practice methods for coding, themeing, interpreting, and presenting your data.

  • Users can browse content by topic, discipline, or format type (reference works, book chapters, definitions, etc.). SRM offers several research tools as well: a methods map, user-created reading lists, a project planner, and advice on choosing statistical tests.  
  • Abductive Coding: Theory Building and Qualitative (Re)Analysis by Vila-Henninger, et al.  The authors recommend an abductive approach to guide qualitative researchers who are oriented towards theory-building. They outline a set of tactics for abductive analysis, including the generation of an abductive codebook, abductive data reduction through code equations, and in-depth abductive qualitative analysis.  
  • Analyzing and Interpreting Qualitative Research: After the Interview by Charles F. Vanover, Paul A. Mihas, and Johnny Saldana (Editors)   Providing insight into the wide range of approaches available to the qualitative researcher and covering all steps in the research process, the authors utilize a consistent chapter structure that provides novice and seasoned researchers with pragmatic, "how-to" strategies. Each chapter author introduces the method, uses one of their own research projects as a case study of the method described, shows how the specific analytic method can be used in other types of studies, and concludes with three questions/activities to prompt class discussion or personal study.   
  • "Analyzing Qualitative Data." Theory Into Practice 39, no. 3 (2000): 146-54 by Margaret D. LeCompte   This article walks readers though rules for unbiased data analysis and provides guidance for getting organized, finding items, creating stable sets of items, creating patterns, assembling structures, and conducting data validity checks.  
  • "Coding is Not a Dirty Word" in Chapter 1 (pp. 1–30) of Enhancing Qualitative and Mixed Methods Research with Technology by Shalin Hai-Jew (Editor)   Current discourses in qualitative research, especially those situated in postmodernism, represent coding and the technology that assists with coding as reductive, lacking complexity, and detached from theory. In this chapter, the author presents a counter-narrative to this dominant discourse in qualitative research. The author argues that coding is not necessarily devoid of theory, nor does the use of software for data management and analysis automatically render scholarship theoretically lightweight or barren. A lack of deep analytical insight is a consequence not of software but of epistemology. Using examples informed by interpretive and critical approaches, the author demonstrates how NVivo can provide an effective tool for data management and analysis. The author also highlights ideas for critical and deconstructive approaches in qualitative inquiry while using NVivo. By troubling the positivist discourse of coding, the author seeks to create dialogic spaces that integrate theory with technology-driven data management and analysis, while maintaining the depth and rigor of qualitative research.   
  • The Coding Manual for Qualitative Researchers by Johnny Saldana   An in-depth guide to the multiple approaches available for coding qualitative data. Clear, practical and authoritative, the book profiles 32 coding methods that can be applied to a range of research genres from grounded theory to phenomenology to narrative inquiry. For each approach, Saldaña discusses the methods, origins, a description of the method, practical applications, and a clearly illustrated example with analytic follow-up. Essential reading across the social sciences.  
  • Flexible Coding of In-depth Interviews: A Twenty-first-century Approach by Nicole M. Deterding and Mary C. Waters The authors suggest steps in data organization and analysis to better utilize qualitative data analysis technologies and support rigorous, transparent, and flexible analysis of in-depth interview data.  
  • From the Editors: What Grounded Theory is Not by Roy Suddaby Walks readers through common misconceptions that hinder grounded theory studies, reinforcing the two key concepts of the grounded theory approach: (1) constant comparison of data gathered throughout the data collection process and (2) the determination of which kinds of data to sample in succession based on emergent themes (i.e., "theoretical sampling").  
  • “Good enough” methods for life-story analysis, by Wendy Luttrell. In Quinn N. (Ed.), Finding culture in talk (pp. 243–268). Demonstrates for researchers of culture and consciousness who use narrative how to concretely document reflexive processes in terms of where, how and why particular decisions are made at particular stages of the research process.   
  • The Ethnographic Interview by James P. Spradley  “Spradley wrote this book for the professional and student who have never done ethnographic fieldwork (p. 231) and for the professional ethnographer who is interested in adapting the author’s procedures (p. iv) ... Steps 6 and 8 explain lucidly how to construct a domain and a taxonomic analysis” (excerpted from book review by James D. Sexton, 1980). See also:  Presentation slides on coding and themeing your data, derived from Saldana, Spradley, and LeCompte Click to request access.  
  • Qualitative Data Analysis by Matthew B. Miles; A. Michael Huberman   A practical sourcebook for researchers who make use of qualitative data, presenting the current state of the craft in the design, testing, and use of qualitative analysis methods. Strong emphasis is placed on data displays matrices and networks that go beyond ordinary narrative text. Each method of data display and analysis is described and illustrated.  
  • "A Survey of Qualitative Data Analytic Methods" in Chapter 4 (pp. 89–138) of Fundamentals of Qualitative Research by Johnny Saldana   Provides an in-depth introduction to coding as a heuristic, particularly focusing on process coding, in vivo coding, descriptive coding, values coding, dramaturgical coding, and versus coding. Includes advice on writing analytic memos, developing categories, and themeing data.   
  • "Thematic Networks: An Analytic Tool for Qualitative Research." Qualitative Research : QR, 1(3), 385–405 by Jennifer Attride-Stirling Details a technique for conducting thematic analysis of qualitative material, presenting a step-by-step guide of the analytic process, with the aid of an empirical example. The analytic method presented employs established, well-known techniques; the article proposes that thematic analyses can be usefully aided by and presented as thematic networks.  
  • Using Thematic Analysis in Psychology by Virginia Braun and Victoria Clark Walks readers through the process of reflexive thematic analysis, step by step. The method may be adapted in fields outside of psychology as relevant. Pair this with One Size Fits All? What Counts as Quality Practice in Reflexive Thematic Analysis? by Virginia Braun and Victoria Clark

TESTING OR GENERATING THEORIES

The quality of your data analysis depends on how you situate what you learn within a wider body of knowledge. Consider the following advice:

Once you have coalesced around a theory, realize that a theory should  reveal  rather than  color  your discoveries. Allow your data to guide you to what's most suitable. Grounded theory  researchers may develop their own theory where current theories fail to provide insight.  This guide on Theoretical Models  from Alfaisal University Library provides a helpful overview on using theory.

MANAGING & FINDING INTERVIEW DATA

Managing your elicited interview data, general guidance:  .

  • Research Data Management @ Harvard A reference guide with information and resources to help you manage your research data. See also: Harvard Research Data Security Policy , on the Harvard University Research Data Management website.  
  • Data Management For Researchers: Organize, Maintain and Share Your Data for Research Success by Kristin Briney. A comprehensive guide for scientific researchers providing everything they need to know about data management and how to organize, document, use and reuse their data.  
  • Open Science Framework (OSF) An open-source project management tool that makes it easy to collaborate within and beyond Harvard throughout a project's lifecycle. With OSF you can manage, store, and share documents, datasets, and other information with your research team. You can also publish your work to share it with a wider audience. Although data can be stored privately, because this platform is hosted on the Internet and designed with open access in mind, it is not a good choice for highly sensitive data.  
  • Free cloud storage solutions for Harvard affiliates to consider include:  Google Drive ,  DropBox , or  OneDrive ( up to DSL3 )  

Data Confidentiality and Secure Handling:  

  • Data Security Levels at Harvard - Research Data Examples This resource provided by Harvard Data Security helps you determine what level of access is appropriate for your data. Determine whether it should be made available for public use, limited to the Harvard community, or be protected as either "confidential and sensitive," "high risk," or "extremely sensitive." See also:  Harvard Data Classification Table  
  • Harvard's Best Practices for Protecting Privacy and  Harvard Information Security Collaboration Tools Matrix Follow the nuts-and-bolts advice for privacy best practices at Harvard. The latter resource reveals the level of security that can be relied upon for a large number of technological tools and platforms used at Harvard to conduct business, such as email, Slack, Accellion Kiteworks, OneDrive/SharePoint, etc.  
  • “Protecting Participant Privacy While Maintaining Content and Context: Challenges in Qualitative Data De‐identification and Sharing.” Proceedings of the ASIST Annual Meeting 57 (1) (2020): e415-420 by Myers, Long, and Polasek Presents an informed and tested protocol, based on the De-Identification guidelines published by the Qualitative Data Repository (QDR) at Syracuse University. Qualitative researchers may consult it to guide their data de-identification efforts.  
  • QDS Qualitative Data Sharing Toolkit The Qualitative Data Sharing (QDS) project and its toolkit was funded by the NIH National Human Genome Research Institute (R01HG009351). It provides tools and resources to help researchers, especially those in the health sciences, share qualitative research data while protecting privacy and confidentiality. It offers guidance on preparing data for sharing through de-identification and access control. These health sciences research datasets in ICPSR's Qualitative Data Sharing (QDS) Project Series were de-identified using the QuaDS Software and the project’s QDS guidelines.  
  • Table of De-Identification Techniques  
  • Generative AI Harvard-affiliated researchers should not enter data classified as confidential ( Level 2 and above ), including non-public research data, into publicly-available generative AI tools, in accordance with the University’s Information Security Policy. Information shared with generative AI tools using default settings is not private and could expose proprietary or sensitive information to unauthorized parties.  
  • Harvard Information Security Quick Reference Guide Storage guidelines, based on the data's security classification level (according to its IRB classification) is displayed on page 2, under "handling."  
  • Email Encryption Harvard Microsoft 365 users can now send encrypted messages and files directly from the Outlook web or desktop apps. Encrypting an email adds an extra layer of security to the message and its attachments (up to 150MB), and means only the intended recipient (and their inbox delegates with full access) can view it. Message encryption in Outlook is approved for sending high risk ( level 4 ) data and below.  

Sharing Qualitative Data:  

  • Repositories for Qualitative Data If you have cleared this intention with your IRB, secured consent from participants, and properly de-identified your data, consider sharing your interviews in one of the data repositories included in the link above. Depending on the nature of your research and the level of risk it may present to participants, sharing your interview data may not be appropriate. If there is any chance that sharing such data will be desirable, you will be much better off if you build this expectation into your plans from the beginning.  
  • Guide for Sharing Qualitative Data at ICPSR The Inter-university Consortium for Political and Social Research (ICPSR) has created this resource for investigators planning to share qualitative data at ICPSR. This guide provides an overview of elements and considerations for archiving qualitative data, identifies steps for investigators to follow during the research life cycle to ensure that others can share and reuse qualitative data, and provides information about exemplars of qualitative data  

International Projects:

  • Research Compliance Program for FAS/SEAS at Harvard The Faculty of Arts and Sciences (FAS), including the School of Engineering and Applied Sciences (SEAS), and the Office of the Vice Provost for Research (OVPR) have established a shared Research Compliance Program (RCP). An area of common concern for interview studies is international projects and collaboration . RCP is a resource to provide guidance on which international activities may be impacted by US sanctions on countries, individuals, or entities and whether licenses or other disclosure are required to ship or otherwise share items, technology, or data with foreign collaborators.

Finding Extant Interview Data

Finding journalistic interviews:  .

  • Academic Search Premier This all-purpose database is great for finding articles from magazines and newspapers. In the Advanced Search, it allows you to specify "Document Type":  Interview.  
  • Guide to Newspapers and Newspaper Indexes Use this guide created to Harvard Librarians to identify newspapers collections you'd like to search. To locate interviews, try adding the term  "interview"  to your search, or explore a database's search interface for options to  limit your search to interviews.  Nexis Uni  and  Factiva  are the two main databases for current news.   
  • Listen Notes Search for podcast episodes at this podcast aggregator, and look for podcasts that include interviews. Make sure to vet the podcaster for accuracy and quality! (Listen Notes does not do much vetting.)  
  • NPR  and  ProPublica  are two sites that offer high-quality long-form reporting, including journalistic interviews, for free.

Finding Oral History and Social Research Interviews:  

  • To find oral histories, see the Oral History   page of this guide for helpful resources on Oral History interviewing.  
  • Repositories for Qualitative Data It has not been a customary practice among qualitative researchers in the social sciences to share raw interview data, but some have made this data available in repositories, such as the ones listed on the page linked above. You may find published data from structured interview surveys (e.g., questionnaire-based computer-assisted telephone interview data), as well as some semi-structured and unstructured interviews.  
  • If you are merely interested in studies interpreting data collected using interviews, rather than finding raw interview data, try databases like  PsycInfo ,  Sociological Abstracts , or  Anthropology Plus , among others. 

Finding Interviews in Archival Collections at Harvard Library:

In addition to the databases and search strategies mentioned under the  "Finding Oral History and Social Research Interviews" category above,  you may search for interviews and oral histories (whether in textual or audiovisual formats) held in archival collections at Harvard Library.

  • HOLLIS searches all documented collections at Harvard, whereas HOLLIS for Archival Discovery searches only those with finding aids. Although HOLLIS for Archival Discovery covers less material, you may find it easier to parse your search results, especially when you wish to view results at the item level (within collections). Try these approaches:

Search in  HOLLIS :  

  • To retrieve items available online, do an Advanced Search for  interview* OR "oral histor*" (in Subject), with Resource Type "Archives/Manuscripts," then refine your search by selecting "Online" under "Show Only" on the right of your initial result list.  Revise the search above by adding your topic in the Keywords or Subject field (for example:  African Americans ) and resubmitting the search.  
  •  To enlarge your results set, you may also leave out the "Online" refinement; if you'd like to limit your search to a specific repository, try the technique of searching for  Code: Library + Collection on the "Advanced Search" page .   

Search in  HOLLIS for Archival Discovery :  

  • To retrieve items available online, search for   interview* OR "oral histor*" limited to digital materials . Revise the search above by adding your topic (for example:  artist* ) in the second search box (if you don't see the box, click +).  
  • To preview results by collection, search for  interview* OR "oral histor*" limited to collections . Revise the search above by adding your topic (for example:  artist* ) in the second search box (if you don't see the box, click +). Although this method does not allow you to isolate digitized content, you may find the refinement options on the right side of the screen (refine by repository, subject or names) helpful.  Once your select a given collection, you may search within it  (e.g., for your topic or the term interview).

UX & MARKET RESEARCH INTERVIEWS

Ux at harvard library  .

  • User Experience and Market Research interviews can inform the design of tangible products and services through responsive, outcome-driven insights. The  User Research Center  at Harvard Library specializes in this kind of user-centered design, digital accessibility, and testing. They also offer guidance and  resources  to members of the Harvard Community who are interested in learning more about UX methods. Contact [email protected] or consult the URC website for more information.

Websites  

  • User Interviews: The Beginner’s Guide (Chris Mears)  
  • Interviewing Users (Jakob Nielsen)

Books  

  • Interviewing Users: How to Uncover Compelling Insights by Steve Portigal; Grant McCracken (Foreword by)  Interviewing is a foundational user research tool that people assume they already possess. Everyone can ask questions, right? Unfortunately, that's not the case. Interviewing Users provides invaluable interviewing techniques and tools that enable you to conduct informative interviews with anyone. You'll move from simply gathering data to uncovering powerful insights about people.  
  • Rapid Contextual Design by Jessamyn Wendell; Karen Holtzblatt; Shelley Wood  This handbook introduces Rapid CD, a fast-paced, adaptive form of Contextual Design. Rapid CD is a hands-on guide for anyone who needs practical guidance on how to use the Contextual Design process and adapt it to tactical projects with tight timelines and resources. Rapid Contextual Design provides detailed suggestions on structuring the project and customer interviews, conducting interviews, and running interpretation sessions. The handbook walks you step-by-step through organizing the data so you can see your key issues, along with visioning new solutions, storyboarding to work out the details, and paper prototype interviewing to iterate the design all with as little as a two-person team with only a few weeks to spare *Includes real project examples with actual customer data that illustrate how a CD project actually works.

Videos  

undefined

Instructional Presentations on Interview Skills  

  • Interview/Oral History Research for RSRA 298B: Master's Thesis Reading and Research (Spring 2023) Slideshow covers: Why Interviews?, Getting Context, Engaging Participants, Conducting the Interview, The Interview Guide, Note Taking, Transcription, File management, and Data Analysis.  
  • Interview Skills From an online class on February 13, 2023:  Get set up for interview research. You will leave prepared to choose among the three types of interviewing methods, equipped to develop an interview schedule, aware of data management options and their ethical implications, and knowledgeable of technologies you can use to record and transcribe your interviews. This workshop complements Intro to NVivo, a qualitative data analysis tool useful for coding interview data.

NIH Data Management & Sharing Policy (DMSP) This policy, effective January 25, 2023, applies to all research, funded or conducted in whole or in part by NIH, that results in the generation of  scientific data , including NIH-funded qualitative research. Click here to see some examples of how the DMSP policy has been applied in qualitative research studies featured in the 2021 Qualitative Data Management Plan (DMP) Competition . As a resource for the community, NIH has developed a resource for developing informed consent language in research studies where data and/or biospecimens will be stored and shared for future use. It is important to note that the DMS Policy does NOT require that informed consent obtained from research participants must allow for broad sharing and the future use of data (either with or without identifiable private information). See the FAQ for more information.

  • << Previous: Remote Research & Virtual Fieldwork
  • Next: Oral History >>

Except where otherwise noted, this work is subject to a Creative Commons Attribution 4.0 International License , which allows anyone to share and adapt our material as long as proper attribution is given. For details and exceptions, see the Harvard Library Copyright Policy ©2021 Presidents and Fellows of Harvard College.

Five Tips for Conducting Effective Qualitative Interviews

CHPIR El Salvador Interview

An interviewer conducts household survey in rural El Salvador for a Center for Health Policy and Inequalities Research study. Photo by Hy V. Huynh.

Published March 12, 2018 under Research News

In qualitative research, in-depth interviews can be an immensely helpful investigative tool. However, the nuances of one-on-one interviewing can sometimes make it difficult to obtain useful results. Rae Jean Proeschold-Bell , associate research professor and founding director of the Evidence Lab at the Duke Global Health Institute, frequently integrates qualitative interviews into her research. In this article, she shares five interviewing tips that have served her well.

1. Convey Intent

Proeschold-Bell says it’s important for the interviewer to know the intent behind each question so that it can be clearly conveyed to the interviewee. Understanding the intent of a question, she’s found, helps interviewers decide whether or not the participant has fully answered the question. This way, they can ask follow-up questions and not leave gaps at the time of data collection. Proeschold-Bell recommends writing the intent of each question below it in italics on the interview script. 

Proeschold-Bell also suggests a few more subtle techniques for helping interviewees understand what is really being asked and soliciting pertinent and thorough responses. Asking the question in several different ways can help clarify its meaning. Follow-up prompts such as “That’s really helpful; tell me more about that,” or “Can you describe what was unpleasant about it?” can also give interviewees helpful guidance in crafting their responses.

“You can also convey intent by explaining more broadly why you’re doing the research, so interviewees will be more likely to give you relevant information,” Proeschold-Bell said. 

2. Don’t Sway the Participants

Acquiescence bias, which occurs when interviewees agree with what they think the interviewer wants to hear instead of giving their unbiased answer, can often prevent interviewees from sharing all relevant information. Research from Focus Groups: A Practical Guide for Applied Research shows that when power dynamics are present in an interview, it may be especially difficult for an interviewee to give an honest answer.

To minimize acquiescence bias, interviewers can emphasize that the participant is the expert in the subject matter of the interview.  For example, they can start the interview by saying, “I’ve asked you to talk with me today because you are an expert in what it’s like to be a patient in Eldoret.” 

Interviewers should also avoid nodding or other body language that expresses agreement with the participant. Instead, interviewers should say, “That’s very helpful,” or “Thank you for those thoughts.” Otherwise, participants might elaborate on a point that isn’t actually very important to them just because the interviewer seemed to agree.   

Proeschold-Bell also recommends that interviewers pay attention to—and record—interviewees’ non-verbal responses, which often communicate feelings and attitudes that the verbal response doesn’t capture.

3. Eliminate Interviewer Bias

Proeschold-Bell says it’s critically important to eliminate interviewer bias through the interview process. Knowing the interview guide extremely well helps an interviewer pace the interview to avoid running out of time, and adhering to the scripted wording for each question helps maintain unbiased prompting across all interviews. Additionally, if an interviewee starts answering a question that is going to be asked later, the interviewer can ask them to wait. 

It’s best to ask interview questions in a specific order because covering certain questions first may influence how interviewees think during later questions. Finally, she recommends, “Ask all questions of all respondents, even if you think you know what they’ll say. They will surprise you sometimes!”

4. Consider a “Test Run” Period

Proeschold-Bell sees her first several interviews for a study as pilots. Learning from these first few test runs and improving questions and interview techniques for future interviews can have a significant impact on the quality of the study. This means that data quality from the first few interviews may not be as strong since some of the questions change, but the data from the interviews later on will be more useful. Proeschold-Bell recommends numbering interviews chronologically to link interviews to the phase of development in which they were conducted.

5. Make Time for Post-Interview Reflection

After an interview, Proeschold-Bell recommends immediately reviewing the data. “This helps capture good ideas that may otherwise be forgotten,” she says. In fact, she suggests creating a review form with a few open-ended questions that can help capture strong reactions and flag questions that didn’t work well or questions that should be added. 

It’s also helpful, she says, to note responses that were different from those given in previous interviews. Doing this may generate ideas to analyze more carefully later on.

Looking for more research design tools? Check out Proeschold-Bell’s recent article, “ Five Tips for Designing an Effective Survey .”

Proeschold-Bell recommends that interviewers pay attention to—and record—interviewees’ non-verbal responses, which often communicate feelings and attitudes that the verbal response doesn’t capture.
  • Rae Jean Proeschold-Bell

Related News

Jirair Ratevosian

Around DGHI

New DGHI Fellow Focuses on HIV Policy

June 21, 2024

Savannah Johnson at Wiser school in Kenya

Education News

Interrupting Relationship Violence When It's Most Likely to Start

June 20, 2024

Charles Muiruri

DGHI's Charles Muiruri Uses Lessons from Soccer to Build a Career in Global Health Research

June 13, 2024

  • Skip to main content
  • Skip to primary sidebar
  • Skip to footer
  • QuestionPro

survey software icon

  • Solutions Industries Gaming Automotive Sports and events Education Government Travel & Hospitality Financial Services Healthcare Cannabis Technology Use Case NPS+ Communities Audience Contactless surveys Mobile LivePolls Member Experience GDPR Positive People Science 360 Feedback Surveys
  • Resources Blog eBooks Survey Templates Case Studies Training Help center

how to conduct an interview qualitative research

Home Market Research Research Tools and Apps

Qualitative Interview: What it is & How to conduct one

qualitative interview

A qualitative interview is commonly used in research projects involving new products, brand positioning, purchase dynamics, market research, social research, behavioral analysis, exploring market segments, etc. Recent data also suggests that it is highly effective when used in employee engagement initiatives.

It has also proven to be extremely helpful when it comes to problem definition as well as developing an approach to a particular problem.

What is a Qualitative Interview?

A qualitative Interview is a research approach used in a qualitative study where more personal interaction is required and detailed in depth information is gathered from the participant. Qualitative interviews usually involve follow-up questions and are conducted in a conversation or discussion format.

A qualitative interview is a more personal form of research agenda compared to general questionnaires or focused group studies. Such formats often include open-ended and follow-up questions .

LEARN ABOUT: Behavioral Research

How to conduct a Qualitative Interview?

Conducting a qualitative interview requires careful planning and implementation to ensure that you gather meaningful and rich data. Here are some steps to consider when conducting a qualitative research interview:

Clearly define the purpose of your qualitative interview and the specific research method questions you want to address. It will help you design appropriate research interview questions and interview guides for your data analysis.

Identify the target population or specific individuals who can provide valuable insights related to your research questions. Consider criteria such as demographics, expertise, or experiences that align with your research methods and objectives. Use appropriate methods, such as purposive sampling of data collection, to recruit participants who can offer diverse perspectives.

Before conducting the interview, ensure that participants understand the purpose, procedures, and potential risks or benefits of their involvement. Obtain their informed consent, clearly explaining their rights as participants, including confidentiality and their ability to withdraw from the study at any time.

Prepare a flexible in depth interviews guide that includes a set of open-ended interview questions for an interview participant. The guide should be designed to elicit participants’ perspectives, experiences, and insights related to your research objectives for conducting interviews. Consider using probing techniques to encourage participants to elaborate on their responses and explore different dimensions of the topic.

Select a suitable location for the in depth interviews that is comfortable, private, and free from distractions for an interview participant. Create a relaxed and welcoming atmosphere to help participants feel at ease and encourage open communication for qualitative interviewing. Establish rapport and build trust with participants by introducing yourself, explaining the purpose of conducting interviews, and actively listening to their responses.

Start by asking introductory questions to establish a rapport with the participant. Follow the qualitative interview guide, but remain flexible and responsive to participants’ responses. Allow participants to speak freely and provide detailed answers, using probing techniques to delve deeper into their experiences, emotions, and perspectives. Take notes or record the interview (with participants’ consent) to capture accurate and detailed information.

Show respect for their experiences and perspectives, even if they differ from yours. Avoid making judgments or imposing your own beliefs during the interview. Create a non-judgmental and inclusive environment that encourages participants to share their thoughts and feelings honestly.

Transcribe the interview recordings or review your notes promptly after each interview while the details remain fresh. Analyze the qualitative data using appropriate methods, such as thematic data analysis, to identify patterns, themes, and insights. Ensure that the data is anonymized and handled following ethical guidelines.

By following these steps, you can conduct a qualitative research interview that facilitates rich and meaningful discussions, resulting in valuable data analysis for your research process.

LEARN ABOUT:   Research Process Steps

Types of Qualitative Interviews

The interview itself can be conducted over multiple formats. Qualitative researchers can employ several types of qualitative interviews based on their research objectives and the nature of the study. Here are some popular types of qualitative interviews:

types of qualitative interviews

Structured interviews involve a predetermined set of questions that are asked in the same order and manner to each participant. The questions of structured interviews are typically closed-ended or have limited response options. This type of interview is proper when researchers aim to collect specific information in a standardized way, allowing for easier comparison and analysis of responses across participants.

Semi-structured interviews combine predetermined questions with flexibility for additional probing and follow-up questions. Researchers have a set of core questions to guide the interview but can adapt the interview data collection process based on participants’ responses. This type of approach allows for a deeper exploration of participants’ experiences, thoughts, and perspectives while maintaining some standardization level.

Unstructured interviews involve open-ended questions and a free-flowing conversation between the interviewer and the participant. The interviewer may have a general topic or area of interest but allows the conversation to evolve naturally. Unstructured interviews provide a high degree of flexibility and allow participants to express themselves more freely, often leading to rich and nuanced data.

Each qualitative interview type has its strengths and is suited for different research purposes. Researchers or a research team should carefully select the appropriate type of research interview that aligns with their research objectives, the nature of the phenomenon under investigation, and the population being studied.

LEARN ABOUT:   Structured Question

Advantages of Using Qualitative Interviews

Qualitative interview techniques offer several advantages as a research method. Here are some of the key advantages:

A qualitative interview allows researchers to delve deeply into participants’ experiences, perspectives, and opinions. Using open-ended questions and probing techniques, researchers can uncover rich and detailed information beyond mere surface-level responses. This in-depth exploration provides a comprehensive understanding of the research topic.

Qualitative interviews offer flexibility in adapting the interview data collection process to the specific needs of each participant. Researchers can tailor their questions, follow-up probes, and overall approach based on the participant’s responses, allowing for a more personalized and engaging research experience. This flexibility enhances the quality and richness of the data collection.

Qualitative interview prioritizes the voices and perspectives of participants. Through interactive and conversational exchanges, participants can express their thoughts, emotions, and beliefs in their own words. This approach ensures that the research captures individuals’ nuanced and diverse experiences, offering insights that may not be obtained through other methods.

A qualitative research interview provides a holistic understanding of the social and cultural context surrounding participants’ experiences. Researchers can explore the factors influencing participants’ perspectives, such as cultural norms, societal expectations, or personal histories. This contextual understanding enhances the interpretation and analysis of the data, providing a comprehensive view of the research topic.

Qualitative interviews are particularly effective when studying sensitive or complex topics. It allows participants to share their experiences and emotions in a safe and confidential environment, facilitating a deeper exploration of potentially challenging subjects. This method also enables researchers to capture these topics’ nuances, contradictions, and subtleties, contributing to a more comprehensive understanding.

Qualitative research interviews can empower participants by giving them a voice and acknowledging the value of their experiences. By actively listening and engaging in meaningful dialogue, researchers validate participants’ contributions and foster a sense of ownership over their narratives. This empowerment can positively affect participants’ self-esteem, self-reflection, and personal growth.

Overall, qualitative interview provides researchers with a powerful tool to explore complex phenomena, gain in-depth insights, and understand the subjective experiences of individuals. By capitalizing on the advantages of this method, researchers can generate valuable and nuanced data that contributes to the advancement of knowledge in their respective fields.

Learn more by reading our guide: Types of Interviews .

Disadvantages of a Qualitative Interview

While a qualitative interview has many advantages, it is essential to acknowledge their potential limitations. Here are some of the disadvantages associated with qualitative interviews:

Qualitative interviews involve interaction between the researcher and participants, which introduces the possibility of subjective interpretations and biases. Researchers may unintentionally influence participants’ responses through questioning techniques, non-verbal cues, or personal beliefs. Researchers must be aware of their biases and take steps to minimize their impact on data collection and analysis.

The findings from qualitative research interviews are typically based on small sample size and specific context, making it difficult to generalize the results to a larger population. While qualitative research aims to provide an in-depth understanding, it may need more statistical representativeness than quantitative research methods offer. Therefore, when applying qualitative interview findings to broader populations or contexts, caution must be exercised.

Qualitative interviews can be time-consuming and require substantial resources. Conducting in depth interviews, transcribing data, and analyzing the qualitative data are labor-intensive tasks that require significant time and effort. Researchers must be prepared for qualitative interviews of a detailed and time-consuming nature, especially when working with large or diverse participant samples.

Ensuring the validity and reliability of qualitative research interviews can be challenging. Validity refers to the extent to which the interview data accurately represent participants’ experiences and perspectives, while reliability relates to the consistency and replicability of the findings. Factors such as interviewer bias, participant recall, and social desirability may compromise the validity and reliability of the data. Researchers must employ rigorous methodologies, triangulate data from multiple sources, and establish trustworthiness to enhance the credibility of their findings.

Qualitative interviews capture participants’ experiences and perspectives at a specific time and within a particular context. However, these experiences may evolve or change over time or in different contexts. Researchers must be mindful of the limitations of capturing participants’ experiences, recognizing that their findings may only partially represent the dynamic nature of human behavior and perceptions.

Despite these disadvantages, qualitative interviews remain a valuable research method that offers unique insights into individuals’ experiences and perspectives.

Learn About: Steps in qualitative Research

Qualitative interviews are valuable for gaining in-depth insights into individuals’ experiences, perspectives, and behaviors. They offer a unique opportunity to explore complex phenomena, uncover rich narratives, and understand the underlying meanings and interpretations that individuals assign to their experiences.

To summarize, Qualitative Research can either be a valuable tool to discover problems or help elevate any research programs with subjective data or leave researchers with amorphous and contradictory data. The key is to use the approach in combination with other qualitative and quantitative research techniques to enhance the depth of the data gathered.

CREATE FREE ACCOUNT

Authors : Harpal Singh & Shabeen Shareef

MORE LIKE THIS

The Item I Failed to Leave Behind — Tuesday CX Thoughts

The Item I Failed to Leave Behind — Tuesday CX Thoughts

Jun 25, 2024

feedback loop

Feedback Loop: What It Is, Types & How It Works?

Jun 21, 2024

how to conduct an interview qualitative research

QuestionPro Thrive: A Space to Visualize & Share the Future of Technology

Jun 18, 2024

how to conduct an interview qualitative research

Relationship NPS Fails to Understand Customer Experiences — Tuesday CX

Other categories.

  • Academic Research
  • Artificial Intelligence
  • Assessments
  • Brand Awareness
  • Case Studies
  • Communities
  • Consumer Insights
  • Customer effort score
  • Customer Engagement
  • Customer Experience
  • Customer Loyalty
  • Customer Research
  • Customer Satisfaction
  • Employee Benefits
  • Employee Engagement
  • Employee Retention
  • Friday Five
  • General Data Protection Regulation
  • Insights Hub
  • Life@QuestionPro
  • Market Research
  • Mobile diaries
  • Mobile Surveys
  • New Features
  • Online Communities
  • Question Types
  • Questionnaire
  • QuestionPro Products
  • Release Notes
  • Research Tools and Apps
  • Revenue at Risk
  • Survey Templates
  • Training Tips
  • Tuesday CX Thoughts (TCXT)
  • Uncategorized
  • Video Learning Series
  • What’s Coming Up
  • Workforce Intelligence
  • University Libraries
  • Research Guides
  • Topic Guides
  • Research Methods Guide
  • Interview Research

Research Methods Guide: Interview Research

  • Introduction
  • Research Design & Method
  • Survey Research
  • Data Analysis
  • Resources & Consultation

Tutorial Videos: Interview Method

Interview as a Method for Qualitative Research

how to conduct an interview qualitative research

Goals of Interview Research

  • Preferences
  • They help you explain, better understand, and explore research subjects' opinions, behavior, experiences, phenomenon, etc.
  • Interview questions are usually open-ended questions so that in-depth information will be collected.

Mode of Data Collection

There are several types of interviews, including:

  • Face-to-Face
  • Online (e.g. Skype, Googlehangout, etc)

FAQ: Conducting Interview Research

What are the important steps involved in interviews?

  • Think about who you will interview
  • Think about what kind of information you want to obtain from interviews
  • Think about why you want to pursue in-depth information around your research topic
  • Introduce yourself and explain the aim of the interview
  • Devise your questions so interviewees can help answer your research question
  • Have a sequence to your questions / topics by grouping them in themes
  • Make sure you can easily move back and forth between questions / topics
  • Make sure your questions are clear and easy to understand
  • Do not ask leading questions
  • Do you want to bring a second interviewer with you?
  • Do you want to bring a notetaker?
  • Do you want to record interviews? If so, do you have time to transcribe interview recordings?
  • Where will you interview people? Where is the setting with the least distraction?
  • How long will each interview take?
  • Do you need to address terms of confidentiality?

Do I have to choose either a survey or interviewing method?

No.  In fact, many researchers use a mixed method - interviews can be useful as follow-up to certain respondents to surveys, e.g., to further investigate their responses.

Is training an interviewer important?

Yes, since the interviewer can control the quality of the result, training the interviewer becomes crucial.  If more than one interviewers are involved in your study, it is important to have every interviewer understand the interviewing procedure and rehearse the interviewing process before beginning the formal study.

  • << Previous: Survey Research
  • Next: Data Analysis >>
  • Last Updated: Aug 21, 2023 10:42 AM

Logo for Open Educational Resources Collective

Want to create or adapt books like this? Learn more about how Pressbooks supports open publishing practices.

Chapter 13: Interviews

Danielle Berkovic

Learning outcomes

Upon completion of this chapter, you should be able to:

  • Understand when to use interviews in qualitative research.
  • Develop interview questions for an interview guide.
  • Understand how to conduct an interview.

What are interviews?

An interviewing method is the most commonly used data collection technique in qualitative research. 1 The purpose of an interview is to explore the experiences, understandings, opinions and motivations of research participants. 2 Interviews are conducted one-on-one with the researcher and the participant. Interviews are most appropriate when seeking to understand a participant’s subjective view of an experience and are also considered suitable for the exploration of sensitive topics.

What are the different types of interviews?

There are four main types of interviews:

  • Key stakeholder: A key stakeholder interview aims to explore one issue in detail with a person of interest or importance concerning the research topic. 3 Key stakeholder interviews seek the views of experts on some cultural, political or health aspects of the community, beyond their personal beliefs or actions. An example of a key stakeholder is the Chief Health Officer of Victoria (Australia’s second-most populous state) who oversaw the world’s longest lockdowns in response to the COVID-19 pandemic.
  • Dyad: A dyad interview aims to explore one issue in a level of detail with a dyad (two people). This form of interviewing is used when one participant of the dyad may need some support or is not wholly able to articulate themselves (e.g. people with cognitive impairment, or children). Independence is acknowledged and the interview is analysed as a unit. 4
  • Narrative: A narrative interview helps individuals tell their stories, and prioritises their own perspectives and experiences using the language that they prefer. 5 This type of interview has been widely used in social research but is gaining prominence in health research to better understand person-centred care, for example, negotiating exercise and food abstinence whilst living with Type 2 diabetes. 6,7
  • Life history: A life history interview allows the researcher to explore a person’s individual and subjective experiences within a history of the time framework. 8 Life history interviews challenge the researcher to understand how people’s current attitudes, behaviours and choices are influenced by previous experiences or trauma. Life history interviews have been conducted with Holocaust survivors 9 and youth who have been forcibly recruited to war. 10

Table 13.4 provides a summary of four studies, each adopting one of these types of interviews.

Interviewing techniques

There are two main interview techniques:

  • Semi-structured: Semi-structured interviewing aims to explore a few issues in moderate detail, to expand the researcher’s knowledge at some level. 11 Semi-structured interviews give the researcher the advantage of remaining reasonably objective while enabling participants to share their perspectives and opinions. The researcher should create an interview guide with targeted open questions to direct the interview. As examples, semi-structured interviews have been used to extend knowledge of why women might gain excess weight during pregnancy, 12 and to update guidelines for statin uptake. 13
  • In-depth: In-depth interviewing aims to explore a person’s subjective experiences and feelings about a particular topic. 14 In-depth interviews are often used to explore emotive (e.g. end-of-life care) 15 and complex (e.g. adolescent pregnancy) topics. 16 The researcher should create an interview guide with selected open questions to ask of the participant, but the participant should guide the direction of the interview more than in a semi-structured setting. In-depth interviews value participants’ lived experiences and are frequently used in phenomenology studies (as described in Chapter 6) .

When to use the different types of interview s

The type of interview a researcher uses should be determined by the study design, the research aims and objectives, and participant demographics. For example, if conducting a descriptive study, semi-structured interviews may be the best method of data collection. As explained in Chapter 5 , descriptive studies seek to describe phenomena, rather than to explain or interpret the data. A semi-structured interview, which seeks to expand upon some level of existing knowledge, will likely best facilitate this.

Similarly, if conducting a phenomenological study, in-depth interviews may be the best method of data collection. As described in Chapter 6 , the key concept of phenomenology is the individual. The emphasis is on the lived experience of that individual and the person’s sense-making of those experiences. Therefore, an in-depth interview is likely best placed to elicit that rich data.

While some interview types are better suited to certain study designs, there are no restrictions on the type of interview that may be used. For example, semi-structured interviews provide an excellent accompaniment to trial participation (see Chapter 11 about mixed methods), and key stakeholder interviews, as part of an action research study, can be used to define priorities, barriers and enablers to implementation.

How do I write my interview questions?

An interview aims to explore the experiences, understandings, opinions and motivations of research participants. The general rule is that the interviewee should speak for 80 per cent of the interview, and the interviewer should only be asking questions and clarifying responses, for about 20 per cent of the interview. This percentage may differ depending on the interview type; for example, a semi-structured interview involves the researcher asking more questions than in an in-depth interview. Still, to facilitate free-flowing responses, it is important to use open-ended language to encourage participants to be expansive in their responses. Examples of open-ended terms include questions that start with ‘who’, ‘how’ and ‘where’.

The researcher should avoid closed-ended questions that can be answered with yes or no, and limit conversation. For example, asking a participant ‘Did you have this experience?’ can elicit a simple ‘yes’, whereas asking them to ‘Describe your experience’, will likely encourage a narrative response. Table 13.1 provides examples of terminology to include and avoid in developing interview questions.

Table 13.1. Interview question formats to use and avoid

Use Avoid
Tell me about… Do you think that…
What happened when… Will you do this…
Why is this important? Did you believe that…
How did you feel when…

How do you…
Were there issues from your perspective…
What are the…

What does...

How long should my interview be?

There is no rule about how long an interview should take. Different types of interviews will likely run for different periods of time, but this also depends on the research question/s and the type of participant. For example, given that a semi-structured interview is seeking to expand on some previous knowledge, the interview may need no longer than 30 minutes, or up to one hour. An in-depth interview seeks to explore a topic in a greater level of detail and therefore, at a minimum, would be expected to last an hour. A dyad interview may be as short as 15 minutes (e.g. if the dyad is a person with dementia and a family member or caregiver) or longer, depending on the pairing.

Designing your interview guide

To figure out what questions to ask in an interview guide, the researcher may consult the literature, speak to experts (including people with lived experience) about the research and draw on their current knowledge. The topics and questions should be mapped to the research question/s, and the interview guide should be developed well in advance of commencing data collection. This enables time and opportunity to pilot-test the interview guide. The pilot interview provides an opportunity to explore the language and clarity of questions, the order and flow of the guide and to determine whether the instructions are clear to participants both before and after the interview. It can be beneficial to pilot-test the interview guide with someone who is not familiar with the research topic, to make sure that the language used is easily understood (and will be by participants, too). The study design should be used to determine the number of questions asked and the duration of the interview should guide the extent of the interview guide. The participant type may also determine the extent of the interview guide; for example, clinicians tend to be time-poor and therefore shorter, focused interviews are optimal. An interview guide is also likely to be shorter for a descriptive study than a phenomenological or ethnographic study, given the level of detail required. Chapter 5 outlined a descriptive study in which participants who had undergone percutaneous coronary intervention were interviewed. The interview guide consisted of four main questions and subsequent probing questions, linked to the research questions (see Table 13.2). 17

Table 13.2. Interview guide for a descriptive study

Research question Open questions Probing questions and topics
How does the patient feel, physically and psychologically, after their procedure? From your perspective, what would be considered a successful outcome of the procedure? Did the procedure meet your expectations? How do you define whether the procedure was successful?
How did you feel after the procedure?

How did you feel one week after the procedure and how does that compare with how you feel now?
How does the patient function after their procedure? After your procedure, tell me about your ability to do your daily activities? Prompt for activities including gardening, housework, personal care, work-related and family-related tasks.

Did you attend cardiac rehabilitation? Can you tell us about your experience of cardiac rehabilitation? What effect has medication had on your recovery?

What are the long-term effects of the procedure? What, if any, lifestyle changes have you made since your procedure?

Table 13.3 is an example of a larger and more detailed interview guide, designed for the qualitative component of a mixed-methods study aiming to examine the work and financial effects of living with arthritis as a younger person. The questions are mapped to the World Health Organization’s International Classification of Functioning, Disability, and Health, which measures health and disability at individual and population levels. 18

Table 13.3. Detailed interview guide

Research questions Open questions Probing questions
How do young people experience their arthritis diagnosis? Tell me about your experience of being diagnosed with arthritis.

How did being diagnosed with arthritis make you feel?

Tell me about your experience of arthritis flare ups what do they feel like?

What impacts arthritis flare ups or feeling like your arthritis is worse?

What circumstances lead to these feelings?

Based on your experience, what do you think causes symptoms of arthritis to become worse?
When were you diagnosed with arthritis?

What type of arthritis were you diagnosed with?

Does anyone else in your family have arthritis? What relation are they to you?
What are the work impacts of arthritis on younger people? What is your field of work, and how long have you been in this role?

How frequently do you work (full-time/part-time/casual)?
How has arthritis affected your work-related demands or career? How so?

Has arthritis led you to reconsider your career? How so?

Has arthritis affected your usual working hours each week? How so?

How have changes to work or career because of your arthritis impacted other areas of life, i.e. mental health or family role?
What are the financial impacts of living with arthritis as a younger person? Has your arthritis led to any financial concerns? Financial concerns pertaining to:

• Direct costs: rheumatologist, prescribed and non-prescribed medications (as well as supplements), allied health costs (rheumatology, physiotherapy, chiropractic, osteopathy, myotherapy), Pilates, and gym/personal trainer fees, complementary therapies.

• Indirect costs: workplace absenteeism, productivity, loss of wages, informal care, cost of different types of insurance: health insurance (joint replacements)

It is important to create an interview guide, for the following reasons:

  • The researcher should be familiar with their research questions.
  • Using an interview guide will enable the incorporation of feedback from the piloting process.
  • It is difficult to predict how participants will respond to interview questions. They may answer in a way that is anticipated or they may provide unanticipated insights that warrant follow-up. An interview guide (a physical or digital copy) enables the researcher to note these answers and follow-up with appropriate inquiry.
  • Participants will likely have provided heterogeneous answers to certain questions. The interview guide enables the researcher to note similarities and differences across various interviews, which may be important in data analysis.
  • Even experienced qualitative researchers get nervous before an interview! The interview guide provides a safety net if the researcher forgets their questions or needs to anticipate the next question.

Setting up the interview

In the past, most interviews were conducted in person or by telephone. Emerging technologies promote easier access to research participation (e.g. by people living in rural or remote communities, or for people with mobility limitations). Even in metropolitan settings, many interviews are now conducted electronically (e.g. using videoconferencing platforms). Regardless of your interview setting, it is essential that the interview environment is comfortable for the participant. This process can begin as soon as potential participants express interest in your research. Following are some tips from the literature and our own experiences of leading interviews:

  • Answer questions and set clear expectations . Participating in research is not an everyday task. People do not necessarily know what to expect during a research interview, and this can be daunting. Give people as much information as possible, answer their questions about the research and set clear expectations about what the interview will entail and how long it is expected to last. Let them know that the interview will be recorded for transcription and analysis purposes. Consider sending the interview questions a few days before the interview. This gives people time and space to reflect on their experiences, consider their responses to questions and to provide informed consent for their participation.
  • Consider your setting . If conducting the interview in person, consider the location and room in which the interview will be held. For example, if in a participant’s home, be mindful of their private space. Ask if you should remove your shoes before entering their home. If they offer refreshments (which in our experience many participants do), accept it with gratitude if possible. These considerations apply beyond the participant’s home; if using a room in an office setting, consider privacy and confidentiality, accessibility and potential for disruption. Consider the temperature as well as the furniture in the room, who may be able to overhear conversations and who may walk past. Similarly, if interviewing by phone or online, take time to assess the space, and if in a house or office that is not quiet or private, use headphones as needed.
  • Build rapport. The research topic may be important to participants from a professional perspective, or they may have deep emotional connections to the topic of interest. Regardless of the nature of the interview, it is important to remember that participants are being asked to open up to an interviewer who is likely to be a stranger. Spend some time with participants before the interview, to make sure that they are comfortable. Engage in some general conversation, and ask if they have any questions before you start. Remember that it is not a normal part of someone’s day to participate in research. Make it an enjoyable and/or meaningful experience for them, and it will enhance the data that you collect.
  • Let participants guide you. Oftentimes, the ways in which researchers and participants describe the same phenomena are different. In the interview, reflect the participant’s language. Make sure they feel heard and that they are willing and comfortable to speak openly about their experiences. For example, our research involves talking to older adults about their experience of falls. We noticed early in this research that participants did not use the word ‘fall’ but would rather use terms such as ‘trip’, ‘went over’ and ‘stumbled’. As interviewers we adopted the participant’s language into our questions.
  • Listen consistently and express interest. An interview is more complex than a simple question-and-answer format. The best interview data comes from participants feeling comfortable and confident to share their stories. By the time you are completing the 20th interview, it can be difficult to maintain the same level of concentration as with the first interview. Try to stay engaged: nod along with your participants, maintain eye contact, murmur in agreement and sympathise where warranted.
  • The interviewer is both the data collector and the data collection instrument. The data received is only as good as the questions asked. In qualitative research, the researcher influences how participants answer questions. It is important to remain reflexive and aware of how your language, body language and attitude might influence the interview. Being rested and prepared will enhance the quality of the questions asked and hence the data collected.
  • Avoid excessive use of ‘why’. It can be challenging for participants to recall why they felt a certain way or acted in a particular manner. Try to avoid asking ‘why’ questions too often, and instead adopt some of the open language described earlier in the chapter.

After your interview

When you have completed your interview, thank the participant and let them know they can contact you if they have any questions or follow-up information they would like to provide. If the interview has covered sensitive topics or the participant has become distressed throughout the interview, make sure that appropriate referrals and follow-up are provided (see section 6).

Download the recording from your device and make sure it is saved in a secure location that can only be accessed by people on the approved research team (see Chapters 35 and 36).

It is important to know what to do immediately after each interview is completed. Interviews should be transcribed – that is, reproduced verbatim for data analysis. Transcribing data is an important step in the process of analysis, but it is very time-consuming; transcribing a 60-minute interview can take up to 8 hours. Data analysis is discussed in Section 4.

Table 13.4. Examples of the four types of interviews

Title
CC Licence
First author and year Cuthbertson, 2019 Bannon, 2021 McGranahan, 2020 Gutierrez-Garcia, 2021
Interview type Key stakeholder Dyad Narrative Life history
Interview guide Appendix A eAppendix Supplement Not provided, but the text states that ‘qualitative semi-structured narrative interviews’ were conducted.’ [methods] Not provided, but the text states that ‘an open and semi-structured question guide was designed for use.' [methods]
Study design Convergent mixed-methods study Qualitative dyadic study Narrative interview study Life history and lifeline techniques
Number of participants 30

Key stakeholders were emergency management or disaster healthcare practitioners, academics specialising in disaster management in the Oceania region, and policy managers.
23 dyads 28 7
Aim ‘To investigate threats to the health and well-being of societies associated with disaster impact in Oceania.’ [abstract] ‘To explore the lived experiences of couples managing young-onset dementia using an integrated dyadic coping model.’[abstract] ‘To explore the experiences and views of people with psychotic experiences who have not received any treatment or other support from mental health services for the past 5 years.’ [abstract] ‘To analyse the use of life histories and lifelines in the study of female genital mutilation in the context of cross-cultural research in participants with different languages.’ [abstract]
Country Australia, Fiji, Indonesia, Aotearoa New Zealand, Timor Leste and Tonga United States England Spain
Length of interview 45–60 minutes 60 minutes 40-120 minutes 3 sessions

Session 1: life history interview

Session 2: Lifeline activity where participants used drawings to complement or enhance their interview

Session 3: The researchers and participants worked together to finalise the lifeline.
The life history interviews ran for 40 – 60 minutes. The timing for sessions 2 and 3 is not provided.
Sample of interview questions from interview guide 1. What do you believe are the top five disaster risks or threats in the Oceania region today?

2. What disaster risks do you believe are emerging in the Oceania region over the next decade?

3. Why do you think these are risks?

4. What are the drivers of these risks?

5. Do you have any suggestions on how we can improve disaster risk assessment?

6. Are the current disaster risk plans and practices suited to the future disaster risks? If not, why? If not, what do you think needs to be done to improve them?

7. What are the key areas of disaster practice that can enhance future community resilience to disaster risk?

8. What are the barriers or inhibitors to facilitating this practice?

9. What are the solutions or facilitators to enhancing community resilience?

[Appendix A]

1. We like to start by learning more about what you each first noticed that prompted the evaluations you went through to get to the diagnosis.

• Can you each tell me about the earliest symptoms you noticed?

2. What are the most noticeable or troubling symptoms that you have experienced since the time of diagnosis?

• How have your changes in functioning impacted you?

• Emotionally, how do you feel about your symptoms and the changes in functioning you are experiencing?

3. Are you open with your friends and family about the diagnosis?

• Have you experienced any stigma related to your diagnosis?

4. What is your understanding of the diagnosis?

• What is your understanding about the how this condition will affect you both in the future? How are you getting information about this diagnosis?

[eAppendix Supplement]

Not provided. Not provided.
Analysis Thematic analysis guided by The Hazard and Peril Glossary for describing and categorising disasters applied by the Centre for Research on the Epidemiology of Disasters Emergency Events Database Thematic analysis guided by the Dyadic Coping Theoretical Framework Inductive thematic analysis outlined by Braun and Clarke. Phenomenological method proposed by Giorgi (sense of the whole):

1. Reading the entire description to obtain a general sense of the discourse

2. The researcher goes back to the beginning and reads the text again, with the aim of distinguishing the meaning units by separating the perspective of the phenomenon of interest

3. The researcher expresses the contents of the units of meaning more clearly by creating categories

4. The researcher synthesises the units and categories of meaning into a consistent statement that takes into account the participant’s experience and language.
Main themes 1. Climate change is observed as a contemporary and emerging disaster risk

2. Risk is contextual to the different countries, communities and individuals in Oceania.

3. Human development trajectories and their impact, along with perceptions of a changing world, are viewed as drivers of current and emerging risks.

4. Current disaster risk plans and practices are not suited to future disaster risks.

5. Increased education and education of risk and risk assessment at a local level to empower community risk ownership.

[Results, Box 1]
1. Stress communication

2. Positive individual dyadic coping

3. Positive conjoint dyadic coping

4. Negative individual dyadic coping

5. Negative conjoint dyadic coping

[Abstract]
1. Perceiving psychosis as positive

2. Making sense of psychotic experiences

3. Finding sources of strength

4. Negative past experiences of mental health services

5. Positive past experiences with individual clinicians

[Abstract]
1. Important moments and their relationship with female genital mutilation

2. The ritual knife: how sharp or blunt it is at different stages, where and how women are subsequently held as a result

3. Changing relationships with family: how being subject to female genital mutilation changed relationships with mothers

4. Female genital mutilation increases the risk of future childbirth complications which change relationships with family and healthcare systems

5. Managing experiences with early exposure to physical and sexual violence across the lifespan.

Interviews are the most common data collection technique in qualitative research. There are four main types of interviews; the one you choose will depend on your research question, aims and objectives. It is important to formulate open-ended interview questions that are understandable and easy for participants to answer. Key considerations in setting up the interview will enhance the quality of the data obtained and the experience of the interview for the participant and the researcher.

  • Gill P, Stewart K, Treasure E, Chadwick B. Methods of data collection in qualitative research: interviews and focus groups. Br Dent J . 2008;204(6):291-295. doi:10.1038/bdj.2008.192
  • DeJonckheere M, Vaughn LM. Semistructured interviewing in primary care research: a balance of relationship and rigour. Fam Med Community Health . 2019;7(2):e000057. doi:10.1136/fmch-2018-000057
  • Nyanchoka L, Tudur-Smith C, Porcher R, Hren D. Key stakeholders’ perspectives and experiences with defining, identifying and displaying gaps in health research: a qualitative study. BMJ Open . 2020;10(11):e039932. doi:10.1136/bmjopen-2020-039932
  • Morgan DL, Ataie J, Carder P, Hoffman K. Introducing dyadic interviews as a method for collecting qualitative data. Qual Health Res .  2013;23(9):1276-84. doi:10.1177/1049732313501889
  • Picchi S, Bonapitacola C, Borghi E, et al. The narrative interview in therapeutic education. The diabetic patients’ point of view. Acta Biomed . Jul 18 2018;89(6-S):43-50. doi:10.23750/abm.v89i6-S.7488
  • Stuij M, Elling A, Abma T. Negotiating exercise as medicine: Narratives from people with type 2 diabetes. Health (London) . 2021;25(1):86-102. doi:10.1177/1363459319851545
  • Buchmann M, Wermeling M, Lucius-Hoene G, Himmel W. Experiences of food abstinence in patients with type 2 diabetes: a qualitative study. BMJ Open .  2016;6(1):e008907. doi:10.1136/bmjopen-2015-008907
  • Jessee E. The Life History Interview. Handbook of Research Methods in Health Social Sciences . 2018:1-17:Chapter 80-1.
  • Sheftel A, Zembrzycki S. Only Human: A Reflection on the Ethical and Methodological Challenges of Working with “Difficult” Stories. The Oral History Review . 2019;37(2):191-214. doi:10.1093/ohr/ohq050
  • Harnisch H, Montgomery E. “What kept me going”: A qualitative study of avoidant responses to war-related adversity and perpetration of violence by former forcibly recruited children and youth in the Acholi region of northern Uganda. Soc Sci Med .  2017;188:100-108. doi:10.1016/j.socscimed.2017.07.007
  • Ruslin., Mashuri S, Rasak MSA, Alhabsyi M, Alhabsyi F, Syam H. Semi-structured Interview: A Methodological Reflection on the Development of a Qualitative Research Instrument in Educational Studies. IOSR-JRME . 2022;12(1):22-29. doi:10.9790/7388-1201052229
  • Chang T, Llanes M, Gold KJ, Fetters MD. Perspectives about and approaches to weight gain in pregnancy: a qualitative study of physicians and nurse midwives. BMC Pregnancy & Childbirth . 2013;13(47)doi:10.1186/1471-2393-13-47
  • DeJonckheere M, Robinson CH, Evans L, et al. Designing for Clinical Change: Creating an Intervention to Implement New Statin Guidelines in a Primary Care Clinic. JMIR Hum Factors .  2018;5(2):e19. doi:10.2196/humanfactors.9030
  • Knott E, Rao AH, Summers K, Teeger C. Interviews in the social sciences. Nature Reviews Methods Primers . 2022;2(1)doi:10.1038/s43586-022-00150-6
  • Bergenholtz H, Missel M, Timm H. Talking about death and dying in a hospital setting – a qualitative study of the wishes for end-of-life conversations from the perspective of patients and spouses. BMC Palliat Care . 2020;19(1):168. doi:10.1186/s12904-020-00675-1
  • Olorunsaiye CZ, Degge HM, Ubanyi TO, Achema TA, Yaya S. “It’s like being involved in a car crash”: teen pregnancy narratives of adolescents and young adults in Jos, Nigeria. Int Health . 2022;14(6):562-571. doi:10.1093/inthealth/ihab069
  • Ayton DR, Barker AL, Peeters G, et al. Exploring patient-reported outcomes following percutaneous coronary intervention: A qualitative study. Health Expect .  2018;21(2):457-465. doi:10.1111/hex.12636
  • World Health Organization. International Classification of Functioning, Disability and Health (ICF). WHO. https://www.who.int/standards/classifications/international-classification-of-functioning-disability-and-health#:~:text=ICF%20is%20the%20WHO%20framework,and%20measure%20health%20and%20disability.
  • Cuthbertson J, Rodriguez-Llanes JM, Robertson A, Archer F. Current and Emerging Disaster Risks Perceptions in Oceania: Key Stakeholders Recommendations for Disaster Management and Resilience Building. Int J Environ Res Public Health .  2019;16(3)doi:10.3390/ijerph16030460
  • Bannon SM, Grunberg VA, Reichman M, et al. Thematic Analysis of Dyadic Coping in Couples With Young-Onset Dementia. JAMA Netw Open .  2021;4(4):e216111. doi:10.1001/jamanetworkopen.2021.6111
  • McGranahan R, Jakaite Z, Edwards A, Rennick-Egglestone S, Slade M, Priebe S. Living with Psychosis without Mental Health Services: A Narrative Interview Study. BMJ Open .  2021;11(7):e045661. doi:10.1136/bmjopen-2020-045661
  • Gutiérrez-García AI, Solano-Ruíz C, Siles-González J, Perpiñá-Galvañ J. Life Histories and Lifelines: A Methodological Symbiosis for the Study of Female Genital Mutilation. Int J Qual Methods . 2021;20doi:10.1177/16094069211040969

Qualitative Research – a practical guide for health and social care researchers and practitioners Copyright © 2023 by Danielle Berkovic is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License , except where otherwise noted.

Share This Book

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List
  • J Basic Clin Pharm
  • v.5(4); September 2014-November 2014

Qualitative research method-interviewing and observation

Shazia jamshed.

Department of Pharmacy Practice, Kulliyyah of Pharmacy, International Islamic University Malaysia, Kuantan Campus, Pahang, Malaysia

Buckley and Chiang define research methodology as “a strategy or architectural design by which the researcher maps out an approach to problem-finding or problem-solving.”[ 1 ] According to Crotty, research methodology is a comprehensive strategy ‘that silhouettes our choice and use of specific methods relating them to the anticipated outcomes,[ 2 ] but the choice of research methodology is based upon the type and features of the research problem.[ 3 ] According to Johnson et al . mixed method research is “a class of research where the researcher mixes or combines quantitative and qualitative research techniques, methods, approaches, theories and or language into a single study.[ 4 ] In order to have diverse opinions and views, qualitative findings need to be supplemented with quantitative results.[ 5 ] Therefore, these research methodologies are considered to be complementary to each other rather than incompatible to each other.[ 6 ]

Qualitative research methodology is considered to be suitable when the researcher or the investigator either investigates new field of study or intends to ascertain and theorize prominent issues.[ 6 , 7 ] There are many qualitative methods which are developed to have an in depth and extensive understanding of the issues by means of their textual interpretation and the most common types are interviewing and observation.[ 7 ]

Interviewing

This is the most common format of data collection in qualitative research. According to Oakley, qualitative interview is a type of framework in which the practices and standards be not only recorded, but also achieved, challenged and as well as reinforced.[ 8 ] As no research interview lacks structure[ 9 ] most of the qualitative research interviews are either semi-structured, lightly structured or in-depth.[ 9 ] Unstructured interviews are generally suggested in conducting long-term field work and allow respondents to let them express in their own ways and pace, with minimal hold on respondents’ responses.[ 10 ]

Pioneers of ethnography developed the use of unstructured interviews with local key informants that is., by collecting the data through observation and record field notes as well as to involve themselves with study participants. To be precise, unstructured interview resembles a conversation more than an interview and is always thought to be a “controlled conversation,” which is skewed towards the interests of the interviewer.[ 11 ] Non-directive interviews, form of unstructured interviews are aimed to gather in-depth information and usually do not have pre-planned set of questions.[ 11 ] Another type of the unstructured interview is the focused interview in which the interviewer is well aware of the respondent and in times of deviating away from the main issue the interviewer generally refocuses the respondent towards key subject.[ 11 ] Another type of the unstructured interview is an informal, conversational interview, based on unplanned set of questions that are generated instantaneously during the interview.[ 11 ]

In contrast, semi-structured interviews are those in-depth interviews where the respondents have to answer preset open-ended questions and thus are widely employed by different healthcare professionals in their research. Semi-structured, in-depth interviews are utilized extensively as interviewing format possibly with an individual or sometimes even with a group.[ 6 ] These types of interviews are conducted once only, with an individual or with a group and generally cover the duration of 30 min to more than an hour.[ 12 ] Semi-structured interviews are based on semi-structured interview guide, which is a schematic presentation of questions or topics and need to be explored by the interviewer.[ 12 ] To achieve optimum use of interview time, interview guides serve the useful purpose of exploring many respondents more systematically and comprehensively as well as to keep the interview focused on the desired line of action.[ 12 ] The questions in the interview guide comprise of the core question and many associated questions related to the central question, which in turn, improve further through pilot testing of the interview guide.[ 7 ] In order to have the interview data captured more effectively, recording of the interviews is considered an appropriate choice but sometimes a matter of controversy among the researcher and the respondent. Hand written notes during the interview are relatively unreliable, and the researcher might miss some key points. The recording of the interview makes it easier for the researcher to focus on the interview content and the verbal prompts and thus enables the transcriptionist to generate “verbatim transcript” of the interview.

Similarly, in focus groups, invited groups of people are interviewed in a discussion setting in the presence of the session moderator and generally these discussions last for 90 min.[ 7 ] Like every research technique having its own merits and demerits, group discussions have some intrinsic worth of expressing the opinions openly by the participants. On the contrary in these types of discussion settings, limited issues can be focused, and this may lead to the generation of fewer initiatives and suggestions about research topic.

Observation

Observation is a type of qualitative research method which not only included participant's observation, but also covered ethnography and research work in the field. In the observational research design, multiple study sites are involved. Observational data can be integrated as auxiliary or confirmatory research.[ 11 ]

Research can be visualized and perceived as painstaking methodical efforts to examine, investigate as well as restructure the realities, theories and applications. Research methods reflect the approach to tackling the research problem. Depending upon the need, research method could be either an amalgam of both qualitative and quantitative or qualitative or quantitative independently. By adopting qualitative methodology, a prospective researcher is going to fine-tune the pre-conceived notions as well as extrapolate the thought process, analyzing and estimating the issues from an in-depth perspective. This could be carried out by one-to-one interviews or as issue-directed discussions. Observational methods are, sometimes, supplemental means for corroborating research findings.

Logo for British Columbia/Yukon Open Authoring Platform

Want to create or adapt books like this? Learn more about how Pressbooks supports open publishing practices.

Chapter 10: Qualitative Data Collection & Analysis Methods

10.3 Conducting Qualitative Interviews

Qualitative interviews might feel more like a conversation than an interview to respondents, however the researcher is usually guiding the conversation with the goal of gathering information from a respondent. A key difference between qualitative and quantitative interviewing is that qualitative interviews contain open-ended questions. Open-ended questions are questions for which a researcher does not provide answer options. Open-ended questions demand more of participants than closed-ended questions, because they require participants to come up with their own words, phrases, or sentences to respond.

In a qualitative interview, the researcher usually develops a guide in advance to which he or she then refers during the interview (or memorizes in advance of the interview). An interview guide is a list of topics or questions that the interviewer hopes to cover during the course of an interview. It is called a guide because it is used to guide the interviewer, but it is not inflexible. Think of an interview guide like your agenda for the day or your to-do list both probably contain all the items you hope to check off or accomplish, however, probably it is not mandatory for you to accomplish everything on the list or accomplish it in the exact order that you have written it down. Perhaps emerging events will influence you to rearrange your schedule, or perhaps you simply will not get to everything on the list.

Interview guides should outline issues that a researcher feels are likely to be important, but because participants are asked to provide answers in their own words, and to raise points that they believe are important, each interview is likely to flow a little differently. While the opening question in an in- depth interview may be the same across all interviews, from that point on what the participant says will shape how the interview proceeds. This is what makes in-depth interviewing so exciting. It is also what makes in-depth interviewing rather challenging to conduct. It takes a skilled interviewer to be able to ask questions and actually listen to respondents; and pick up on cues about when to follow up, when to move on, and when to simply let the participant speak without guidance or interruption.

Interview guides can list topics or questions. The specific format of an interview guide might depend on your style, experience, and comfort level as an interviewer or with your topic, however, interview guides are the result of thoughtful and careful work on the part of a researcher. It is important to ensure that the topics and questions are organized thematically and in the order in which they are likely to proceed (keep in mind, however, that the flow of a qualitative interview is in part determined by what a respondent has to say).

Sometimes researchers may create two versions of the guide for a qualitative interview: one version contains a very brief outline of the interview (perhaps with just topic headings), and another version contains detailed questions underneath each topic heading. In this case, the researcher might use the detailed guide to prepare and practice in advance of actually conducting interviews, and then bring just the brief outline to the interview. Bringing an outline, as opposed to a very long list of detailed questions, to an interview encourages the researcher to actually listen to what a participant is telling her. An overly-detailed interview guide will be difficult to navigate through during an interview and could give respondents the incorrect impression that the interviewer is more interested in her questions than in the participant’s answers.

Begin to construct your interview guide by brainstorming. There are no rules at the brainstorming stage—simply list all the topics and questions that come to mind when you think about your research question. Once you have developed a pretty good list, you can begin to pare it down by cutting questions and topics that seem redundant, and grouping like questions and topics together. If you have not done so yet, you may also want to come up with question and topic headings for your grouped categories. You should also consult the scholarly literature to find out what kinds of questions other interviewers have asked in studies of similar topics. As with quantitative survey research, it is best not to place very sensitive or potentially controversial questions at the very beginning of your qualitative interview guide. You need to give participants the opportunity to warm up to the interview and to feel comfortable talking with you. Finally, get some feedback on your interview guide. Ask your friends, family members, and your professors for some guidance and suggestions once you have come up with what you think is a pretty strong guide. Chances are they will catch a few things you had not noticed.

In terms of the specific questions you include on your guide, there are a few guidelines worth noting. First, try to avoid questions that can be answered with a simple yes or no, or, if you do choose to include such questions, be sure to include follow-up questions. Remember, one of the benefits of qualitative interviews is that you can ask participants for more information; be sure to do so. While it is a good idea to ask follow-up questions, try to avoid asking “why” as your follow-up question, since “why” questions can appear to be confrontational, even if that is not your intention. Often people will not know how to respond to “why.” This may be the case because they do not know why themselves. Instead of “why,” it is recommended that you say something like, “could you tell me a little more about that?” This allows participants to explain themselves further without feeling that they are being doubted or questioned in a hostile way.

Also, try to avoid phrasing your questions in a leading way. For example, rather than asking, “What do you think about people who drink and drive?” you could ask, “How do you feel about drinking and driving?” Finally, as noted earlier in this section, remember to keep most, if not all, of your questions open-ended. The key to a successful qualitative interview is giving participants the opportunity to share information in their own words and in their own way.

Even after the interview guide is constructed, the interviewer is not yet ready to begin conducting interviews. The researcher next has to decide how to collect and maintain the information that is provided by participants. It is probably most common for qualitative interviewers to take audio recordings of the interviews they conduct. Recording interviews allows the researcher to focus on her or his interaction with the interview participant rather than being distracted by trying to take notes. Of course, not all participants will feel comfortable being recorded and sometimes even the interviewer may feel that the subject is so sensitive that recording would be inappropriate. If this is the case, it is up to the researcher to balance excellent note-taking with exceptional question-asking and even better listening. It can be quite challenging to do all three at the same time. Recording is best, if you can do so. Whether you will be recording your interviews or not (and especially if not), it is crucial to practice the interview in advance. Ideally, try to find a friend or two willing to participate in a couple of trial runs with you. Even better, try and find a friend or two who are similar in at least some ways to your sample. They can give you the best feedback on your questions and your interview demeanor.

All interviewers should be aware of, give some thought to, and plan for, several additional factors, such as where to conduct an interview and how to make participants as comfortable as possible during an interview. Because these factors should be considered by both qualitative and quantitative interviewers, we will return to them in Chapter 11 “Issues to Consider for AllInterview Types.”

Research Methods for the Social Sciences: An Introduction Copyright © 2020 by Valerie Sheppard is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License , except where otherwise noted.

Share This Book

Imperial College London Imperial College London

Latest news.

how to conduct an interview qualitative research

£150,000 study will explore links between breast cancer and breastfeeding  

how to conduct an interview qualitative research

Bezos Centre for Sustainable Protein launches at Imperial with $30m funding

how to conduct an interview qualitative research

Public Engagement team celebrate six years of sector-leading Academy

  • Centre for Higher Education Research and Scholarship
  • Research and Innovation
  • Education evaluation toolkit
  • Tools and resources for evaluation

Best practice for interviews

At the root of interviewing is an interest in understanding the lived experiences of other people (Seidman, 2006). Interviews invite the participant to make sense of their own experiences and to share these experiences with the researcher. Interviews are therefore an appropriate method when researchers want to learn from and understand the experiences of others. Important educational issues facing Imperial College include the wellbeing of staff and students, and their experiences of new curricula and pedagogies such as active learning and technologically-enhanced learning. Interviews offer powerful insight into individual experiences of these issues, which can help Imperial improve overall. If you are new to interviewing, it might seem like an unnatural situation. However, interviews are great opportunities for collecting rich data. Participants open up their lives for us to investigate. The data that emerge from interviews is qualitative, often in the form of text from interview transcripts. This data can help us to describe people, explain phenomena, and understand experiences, among other things  (Jacob & Furgerson, 2012). Even if you have experience of interviews, these tips can help you make the most out of your interview.

Best practice interviews accordion widget

Create a comfortable environment in the interview setting.

Establishing an environment and setting where the participant will be comfortable is paramount to conducting a successful interview. The location of the interview itself should be comfortable for the participant (Herzog, 2012). If possible, try to conduct the interview somewhere that the participant knows well. Choose a quiet, private place (Jacob & Furgerson, 2012). For example, it would not be appropriate to conduct an interview discussing sensitive topics like a student’s sense of belonging in a very public setting like the Junior Common Room, where there is an increased chance that your interview might be overheard, and the participant may not feel comfortable to speak freely about the topic as a result (Elwood & Martin, 2000).

Establish trust and rapport with the participants

It is important that your participants feel comfortable being honest with you in their responses to your questions. When building rapport, be especially aware of your tone when you ask questions: convey that you are interested, and ask questions in a way that invites all types of responses. To help participants feel more at ease:

  • Start by asking for some background information about the participant, like where they are from, what they are studying, and other background questions that are relevant to the study (Jacob & Furgerson, 2012).
  • Make yourself more relatable to the participant by sharing some personal information about yourself. For example, if a participant seems hesitant perhaps ask him or her about something that they've expressed interest in earlier in the interview. Then, you can redirect the conversation to the interview questions that you've already established.

Be cautious of “over rapport,” which happens if a participant tries to please the interviewer by saying what he or she thinks is expected of them (Grinyer & Thomas, 2012).

Follow an interview protocol

The interview protocol includes your interview questions, but it can also be much more than that (Jacob & Fuergerson, 2012). A good interview protocol will remind you to conduct proper procedures such as collecting informed consent (see below), checking that audio equipment is working and that the participant is happy to be recorded. Also, in addition to your main interview questions, the protocol could include some prompts for you to use if the participant struggles to understand the question or to provide answers, or if the participant’s responses stray from the topic. It may also include a script for you to read off to open and close the interview. You don’t always need to include all of the above on your interview protocol, but it is advised to at least have your list of interview questions written down.

Collect informed consent

Most often, this is done by providing the participant with a participant information sheet explaining the research and detailing the risks and benefits associated with their participation in the interview, and a consent form for the participant to sign which indicates that they have understood the participant information sheet and they agree to participant in the interview with you. Please consult the Imperial College London Education Ethics Review Process (EERP) webpage for resources on participant information sheets  and informed consent forms.

Be an active listener

Remember to maintain eye contact to convey compassion and that you are listening (boyd, 2015).  If a participant’s response seems unclear, do not be afraid to ask for clarification. Try to make the interview feel like a natural conversation. This means that you should refrain from taking too many notes during the interview. It is advisable to audio or video record the interview with the permission of the participant so that you can focus on the conversation. However, it is advisable to prepare note-taking equipment (Talmage, 2012), both as a back-up to recording equipment and to make certain key notes during the interview. These could be key conceptual ideas that spring to mind during the interview or simply points that you would like to ask more about later in the interview. You may also want to tick things off your interview protocol. Consider a backup recording option - for example, if you use an audio recorder as your primary device, consider preparing your phone or tablet to record the interview as a backup option.

Be mindful of power relations

Social roles shape the interview process (DiCicco-Bloom & Crabtree, 2006). If you are a member of staff interviewing students or colleagues, there are likely to be power relations and these may affect the interview (Wang & Yan, 2012). For example, a student you are interviewing might feel they need to give you the ‘right’ answer because you are in a position of authority in the Imperial context. Reassure participants that there are no right or wrong answers (Greene & Hogan, 2005), that their experiences are important, and that their participation is completely voluntary and that there will be absolutely no negative consequences of withdrawing from the interview. If you are interviewing students to evaluate a particular module, you may wish to emphasise that their participation in the interview will have no impact on their grades.

Consider the ethical implications of interviewing students within your class or department. It is better to have a neutral/external interviewer to interview your students instead, particularly if you are using interviews to evaluate your teaching practice or the effectiveness of your module.

Check your bias

Be careful not to let your own assumptions get in the way of their hearing perspectives or stories that you do not expect to hear (Johnson & Rowlands, 2012). Have an open mind, and pay equal attention to all of your interview participants to collect and make the most of rich data.

boyd, d. (2015). Making Sense of Teen Life: Strategies for Capturing Ethnographic Data in a Networked Era. In E. Hargittai, & C. Sandvig, Digital Research Confidential: The Secrets of Studying Behavior Online (pp. 79-102). Cambridge, MA: The MIT Press.

DiCicco-Bloom, B., & Crabtree, B. F. (2006). The qualitative research interview. Medical Education, 40 , 314-321.

Elwood, S. A., & Martin, D. G. (2000). 'Placing' Interviews: Location and Scales of Power in Qualitative Research. Professional Geographer, 52 (4), 649-657.

Gehlbach, H. (2015). User Guide: Panorama Student Survey. Boston: Panorama Education. Retrieved from https://www.panoramaed.com/panorama-student-survey

Gehlbach, H., & Artino Jr., A. R. (2018). The survey checklist (manifesto). Academic Medicine, 93 (3), 360-366. Retrieved from https://journals.lww.com/academicmedicine/fulltext/2018/03000/The_Survey_Checklist__Manifesto_.18.aspx#pdf-link

Gehlbach, H., & Brinkworth, M. E. (2011). Measure twice, cut down error: A process for enhancing the validity of survey scales. Review of General Psychology, 15 (4), 380-387. Retrieved from https://dash.harvard.edu/bitstream/handle/1/8138346/Gehlbach%20-%20Measure%20twice%208-31-11.pdf?sequence=1&isAllowed=y

Greene, S., & Hogan, D. (2005). Exploring Meaning in Interviews with Children. In S. Greene, & D. Hogan (Eds.), Researching Children's Experience (pp. 142-158). London, UK: SAGE Publications Ltd.

Grinyer, A., & Thomas, C. (2012). The Value of Interviewing on Multiple Locations or Longitudinally. In J. F. Gubrium, J. A. Holstein, A. B. Marvasti, & K. D. McKinney (Eds.), The SAGE Handbook of Interview Research (2nd ed., pp. 219-230). London, U.K.: SAGE.

Herzog, H. (2012). Interview Location and Its Social Meaning. In J. F. Gubrium, J. A. Holstein, A. B. Marvasti, & K. D. McKinney (Eds.), The SAGE Handbook of Interview Research (2nd ed., pp. 207-217). London, U.K.: SAGE.

Jacob, S. A., & Furgerson, S. P. (2012). Writing Interview Protocols and Conducting Interviews: Tips for Students New to the Field of Qualitative Research. The Qualitative Report, 17 (2), 1-10.

Johnson, J. M., & Rowlands, T. (2012). The Interpersonal Dynamics of In-depth Interviewing. In J. F. Gubrium, J. A. Holstein, A. B. Marvasti, & K. D. McKinney (Eds.), The SAGE Handbook of Interview Research (2nd ed., pp. 99-113). London, U.K.: SAGE.

Krosnick, J. A., & Presser, S. (2010). Question and questionnaire design. In P. V. Marsden, & J. D. Wright (Eds.), Handbook of Survey Research. Bingley, England: Emerald Group Publishing.

Schwarz, N. (1999). Self-reports: how the questions shape the answers. American Psychology, 54 , 93-105.

Seidman, I. (2006). Interviewing as Qualitative Research: A Guide for Researchers in Education and the Social Sciences (3rd ed.). New York: Teachers College Press.

Talmage, J. B. (2012). Listening to, and for, the Research Interview. In J. F. Gubrium, J. A. Holstein, A. B. Marvasti, & K. D. McKinney (Eds.), The SAGE Handbook of Interview Research (2nd ed., pp. 295-304). London, U.K.: SAGE.

Wang, J., & Yan, Y. (2012). The Interview Question. In J. F. Gubrium, J. A. Holstein, A. B. Marvasti, & K. D. McKinney (Eds.), The SAGE Handbook of Interview Research (2nd ed., pp. 231-242). London, U.K.: SAGE.

Warren, C. A. (2012). Interviewing as Social Interaction. In J. F. Gubrium, J. A. Holstein, A. B. Marvasti, & K. D. McKinney (Eds.), The SAGE Handbook of Interview Research (2nd ed., pp. 129-142). London, U.K.: SAGE.

In-depth interviews: The best strategies to gain high-quality insights in qualitative marketing research

In Depth Interviews Techniques To Implement To Gain High Quality Insights

In-depth interviews are an important methodology in qualitative marketing research. They offer researchers insights from real people. This article will share some techniques to consider when conducting an in-depth interview to make the best of your time with interviewees.

How to conduct a successful in-depth interview

Editor’s note: Lyndsay Sund is the senior project manager at Syncscript. This is an edited version of an article that originally appeared under the title “ Mastering the Art of In-depth Interviews: Effective Techniques for Uncovering Insights .”

In-depth interviews are the cornerstone of qualitative research. They provide rich, detailed insights on complex issues that surveys and quantitative methods can’t capture. However, an engaged respondent can only take you so far. The effectiveness of these interviews hinge on the interviewer’s skill in fostering trust and eliciting genuine insights which can significantly enhance the quality of your findings. In this article, we’ll explore some proven techniques for conducting in-depth interviews that yield actionable results.

Establish   rapport with interviewees

Building rapport is essential for a successful interview. Begin by introducing yourself warmly, expressing genuine interest and creating a comfortable atmosphere. Empathy, active listening and validation of participants’ experiences are key elements in establishing rapport. A comfortable respondent often makes for a more interesting interview.

Ask the right questions

Questions in the discussion guide are often scripted, but it’s important to maintain some flexibility, allowing room for spontaneous exploration of topics that arise. Don’t hesitate to ask for clarification or probe deeper into certain topics. It is important to invite participants to share stories or examples, which can yield richer insights. 

We’ve noticed that if you sound like you’re reading from a questionnaire, the respondents will give shorter answers. Vary your voice, connect the questions to their previous responses or what you know from the screener and your respondents will be more likely to answer more thoroughly. We’ve found that the discussion guide is an excellent outline, but word choice and following a more conversational style matters. 

Practice active listening and summarize answers

Demonstrate genuine interest through attentive body language using both verbal and non-verbal cues and supportive listening, even if you are on a web-based platform. Mirror or paraphrase key points periodically to show understanding and allow the respondent to confirm or clarify. These probes delve deeper into participants’ perspectives, revealing nuances and underlying motivations.

Proactive listening: Noticing non-verbal cues

In addition to verbal responses, pay close attention to participants’ non-verbal cues. Facial expressions, body language and tone of voice offer valuable insights into emotions, attitudes and underlying sentiments. Proactive listening to these cues enables interviewers to delve deeper into participants’ subconscious thoughts and feelings. 

Embrace silence

Silence can be a powerful tool in in-depth interviews. Allow moments of silence after posing a question to give participants ample time to process and formulate their responses. Resist the urge to fill silences with additional questions or commentary. Often, participants use these pauses to delve into deeper thoughts, resulting in richer insights. Remember the old dial-up internet service where we waited patiently to access the internet? The same rule applies here.

The post-interview process: Follow-up with research participants

Always thank your interviewees for their time and insights and, when applicable, ask if they’d like to receive the findings of the study. Also, invite feedback on the interview process: “Is there anything I didn’t ask you that I should have?” This is the perfect opportunity to gain last-minute valuable insights for future interviews.   After the interview, the focus shifts to analyzing the data.  

Mastering effective techniques for conducting in-depth interviews is a valuable skill across various fields. By carefully preparing, building rapport, asking insightful questions, actively listening and ethically handling post-interview processes, you can uncover deep insights that surface-level methods cannot reach. Whether you’re a seasoned researcher or new to qualitative interviews, applying these strategies will enhance the quality and depth of your findings, leading to more impactful outcomes.

The benefits and limitations of intercept interviews Related Categories: Research Industry, One-on-One (Depth) Interviews, Qualitative Research Research Industry, One-on-One (Depth) Interviews, Qualitative Research, Mall Interviewing, On-site Interviewing, Recruiting-Qualitative

In Case You Missed It...May/June 2024 Related Categories: Research Industry, One-on-One (Depth) Interviews, Questionnaire Analysis Research Industry, One-on-One (Depth) Interviews, Questionnaire Analysis, The Business of Research, Employees, Survey Research

Yasna: AI assistant for interviewing people, regardless of the scale Related Categories: Research Industry, One-on-One (Depth) Interviews, Qualitative Research Research Industry, One-on-One (Depth) Interviews, Qualitative Research, Artificial Intelligence / AI

How friendship pairs can help marketing researchers Related Categories: Research Industry, One-on-One (Depth) Interviews, Qualitative Research Research Industry, One-on-One (Depth) Interviews, Qualitative Research, Advertising Effectiveness, Audience Research, Focus Group-Moderating

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings
  • My Bibliography
  • Collections
  • Citation manager

Save citation to file

Email citation, add to collections.

  • Create a new collection
  • Add to an existing collection

Add to My Bibliography

Your saved search, create a file for external citation management software, your rss feed.

  • Search in PubMed
  • Search in NLM Catalog
  • Add to Search

A practical guide for conducting qualitative research in medical education: Part 1-How to interview

Affiliations.

  • 1 Department of Emergency Medicine University of California, Los Angeles, David Geffen School of Medicine at UCLA Los Angeles California USA.
  • 2 Department of Emergency Medicine Ronald Reagan UCLA Medical Center Los Angeles California USA.
  • 3 Department of Emergency Medicine University of California, Davis Health System Sacramento California USA.
  • 4 Department of Emergency Medicine Harbor-UCLA Medical Center Torrance California USA.
  • PMID: 34471795
  • PMCID: PMC8325517
  • DOI: 10.1002/aet2.10646

PubMed Disclaimer

Conflict of interest statement

The authors have no potential conflicts to disclose.

Similar articles

  • A practical guide for conducting qualitative research in medical education: Part 3-Using software for qualitative analysis. Clarke SO, Coates WC, Jordan J. Clarke SO, et al. AEM Educ Train. 2021 Aug 1;5(4):e10644. doi: 10.1002/aet2.10644. eCollection 2021 Aug. AEM Educ Train. 2021. PMID: 34589659 Free PMC article.
  • A practical guide for conducting qualitative research in medical education: Part 2-Coding and thematic analysis. Coates WC, Jordan J, Clarke SO. Coates WC, et al. AEM Educ Train. 2021 Aug 1;5(4):e10645. doi: 10.1002/aet2.10645. eCollection 2021 Aug. AEM Educ Train. 2021. PMID: 34585038 Free PMC article.
  • Qualitative Research Methods in Medical Education. Sawatsky AP, Ratelle JT, Beckman TJ. Sawatsky AP, et al. Anesthesiology. 2019 Jul;131(1):14-22. doi: 10.1097/ALN.0000000000002728. Anesthesiology. 2019. PMID: 31045898 Review.
  • Systematic methodological review: developing a framework for a qualitative semi-structured interview guide. Kallio H, Pietilä AM, Johnson M, Kangasniemi M. Kallio H, et al. J Adv Nurs. 2016 Dec;72(12):2954-2965. doi: 10.1111/jan.13031. Epub 2016 Jun 23. J Adv Nurs. 2016. PMID: 27221824 Review.
  • Using focus groups in medical education research: AMEE Guide No. 91. Stalmeijer RE, Mcnaughton N, Van Mook WN. Stalmeijer RE, et al. Med Teach. 2014 Nov;36(11):923-39. doi: 10.3109/0142159X.2014.917165. Epub 2014 Jul 29. Med Teach. 2014. PMID: 25072306
  • Quality of life, socioeconomic and psychological concerns in parents of children with tuberous sclerosis complex, STXBP1 and SYNGAP1 encephalopathies: a mixed method study. Salcedo-Perez-Juana M, Palacios-Ceña D, San-Martín-Gómez A, Aledo-Serrano Á, Florencio LL. Salcedo-Perez-Juana M, et al. Front Pediatr. 2023 Nov 9;11:1285377. doi: 10.3389/fped.2023.1285377. eCollection 2023. Front Pediatr. 2023. PMID: 38027293 Free PMC article.
  • The Emmaus Project: Aging, Illness, and Dying Among Older Christians-A Qualitative Study. Quinn KRT, Kim J, Yoon JD. Quinn KRT, et al. Linacre Q. 2023 Aug;90(3):320-332. doi: 10.1177/00243639231156700. Epub 2023 Mar 22. Linacre Q. 2023. PMID: 37841375
  • Factors influencing the quality of acupuncture clinical trials: a qualitative interview of stakeholders. He Y, Li N, Wang Q, Wang Y, Dai Z, Wu M, Song H, Wen Q, Li N, Zhang Y. He Y, et al. BMC Complement Med Ther. 2023 Sep 16;23(1):326. doi: 10.1186/s12906-023-04020-w. BMC Complement Med Ther. 2023. PMID: 37716936 Free PMC article.
  • Kozleski EB. The uses of qualitative research: powerful methods to inform evidence based practice in education. Res Pract Persons Severe Disabl. 2017;42(1):19‐32.
  • Cleland JA. The qualitative orientation in medical education research. Korean J Med Educ. 2017;29(2):61‐71. - PMC - PubMed
  • Daniel E. The usefulness of qualitative and quantitative approaches and methods in researching problem‐solving ability in science education curriculum. J Edu Pract. 2016;7(15):91‐100.
  • Chen HC, Teherani A. Common qualitative methodologies and research designs in health professions education. Acad Med. 2016;91(12):e5. - PubMed
  • Paradis E. The tools of the qualitative research trade. Acad Med. 2016;91(12):e17. - PubMed

LinkOut - more resources

Full text sources.

  • Europe PubMed Central
  • Ovid Technologies, Inc.
  • PubMed Central

full text provider logo

  • Citation Manager

NCBI Literature Resources

MeSH PMC Bookshelf Disclaimer

The PubMed wordmark and PubMed logo are registered trademarks of the U.S. Department of Health and Human Services (HHS). Unauthorized use of these marks is strictly prohibited.

  • Methodology
  • Open access
  • Published: 21 June 2024

The Rapid Implementation Feedback (RIF) report: real-time synthesis of qualitative data for proactive implementation planning and tailoring

  • Erin P. Finley   ORCID: orcid.org/0000-0003-4497-7721 1 , 2 ,
  • Joya G. Chrystal 1 ,
  • Alicia R. Gable 1 ,
  • Erica H. Fletcher 1 ,
  • Agatha Palma 1 ,
  • Ismelda Canelo 1 ,
  • Rebecca S. Oberman 1 ,
  • La Shawnta S. Jackson 1 ,
  • Rachel Lesser 1 ,
  • Tannaz Moin 1 , 3 ,
  • Bevanne Bean-Mayberry 1 , 3 ,
  • Melissa M. Farmer 1 &
  • Alison Hamilton 1 , 3  

Implementation Science Communications volume  5 , Article number:  69 ( 2024 ) Cite this article

105 Accesses

3 Altmetric

Metrics details

Qualitative methods are a critical tool for enhancing implementation planning and tailoring, yet rapid turn-around of qualitative insights can be challenging in large implementation trials. The Department of Veterans Affairs-funded EMPOWER 2.0 Quality Enhancement Research Initiative (QUERI) is conducting a hybrid type 3 effectiveness-implementation trial comparing the impact of Replicating Effective Programs (REP) and Evidence-Based Quality Improvement (EBQI) as strategies for implementing three evidence-based practices (EBPs) for women Veterans. We describe the development of the Rapid Implementation Feedback (RIF) report, a pragmatic, team-based approach for the rapid synthesis of qualitative data to aid implementation planning and tailoring, as well as findings from a process evaluation of adopting the RIF report within the EMPOWER 2.0 QUERI.

Trained qualitative staff conducted 125 semi-structured pre-implementation interviews with frontline staff, providers, and leadership across 16 VA sites between October 2021 and October 2022. High-priority topic domains informed by the updated Consolidated Framework for Implementation Research were selected in dialogue between EMPOWER 2.0 implementation and evaluation teams, and relevant key points were summarized for each interview to produce a structured RIF report, with emergent findings about each site highlighted in weekly written and verbal communications. Process evaluation was conducted to assess EMPOWER 2.0 team experiences with the RIF report across pre-implementation data collection and synthesis and implementation planning and tailoring.

Weekly RIF updates supported continuous EMPOWER 2.0 team communication around key findings, particularly questions and concerns raised by participating sites related to the three EBPs. Introducing the RIF report into team processes enhanced: team communication; quality and rigor of qualitative data; sensemaking around emergent challenges; understanding of site readiness; and tailoring of REP and EBQI implementation strategies. RIF report findings have facilitated rapid tailoring of implementation planning and rollout, supporting increased responsiveness to sites’ needs and concerns.

Conclusions

The RIF report provides a structured strategy for distillation of time-sensitive findings, continuous team communication amid a complex multi-site implementation effort, and effective tailoring of implementation rollout in real-time. Use of the RIF report may also support trust-building by enhancing responsiveness to sites during pre- and early implementation.

Trial registration

Enhancing Mental and Physical Health of Women Veterans (NCT05050266); https://clinicaltrials.gov/study/NCT05050266?term=EMPOWER%202.0&rank=1

Date of registration: 09/09/2021.

Peer Review reports

Contributions to the literature

Tailoring implementation strategies for specific site needs is often critical for successful implementation. However, few approaches ensure that implementation teams possess the necessary information to deliver timely, tailored strategies in multi-site trials.

We introduce a practical approach, the Rapid Implementation Feedback (RIF) report, designed to share critical information within implementation and evaluation teams. We illustrate how the RIF report has proven instrumental in fostering effective communication and tailoring within the EMPOWER 2.0 Quality Enhancement Research Initiative (QUERI).

The RIF report offers a method for sharing pertinent and time-sensitive findings, empowering teams to swiftly and effectively tailor implementation in real time.

As implementation science has matured, implementation trials have become increasingly complex, often comparing two or more implementation strategies, integrating multiple quantitative and qualitative methods, and occurring across a dozen or more sites. Such complex initiatives require larger teams of implementation researchers and practitioners to conduct, raising challenges for effective and timely communication within teams. Meanwhile, tailoring interventions and implementation rollout to align with the unique strengths and challenges at individual sites – recognized as a valuable and often requisite strategy for achieving implementation and sustainment [ 1 , 2 , 3 ] – requires intensive, flexible, and dynamic engagement with sites. Contextual factors must be assessed, key partners identified, and critical information synthesized and shared to allow for rapid sensemaking and problem-solving.

The growth of implementation science as a field has been accompanied by an acceleration in the variety, rigor, and rapidity of qualitative methods available to support real-world research translation [ 4 , 5 ]. Effective work in implementation often requires gathering information that is purposeful and systematic, represents a variety of partners and perspectives, and accurately synthesizes diverse viewpoints to support meaningful communication and decision-making at every stage of implementation. Accordingly, an array of methodological strategies for supporting participatory and partner-engaged processes [ 6 , 7 ], rapid qualitative data collection and analysis [ 8 , 9 ], and ethnographic and observational approaches [ 10 , 11 , 12 ] have emerged, offering a growing array of qualitative methods to meet the needs of a given study or initiative.

To make use of these methods effectively, work and team processes suitable for the implementation context are needed. The importance of strong communication and relationship networks within implementing sites and teams has been recognized since the early days of the field [ 13 , 14 , 15 ], and recent scholarship has examined how relational communication is embedded within most strategies for implementation [ 16 ], trust-building [ 17 ], and scale-up and spread [ 18 ]. Yet relatively little scholarship has put forward methods for ensuring timely and effective communication within implementation teams, particularly amid efforts to achieve site-level tailoring in real-time. Across eight years of conducting hybrid effectiveness-implementation trials in support of improved care delivery for women Veterans, our team has learned that effective tailoring requires capturing and sharing critical information in an ongoing way [ 4 , 10 , 19 ]. In the first part of this article, we describe the development of a pragmatic, team-based approach for the rapid synthesis of qualitative data to support implementation planning and tailoring: the Rapid Implementation Feedback (RIF) report. In the latter part, we describe findings from a process evaluation of adopting the RIF report within the EMPOWER 2.0 QUERI, outlining how use of this approach has evolved our work.

Background and study overview

Women Veterans represent the fastest-growing proportion of VA healthcare users. Despite substantial VA investment in women’s health, gender disparities persist in certain health outcomes, including cardiovascular and metabolic risk and mental health [ 20 , 21 , 22 ]. In tailoring healthcare delivery for women, prior studies suggest that women Veterans prefer gender-specific care and telehealth options [ 19 , 23 ]. In response, the VA EMPOWER 2.0 QUERI is conducting a hybrid type 3 effectiveness-implementation trial [ 24 ] comparing the impact of Replicating Effective Programs (REP) and Evidence-Based Quality Improvement (EBQI) as strategies for implementing three virtual evidence-based practices (EBPs) for women Veterans in 20 VA sites across the United States: (1) Diabetes Prevention Program (DPP) to reduce risk of progressing to type 2 diabetes [ 25 ]; (2) Telephone Lifestyle Coaching (TLC) to reduce cardiovascular risk [ 26 ]; and (3) Reach Out, Stay Strong Essentials (ROSE) to prevent postpartum depression [ 27 ]. REP combines phased intervention packaging, tailoring, training and technical assistance, and re-customization during maintenance/sustainment [ 28 ], while EBQI offers a systematic quality improvement method for engaging frontline providers in improvement efforts via tailoring, multi-level partnership, and ongoing facilitation [ 29 ]. We selected these bundled implementation strategies, REP and EBQI, based on their strong evidence for effectively supporting implementation in diverse healthcare settings [ 28 , 30 ]. Both of these strategies draw upon pre-implementation needs assessment and planned tailoring as key activities for successful implementation, which we postulated would be important based on our experience in the prior EMPOWER QUERI (2015–2020) [ 19 , 30 ]. These activities were deemed to be non-research by the VA Office of Patient Care Services prior to funding.

To coordinate the separate implementation and evaluation elements of our work, we established distinct-but-overlapping teams under the broader umbrella of EMPOWER 2.0, dedicated to: (1) implementing each of the EBPs (DPP, TLC, ROSE), with these smaller teams led by principal investigators for each EBP; (2) providing REP- or EBQI-consistent implementation support at each site (i.e., “REP team” and “EBQI team” project directors); and (3) executing qualitative and quantitive components of our overall evaluation (described in detail in [ 24 ]), in the form of the “qualitative team” and “measures team,” respectively.

EMPOWER 2.0 engagement and outreach

Working in concert across these implementation and evaluation teams, EMPOWER 2.0 followed a standardized process for engaging with sites (Fig. 1 ). Initial efforts (beginning pre-funding) involved reaching out to partners at the regional Veterans Integrated Service Network (VISN) level to introduce the EBPs, answer questions, and request a list of potential VA medical centers (VAMCs) within the VISN that might be appropriate for implementation. Following EMPOWER 2.0’s cluster-randomized study design, VISNs were assigned to participate in two of the EBPS (either TLC and ROSE or DPP and ROSE; ROSE was offered to all sites in an effort to ensure an adequate number of pregnant Veteran participants) [ 24 ]. We extended invitations to identified VAMCs to participate in the two EBPs available in their VISN. If sites expressed interest, we conducted an introductory meeting with providers and leadership from Primary Care, Women’s Health, Mental Health, Whole Health [ 31 ], and/or Health Promotion and Disease Prevention, as appropriate to the EBP and each site’s local organization of care. Once a site confirmed their participation, they were randomized to receive either the REP or the EBQI implementation strategy. Following randomization, they were asked to identify a point person for each EBP and key individuals who would be likely to participate in local EBP implementation teams and/or play an important role in supporting implementation (e.g., VAMC leadership). These individuals (e.g., Medical Director, Health Educator) were then invited to participate in pre-implementation interviews prior to initiating REP or EBQI at their site. In each VISN, partners at the VISN level were also invited to participate in pre-implementation interviews, to obtain broader perspectives on the regional women’s health context and priorities.

figure 1

EMPOWER 2.0 QUERI site-level outreach, randomization, and engagement

Pre-implementation qualitative interviews

Intended to assess sites’ needs and resources and enable pre-implementation tailoring prior to launch, EMPOWER 2.0 pre-implementation interviews examined baseline care practices for each relevant care condition (prediabetes for DPP; cardiovascular risk for TLC; perinatal mental health for ROSE), as well as updated Consolidated Framework for Implementation Research (CFIR) domains including inner and outer setting, innovation, individuals (e.g., characteristics: motivation) and implementation process [ 32 ]. Semi-structured interview guides (previously published [ 24 ]) were developed building on prior work in the original EMPOWER QUERI [ 30 ] and the Women’s Health Patient-Aligned Clinical Team trial [ 33 ]. We have an expert qualitative team, each of whom has master’s or PhD-level training in qualitative methods and years of experience in conducting team-based qualitative research, including using rapid qualitative analysis approaches [ 8 , 9 ]. Most team members have worked together on EMPOWER and other projects for over five years.

Between October 2021 – October 2022, the qualitative team completed 125 interviews across 16 sites, with site and VISN-level participants representing a range of roles, including Women Veteran Program Managers, Women’s Health Primary Care Providers, Maternity Care Coordinators, primary care team members, health coaches, and nutritionists. Pre-implementation interviews took an average of 57 days (range 15–108 days) to complete per site, and included 4–13 participants depending on the size and complexity of the care facility.

Developing the Rapid Implementation Feedback (RIF) report

The EMPOWER 2.0 qualitative team has a well-established approach to conducting rapid qualitative analysis [ 8 , 19 ] and strong personnel infrastructure and expertise. Even so, once pre-implementation interviews began, challenges quickly arose in ensuring that findings were being communicated to EMPOWER 2.0 implementation teams for DPP, TLC, and ROSE in a timely and effective manner, particularly given that each team was working with multiple sites concurrently. Key questions included: how do we ensure early findings are shared in time to support pre-implementation tailoring? How do we communicate effectively across the qualitative team conducting interviews and the teams responsible for implementation? And how do we keep qualitative team members up-to-date on implementation, so they are well-informed for interviews?

In responding to these challenges, we developed the Rapid Implementation Feedback (RIF) report to support data distillation and bidirectional feedback across our qualitative and implementation teams. In developing the RIF, the EMPOWER 2.0 implementation teams, which are composed of investigators and project directors for each EBP who provide external implementation support for each site, met with the qualitative interview team and agreed upon high-priority topic domains to be extracted from the interviews. These domains were related to implementation planning and included critical roles for implementation planning and launch ; implementation concerns and/or demand for the EBP ; and use of data to track women Veterans’ population health needs (see Table  1 ). These topics reflected both specific CFIR subdomains included in the pre-implementation interview guide (e.g., use of data as an assessment of the CFIR subdomain for information technology infrastructure ), as well as higher-level domains combined to aid in prioritizing key issues (e.g., germane responses related to inner setting , individual characteristics , and implementation process were combined into implementation concerns ). These topic domains were used to create a RIF report template (see Appendix 1 ), which was organized under headings by VISN (outer setting), site (inner setting), and EBP [ 32 ]; the same domains were selected for all EBPs, ensuring consistency in data distillation across the project. Compiling the RIF report ensured that, for example, all interview data relevant to critical roles for implementation planning for ROSE in Site A were collated and easy to locate. Thereafter, at the conclusion of an interview, the qualitative team reviewed interview notes and/or Microsoft Teams transcripts and extracted key points relevant to each priority topic; in doing so, team members followed a process similar to that used in developing structured summaries for rapid qualitative analysis [ 8 , 34 ], but differing by a targeted focus on relatively few domains. For each interview, the analyst would summarize key points related to each RIF domain (e.g., critical roles for implementation planning and launch ), as well as any brief or particularly salient quotes; every key point or quote was also labeled with a study identification number indicating the role of the respondent. The resulting key points and quotes were then added to the RIF report, creating a single, up-to-date written resource for implementation teams, which was cumulatively updated over time.

This approach to analysis is distinct in two key ways from the data distillation process typically used in rapid qualitative analysis [ 8 , 34 , 35 , 36 ]. First, in rapid qualitative analysis, templated summaries are first created at the level of the individual interview or other data episode, so that each data episode is associated with a summary of contents that can later be compiled into a multi-episode matrix. Second, structured summaries are traditionally intended to capture all of the key findings in a given data episode, and thus are both more comprehensive and less focused than the RIF report. By contrast, the RIF report collapsed two steps (i.e., summary then matrix) into one (i.e., RIF report) to assemble a targeted selection of high-priority data. In addition, because the data for each domain were collated from the beginning into a single document, the process of assessing data heterogeneity (e.g., diversity of opinions) and adequacy (e.g., saturation) for a given site was expedited. Up-to-date findings could be made available to the implementation teams on a consistent basis, despite the fact that the qualitative team was often interviewing among multiple sites concurrently. During this period, EMPOWER 2.0 held a weekly full-team meeting to coordinate implementation and evaluation efforts. The day before this weekly meeting, the updated RIF report was sent to the full EMPOWER 2.0 team in a secure encrypted email, with new additions highlighted for easy reference; the team was also notified if there were no RIF updates for the week. As implementation teams were also working concurrently across multiple sites, the RIF report became a centralized resource for organizing essential information in a dynamic environment.

Although the brief written RIF expedited communication of time-sensitive information across teams, challenges continued to arise in coordinating activities, tailoring EBPs, and general communication with sites. We therefore added a verbal update to the RIF Report (see Fig.  2 ), summarizing new additions to the RIF as part of our overall EMPOWER 2.0 weekly meeting. Updates were brief, organized by site, and included a brief summary of interviews conducted that week, along with the roles interviewed and unique findings (e.g., staff turnover issues). Members of the qualitative team also gave feedback on whether saturation had been reached at a site, or if additional interviewing would be helpful in developing a snapshot of key site features, strengths, and potential challenges.

figure 2

Core components of the Rapid Implementation Feedback (RIF) report

Process evaluation

To assess whether the RIF was an effective method for communication and coordination, we conducted a process evaluation of EMPOWER 2.0 teams’ experiences of using the RIF report. We reviewed periodic reflections conducted by the first author as part of EMPOWER 2.0’s overall implementation evaluation with 11 members of five internal teams: those responsible for leading DPP, TLC, and ROSE implementation (i.e., PIs and Co-PIs), and for supporting sites using REP and EBQI implementation strategies (i.e., project directors). Periodic reflections [ 10 ] are lightly guided discussions conducted by phone or teleconference software, which allow for consistent documentation of implementation activities, processes, and events, both planned and unexpected. We adapted the original periodic reflection template [ 10 ] as a discussion guide for EMPOWER 2.0 (previously published [ 24 ]). Reflections lasted 15–60 minutes, with length roughly corresponding to the amount of recent implementation activity, and were conducted monthly or bi-monthly with each team.

In examining how the RIF report was working for our teams, we conducted thematic analysis [ 37 ] of all periodic reflections ( n  = 32) completed with EMPOWER 2.0 teams between October 2021, when the RIF was first introduced, and October 2022. All text relevant to the RIF report was extracted and reviewed inductively for key themes associated with perceived impacts of the RIF, resulting in a preliminary set of emergent themes, which were codified into a codebook. All segments of extracted text were then reviewed again and assigned codes as appropriate to their meaning; central findings for each code/theme were then distilled. This preliminary analysis was conducted by the lead author and then presented back to the full EMPOWER 2.0 team to allow for debriefing and member checking [ 38 , 39 ] over a series of meetings. Team members provided substantive feedback that aided in refining themes, and offered additional reflection and commentary on the RIF report and its role within team processes.

We identified five interconnected impacts associated with introducing the RIF report into EMPOWER 2.0 team processes: enhanced communication across teams; improved quality and rigor of qualitative data; heightened sensemaking around emergent challenges; increased understanding of site readiness; and informed tailoring of REP and EBQI implementation strategies. We describe each of these in turn below.

Enhanced communication across teams

As intended, the RIF was felt to be an effective strategy for improving communication across EMPOWER 2.0’s internal teams. Having the RIF available in written format created an easily accessible resource for implementation teams as they prepared for next steps in engaging with sites, and for qualitative team members as they prepared for upcoming interviews. The verbal RIF update, because it occurred alongside implementation team updates as part of the weekly team call, ensured that information-sharing was bidirectional in real time. The continuous flow of information provided a regular opportunity for answering questions, clarifying areas of potential confusion, and identifying where additional information was needed. Additionally, the RIF served to keep all team members in sync with site-specific information on an ongoing basis.

“I love that the qualitative team is giving us real-time feedback. I don’t think I’ve ever done that except informally. I think that’s been a really nice addition to our meetings.” [EBP 1 lead]

On the whole, the enhanced communication among teams was felt to support team “synergy” and increase synchronization of activities in continued data-gathering and site engagement.

Improved quality and rigor of qualitative data

Although improving rigor was not an explicit goal of developing the RIF report, introducing this structured process was felt to have improved both the quality of data collection and the rigor of early analyses. Because of the improved bidirectional communication occurring as part of the weekly verbal RIF report with implementation teams, qualitative team members felt as though they had an increased understanding of implementation activities and site-level context. This in turn was felt to improve the quality of their interviewing by allowing them to ask more attuned follow-up questions and to prioritize topics that were “meaningful to inform implementation.”

“[We] felt very disconnected in the beginning like we didn’t have any information. Having the weekly calls to talk about these things was really helpful.” [Qualitative team member 1]

Qualitative team members also reported feeling more consistent and “in sync” in their processes for interviewing and preparing the RIF report, as the weekly discussions provided an opportunity for the team to observe, confer, and calibrate regarding the conduct of interviews and the content and level of detail included in ongoing RIF updates.

“It helps us stay impartial as interviewers across stakeholders, across sites, and as we modify the interview guide. It kept all of us…aligned with the parts we need to dig deeper into because they’re RIF/high priority.” [Qualitative team member 2].

In addition, introducing the RIF report was felt to increase the trustworthiness of preliminary analyses and data distillation, because while initial data reviews can be impressionistic or anecdotal, the RIF provided a structured and systematic way of consolidating multi-site data from the first pass. Because the RIF report provided early synthesis, it also aided in generating ideas for targeted analysis and coding conducted as part of evaluation activities in later phases.

Heightened sensemaking around emergent challenges

Arising out of the enhanced team communication, and perhaps supported by the improved quality of information being gathered and distilled by the qualitative team, discussions prompted by the RIF helped the EMPOWER 2.0 team to identify and develop solutions to emergent challenges. As one example, the qualitative team quickly realized that, while it is common practice to keep implementation-focused and evaluation-focused teams distinct in an effort to reduce bias in hybrid trials, sites viewed everyone associated with EMPOWER 2.0 – including interviewers – as an “ambassador” of the project. Interviewers found early on that they were fielding important questions from sites regarding the EBPs and/or implementation plans, and often lacked the information to provide an appropriate response, which placed them in an awkward position. After this issue was raised as part of a weekly RIF update, the teams worked together to develop a living Frequently Asked Questions document to help interviewers answer common questions that were coming up during interviews. This document was later helpful in standardizing communication with sites more generally, serving as a resource for implementation teams as well.

In a second example, a key pre-implementation effort by the EMPOWER 2.0 measures team involved developing a dashboard of population health and performance metrics tailored to provide actionable information to sites on the healthcare needs of their women Veterans. As preparations for site launch continued, and discussions of RIF findings informed ongoing planning efforts, the measures team realized they lacked information on how sites were using existing population health and performance measures. The measures and qualitative teams then worked together to update the interview guide and add priority domains to the RIF report to aid in dashboard development. Having integrated these additions, the qualitative team was able to rapidly confirm the need for a dashboard display of women-only performance measures, and data were used to support tailoring to sites’ needs.

Increased understanding of site readiness

Reflecting the enhanced communication and improved data quality associated with adopting the RIF report, the EMPOWER 2.0 teams were also more able to develop timely assessments of site readiness. The distillation of qualitative interview data provided important contextual information about site-level participants’ level of EBP awareness, motivation, and competing demands prior to implementation planning meetings.

“They just seem generally enthusiastic.” [EBP 2 lead] “Most of what I was picking up on was people saying, ‘We don’t have anyone to do it.’ Just sites saying that they don’t have people…they don’t want to take it on right now.” [EBP 3 lead]

Readied with this information, implementation teams were able to prepare for engagement and planning efforts with a greater understanding of what the critical issues were likely to be.

Informed tailoring of REP & EBQI strategies

Finally, building on an improved understanding of sites’ pre-implementation readiness, EMPOWER 2.0 teams felt better equipped to engage in planned tailoring of site outreach and implementation activities within the REP and EBQI strategy bundles. For example, when a key leader at one site was revealed to be “not entirely on board” with DPP implementation, the DPP team lead was able to offer targeted outreach to acknowledge and address the concerns expressed. When concerns were raised about staffing and EBP ownership prior to launch of ROSE, the ROSE team lead expressed, “We were prepared for tough conversations.”

“That became our ‘MO’…anything that comes up [in the RIF], we’ll try to address in the kick-off [meeting with sites] to show that we’re helping in addressing their questions.” [EBP 1 lead]

The RIF report was developed in response to the challenge, within the EMPOWER 2.0 hybrid type 3 effectiveness-implementation trial, of distilling and sharing critical information among internal teams as they pursued distinct implementation and evaluation tasks with an evolving cast of dynamic sites. Combined, the RIF report’s written and verbal components provide a method and process for rapidly extracting high-priority, actionable data, sharing these data in a focused and digestible way, and supporting team sensemaking and tailoring of implementation approaches in real time.

In evaluating the RIF report process, we found that its key benefits were interconnected and mutually reinforcing. Bidirectional communication increased the quality of qualitative data collection, which in turn improved the depth and salience of the data conveyed to the implementation teams, which in turn increased the teams’ ability to engage in active sensemaking and identify effective strategies for tailoring the implementation approach at each site. The tight informational feedback loop allowed us to be nimble and iterative both in data-gathering (e.g., by adding novel domains to the RIF as needed) and in tailoring (e.g., by allowing us to customize early messaging to address sites’ most pressing concerns).

Tailoring and adaptation of both interventions and implementation strategies have been recognized as essential for the successful translation of research into routine practice [ 40 , 41 , 42 , 43 ]. In response, a variety of qualitative and mixed-methods approaches have been put forward for capturing feedback from diverse partners, including user-centered adaptation [ 44 ], the Method for Program Adaptation through Community Engagement (M-PACE) [ 45 ], the ADAPT guidance [ 46 ], concept mapping [ 47 ], and intervention mapping [ 48 ]. These approaches have strengthened capacity for implementation researchers and practitioners to gather and synthesize often wide-ranging perspectives into actionable guidance for improving the acceptability, feasibility, appropriateness, and compatibility of interventions and implementation strategies. Yet there remains significant opportunity to streamline and systematize methods for tailoring in the context of hybrid type 2 and 3 trials, which often conduct formative evaluation in real time amid simultaneous data collection and implementation activities. In addition to providing a model for how to embed a structured method for data capture, distillation, and sharing within a complex implementation trial, we believe the RIF report offers a pragmatic method to improve both the quality of information synthesis and the ability of teams to engage in timely sensemaking.

Creating an effective internal communication process via the RIF supported tailored delivery of EBPs at each site, which in turn was felt to enhance the relationships between EMPOWER 2.0 QUERI members and site partners. The role of relationships as an underlying and underexplored element within implementation has garnered increasing attention [ 15 ]. Bartley et al. [ 16 ] conducted an analysis of the Expert Recommendations for Implementing Change (ERIC) taxonomy of implementation strategies [ 49 ], and found that nearly half (36 of 73) could be classified as highly or semi-relational in nature. Connelly and collaborators [ 50 ] developed a Relational Facilitation Guidebook based in relational coordination and the principle that high-quality communication and relationships result in improved healthcare quality. Metz and colleagues [ 17 ] have proposed a theoretical model for building trusting relationships to support implementation, drawing on theory and research evidence to identify both technical and relational strategies associated with nurturing trust. There is considerable overlap between Metz et al.’s strategies and the processes supported by adopting the RIF report in EMPOWER 2.0, particularly those related to bidirectional communication, co-learning, and frequent interactions, which in turn enabled greater responsiveness to sites. We found the structured communication offered by the RIF helped to support trust-building both within EMPOWER 2.0 and in our teams’ interactions with sites.

Future teams weighing potential use of the RIF report should first consider whether the RIF report is suitable to their project goals and resources. It may be less suitable for teams whose timelines allow for traditional coding-based or rapid qualitative approaches to data analysis, who do not intend to engage in formative evaluation or planned tailoring, or who have concerns that any modifications to the implementation approach may be incompatible with their trial design. In EMPOWER 2.0, core components for determining fidelity to implementation strategy in both study arms (REP and EBQI) were identified before initiating pre-implementation activities, and both strategies included planned tailoring to address specific conditions at sites (e.g., perceived patient needs, key professional roles and service lines to be involved). We were thus able to ensure that no decisions made in RIF-related or other discussions varied from our trial protocol.

Teams electing to adopt the RIF report may benefit from discussing how best to integrate this method into their workflow, and what specific tailoring of the RIF report is needed to ensure alignment with their implementation, research, and/or evaluation goals. We recommend that teams discuss and come to consensus on four RIF elements: (1) selected high-priority topic domains, e.g., site-level concerns, which may be higher-level or more closely focused on implementation theory constructs, as appropriate to the project; (2) what data sources will be included (e.g., data from provider or leadership interviews, surveys, or periodic reflections); (3) the preferred format for written and verbal RIF reports, including salient categories for organizing information (e.g., by site or professional role); and (4) the preferred frequency of sharing RIF reports. Given the established importance of identifying effective local champions in implementation [ 51 , 52 , 53 , 54 ], identifying critical roles and service lines for implementation planning and launch are domains likely to be of value for many projects, as is the domain of implementation concerns , which encapsulates important doubts or anxieties expressed by respondents that may be addressable by the implementation team. Teams documenting shifts to the implementation approach in response to respondent feedback might also consider adding a tailoring / action items or next steps domain to track decisions made during discussions of RIF findings. With regard to frequency, weekly RIF reports worked well for EMPOWER 2.0 because this tempo aligned with existing meetings and the busy pace of pre-implementation activities, but this frequency may not be necessary for all teams. Dialogue across these issues is likely to be of value for teams in developing a shared understanding of how project goals will be operationalized, and may allow for more agile responses when change is needed or challenges arise.

There are several limitations to the process evaluation described here. First, it should be noted that periodic reflections were conducted by the first author, who has worked with most members of the implementation teams for at least five years. As an ethnographic method occurring repeatedly over time, reflections benefit from the long-term relationship built between discussion lead and participants, and may be subject to less reporting bias than other data collection methods [ 10 ]. Nonetheless, the potential for biased reporting should be acknowledged. We endeavored to ensure the accuracy, completeness, and trustworthiness of findings [ 39 , 55 , 56 ] by engaging in multiple rounds of member checking with the EMPOWER 2.0 team, first in dedicated meetings and later in preparing and revising this manuscript.

In considering the limitations of the RIF report as a methodological approach to support effective distillation and tailoring, it is important to note that this process was developed and executed by a highly trained and experienced team, which likely facilitated qualitative team members in completing the structured reports in a timely and consistent manner. We found that analyses conducted for the RIF report were adequate to support all of the pre-implementation tailoring required for this initiative; however, projects – and particularly projects occurring earlier in the implementation pipeline than this hybrid type 3 trial – may vary in their early-stage analytic needs. Notably, no negative impacts associated with introducing the RIF were identified by team members; this may reflect the fact that the RIF report replaced other rapid qualitative analysis activities (e.g., developing structured summaries for each interview) rather than adding to the team workload. It should be noted that the EMPOWER 2.0 core team also builds on significant experience working together over time, which may have enhanced the quality of communication and coordination emerging from RIF updates. The RIF report may not be relevant or appropriate in implementation efforts where formative evaluation and/or tailoring are not intended or desirable (e.g., in implementation trials assessing the effectiveness of strategies that do not include planned tailoring), although its step-by-step process for synthesizing data relevant to high-priority topics for rapid communication is likely to have broad utility. Future research should consider whether the RIF report has generalizability as a method for use in less complex implementation studies, or by smaller or less experienced teams.

Rapid qualitative methods are a critical tool for enhancing implementation planning, communication, and tailoring, but can be challenging to execute in the context of complex implementation trials, such as those occurring across multiple sites and requiring coordination across implementation and evaluation teams. The RIF report extends rapid qualitative methods by providing a structured process to enhance focused data distillation and timely communication across teams, laying the groundwork for an up-to-date assessment of site readiness, improved identification and sensemaking around emergent problems, and effective and responsive tailoring to meet the needs of diverse sites.

Availability of data and materials

The datasets generated and/or analysed during the current study are not publicly available as participants have not provided consent for sharing; de-identified portions may be available from the corresponding author on reasonable request.

Abbreviations

Diabetes Prevention Program

Evidence-Based Practice

Evidence-Based Quality Improvement

Enhancing Mental and Physical Health of Women Veterans through Engagement and Retention

Quality Enhancement and Research Initiative

Replicating Effective Programs

Rapid Implementation Feedback

Reach Out, Stay Strong Essentials

Telephone Lifestyle Coaching

Veterans Affairs

VA Medical Center

Veterans Integrated Service Network

Krause J, Van Lieshout J, Klomp R, Huntink E, Aakhus E, Flottorp S, et al. Identifying determinants of care for tailoring implementation in chronic diseases: an evaluation of different methods. Implementation Sci. 2014;9(1):102.

Article   Google Scholar  

Treichler EBH, Mercado R, Oakes D, Perivoliotis D, Gallegos-Rodriguez Y, Sosa E, et al. Using a stakeholder-engaged, iterative, and systematic approach to adapting collaborative decision skills training for implementation in VA psychosocial rehabilitation and recovery centers. BMC Health Serv Res. 2022;22(1):1543.

Article   PubMed   PubMed Central   Google Scholar  

Chambers DA, Glasgow RE, Stange KC. The dynamic sustainability framework: addressing the paradox of sustainment amid ongoing change. Implementation Science. 2013;8(1). Available from: http://implementationscience.biomedcentral.com/articles/ https://doi.org/10.1186/1748-5908-8-117   Cited 2017 Mar 14.

Hamilton AB, Finley EP. Qualitative methods in implementation research: An introduction. Psychiatry Res. 2019;280:112516.

Cohen D, Crabtree BF, Damschroder LJ, Hamilton AB, Heurtin-Roberts S, Leeman J, et al. Qualitative Methods in Implementation Science. National Cancer Institute; 2018. Available from: https://cancercontrol.cancer.gov/sites/default/files/2020-09/nci-dccps-implementationscience-whitepaper.pdf

Cunningham-Erves J, Mayo-Gamble T, Vaughn Y, Hawk J, Helms M, Barajas C, et al. Engagement of community stakeholders to develop a framework to guide research dissemination to communities. Health Expect. 2020;23(4):958–68.

Hamilton AB, Brunner J, Cain C, Chuang E, Luger TM, Canelo I, et al. Engaging multilevel stakeholders in an implementation trial of evidence-based quality improvement in VA women’s health primary care. Behav Med Pract Policy Res. 2017;7(3):478–85.

Hamilton. Qualitative methods in rapid turn-around health services research. 2013 Dec 11; VA HSR&D Cyberseminar Spotlight on Women’s Health. Available from: http://www.hsrd.research.va.gov/for_researchers/cyber_seminars/archives/780-notes.pdf

St. George SM, Harkness AR, Rodriguez-Diaz CE, Weinstein ER, Pavia V, Hamilton AB. Applying Rapid Qualitative Analysis for Health Equity: Lessons Learned Using “EARS” With Latino Communities. Int J Qual Methods. 2023;22:160940692311649.

Finley EP, Huynh AK, Farmer MM, Bean-Mayberry B, Moin T, Oishi SM, et al. Periodic reflections: a method of guided discussions for documenting implementation phenomena. BMC Med Res Methodol. 2018;18(1):153.

Gertner AK, Franklin J, Roth I, Cruden GH, Haley AD, Finley EP, et al. A scoping review of the use of ethnographic approaches in implementation research and recommendations for reporting. Implementation Research and Practice. 2021;2:263348952199274.

Palinkas LA, Zatzick D. Rapid Assessment Procedure Informed Clinical Ethnography (RAPICE) in Pragmatic Clinical Trials of Mental Health Services Implementation: Methods and Applied Case Study. Adm Policy Ment Health. 2019;46(2):255–70.

Lanham HJ, McDaniel RR, Crabtree BF, Miller WL, Stange KC, Tallia AF, et al. How Improving Practice Relationships Among Clinicians and Nonclinicians Can Improve Quality in Primary Care. Jt Comm J Qual Patient Saf. 2009;35(9):457–66.

PubMed   PubMed Central   Google Scholar  

Miake-Lye IM, Delevan DM, Ganz DA, Mittman BS, Finley EP. Unpacking organizational readiness for change: an updated systematic review and content analysis of assessments. BMC Health Serv Res. 2020;20(1):106.

Finley EP, Closser S, Sarker M, Hamilton AB. Editorial: The theory and pragmatics of power and relationships in implementation. Front Health Serv. 2023;23(3):1168559.

Bartley L, Metz A, Fleming WO. What implementation strategies are relational? Using Relational Theory to explore the ERIC implementation strategies. FrontHealth Serv. 2022;17(2):913585.

Metz A, Jensen T, Farley A, Boaz A, Bartley L, Villodas M. Building trusting relationships to support implementation: A proposed theoretical model. FrontHealth Serv. 2022;23(2):894599.

Ketley D. A new and unique resource to help you spread and scale innovation and improvement. NHS Horizons. 2023. Available from: https://horizonsnhs.com/a-new-and-unique-resource-to-help-you-spread-and-scale-innovation-and-improvement/   Cited 2023 Mar 27

Dyer KE, Moreau JL, Finley E, Bean-Mayberry B, Farmer MM, Bernet D, et al. Tailoring an evidence-based lifestyle intervention to meet the needs of women Veterans with prediabetes. Women Health. 2020;60(7):748–62.

Goldstein KM, Melnyk SD, Zullig LL, Stechuchak KM, Oddone E, Bastian LA, et al. Heart Matters: Gender and Racial Differences Cardiovascular Disease Risk Factor Control Among Veterans. Women’s Health Issues. 2014;24(5):477–83.

Article   PubMed   Google Scholar  

Vimalananda VG, Biggs ML, Rosenzweig JL, Carnethon MR, Meigs JB, Thacker EL, et al. The influence of sex on cardiovascular outcomes associated with diabetes among older black and white adults. J Diabetes Complications. 2014;28(3):316–22.

Breland JY, Phibbs CS, Hoggatt KJ, Washington DL, Lee J, Haskell S, et al. The Obesity Epidemic in the Veterans Health Administration: Prevalence Among Key Populations of Women and Men Veterans. J GEN INTERN MED. 2017;32(S1):11–7.

Sheahan KL, Goldstein KM, Than CT, Bean-Mayberry B, Chanfreau CC, Gerber MR, et al. Women Veterans’ Healthcare Needs, Utilization, and Preferences in Veterans Affairs Primary Care Settings. J GEN INTERN MED. 2022;37(S3):791–8.

Hamilton AB, Finley EP, Bean-Mayberry B, Lang A, Haskell SG, Moin T, et al. Enhancing Mental and Physical Health of Women through Engagement and Retention (EMPOWER) 2.0 QUERI: study protocol for a cluster-randomized hybrid type 3 effectiveness-implementation trial. Implement Sci Commun. 2023Mar 8;4(1):23.

Moin T, Damschroder LJ, AuYoung M, Maciejewski ML, Havens K, Ertl K, et al. Results From a Trial of an Online Diabetes Prevention Program Intervention. Am J Prev Med. 2018;55(5):583–91.

Damschroder LJ, Reardon CM, Sperber N, Robinson CH, Fickel JJ, Oddone EZ. Implementation evaluation of the Telephone Lifestyle Coaching (TLC) program: organizational factors associated with successful implementation. Behav Med Pract Policy Res. 2017;7(2):233–41.

Zlotnick C, Tzilos G, Miller I, Seifer R, Stout R. Randomized controlled trial to prevent postpartum depression in mothers on public assistance. J Affect Disord. 2016;189:263–8.

Kilbourne AM, Neumann MS, Pincus HA, Bauer MS, Stall R. Implementing evidence-based interventions in health care: application of the replicating effective programs framework. Implementation Science. 2007;2(1).  Available from: http://implementationscience.biomedcentral.com/articles/ https://doi.org/10.1186/1748-5908-2-42  Cited 2017 May 11

Rubenstein LV, Stockdale SE, Sapir N, Altman L, Dresselhaus T, Salem-Schatz S, et al. A Patient-Centered Primary Care Practice Approach Using Evidence-Based Quality Improvement: Rationale, Methods, and Early Assessment of Implementation. J GEN INTERN MED. 2014;29(S2):589–97.

Article   PubMed Central   Google Scholar  

Hamilton AB, Farmer MM, Moin T, Finley EP, Lang AJ, Oishi SM, et al. Enhancing Mental and Physical Health of Women through Engagement and Retention (EMPOWER): a protocol for a program of research. Implementation Science. 2017;12(1).  Available from: https://implementationscience.biomedcentral.com/articles/ https://doi.org/10.1186/s13012-017-0658-9   Cited 2018 Jan 5

Kligler B. Whole Health in the Veterans Health Administration. Glob Adv Health Med. 2022;11:2164957X2210772.

Damschroder LJ, Reardon CM, Widerquist MAO, Lowery J. The updated Consolidated Framework for Implementation Research based on user feedback. Implementation Sci. 2022;17(1):75.

Yano EM, Darling JE, Hamilton AB, Canelo I, Chuang E, Meredith LS, et al. Cluster randomized trial of a multilevel evidence-based quality improvement approach to tailoring VA Patient Aligned Care Teams to the needs of women Veterans. Implementation Sci. 2015;11(1):101.

Nevedal AL, Reardon CM, Opra Widerquist MA, Jackson GL, Cutrona SL, White BS, et al. Rapid versus traditional qualitative analysis using the Consolidated Framework for Implementation Research (CFIR). Implementation Sci. 2021;16(1):67.

Kowalski C, Nevedal AL, Finley, Erin P., Young J, Lewinski A, Midboe AM, et al. Raising expectations for rapid qualitative implementation efforts: guidelines to ensure rigor in rapid qualitative study design, conduct, and reporting. 16th Annual Conference on the Science of Dissemination and Implementation in Health; 2023 Dec 13; Washington, D.C.

Gale RC, Wu J, Erhardt T, Bounthavong M, Reardon CM, Damschroder LJ, et al. Comparison of rapid vs in-depth qualitative analytic methods from a process evaluation of academic detailing in the Veterans Health Administration. Implementation Sci. 2019;14(1):11.

Braun V, Clarke V. Thematic analysis. In: Cooper H, Camic PM, Long DL, Panter AT, Rindskopf D, Sher KJ, editors. APA handbook of research methods in psychology, Vol 2: Research designs: Quantitative, qualitative, neuropsychological, and biological. Washington: American Psychological Association. 2012;57–71.   Available from: http://content.apa.org/books/13620-004  Cited 2023 Mar 28

Torrance H. Triangulation, Respondent Validation, and Democratic Participation in Mixed Methods Research. J Mixed Methods Res. 2012;6(2):111–23.

Birt L, Scott S, Cavers D, Campbell C, Walter F. Member Checking: A Tool to Enhance Trustworthiness or Merely a Nod to Validation? Qual Health Res. 2016;26(13):1802–11.

Stirman SW, Miller CJ, Toder K, Calloway A. Development of a framework and coding system for modifications and adaptations of evidence-based interventions. Implementation Science. 2013;8(1).  Available from: http://implementationscience.biomedcentral.com/articles/ https://doi.org/10.1186/1748-5908-8-65  Cited 2017 Sep 5

Miller CJ, Barnett ML, Baumann AA, Gutner CA, Wiltsey-Stirman S. The FRAME-IS: a framework for documenting modifications to implementation strategies in healthcare. Implementation Sci. 2021;16(1):36.

Wiltsey Stirman S, Baumann AA, Miller CJ. The FRAME: an expanded framework for reporting adaptations and modifications to evidence-based interventions. Implementation Sci. 2019;14(1):58.

Powell BJ, Beidas RS, Lewis CC, Aarons GA, McMillen JC, Proctor EK, et al. Methods to Improve the Selection and Tailoring of Implementation Strategies. J Behav Health Serv Res. 2017;44(2):177–94.

Ware P, Ross HJ, Cafazzo JA, Laporte A, Gordon K, Seto E. User-Centered Adaptation of an Existing Heart Failure Telemonitoring Program to Ensure Sustainability and Scalability: Qualitative Study. JMIR Cardio. 2018;2(2):e11466.

Chen EK, Reid MC, Parker SJ, Pillemer K. Tailoring Evidence-Based Interventions for New Populations: A Method for Program Adaptation Through Community Engagement. Eval Health Prof. 2013;36(1):73–92.

Article   CAS   PubMed   Google Scholar  

Moore G, Campbell M, Copeland L, Craig P, Movsisyan A, Hoddinott P, et al. Adapting interventions to new contexts—the ADAPT guidance. BMJ. 2021;3:n1679.

Waltz TJ, Powell BJ, Matthieu MM, Damschroder LJ, Chinman MJ, Smith JL, et al. Use of concept mapping to characterize relationships among implementation strategies and assess their feasibility and importance: results from the Expert Recommendations for Implementing Change (ERIC) study. Implementation Science. 2015;10(1).  Available from: http://implementationscience.biomedcentral.com/articles/ https://doi.org/10.1186/s13012-015-0295-0  Cited 2017 Sep 5

Fernandez ME, Ruiter RAC, Markham CM, Kok G. Intervention Mapping: Theory- and Evidence-Based Health Promotion Program Planning: Perspective and Examples. Front Public Health. 2019;7:209.

Powell BJ, Waltz TJ, Chinman MJ, Damschroder LJ, Smith JL, Matthieu MM, et al. A refined compilation of implementation strategies: results from the Expert Recommendations for Implementing Change (ERIC) project. Implementation Science. 2015;10(1). Available from: http://implementationscience.biomedcentral.com/articles/ https://doi.org/10.1186/s13012-015-0209-1  Cited 2017 Nov 2

Connelly B, Gilmartin H, Hale A, Kenney R, Morgon B, Sjoberg H. The Relational Facilitation Guidebook [Internet]. Denver-Seattle Center of Innovation for Veteran-Centered and Value-Driven Care; 2023 Feb.  Available from: https://www.seattledenvercoin.research.va.gov/education/rc/docs/Relational_Facilitation_Guidebook.pdf  Cited 2023 Mar 29

Bonawitz K, Wetmore M, Heisler M, Dalton VK, Damschroder LJ, Forman J, et al. Champions in context: which attributes matter for change efforts in healthcare? Implementation Sci. 2020;15(1):62.

Demes JAE, Nickerson N, Farand L, Montekio VB, Torres P, Dube JG, et al. What are the characteristics of the champion that influence the implementation of quality improvement programs? Eval Program Plann. 2020;80:101795.

Flanagan ME, Plue L, Miller KK, Schmid AA, Myers L, Graham G, et al. A qualitative study of clinical champions in context: Clinical champions across three levels of acute care. SAGE Open Medicine. 2018;6:205031211879242.

Wood K, Giannopoulos V, Louie E, Baillie A, Uribe G, Lee KS, et al. The role of clinical champions in facilitating the use of evidence-based practice in drug and alcohol and mental health settings: A systematic review. Implementation Research and Practice. 2020;1:263348952095907.

Morse JM, Barrett M, Mayan M, Olson K, Spiers J. Verification strategies for establishing reliability and validity in qualitative research. Int J Qual Methods. 2002;1(2):13–22.

Abraham TH, Finley EP, Drummond KL, Haro EK, Hamilton AB, Townsend JC, et al. A Method for Developing Trustworthiness and Preserving Richness of Qualitative Data During Team-Based Analysis of Large Data Sets. Am J Eval. 2021;42(1):139–56.

Download references

Acknowledgements

All views expressed are those of the authors and do not represent the views of the US Government or the Department of Veterans Affairs. The authors would like to thank the EMPOWER QUERI 2.0 team, the VA Women’s Health Research Network (SDR 10-012), the participating Veteran Integrated Service Networks, and the women Veterans who inspire this work. Dr. Hamilton is supported by a VA HSR&D Research Career Scientist Award (RCS 21-135). Dr. Moin also receives support from the NIH/NIDDK (R01DK124503, R01DK127733, and R18DK122372), NIH/NIDDK Centers for Disease Control and Prevention (U18DP006535), the Patient-Centered Outcomes Research Institute (PCORI; SDM-2018C2-13543), the Department of Veterans Affairs (CSP NODES, CSP#2002), and UCLA/UCOP.

We would like to acknowledge funding from the VA Quality Enhancement Research Initiative (QUERI; QUE 20–028), the VA QUERI Rapid Qualitative Methods for Implementation Practice Hub (QIS 22–234), and VA Health Services Research & Development (Hamilton; RCS 21–135).

Author information

Authors and affiliations.

Center for the Study of Healthcare Innovation, Implementation, and Policy (CSHIIP), VA Greater Los Angeles Healthcare System, Los Angeles, CA, USA

Erin P. Finley, Joya G. Chrystal, Alicia R. Gable, Erica H. Fletcher, Agatha Palma, Ismelda Canelo, Rebecca S. Oberman, La Shawnta S. Jackson, Rachel Lesser, Tannaz Moin, Bevanne Bean-Mayberry, Melissa M. Farmer & Alison Hamilton

Joe R. & Teresa Lozano Long School of Medicine, The University of Texas Health Science Center at San Antonio, San Antonio, TX, USA

Erin P. Finley

David Geffen School of Medicine, University of California Los Angeles, Los Angeles, CA, USA

Tannaz Moin, Bevanne Bean-Mayberry & Alison Hamilton

You can also search for this author in PubMed   Google Scholar

Contributions

The original Rapid Implementation Feedback (RIF) report format was developed by JC, AG, AH, and EPF, with feedback from EHF, AP, IC, RO, LSJ, RL, TM, BBM, and MF. The analysis for this manuscript was planned by EPF, AH, JC, and AG. Preliminary analysis was conducted by EPF, with refinement and verification of findings provided by all authors during member checking meetings. The first draft was written by EPF, JC, AG, EHF, and AH. All authors reviewed, edited, and approved the final manuscript.

Corresponding author

Correspondence to Erin P. Finley .

Ethics declarations

Ethics approval and consent to participate.

This proposal was funded through VA’s Quality Enhancement Research Initiative (QUERI), which uses operational funds to support program improvement. QUERI projects are conducted as quality improvement for the purposes of program implementation and evaluation and are approved as such by the main VA operations partner, which was the VA Office of Patient Care Services for EMPOWER 2.0 (approval received 11/26/2019). All interview participants provide oral, recorded consent for participation.

Consent for publication

Not applicable.

Competing interests

Erin P. Finley and Alison Hamilton are on the editorial board for Implementation Science Communications.

Additional information

Publisher’s note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary Information

Supplementary material 1., rights and permissions.

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ . The Creative Commons Public Domain Dedication waiver ( http://creativecommons.org/publicdomain/zero/1.0/ ) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Cite this article.

Finley, E.P., Chrystal, J.G., Gable, A.R. et al. The Rapid Implementation Feedback (RIF) report: real-time synthesis of qualitative data for proactive implementation planning and tailoring. Implement Sci Commun 5 , 69 (2024). https://doi.org/10.1186/s43058-024-00605-9

Download citation

Received : 14 December 2023

Accepted : 09 June 2024

Published : 21 June 2024

DOI : https://doi.org/10.1186/s43058-024-00605-9

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Rapid qualitative methods
  • Implementation strategies
  • Implementation planning
  • Evidence-based practice

Implementation Science Communications

ISSN: 2662-2211

  • Submission enquiries: Access here and click Contact Us
  • General enquiries: [email protected]

how to conduct an interview qualitative research

  • Open access
  • Published: 25 June 2024

Achieving research impact in medical research through collaboration across organizational boundaries: Insights from a mixed methods study in the Netherlands

  • Jacqueline C. F. van Oijen   ORCID: orcid.org/0000-0002-5100-0671 1 ,
  • Annemieke van Dongen-Leunis 1 ,
  • Jeroen Postma 1 ,
  • Thed van Leeuwen 2 &
  • Roland Bal 1  

Health Research Policy and Systems volume  22 , Article number:  72 ( 2024 ) Cite this article

Metrics details

In the Netherlands, university medical centres (UMCs) bear primary responsibility for conducting medical research and delivering highly specialized care. The TopCare program was a policy experiment lasting 4 years in which three non-academic hospitals received funding from the Dutch Ministry of Health to also conduct medical research and deliver highly specialized care in specific domains. This study investigates research collaboration outcomes for all Dutch UMCs and non-academic hospitals in general and, more specifically, for the domains in the non-academic hospitals participating in the TopCare program. Additionally, it explores the organizational boundary work employed by these hospitals to foster productive research collaborations.

A mixed method research design was employed combining quantitative bibliometric analysis of publications and citations across all Dutch UMCs and non-academic hospitals and the TopCare domains with geographical distances, document analysis and ethnographic interviews with actors in the TopCare program.

Quantitative analysis shows that, over the period of study, international collaboration increased among all hospitals while national collaboration and single institution research declined slightly. Collaborative efforts correlated with higher impact scores, and international collaboration scored higher than national collaboration. A total of 60% of all non-academic hospitals’ publications were produced in collaboration with UMCs, whereas almost 30% of the UMCs’ publications were the result of such collaboration. Non-academic hospitals showed a higher rate of collaboration with the UMC that was nearest geographically, whereas TopCare hospitals prioritized expertise over geographical proximity within their specialized domains. Boundary work mechanisms adopted by TopCare hospitals included aligning research activities with organizational mindset (identity), bolstering research infrastructure (competence) and finding and mobilizing strategic partnerships with academic partners (power). These efforts aimed to establish credibility and attractiveness as collaboration partners.

Conclusions

Research collaboration between non-academic hospitals and UMCs, particularly where this also involves international collaboration, pays off in terms of publications and impact. The TopCare hospitals used the program’s resources to perform boundary work aimed at becoming an attractive and credible collaboration partner for academia. Local factors such as research history, strategic domain focus, in-house expertise, patient flows, infrastructure and network relationships influenced collaboration dynamics within TopCare hospitals and between them and UMCs.

Peer Review reports

Introduction

Research collaboration has taken flight worldwide in recent decades [ 1 ], as reflected by the growing number of authors listed on research papers [ 2 , 3 ]. Collaborative research has become the norm for many, if not most, scientific disciplines [ 4 , 5 , 6 , 7 , 8 ]. Several studies have found a positive relationship between collaboration and output [ 9 , 10 , 11 , 12 , 13 ]. Publications resulting from research collaborations tend to be cited more frequently [ 14 , 15 , 16 , 17 , 18 ] and to be of higher research quality [ 5 , 14 , 19 , 20 ]. In particular, international collaboration can lead to more citations [ 17 , 21 , 22 , 23 , 24 ], although there are major differences internationally and between fields [ 25 ]. Moreover, international collaboration is often set as an eligibility requirement for European research grants, which have become necessary as national-level resources dwindle. Funding consortia also encourage and require boundary crossings, such as research collaborations between academia and societal partners. Collaboration within public research organizations and universities further plays a crucial role in the international dissemination of knowledge [ 26 ].

In the medical domain, initiatives have been rolled out in numerous countries to encourage long-term collaboration and the exchange of knowledge and research findings. Each initiative takes a strategic approach to assembling the processes needed to support these exchanges across the boundaries of stakeholder groups. In the Netherlands, medical research has traditionally been concentrated in public academia, especially the university medical centres (UMCs). Increasingly, however, research activities are being undertaken in non-academic teaching hospitals (hereafter, non-academic hospitals), driven by their changing patterns of patient influx. In 2013, a Dutch study based on citation analysis showed that collaboration between UMCs and non-academic hospitals leads to high-quality research [ 27 ]. There was further encouragement for medical research in Dutch non-academic hospitals in 2014, when a 4-year policy experiment, the TopCare program, was launched, with three such hospitals receiving additional funding from the Ministry of Health to also provide highly specialized care and undertake medical research. Funding for this combination of care and research is available for UMCs under the budgetary “academic component” of the Dutch healthcare system. Such additional funds are not available for non-academic hospitals, nor can they allocate their regular budgets to research. In the past, these hospitals managed to conduct research and provide specialized care through their own financial and time investments, or by securing occasional external research funding. The TopCare policy experiment was thus meant to find new ways of organizing and funding highly specialized care and medical research in non-academic hospitals.

Despite the increasing emphasis on research collaboration, we still know little about its impact and how it can be achieved. This study integrates two sides of research collaboration in Dutch hospitals and combines elements of quantitative and qualitative research for a broad (output and impact) and deep (boundary work to achieve collaboration) understanding of the phenomenon. We define research collaboration as collaboration between two or more organizations (at least one being a UMC or non-academic hospital) that has resulted in a co-authored (joint) scientific publication [ 28 ]. The research questions are: How high is the level of collaboration in the Dutch medical research field, what is the impact of collaboration, and how are productive research collaborations achieved?

To answer these questions, we performed mixed methods research in UMCs and non-academic hospitals. To examine the impact of various collaboration models – namely, single institution, national and international – across all eight Dutch UMCs and 28 non-academic hospitals between 2009 and 2018/2019, we conducted a bibliometric analysis of publications and citations. We additionally carried out a similar analysis for the TopCare non-academic hospitals between 2010 and 2016 to examine the effects of collaboration in the two domains funded through the program at each hospital. The latter timeframe was chosen to match the duration of the program, 2014–2018. We further conducted an in-depth qualitative analysis of the organizational boundary work done by two non-academic hospitals participating in the TopCare program to initiate and enhance productive research collaborations around specialized research and care within and between hospitals on a national level. Historically, such endeavours have been predominantly reserved for UMCs. The program was therefore a unique opportunity to examine such boundary work.

Background and theory

The landscape of medical research in the netherlands, collaboration in medical research.

The Netherlands has a three-tiered hospital system: general hospitals (including non-academic hospitals), specialized hospitals focusing on a specific medical field or patient population, and UMCs. Nowadays, there are 7 UMCs, 17 specialized hospitals and 58 general hospitals, of which 26 are non-academic [ 29 ].

UMCs receive special funding (the budgetary “academic component”) for research and oversee medical training programs in their region. Non-academic hospitals do not receive structural government funding for medical research and have less chance of obtaining other funding because they are not formally acknowledged as knowledge-producing organizations. Research has less priority in most of these hospitals than in UMCs. On the introduction of government policies regarding competition in healthcare and the development of quality guidelines emphasizing high-volume treatments, some non-academic hospitals began focusing on specific disease areas, in a bid to distinguish themselves from other hospitals and to perform research in and hence develop more knowledge about these priority areas. This led to a greater concentration of highly specialized care [ 30 ]. Non-academic hospitals have also become important partners in medical research for UMCs due to their large patient volumes.

The TopCare program

To further stimulate research in non-academic hospitals, the Ministry of Health awarded three such hospitals €28.8 million in funding over a 4-year period (2014–2018) to support medical research and specialized care for which they do not normally receive funding [ 31 ]. It should be noted that, in non-academic hospitals, the concept of highly specialized research and care applies not to the entire hospital but rather to specific departments or disease areas. This is why the TopCare non-academic hospitals have been evaluated on the basis of specific domains. The funding recipients were two non-academic hospitals and one specialized hospital. In this article, we focus on UMCs and general non-academic hospitals and therefore excluded the specialized hospital from our analysis. Hospital #1 is the largest non-academic hospital in the Netherlands (1100 beds), even larger than some UMCs. Its fields of excellence (known as “domains”) are lung and heart care. Hospital #2 is a large non-academic hospital (950 beds) that focuses on emergency care and neurology. According to the two hospitals, these four highly specialized care and research-intensive domains are comparable to high-complexity care and research in UMCs [ 31 ].

The TopCare program ran through ZonMw, the Netherlands Organization for Health Research and Development, the main funding body for health research in the Netherlands. ZonMw established a committee to assess the research proposals and complex care initiatives of the participating hospitals and to set several criteria for funding eligibility. One requirement was that participating hospitals had to collaborate with universities or UMCs on research projects and were not allowed to conduct basic research in the context of the program, as this was seen as the special province of UMCs.

Boundary work

In the qualitative part of this study, we analyse the boundary work done by actors to influence organizational boundaries as well as the practices undertaken to initiate or enhance collaboration between TopCare non-academic hospitals and academia (universities and UMCs). We refer to boundary work when actors create, shape or disrupt organizational boundaries [ 32 , 33 , 34 , 35 ]. In particular, boundary work involves opening a boundary for collaboration and creating linkages with external partners [ 36 ]. In this article, we use three organizational boundary concepts – “identity”, “competence” and “power” – out of four presented by Santos and Eisenhardt. These concepts are concerned with fostering collaboration, whereas the fourth is concerned with “efficiency” and is less relevant here. Identity involves creating a reputation for research to become an attractive partner while preserving identity. Competence involves creating opportunities for research, for example, in manpower and infrastructure. Finally, power involves creating a negotiating position vis-à-vis relevant others [ 35 ].

The data for this study consist of different types of analysis: (1) quantitative bibliometric data on the publications and citations of all eight Dutch UMCs and 28 non-academic hospitals, and (2) quantitative bibliometric data on the publications and citations in the four domains of two TopCare non-academic hospitals, qualitative (policy) document analysis and in-depth ethnographic interviews with various actors in the Dutch TopCare program. The quantitative data collected from Dutch UMCs and non-academic hospitals were utilized to contextualize data gathered within the TopCare program. We discuss and explain the data collection and methodology in detail in the two sections below.

Quantitative approach: bibliometric analysis of all 8 Dutch UMCs and 28 non-academic hospitals

Data collection

We performed a bibliometric analysis of the publications of 28 non-academic hospitals and 8 UMCs Footnote 1 in the Netherlands between 2009 and 2018. Data for the study were derived from the Center for Science and Technology Studies’ (CWTS) in-house version of the Web of Science (WoS) database. The year 2009 was chosen because the address affiliations in publications are more accurately defined from this year onward. To examine trends over time, we divided the period 2009–2018/2019 into two blocks of 4 years and an additional year for citation impact measurement (2009–2012/2013 and 2014–2017/2018; see explanation in Appendix 1).

Methodology

The bibliometric analysis includes several bibliometric indicators that describe both the output and impact of the relevant research (Table  5 in Appendix 1). One of the indicators, the mean normalized citation score (MNCS), reveals the average impact of a hospital’s publications compared with the average score of all other publications in that area of research. If the MNCS is higher than 1, then on average, the output of that hospital’s domain is cited more often than an “average” publication in that research area.

To map the ways hospitals cooperate, we follow two lines of analysis. The first is centred around a typology of scientific activities and differentiates between (i) a single institution (SI;  all publications with only one address) and (ii) international collaboration (IC; collaboration with at least one international partner). All other publications are grouped as (iii) national collaboration (NC; collaboration with Dutch organizations only).

The second line is centred around geographical distance and size of collaboration. The geographical distances between each non-academic hospital and each of the eight UMCs were measured in Google Maps. The size of collaboration was measured by counting the joint publications of each non-academic hospital and the eight UMCs. Subsequently, we assessed whether the non-academic hospitals also had the most joint publications with the nearest UMC.

Quantitative and qualitative approach to the two TopCare hospitals and their four domains, the “TopCare program” case study

Quantitative approach

The quantitative approach to the TopCare program relies on a bibliometric analysis of publications within each hospital’s two domains: lung and heart care in TopCare non-academic hospital #1, and trauma and neurology in TopCare non-academic hospital #2. Our bibliometric analysis focused on publications within the four selected TopCare domains between 2010 and 2016, following the same methodology described in the previous section under ‘Data collection’. Each domain provided an overview of its publications. The number of publications produced by the two domains at each TopCare hospital is combined in the results. Although this timeframe differs from the broader analysis of all UMCs and non-academic hospitals, comparing these two periods offers insights into the “representative position” of the two domains of each non-academic hospital participating in the TopCare program, in terms of publications and citations.

Qualitative approach

We took a qualitative approach to analysing the collaborative activities in the two TopCare non-academic hospitals, where each domain has its own leadership arrangements, regional demographic priorities and history of research collaboration [cf. 37 ]. This part of the study consisted of interviews and document analysis.

Ethnographic interviews

Over the course of the 4-year program, J.P. and/or R.B. conducted and recorded 90 semi-structured interviews that were then transcribed. For this study, we used repeated in-depth ethnographic interviews with the main actors in the Dutch TopCare program, which took place between 2014 and 2018. We conducted a total of 27 interviews; 20 of the interviews were with a single person, 5 with two persons, and 2 with three persons. The interviews were held with 20 different respondents; 12 respondents were interviewed multiple times. Table 1 presents the different respondents in non-academic hospitals #1 and #2.

Document analysis

Desk research was performed for documents related to the TopCare program (Table  6 – details of document analysis in Appendix 1).

The bibliometric analysis of the four domains in the two TopCare non-academic hospitals follows the same methodology as described in Abramo et al. [ 1 ].

We tested the assumption that joint publications are most frequent between a non-academic hospital and its nearest UMC. If the geographical distance between TopCare non-academic hospitals and their collaborative academic partners is described as “nearby”, then they both work within the same region.

The ethnographic interviews were audio-recorded and transcribed in full with the respondents’ permission. These transcripts were subject to close reading and coding by two authors, J.P. and J.O., to identify key themes derived from the theory [ 35 ] (Table  7 in the Appendix). These were then discussed and debated with the wider research team with the goal of developing a critical interpretation of the boundary work done to initiate or enhance research collaboration [cf. 37 ]. The processed interview data were submitted to the respondents for member check. All respondents gave permission to use the data for this study, including the specific quotes. In the Netherlands, this research requires no ethical approval.

Triangulating the results of the document analysis and the interviews enables us to identify different overarching themes within each boundary concept (identity, competence and power). These themes were utilized as a framework for structuring individual paragraphs, which we explain in greater detail in Table  4 in the Results.

Bibliometric analysis of all Dutch UMCs and non-academic hospitals

This section reports the results of the quantitative bibliometric analysis of the output, trends and impact of collaboration between all UMCs and non-academic hospitals from 2009 to 2018/2019. It provides a broad picture of the output – in terms of research publications – of both existing and ongoing collaborations between all UMCs and non-academic hospitals within the specified timeframe. It furthermore describes the analysis results concerning the relationship between collaboration and the geographical distance between two collaborating hospitals.

Output: distribution of the types of collaboration for UMCs and non-academic hospitals from 2009 to 2018/2019

The first step in understanding the degree of collaboration between hospitals is to measure the research output by number of publications. The total number of publications between 2009 and 2018 is shown in Table  8 ( Appendix 1) and Fig.  1 .

figure 1

Types of collaboration for UMCs and non-academic hospitals from 2009 to 2018/2019. # Total number of publications. Percentage of total (100%) accounted for by single institution, national collaboration and international collaboration

The majority of these publications (89%) are affiliated with UMCs. UMCs, in particular, tend to have a relatively higher proportion of single-institution publications and are more engaged in international collaboration. This pattern may be indicative of UMCs’ enhanced access to research grants and EU subsidies, as well as their active involvement in international consortia.

Collaboration between UMCs and non-academic hospitals appears to be more prevalent and impactful for non-academic hospitals than for UMCs: 70% of all publications originating from a non-academic hospital were the result of joint efforts between a UMC and a non-academic hospital, whereas only 8% of all UMC publications were produced in collaboration with a non-academic hospital (Table  8 in Appendix 1).

Trend analysis of collaboration in relative number of publications

Table  9 Appendix 1) and Fig.  2 show the relative number of publications of all 8 UMCs and all 28 non-academic hospitals in the two periods: 2009–2012/2013 and 2014–2017/2018. For both UMCs and non-academic hospitals, international collaboration accounted for a relatively larger share of publications in recent years.

figure 2

Type of research collaboration for UMCs and non-academic hospitals over time. Percentage of total (100%) accounted for by single institution, national collaboration and international collaboration in each period

Analysis of relationship between distance and collaboration

As the non-academic hospitals often collaborate with UMCs, it is interesting to analyse these collaborations geographically (distance). The assumption is that geographical proximity matters, with the most-frequent joint publications being between a non-academic hospital and the nearest UMC.

Figure  3 shows that 61% (17 out of 28) of the non-academic hospitals collaborate most frequently with the nearest UMCs. Geographical proximity is thus an important but not the only determining factor in collaboration.

figure 3

Collaboration with nearest UMC from 2009 to 2018

Impact of collaboration on bibliometric output of UMCs and non-academic hospitals

The mean normalized citation scores (MNCS) shown in Table  2 cover all 8 UMCs and 28 non-academic hospitals.

The MNCS in Table  2 and the mean normalized journal scores (MNJS) in Table  10 (Appendix 1) show similar patterns. The impact score for both UMCs and non-academic hospitals is greatest for international collaboration. Non-academic hospitals’ single-institution publications score lower than the global average, which was defined as 1.

In sum, quantitative analysis exposes two trends. The first is growth in international collaboration for all UMCs and non-academic hospitals over time, also revealing that collaboration leads to higher MNCS impact scores. Second, geographical proximity between UMCs and non-academic hospitals is an important but not the only determining factor in collaboration. This is the context in which the TopCare program operated in 2014–2018.

“TopCare program” case study

This section presents the results of our analysis of the collaboration networks of the two TopCare non-academic hospitals, consisting of: (1) quantitative bibliometric analysis of the output and impact of these networks between 2010 and 2016, along with the geographical distance to their academic partners, and (2) qualitative ethnographic interviews to identify the boundary work conducted by these hospitals.

Bibliometric analysis of the two TopCare non-academic hospitals’ international and national collaboration networks across four domains

The results of the bibliometric analysis indicate the representative positions of the two domains within each TopCare non-academic hospital. Between 2010 and 2016, these hospitals generated a higher number of single-institution publications compared with the average of all non-academic hospitals. Percentage-wise, their output resembled that of the UMCs, underscoring their leading positions in their respective domains. The percentage of publications based on national collaboration in the domains of TopCare hospital #2 is comparable to that of non-academic hospitals overall, while there is more international collaboration in the domains of TopCare hospital #1 than at non-academic hospitals overall (Fig.  4 , Appendix 1 and Fig.  1 ). The impact of the research is above the global average, and the publications have a higher average impact when there is collaboration with international partners; this is true across all four domains (Table  11 in Appendix 1).

In terms of geographical distance, only the neurology domain of TopCare hospital #2 collaborates with an academic partner within the same region. All other domains collaborate with partners outside the region, a striking difference from the geographical results shown in Fig.  3 .

Ethnographic analysis

This section reviews the results of our ethnographic analysis of the two TopCare hospitals from 2014 to 2018. To analyse the boundary work these hospitals performed to initiate and/or enhance productive research collaborations, we use the framework suggested by Santos and Eisenhardt (2005) for examining organizational boundary work through the concepts of identity, competence and power. Table 3 provides a description of each boundary and how these concepts are defined in our case study on the basis of the overarching themes in the document analysis and the interviews.

Identity: enhancing hospitals’ value proposition

In the TopCare program, the non-academic hospitals used their unique history and expertise to create a joint research focus in a domain and to enhance their positions and influence their collaboration with UMCs and universities.

A manager in hospital #1’s lung domain explained the work being done from a historical perspective, emphasizing not only the innovative history of the hospital but also its central position in patient care:

The first-ever lung lavage, lung transplant and angioplasty were performed in this hospital. Nationally, this hospital has always, and we’re talking about 50–60 years ago now, been at the forefront, and has always invested in this line of research and care. So that is truly institutionally built, there is just that history and you can’t just copy that. And we have the numbers: for interstitial lung diseases, we have 2000 patients in our practice and receive 600 new patients per year. (interview with manager at hospital #1 in 2018).

To explain why patient care and research into rare interstitial lung diseases is centred in hospital #1 as a strategic domain focus, a leading international pulmonary physician – a “boundary spanner” (see below) – pointed to the importance of building team expertise and creating facilities:

I lead that care program for interstitial lung diseases and preside over the related research. I’ve often been asked: you’re a professor, so why don’t you go to a UMC, couldn’t you do much more there? But the care was developed here [in this hospital]. The expertise needed to recognize interstitial lung diseases depends not only on me but also on the radiologist and pathologist; together we have a team that can do this. We have created facilities that no other hospital has for these diseases. If I leave to do the same work in a UMC, I’d have to start over and I’d be going back 30 years. (interview with pulmonary physician at hospital #1 in 2014).

The doctors working in this hospital’s lung and heart domains finance the working hours they put into research themselves. “This fits in with the spirit of a top clinical hospital and the entrepreneurial character of our hospital.” (interview with project leader at hospital #1 in 2018).

Hospital #2, the result of a merger in 2016, struggled to find its strategic focus. A surgical oncologist at this hospital clarified one of the disadvantages of the merger: “People are [still] busy dealing with the money and positions, and the gaze is turned inward, the primary processes. So clinical research is very low on the agenda.” She continued by saying that a small project team acting on behalf of the hospital’s board of directors (BoD) was seeking the best-fit profile for the program, which had raised some opposition in departments excluded from the chosen strategic focus. As a consequence, the hospital had begun to showcase its highly specialized care in the field of neurosurgical treatments. It had a long history and was the first to use a Gamma Knife device for treating brain tumours. The experts in this domain could thus act as authorities, and they became a national centre of expertise. Their strategic partner was a nearby UMC, and they treated relevant patients from other hospitals in their region.

To generate impact, research priorities in a domain are aligned with the focus of the hospital. A member of the BoD of hospital #2 stressed the urgency of “specializing or focusing on a particular area of care” and emphasized that the TopCare budget was being utilized to create a joint focus within a domain. The resulting collective identity mobilized internal affairs and was recognized as valuable by third parties. An important reason for joining the TopCare program for both hospitals was to be able to position themselves strategically as attractive and credible research partners:

The focus is on the domains of neurology and trauma because we think as a non-academic hospital we have something extra to offer: the very close relationship between patient care and research, because we have a larger number of patients of this type here than the universities. (interview with care manager at hospital #2 in 2013).

In short, the boundary of identity requires a closer alignment between these hospitals’ research activities and their strategic objectives and organizational mindset, and demands that they also showcase their staff’s expertise. The TopCare program offered opportunities to transform and consolidate their identity by enhancing their value proposition, that is, their unique history, strategic domain focus, expertise and number of patients.

Competence: Enhancing research infrastructures

All domains in the TopCare program chose to utilize the TopCare funding to invest in their research infrastructure, and to build research networks to share and learn. A research infrastructure consists of all the organizational, human, material and technological facilities needed for specialist care and research [ 31 ].

The TopCare data show that funding is essential for generating research impact. A manager at hospital #1 described its current financial circumstances:

A lot of research and much of the care is currently not funded, it is actually paid for mostly by the hospital... We have had massive budgetary adjustments the past two or three years. ...It is increasingly difficult to finance these kinds of activities within your own operation. (interview with manager at hospital #1 in 2018).

The TopCare funding was used to enhance the material infrastructure in hospital #1’s heart domain:

A number of things in healthcare are really terribly expensive, and there is simply no financing at all for them. …Cardiac devices, for example. We are constantly trying things out, but there’s no compensation for it. (interview with project leader at hospital #1 in 2018).

Hospital #1 had a long-standing and firm relationship with a UMC in the lung domain, giving it a solid material infrastructure. For example, there were spaces where researchers, especially PhD students, could meet, collaborate and share knowledge [ 31 ]. Another essential part of the material infrastructure for the lung domain was the biobank, as highlighted by a leading international pulmonary physician:

Our board of directors made funds available through the innovation fund to start up a biobank, but developing it and keeping it afloat has now been made possible thanks to the TopCare funding. It’s a gift from heaven! It will allow for further expansion and we can now seek international cooperation. (interview with pulmonary physician at hospital #1 in 2014).

Notably, the program allowed both non-academic hospitals to digitize their infrastructure, for example, with clinical registration and data management systems. According to an orthopaedic surgeon at hospital #2, “Logistics have been created, which can very easily be applied to other areas. By purchasing a data system, everyone can record data in a similar way.”

Besides investing in data infrastructure, the human dimension was another crucial factor in the research infrastructure. Instead of working on research “at night”, it became embedded in physicians’ working hours. All domains indicated the importance of having researchers, statisticians and data management expertise available to ensure and enhance the quality of research, and both hospitals invested in research staffing.

After losing many research-minded traumatologists to academia, hospital #2 decided to invest in dedicated researchers to form an intermediate layer of full-time senior researchers linked to clinicians within the two domains.

I personally think this is the most important layer in a hospital, with both a doctor and a senior researcher supervising students and PhD candidates. Clinicians ask practical questions and researchers ask a lot of theoretical questions. Both perspectives are needed to change practices. I have also learned that it takes a few years before the two can understand each other’s language. (interview with neurosurgeon at hospital #2 in 2018).

Competence: Finding alignments within hospitals and research networks

The program offered the hospitals opportunities to structure internal forms of collaboration and build a knowledge base within a domain. For example, hospital #1 organized educational sessions with all PhD students in the heart domain.

Having more researchers working in our hospital has given the whole research culture a boost, as well as the fact that they are producing more publications and dissertations. (interview with cardiologist at hospital #1 in 2018).

Hospital #2 also encouraged cross-domain learning by organizing meetings between the neurology and trauma domains.

You know, you may not be able to do much together content-wise, but you can learn a lot from each other in terms of the obstacles you face (interview with project manager at hospital #2 in 2016).

At the beginning there was resistance to participating in the program.

It was doom and gloom; without more support, groups refused to join. That kind of discussion. So the financial details have been important in terms of willingness to participate. (interview with surgical oncologist at hospital #2 in 2018).

Another obstacle was local approval for multicentre studies, which led to considerable delay (interview with psychologist at hospital #2 in 2018). Overall, the TopCare program created a flywheel effect for other domains that proved essential for internal collaborations (interview with surgical oncologist at hospital #2 in 2018).

In hospital #1, collaboration between the heart and lung domains grew closer.

Divisions between the different disciplines are much less pronounced in our hospital than in UMCs. So it’s much easier to work together. We’d already collaborated closely on lung diseases, and this has improved during the program. (interview with cardiologist at hospital #1 in 2016)

At the network level, the TopCare data show that most researchers participated in national networks. For example, the neurology domain in hospital #2 had established a network of 16 non-academic hospitals. Limited funding prevented researchers at non-academic hospitals from attending many international seminars, and they had more trouble building their international networks. One exception concerned the researchers in the lung domain of hospital #1, who expanded their international network by organizing an international seminar during the TopCare program and by contributing to other national and international seminars.

Each TopCare domain provided highly specialized care and wanted to become a centre of expertise. However, a hospital can only provide highly specialized care if research is conducted to determine the best treatment strategies. The data show how the two are interwoven.

For example, a PhD student has sought to collaborate with a UMC on a specific aorta subject in which we have greater expertise and more volume in terms of patients than UMCs. Based on this link with this UMC, a different policy was drawn up and also implemented immediately in all kinds of other UMCs. (interview with cardiologist at hospital #1 in 2018).

Often, a leading scientist who is the driving force behind a domain in a hospital is a “boundary spanner”, a person in a unique position to bridge organizational boundaries and foster research collaboration by “enabling exchange between production and use of knowledge” [ 40 , p. 1176], [ 41 ]. For example, the leading pulmonary physician in hospital #1 is a boundary spanner who has done a huge amount of work to enhance collaboration. With interstitial lung disease care being concentrated here, this professor can offer fellowships and stimulate virtual knowledge-sharing by video conferencing for “second-opinion” consultations. The TopCare funding was used to finance this. The network is successful at a non-academic level.

These consultations are with colleagues in other hospitals and they avoid patients having to be referred. (interview with project leader at hospital #1 in 2018). Our network now [in 2018] consists of more than 14 hospitals, which we call every week to discuss patients with an interstitial lung disease. …UMCs participate indirectly in this network. For example, the north has a specific centre for this disease in a non-academic hospital and a nearby UMC refers patients to this centre, who are then discussed in our network. (interview with pulmonary physician at hospital #1 in 2018).

This physician also noted that the network was still growing; other colleagues from non-academic hospitals wanted to join it.

Yesterday, colleagues from XX and XX were here. And they all said, “I’ve never learned so much about interstitial lung diseases.” We’re imparting enormous amounts of expertise. (interview with pulmonary physician at hospital #1 in 2018).

In sum, focusing on the boundary of competence, the TopCare hospitals created and mobilized resources to invest in their research infrastructure. In every domain, this infrastructure was used to strengthen the relationship between research, care and education, and to build and enhance internal and external research networks to share and learn.

Power: Enhancing the relationship with or finding and mobilizing strategic academic partners

For TopCare non-academic hospitals, the boundary of power is concerned with creating the right sphere of influence, meaning BoDs and administrators attempt to find and mobilize new strategic partners and build mutual relationships with various stakeholders at different levels.

A project leader at hospital #2 emphasized that the additional resources of the TopCare program created an opportunity for the non-academic hospitals “to show our collaborative partners that we’re a valuable partner.” For once, the tables were turned:

We’ve always had a good relationship with one UMC; they always used the data from our surgeries. But it’s nice that we can finally ask them whether they want to join us. That makes it a little more equal, and we can be a clinical partner. (interview with neurosurgeon at hospital #2 in 2018).

One of the requirements in each domain when applying to ZonMw for funding was alignment with academia in a research and innovation network. Collaboration often appeared more difficult at the administrative level when the academic partners worked in the same field of expertise, and tended to be more successful when the partners focused on different fields, where their interests did not conflict. According to a board member at hospital #2 who played a crucial role in a partnership agreement, a conscious decision was taken beforehand to seek partners beyond the medical domain as well.

There may be conflict with other groups within the walls of a UMC and I don’t see that as promising. You have to work together and we aren’t in a real position to do so. (interview with board member at hospital #2 in 2018).

Just before the end of the program, it was announced that this hospital had concluded a partnership agreement with a university to broaden their joint research program alongside neurology and trauma. An important prerequisite was that both organizations invest 1 million euros in the partnership. The board member revealed that the relationship with this university had in fact existed for some time:

So we went and talked to the university and they became interested. Then the top level was reorganized and replaced and we had to start from scratch again. That took a lot of time. Our goals were to awaken the enthusiasm of the board and at least three deans, otherwise it would be a very isolated matter. And we succeeded. Last week we had a matchmaking meeting at the university and there were about 50 pitches showing how we could be of value to each other. (interview with board member at hospital #2 in 2018)

Looking back, he defined the conditions for a successful collaboration with academia:

In terms of substance, the two sides have to be going in the same direction and complement each other, for example, in expertise, techniques, and/or facilities. And what is really important is that people know each other and are willing to meet each other…and there must be appreciation. (interview with board member at hospital #2 in 2018).

The trauma domain in hospital #2 wanted to become a trauma research centre in its region, and after investing in its research infrastructure, it found a new strategic academic partner:

We have also found new partners, for example, the Social Health Care Department of a UMC [name]. And that really has become a strong partnership; the intent was there for years, but we had no money. (interview with epidemiologist at hospital #2 in 2018).

The neurology domain at this hospital worked to form a network with a university of technology and a university social science department.

Officially, our hospital can’t serve as a co-applicant for funding and that is frustrating. However, I am pleased to show that we are contributing to innovation. (interview with neurosurgeon at hospital #2 in 2018).

A board member at this hospital reflected on the qualities needed for research and concluded: “The neuro group has more of those intrinsic qualities than the trauma group. …I think the trauma group is actually at a crossroads and will think twice about whether they can attract capacity to develop the research side or fall back to a very basic level.”

In hospital #1, administrators rejected a proposal to collaborate with the nearest UMC submitted by medical specialists in the heart domain. Past conflicts and unsuccessful ventures still influenced the present, even though the individuals involved had already left.

A further factor was raised by a manager at hospital #1, who reflected on the importance of obtaining a professorship in the heart domain:

If we can, even on the basis of any kind of appointment, obtain a professorship from the heart centre, then yes, that helps! …I think it just helps throughout the whole operation, politically speaking, as extra confirmation, extra legitimization for that status. (interview with manager at hospital #1 in 2016).

Eventually, hospital #1 managed to find alignment with a UMC in another region during the program and a medical specialist from the hospital became a professor by special appointment.

This UMC showed the greatest determination, actually, while we could have chosen to collaborate with the nearest UMC [but we didn’t]. And there was actually also a real click between both the administrators and the specialists. (interview with manager at hospital #1 in 2018).

Additionally, the TopCare data show that, while there may be close alignment with the nearest UMC, collaboration is not limited to this and proximity can sometimes even be detrimental (for example, in some cases hospitals compete for patients). As research and care in the TopCare hospitals’ domains became more specialized, they required the specific expertise of UMCs in other regions.

One critical dependency in the collaboration between a university or UMC and a non-academic hospital is the distribution of dissertation premiums, valued at about €100,000 per successful PhD track. Currently, after completion of a dissertation, the premium goes entirely to the university or UMC, even when much of the candidate’s research and supervision takes place in a non-academic hospital [ 31 ]. This structural difference makes collaboration less financially valuable to non-academic hospitals. For example, the leading pulmonary physician in hospital #1 is a professor who is affiliated with both a UMC and non-academic hospital, a boundary spanner who works across organizational boundaries, is successful in research, and bears responsibility for a significant proportion of the research output in the lung domain and in the collaboration with other organizations. Moreover, he does most of the PhD supervision, and his students do their work in hospital #1. Despite all this work, the dissertation premium goes to the UMC. Although efforts have been made to change this, certain institutional structures are so strongly embedded that it is difficult to open the organizational boundary.

Power: Aligning with the BoDs and administrators of the TopCare non-academic hospitals

During our research, we observed how the BoDs and administrators of the two TopCare hospitals discussed the progress of the program and worked together to learn from each other.

We can learn a lot from hospital #1 regarding the organization of our research, we think. That has been very inspiring. …On the other hand, the focus has been very centred on getting the domain and project requests funded at all. (interview with care manager at hospital #2 in 2013).

The BoDs opted for an approach aimed at building mutual trust and understanding. As a result, their alliance became more intensive during the program. By the time the program’s final report was released, both BoDs were leveraging their power to influence ZonMw’s next step: the follow-up to TopCare. They had a targeted plan for their lobbying. For example, after mutual coordination, the BoD of each hospital sent a letter to the Ministry of Health sketching their vision for the future.

In summary, for the TopCare hospitals, the boundary of power centred on finding alignment with strategic academic partners and the other BoDs and administrators in the TopCare program. Moreover, ties with strategic partners were important for extending the organization’s sphere of influence [ 33 ] in building and enhancing productive research collaborations. These hospitals recognized that they could not dismantle the existing structure of research funding, and they therefore committed themselves to trying to extend the TopCare program. Table 4 summarizes the opportunities and challenges within the three boundary concepts.

In our study, we used a mixed methods research design to explore research collaborations by focusing on the research output and impact of UMCs and non-academic hospitals in the Netherlands and by zeroing in on the boundary work of two Dutch non-academic hospitals for achieving collaboration.

Our bibliometric analysis shows that collaboration matters, especially for non-academic hospitals. Access to research grants, EU funding and international collaborations is harder for non-academic hospitals, and they need to collaborate with UMCs to generate research impact, assessed by means of MNCS impact scores. Conversely, non-academic hospitals are important for UMCs because they have a larger volume of patients. When UMCs and non-academic hospitals collaborate, their impact scores are higher. Impact scores are, moreover, higher for international collaborative publications across all types of hospital and all periods. More in-depth research is needed into why collaboration increases impact.

Bibliometric analysis of the domains of the two TopCare non-academic hospitals underscores their leading role in these domains. Upon receiving TopCare funding, the hospitals had to engage in various forms of boundary work to meet the requirement mandated by ZonMw of establishing a research collaboration with academia. They used the additional program resources to invest [ 33 ] in opening a boundary for research collaboration with academic partners.

Identity work involves creating an image of the organizational unit that legitimizes its research and care status in line with the dominant mindset of the organization. In practice, the relevant unit needs to establish a distinctive history and domain focus that aligns with the organizational strategy of the hospital, in-house expertise and patient flow. This requires coordination work with the BoD. However, not all domains have been successful in creating such an identity. It proved much more difficult for the trauma domain, for example, because their research is not as highly specialized as and more fragmented than the other domains.

Competence work focuses on organizational (a well-functioning science support unit), technological (registration systems) and material (floor space or biobank) infrastructure, depending on individual requirements. Additionally, tremendous efforts go into the human dimension of infrastructure, as TopCare hospitals consider research staff and making time available for doctors to be important conditions for building structurally supportive research programs. In a previous study, we highlighted that collaboration between all non-academic hospitals within the Association of Top Clinical Teaching Hospitals (STZ) is essential for strengthening their research infrastructure [ 42 ], and can also be seen as a matter of efficiency [ 35 ]. Moreover, in each TopCare hospital, competence work served to bring domains together to facilitate shared learning. Knowledge-sharing across departments or communities is an example of opening boundaries to facilitate integration, convergence or enrichment of points of view [ 36 , 43 , 44 ].

Professors with double affiliations can act as boundary spanners. They play a significant role as experts in a domain by creating its distinctive character, and they surmount borders and break down barriers through their network relationships with other hospitals. Additionally, these persons are responsible for a significant share of the research output in their domain and conduct research with worldwide impact in collaboration with other organizations. Their boundary work must be recognized as essential because they bring usable knowledge to the table, create opportunities for improved relationships across disciplines, enhance communication between stakeholders and facilitate more productive research collaborations [cf. 45 ].

The TopCare hospitals do much less work in the power dimension because the domains in which they operate are adjacent to those of academia. Our study shows that more successful, productive research collaborations are created when the hospital’s academic partner works in a complementary but not identical field. Only in one case, the heart domain, did collaboration succeed in an identical field, but that was because the academic partner was located outside of the hospital’s region and was therefore not a competitor. According to Joo et al., a potential partner’s suitability is determined not only by complementarity, their unique contribution to research collaboration in terms of expertise, skills, knowledge, contexts or resources but also by compatibility and capacity. Partner compatibility involves alignment in vision, commitment, trust, culture, values, norms and working styles, which facilitate rapport-building and cross-institutional collaboration [ 46 ]. TopCare data indicate that research collaborations should be managed to ensure all partners can operate as equals [ 47 ]. Partner capacity refers to the ability to provide timely resources (for example, expertise, skills or knowledge) for projects, as well as leadership commitment, community engagement and institutional support for long-term, mission-driven goals, such as the joint research program in neurology and trauma at hospital #2 and a university.

These three qualitative criteria – partner compatibility, complementarity and capacity – are aspects of power dynamics that influence strategic decisions about recruiting research partners. Generally, power dynamics shape a hospital’s strategic choices regarding whether to collaborate, with whom to partner and the extent of the research collaboration [ 48 ]. Future research should examine these power dynamics in a more integrated manner to unlock the full potential of collaboration [ 46 ].

It was possible to unravel how non-academic hospitals participating in the TopCare program engaged in research collaborations with academia. As the program did not interfere with the existing care, research and financing structures within the UMCs, it allowed TopCare non-academic hospitals to also combine top clinical care and research. The boundary concepts allow us to observe a dual dynamic in the collaboration: the opening of boundaries while simultaneously maintaining certain limits. Opening boundaries refers to facilitating collaboration through activities related to identity and competence, while maintaining them involves the power balance. The temporary program did not disrupt the existing power balance associated with the budgetary “academic component” and the dissertation premiums that accrue to academia. Overall, then, the power dimension may well be the primary factor that made it impossible for the TopCare non-academic hospitals to attain their ultimate goal: secure a consistent form of funding for their research and top clinical care. Instead, the national authorities introduced a new, temporary funding program for non-academic hospitals, and preserved the status quo favouring academia.

A key finding is that, if a hospital is successful in establishing coherence between the different forms of boundary work, it can create productive research collaborations and generate research impact. The TopCare hospitals performed boundary work to strengthen their research infrastructure (competence) and their research status (identity) and create a favourable negotiating position opposite academia (power). For example, choosing the lung domain as the hospital’s strategic focus (identity) and establishing a database as a fundamental source of information for research by a boundary spanner (competence) generated sufficient power to make the hospital a key player in this field and a much-respected collaboration partner, nationally and internationally. However, some restrictions remained in place, such as the national lung research network consisting only of non-academic hospitals, with UMCs participating only indirectly.

Another key finding is that possessing a substantial budget is not in itself enough to ensure successful research collaboration. It is clear from this study that extensive boundary work is also needed to facilitate research collaboration. Given the absence of structural funding, the TopCare non-academic hospitals were under pressure to deliver results during the program, making research collaboration even more crucial for them than for the UMCs in this context. Additionally, because highly specialized care and research at the TopCare non-academic hospitals required unique expertise, they had a growing need for collaboration at the national level. Contrary to assumptions and the findings of our analysis of UMCs and non-academic hospitals overall, their collaborative partners were not predominantly located at the nearest UMC.

Does our study align with the literature and support the results of similar initiatives, such as the establishment of Collaborations for Leadership in Applied Health Research and Care (CLAHRC), a regional multi-agency research network of universities and local national health service (NHS) organizations focused on improving patient outcomes in England by conducting and utilizing applied health research [ 49 ]? And what does it contribute to previous research?

While differences exist between the National Health Service (NHS) and the healthcare system in the Netherlands, there are also noteworthy parallels that render a comparison possible. These include encouraging networks to boost research productivity, fostering collaboration within a competitive system and funding research that is relevant to public health priorities. Moreover, building upon the findings of CLAHRC regarding boundary work within a competitive system and developing and funding research that is relevant to patient needs and public health priorities, there are further parallels, such as creating strong local research infrastructures and local networks [ 49 ], and using influential and skilled boundary spanners [ 49 , 50 ]. In addition, we found that research history, strategic domain focus, in-house expertise, patient flows, and network relationships pre-conditioned the TopCare hospitals’ collaboration with academia. Our results further show that, for non-academic hospitals seeking to create productive research collaborations, it is essential to work in complementary fields and to establish a coherence between identity, competence and power.

Our findings indicate that, after opening a boundary with academia, the focus of the TopCare hospitals was on searching for mutual engagement. These hospitals tried to clarify their added value by creating boundaries to distinguish themselves from UMCs, and attempted to extend the TopCare program without it overlapping with the budgetary “academic component”, so that it posed no threat to the UMCs. Boundary-crossing involves a two-way interaction of mutual engagement and commitment to change in practices [ 51 ]. It is likely that the program did not last long enough to instigate changes in practices, as it can take time to develop mutual understanding and foster trusting relationships [ 52 ].

Based on the CLAHRC results and our research findings, the trend towards regionalization in the Netherlands [ 53 ] and a new leading and coordinating role for UMCs in this research landscape [ 52 , 54 ] can only be successful if boundary work is conducted, allowing research-minded non-academic hospitals to:

Build a “collaborative identity” [ 50 , 55 , 56 ] over time with their academic partners (identity);

Establish added value in their research infrastructures compared with that of their academic partners (competence);

Create solid networks for learning and sharing knowledge [ 55 , 57 ] with their academic partners (competence);

Mobilize boundary spanners to bridge disciplinary and professional boundaries in research, teaching and practice [ 49 , 50 , 55 , 58 ] and publish articles in collaboration with academic partners with high research impact (competence);

Find the inspiration and confidence to increase their co-dependence to, for example, gain benefits from interacting with different partners in the field [ 35 ] (power); and

Create long-term collaborations with academia across sectors over time, as well as within sectors; this requires iterative and continual engagement between clinicians, academics, managers, practitioners and patients (power) [ 49 , 52 ].

It is conceivable that the evaluation of the follow-up study to the TopCare program, which will extend to 2025, could unravel these next steps.

Our results demonstrate that collaboration in research is important and should be encouraged. However, the current methods used to assess researchers underestimate this importance. Reward systems and metrics focus on the performance of individual researchers and may even discourage the development of medical research networks and collaboration [ 52 , 59 ]. There is ongoing debate about and rising criticism of the dominance of scientific impact scores as a measure of the performance of health researchers and research organizations [ 60 ]. Other forms of impact, such as the societal impact of medical research, are becoming more important, and different metrics are being developed. Research collaboration among individuals and organizations should be incentivized and rewarded, and should also be embedded in performance assessment and the core competences of all actors involved [ 61 ]. New ways of rewarding research collaboration within organizations should therefore be explored.

Limitations

This study is limited, both geographically and institutionally, to the Netherlands, and factors other than national and international research collaborations may explain the increase in research output and impact. For example, the research articles in our sample have not been analysed on substantive aspects such as methodology and funding. A bias may therefore have been introduced. Furthermore, the research output and impact of the TopCare non-academic hospitals that we measured was limited to the 4-year program period. A further limitation was the use of these hospitals’ research output as a measure of the influence of the TopCare program, as we were interested not only in the short-term effects (publications) but also in the long-term ones (on the work conducted to build research infrastructures). Moreover, the focus in the qualitative material concerning the TopCare program was on the two TopCare non-academic hospitals and, more specifically, on their national rather than their international collaborations.

Research collaboration between non-academic hospitals and academia in the Netherlands pays off in terms of publications and impact. For the publication of scientific articles, collaboration between UMCs and non-academic hospitals appears to be more prevalent and impactful for non-academic hospitals than for UMCs. When UMCs and non-academic hospitals collaborate, their impact scores tend to be higher. More research is needed into why collaboration leads to more impact.

Non-academic hospitals showed a higher rate of collaboration with the nearest UMC, whereas collaborative partners of TopCare hospitals were not predominantly located at the nearest UMC. TopCare hospitals prioritized expertise over geographical proximity as a predicator of their collaborative efforts, particularly as research and care in their domains became more specialized.

Drawing on the additional resources of the TopCare program, participating non-academic hospitals invested significantly in boundary work to open boundaries for research collaboration with academic partners and, simultaneously, to create boundaries that distinguished them from UMCs. Identity work was performed to ensure that their history and domain focuses were coherent with the dominant mindset of their organization, while competence work was done to enhance their research infrastructure. The human dimension of the infrastructure received considerable attention: more research staff, time made available for doctors and recognition that boundary spanners facilitate research collaborations.

Power work to find and mobilize strategic academic partners was mostly focused on complementary fields, as non-academic hospitals work in domains adjacent to those of academia. The TopCare hospitals tended to avoid power conflicts, resulting in a preservation of the status quo favouring academia.

The local research history, strategic domain focus, in-house expertise, patient flows, infrastructure and network relationships of each TopCare hospital influenced collaboration with academia [cf. 37 , 58 . Increased coherence between the different forms of boundary work led to productive research collaborations and generated research impact. To meet future requirements, such as regionalization, further boundary work is needed to create long-term collaborations and new ways of rewarding research collaboration within organizations.

Availability of data and materials

The datasets used and/or analysed during the study are available from the corresponding author upon reasonable request.

The names of the UMCs and non-academic hospitals and their numbers are not up to date due to mergers in the intervening period. The database contains data on eight UMCs; today there are seven, as two UMCs in Amsterdam merged in 2018. There are 28 non-academic hospitals in the database, whereas today 27 such hospitals are members of the Association of Top Clinical Teaching Hospitals ( https://www.stz.nl ). To ensure data consistency, the database remains unchanged.

Abbreviations

Board of directors

Center for Science and Technology Studies

International collaboration

Mean normalized citation score

Mean normalized journal score

National collaboration

Netherlands Federation of University Medical Centers

Single institution

Association of Top Clinical Teaching Hospitals

University medical centre

Abramo G, D’Angelo CA, Di Costa F. Research collaboration and productivity: is there correlation? High Educ. 2009. https://doi.org/10.1007/s10734-008-9139-z .

Article   Google Scholar  

De Solla Price DJ. Little science, big science. New York: Columbia University Press; 1963.

Book   Google Scholar  

Narin F, Carpenter MP. National publication and citation comparisons. JASIS&T. 1975. https://doi.org/10.1002/asi.4630260203 .

Beaver D, Rosen R. Studies in scientific collaboration: part III – professionalization and the natural history of modern scientific co-authorship. Scientometrics. 1979. https://doi.org/10.1007/BF02016308 .

Katz JS, Martin BR. What is research collaboration? Res Policy. 1997. https://doi.org/10.1016/S0048-7333(96)00917-1 .

Clark BY, Llorens JJ. Investments in scientific research: examining the funding threshold effects on scientific collaboration and variation by academic discipline. PSJ. 2012. https://doi.org/10.1111/j.1541-0072.2012.00470.x .

Bozeman B, Fay D, Slade CP. Research collaboration in universities and academic entrepreneurship: the-state-of-the-art. J Technol Transf. 2013. https://doi.org/10.1007/s10961-012-9281-8 .

Van Raan AF. Measuring science. In: Moed HF, Glänzel W, Schmoch U, editors. Handbook of quantitative science and technology research: the use of patent and publication statistics in studies of S&T systems. Dordrecht: Springer; 2004. p. 19–50. https://doi.org/10.1007/1-4020-2755-9_2 .

Chapter   Google Scholar  

Lotka AJ. The frequency distribution of scientific productivity. J Wash Acad Sci. 1926;16:317–23.

Google Scholar  

De Solla Price DJ, Beaver D. Collaboration in an invisible college. Am Psychol. 1966. https://doi.org/10.1037/h0024051 .

Zuckerman H. Nobel laureates in science: patterns of productivity, collaboration, and authorship. Am Sociol Rev. 1967;32:391–403.

Article   CAS   PubMed   Google Scholar  

Morrison PS, Dobbie G, McDonald FJ. Research collaboration among university scientists. High Educ Res Dev. 2003. https://doi.org/10.1080/0729436032000145149 .

Lee S, Bozeman B. The impact of research collaboration on scientific productivity. Soc Stud Sci. 2005. https://doi.org/10.1177/0306312705052359 .

Beaver DB. Collaboration and teamwork in physics. Czechoslov J Phys B. 1986. https://doi.org/10.1007/BF01599717 .

Acedo FJ, Barroso C, Casanueva C, Galán JL. Co-authorship in management and organizational studies: an empirical and network analysis. J Manag Stud. 2006. https://doi.org/10.1111/j.1467-6486.2006.00625.x .

Wuchty S, Jones BF, Uzzi B. The increasing dominance of teams in production of knowledge. Science. 2007. https://doi.org/10.1126/science.1136099 .

Article   PubMed   Google Scholar  

Sooryamoorthy R. Do types of collaboration change citation? Collaboration and citation patterns of South African science publications. Scientometrics. 2009. https://doi.org/10.1007/s11192-009-2126-z .

Gazni A, Didegah F. Investigating different types of research collaboration and citation impact: a case study of Harvard University’s publications. Scientometrics. 2011. https://doi.org/10.1007/s11192-011-0343-8 .

Landry R, Traore N, Godin B. An econometric analysis of the effect of collaboration on academic research productivity. High Educ. 1996. https://doi.org/10.1007/BF00138868 .

Laband DN, Tollison RD. Intellectual collaboration. J Political Econ. 2000. https://doi.org/10.1086/262132 .

Van Raan A. The influence of international collaboration on the impact of research results: some simple mathematical considerations concerning the role of self-citations. Scientometrics. 1998;42(3):423–8.

Glänzel W. National characteristics in international scientific co-authorship relations. Scientometrics. 2001. https://doi.org/10.1023/a:1010512628145 .

Glänzel W, Schubert A. Analysing scientific networks through co-authorship. In: Moed HF, Glänzel W, Schmoch U, editors. Handbook of quantitative science and technology research: the use of patent and publication statistics in studies of S&T systems. Dordrecht: Kluwer; 2004. p. 257–76. https://doi.org/10.1007/1-4020-2755-9_20 .

Didegah F, Thelwall M. Which factors help authors produce the highest impact research? Collaboration, journal and document properties. J Informetr. 2013. https://doi.org/10.1016/J.JOI.2013.08.006 .

Thelwall M, Maflahi N. Academic collaboration rates and citation associations vary substantially between countries and fields. J Assoc Inf Sci Technol. 2020. https://doi.org/10.1002/asi.24315 .

Archibugi D, Coco A. International partnerships for knowledge in business and academia: a comparison between Europe and the USA. Technovation. 2004. https://doi.org/10.1016/S0166-4972(03)00141-X .

Levi M, Sluiter HE, Van Leeuwen T, Rook M, Peeters G. Medisch-wetenschappelijk onderzoek in Nederland: Hoge kwaliteit door samenwerking UMC’s en opleidingsziekenhuizen. NTvG. 2013;157:A6081.

Abramo G, D’Angelo CA, Di Costa F. University-industry research collaboration: a model to assess university capability. High Educ. 2011. https://doi.org/10.48550/arXiv.1811.01763 .

Centraal Bureau voor de Statistiek. Health care institutions; key figures, finance and personnel. https://www.cbs.nl/nl-nl/cijfers/detail/83652ENG . Accessed 6 Mar 2024.

Postma J, Zuiderent-Jerak T. Beyond volume indicators and centralization: toward a broad perspective on policy for improving quality of emergency care. Ann Emerg Med. 2017. https://doi.org/10.1016/j.annemergmed.2017.02.020 .

Postma JP, Van Dongen-Leunis A, Van Hakkaart-van Roijen L, Bal RA. Evaluatie Topzorg. Een evaluatie van 4 jaar specialistische zorg en wetenschappelijk onderzoek in het St. Antonius Ziekenhuis, het Oogziekenhuis en het ETZ. Rotterdam: Erasmus School of Health Policy & Management; 2018.

Gieryn TF. Boundary-work and the demarcation of science from non-science: strains and interests in professional ideologies of scientists. Am Sociol Rev. 1983. https://doi.org/10.2307/2095325 .

Gieryn TF. Cultural boundaries of science: credibility on the line. Chicago: University of Chicago Press; 1999.

Abbott A. The system of professions. Chicago: University of Chicago Press; 1988.

Santos FM, Eisenhardt KM. Organizational boundaries and theories of organization. Organ Sci. 2005. https://doi.org/10.1287/orsc.1050.0152 .

Chreim S, Langley A, Comeau-Vallée M, Huq JL, Reay T. Leadership as boundary work in healthcare teams. Leadership. 2013. https://doi.org/10.1177/174271501246 .

Waring J, Crompton A, Overton C, Roe B. Decentering health research networks: framing collaboration in the context of narrative incompatibility and regional geo-politics. Public Policy Adm. 2022. https://doi.org/10.1177/0952076720911686 .

Siaw CA, Sarpong D. Dynamic exchange capabilities for value co-creation in ecosystems. J Bus Res. 2021. https://doi.org/10.1016/j.jbusres.2021.05.060 .

Velter M, Bitzer V, Bocken N, Kemp R. Boundary work for collaborative sustainable business model innovation: the journey of a Dutch SME. J Bus Models. 2021. https://doi.org/10.5278/jbm.v9i4.6267 .

Bednarek AT, Wyborn C, Cvitanovic C, Meyer R, Colvin RM, Addison PF, et al. Boundary spanning at the science–policy interface: the practitioners’ perspectives. Sustain Sci. 2018. https://doi.org/10.1007/s11625-018-0550-9 .

Article   PubMed   PubMed Central   Google Scholar  

Neal JW, Neal ZP, Brutzman B. Defining brokers, intermediaries, and boundary spanners: a systematic review. Evid Policy. 2022. https://doi.org/10.1332/174426420X16083745764324 .

Van Oijen JCF, Wallenburg I, Bal R, Grit KJ. Institutional work to maintain, repair, and improve the regulatory regime: how actors respond to external challenges in the public supervision of ongoing clinical trials in the Netherlands. PLoS ONE. 2020. https://doi.org/10.1371/journal.pone.0236545 .

Carlile PR. Transferring, translating, and transforming: an integrative framework for managing knowledge across boundaries. Organ Sci. 2004. https://doi.org/10.1287/ORSC.1040.0094 .

Orlikowski WJ. Knowing in practice: enacting a collective capability in distributed organizing. Organ Sci. 2002. https://doi.org/10.1287/orsc.13.3.249.2776 .

Goodrich KA, Sjostrom KD, Vaughan C, Nichols L, Bednarek A, Lemos MC. Who are boundary spanners and how can we support them in making knowledge more actionable in sustainability fields? Curr Opin Environ Sustain. 2020. https://doi.org/10.1016/j.cosust.2020.01.001 .

Joo J, Selingo J, Alamuddin R. Unlocking the power of collaboration. How to develop a successful collaborative network in and around higher education. Ithaka S+R; 2019.

McDonald J, Jayasuriya R, Harris MF. The influence of power dynamics and trust on multidisciplinary collaboration: a qualitative case study of type 2 diabetes mellitus. BMC Health Serv Res. 2012. https://doi.org/10.1186/1472-6963-12-63 .

Harrington S, Fox S, Molinder HT. Power, partnership, and negotiations: the limits of collaboration. WPA-LOGAN. 1998;21:52–64.

Soper B, Hinrichs S, Drabble S, Yaqub O, Marjanovic S, Hanney S, et al. Delivering the aims of the Collaborations for Leadership in Applied Health Research and Care: understanding their strategies and contributions. Health Serv Deliv Res. 2015. https://doi.org/10.3310/hsdr03250 .

Lockett A, El Enany N, Currie G, Oborn E, Barrett M, Racko G, Bishop S, Waring J. A formative evaluation of Collaboration for Leadership in Applied Health Research and Care (CLAHRC): institutional entrepreneurship for service innovation. Health Serv Deliv Res. 2014. https://doi.org/10.3310/hsdr02310 .

Engeström Y. The horizontal dimension of expansive learning: weaving a texture of cognitive trails in the terrain of health care in Helsinki. In: Achtenhagen F, John EG, editors. Milestones of vocational and occupational education and training. Bielefelt: W Bertelmanns Verlag; 2003. p. 152–79.

Gezondheidsraad. Onderzoek waarvan je beter wordt: Een heroriëntatie op umc-onderzoek. Den Haag: Gezondheidsraad; 2016.

van der Woerd O, Schuurmans J, Wallenburg I, van der Scheer W, Bal R. Heading for health policy reform: transforming regions of care from geographical place into governance object. Policy Politics. 2024. https://doi.org/10.1332/03055736Y2024D000000030 .

Iping R, Kroon M, Steegers C, van Leeuwen T. A research intelligence approach to assess the research impact of the Dutch university medical centers. Health Res Policy Syst. 2022. https://doi.org/10.1186/s12961-022-00926-y .

Rycroft-Malone J, Burton C, Wilkinson JE, Harvey G, McCormack B, Baker R, et al. Collective action for knowledge moblisation: a realist evaluation of the collaborations for leadership in applied Health Research and care. Health Serv Deliv Res. 2015. https://doi.org/10.3310/hsdr03440 .

Kislov R, Harvey G, Walshe K. Collaborations for leadership in applied health research and care: lessons from the theory of communities of practice. Implement Sci. 2011. https://doi.org/10.1186/1748-5908-6-64 .

Harvey G, Fitzgerald L, Fielden S, McBride A, Waterman H, Bamford D, et al. The NIHR collaboration for leadership in applied health research and care (CLAHRC) for Greater Manchester: combining empirical, theoretical and experiential evidence to design and evaluate a large-scale implementation strategy. Implement Sci. 2011. https://doi.org/10.1186/1748-5908-6-96 .

Currie G, Lockett A, Enany NE. From what we know to what we do: lessons learned from the translational CLAHRC initiative in England. J Health Serv Res Policy. 2013. https://doi.org/10.1177/1355819613500484 .

Hurley TJ. Collaborative leadership: engaging collective intelligence to achieve results across organisational boundaries. White Paper. Oxford Leadership. 2011.

DORA. The declaration. https://sfdora.org/read . Accessed 6 Mar 2024.

O’Leary R, Gerard C. Collaboration across boundaries: insights and tips from federal senior executives. Washington: IBM Center for The Business of Government; 2012.

Traag VA, Waltman L, Van Eck NJ. From Louvain to Leiden: guaranteeing well-connected communities. Sci Rep. 2019;9(1):5223. https://doi.org/10.1038/s41598-019-41695-z .

Article   CAS   Google Scholar  

Download references

Acknowledgements

The authors thank the two reviewers and the members of the Health Care Governance department of Erasmus School of Health Policy & Management, Erasmus University Rotterdam for their helpful comments on earlier drafts. We are particularly indebted to Kor Grit for his helpful comments and critical appraisal of this paper.

The TopCare program was funded by the Netherlands Organization for Health Research and Development (ZonMw) ( www.zonmw.nl/en ) under Grant [Number 80-84200-98-14001]. ZonMw had no role in the design or conduct of the study; the collection, management, analysis and interpretation of the data; or the preparation, review and approval of the manuscript.

Author information

Authors and affiliations.

Erasmus School of Health Policy & Management, Erasmus University Rotterdam, P.O. Box 1738, 3000 DR, Rotterdam, The Netherlands

Jacqueline C. F. van Oijen, Annemieke van Dongen-Leunis, Jeroen Postma & Roland Bal

Centre for Science and Technology Studies, Leiden University, Leiden, The Netherlands

Thed van Leeuwen

You can also search for this author in PubMed   Google Scholar

Contributions

Conceptualization: J.v.O., A.v.D.L. and T.v.L. (bibliometric analysis of UMCs and non-academic hospitals); A.v.D.L. and T.v.L. (bibliometric analysis of TopCare domains); and J.v.O., J.P. and R.B. (ethnographic interviews in the TopCare program). Formal analysis: J.v.O., A.v.D.L. and T.v.L. (bibliometric analysis of UMCs and non-academic hospitals); A.v.D.L and T.v.L. (bibliometric analysis of TopCare domains); J.v.O., J.B. and R.B. (ethnographic interviews in the TopCare program). Funding acquisition: R.B. (TopCare program). Investigation: A.v.D.L and T.v.L. (database analysis of UMCs and non-academic hospitals and TopCare domains) and J.v.O., J.B. and R.B. (ethnographic interviews in the TopCare program). Methodology: J.v.O., A.v.D.L and T.v.L. (bibliometric analysis of UMCs and non-academic hospitals); A.v.D.L and T.v.L. (bibliometric analysis of TopCare domains); and J.v.O., J.B. and R.B. (ethnographic interviews in the TopCare program). Project administration: T.v.L. and A.v.D.L (bibliometric analysis of UMCs and non-academic hospitals and TopCare domains) and J.P. (TopCare program). Supervision: T.v.L. (bibliometric analysis of UMCs and non-academic hospitals and TopCare domains) and R.B. (bibliometric analysis of UMCs and non-academic hospitals and TopCare domains, and ethnographic interviews in the TopCare program). Visualization: A.v.D.L and T.v.L. (bibliometric analysis of UMCs and non-academic hospitals and TopCare domains). Original draft: J.v.O., A.v.D.L and R.B. Draft & revision: J.v.O., A.v.D.L, J.P., T.v.L. and R.B. All authors read and approved the final manuscript (and agreed to be both personally accountable for their own contributions and to ensure that questions related to the accuracy or integrity of any part of the work, even ones in which the author was not personally involved, would be appropriately investigated and resolved and that the resolution would be documented in the literature).

Corresponding author

Correspondence to Jacqueline C. F. van Oijen .

Ethics declarations

Ethics approval and consent to participate.

Not applicable; at the time we were conducting the research, ethical approval was not required. Nowadays our facility has an Ethics Committee that assesses research proposals involving human subjects (including interview studies), but this was not the case then. This study is not subject to the Dutch Medical Research Involving Human Subjects Act (WMO); it concerns collaboration on medical research in TopCare non-academic hospitals. For research not subject to the WMO, local policy and applicable procedures apply; as the TopCare program began in 2014, there were, as yet, no institutional rules in this area.

Consent for publication

Member check is part of our policy of informed consent of respondents and consent for publication. Specifically, we gave respondents the opportunity to peruse and add to quotes from their semi-structured interviews and to confirm our interpretation. The focus was on confirming and amending the quote and verifying the interpretation. The research team discussed the feedback received from the respondents and weighed it against the context of data analysis. Any disagreement on a respondent’s feedback was discussed directly with the respondent until consensus was reached. The STZ and NFU have given permission to use the data collected by CWTS on behalf of the NFU and STZ for the bibliometric analysis of this study. They have taken note of the results of this study and agreed to its publication.

Competing interests

The authors declare that they have no competing interests.

Additional information

Publisher’s note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

See Fig.  4 and Tables 5 , 6 , 7 , 8 , 9 , 10 and 11 .

UMCs produce 18 times (= 27,592/1503) more SI, four times (= 42,557/10880) more NC and 14 times (82,540/5896) more IC publications than non-academic hospitals.

Of all publications, 89% (= 152,688/170967) are attributed to UMCs and 11% (18,279/170967) to non-academic hospitals.

Joint publications in national collaboration: 82% (= 8943/10880) non-academic hospitals and 21% (= 8943/42557) UMCs.

Joint international publications: 66% (= 3874/5896) non-academic hospitals and 5% (= 3874/82540) UMCs.

Joint publications: 70% (= 12,816/18279) non-academic hospitals and 8% (= 12,816/152688) UMCs.

Relationship between joint publications and total publications in each type of collaboration: 17% (= 8943/53436) national collaboration and 4% (= 3874/88435) international collaboration.

figure 4

Types of collaboration involving TopCare hospitals #1 and #2 between 2010 and 2016. #, total number of publications

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ . The Creative Commons Public Domain Dedication waiver ( http://creativecommons.org/publicdomain/zero/1.0/ ) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Cite this article.

van Oijen, J.C.F., van Dongen-Leunis, A., Postma, J. et al. Achieving research impact in medical research through collaboration across organizational boundaries: Insights from a mixed methods study in the Netherlands. Health Res Policy Sys 22 , 72 (2024). https://doi.org/10.1186/s12961-024-01157-z

Download citation

Received : 31 December 2022

Accepted : 31 May 2024

Published : 25 June 2024

DOI : https://doi.org/10.1186/s12961-024-01157-z

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Collaboration
  • Research impact
  • Bibliometric analysis
  • Organizational boundary work

Health Research Policy and Systems

ISSN: 1478-4505

  • Submission enquiries: Access here and click Contact Us
  • General enquiries: [email protected]

how to conduct an interview qualitative research

  • DOI: 10.47941/jep.2018
  • Corpus ID: 270650163

Developing Students’ Competences in the Era of Technology: Experience of Integrating Writing in a Content Course at Christian Bilingual University of Congo

  • Mumbere Malonga Mashauri , Malobi Pato , Lotsove Makuru
  • Published in Journal of Education and… 20 June 2024

13 References

Creative writing in economics.

  • Highly Influential

Collaborative Writing among Second Language Learners Using Google Docs in a Secondary School Context

Teacher collaboration in instructional teams and student achievement, educational research: planning, conducting, and evaluating quantitative and qualitative research, implementing clil in higher education in thailand: the extent to which clil improves agricultural students' writing ability, agricultural content, and cultural knowledge., related papers.

Showing 1 through 3 of 0 Related Papers

IMAGES

  1. Qualitative Interview: What it is & How to conduct one

    how to conduct an interview qualitative research

  2. How to Plan and Conduct Qualitative Interviews? Research Beast

    how to conduct an interview qualitative research

  3. General Guidelines for Conducting Research Interviews

    how to conduct an interview qualitative research

  4. How to Conduct an Interview for Research

    how to conduct an interview qualitative research

  5. qualitative research interview design

    how to conduct an interview qualitative research

  6. Qualitative Interview Techniques and Considerations

    how to conduct an interview qualitative research

VIDEO

  1. HOW TO CONDUCT QUALITATIVE RESEARCH IN 14 SIMPLE STEPS FOR THESIS-DISSERTATION & RESEARCH ASSIGNMENT

  2. How to conduct interview properly and ethically. Journ Proj

  3. Focus Group Interview

  4. Research interview- Qualitative & Quantitative (group 8

  5. Interviewing as a Qualitative Data Collection Method

  6. 4.7 Interviewing, a final thought

COMMENTS

  1. How to Conduct a Qualitative Interview (2024 Guide)

    Conducting interviews involves a well-planned and deliberate process to collect accurate and valid data. Here's a step-by-step guide on how to conduct interviews in qualitative research, broken down into three stages: 1. Before the interview. The first step in conducting a qualitative interview is determining your research question.

  2. Twelve tips for conducting qualitative research interviews

    Summary. The qualitative research interview is a powerful data-collection tool which affords researchers in medical education opportunities to explore unknown areas of education and practice within medicine. This paper articulates 12 tips for consideration when conducting qualitative research interviews, and outlines the qualitative research ...

  3. How to carry out great interviews in qualitative research

    A qualitative research interview is a one-to-one data collection session between a researcher and a participant. Interviews may be carried out face-to-face, over the phone or via video call using a service like Skype or Zoom. There are three main types of qualitative research interview - structured, unstructured or semi-structured.

  4. Chapter 11. Interviewing

    Qualitative Inquiry 9(3):335-354. Weighs the potential benefits and harms of conducting interviews on topics that may cause emotional distress. Argues that the researcher's skills and code of ethics should ensure that the interviewing process provides more of a benefit to both participant and researcher than a harm to the former.

  5. How To Do Qualitative Interviews For Research

    5. Not keeping your golden thread front of mind. We touched on this a little earlier, but it is a key point that should be central to your entire research process. You don't want to end up with pages and pages of data after conducting your interviews and realize that it is not useful to your research aims.

  6. Conduct Interviews for Qualitative Research

    Qualitative research interviews are depth interviews. They elicit detailed feedback from your leads and customers. Unstructured interviews reveal why people react in a certain way or make certain decisions. According to The Hartford, qualitative research provides an anecdotal look into your business. That provides an important form of data.

  7. Types of Interviews in Research

    An interview is a qualitative research method that relies on asking questions in order to collect data. Interviews involve two or more people, one of whom is the interviewer asking the questions. ... Depending on the type of interview you are conducting, your questions will differ in style, phrasing, and intention. Structured interview ...

  8. (PDF) How to Conduct an Effective Interview; A Guide to Interview

    Vancouver, Canada. Abstract. Interviews are one of the most promising ways of collecting qualitative data throug h establishment of a. communication between r esearcher and the interviewee. Re ...

  9. PDF Strategies for Qualitative Interviews

    Gentle: lets people finish; gives them time to think; tolerates pauses. 5. Sensitive: listens attentively to what is said and how it is said; is empathetic in dealing with the interviewee. 6. Open: responds to what is important to interviewee and is flexible. 7. Steering: knows what he/she wants to find out. 8.

  10. Twelve tips for conducting qualitative research interviews

    Abstract. The qualitative research interview is an important data collection tool for a variety of methods used within the broad spectrum of medical education research. However, many medical teachers and life science researchers undergo a steep learning curve when they first encounter qualitative interviews, both in terms of new theory but also ...

  11. Interview Research

    InterViews by Steinar Kvale Interviewing is an essential tool in qualitative research and this introduction to interviewing outlines both the theoretical underpinnings and the practical aspects of the process. After examining the role of the interview in the research process, Steinar Kvale considers some of the key philosophical issues relating ...

  12. PDF TIPSHEET QUALITATIVE INTERVIEWING

    TIPSHEET QUALITATIVE INTERVIEWINGTIP. HEET - QUALITATIVE INTERVIEWINGQualitative interviewing provides a method for collecting rich and detailed information about how individuals experience, understand. nd explain events in their lives. This tipsheet offers an introduction to the topic and some advice on. arrying out eff.

  13. PDF CONDUCTING IN-DEPTH INTERVIEWS: A Guide for Designing and Conducting In

    In-depth interviewing is a qualitative research technique that involves conducting intensive individual interviews with a small number of respondents to explore their perspectives on a particular idea, program, or situation. For example, we might ask participants, staff, and others associated with a program about their experiences and ...

  14. Five Tips for Conducting Effective Qualitative Interviews

    In this article, she shares five interviewing tips that have served her well. 1. Convey Intent. Proeschold-Bell says it's important for the interviewer to know the intent behind each question so that it can be clearly conveyed to the interviewee. Understanding the intent of a question, she's found, helps interviewers decide whether or not ...

  15. Qualitative Interview: What it is & How to conduct one

    Qualitative interviews usually involve follow-up questions and are conducted in a conversation or discussion format. A qualitative interview is a more personal form of research agenda compared to general questionnaires or focused group studies. Such formats often include open-ended and follow-up questions. LEARN ABOUT: Behavioral Research.

  16. PDF Interviewing in Qualitative Research

    Qualitative interviewis a broad term uniting semi-structuredand unstructured interviews. Quali- tative interviewing is less structured and more likely to evolve as a natural conversation; it is of- ten conducted in the form of respondents narrating their personal experiences or life histories. Qualitative interviews can be part of ethnography ...

  17. Research Methods Guide: Interview Research

    Introduce yourself and explain the aim of the interview. Devise your questions so interviewees can help answer your research question. Have a sequence to your questions / topics by grouping them in themes. Make sure you can easily move back and forth between questions / topics. Make sure your questions are clear and easy to understand.

  18. Chapter 13: Interviews

    What are interviews? An interviewing method is the most commonly used data collection technique in qualitative research. 1 The purpose of an interview is to explore the experiences, understandings, opinions and motivations of research participants. 2 Interviews are conducted one-on-one with the researcher and the participant. Interviews are most appropriate when seeking to understand a ...

  19. PDF Writing Interview Protocols and Conducting Interviews: Tips for

    10. Be willing to make "on the spot" revisions to your interview protocol. Many times when you are conducting interviews a follow up question may pop into your mind. If a question occurs to you in the interview ask it. Sometimes the "ah-ha" question that makes a great project comes to you in the moment.

  20. Qualitative research method-interviewing and observation

    As no research interview lacks structure most of the qualitative research interviews are either semi-structured, lightly structured or in-depth. Unstructured interviews are generally suggested in conducting long-term field work and allow respondents to let them express in their own ways and pace, with minimal hold on respondents' responses.

  21. 10.3 Conducting Qualitative Interviews

    10.3 Conducting Qualitative Interviews. Qualitative interviews might feel more like a conversation than an interview to respondents, however the researcher is usually guiding the conversation with the goal of gathering information from a respondent. A key difference between qualitative and quantitative interviewing is that qualitative ...

  22. How to Conduct Online Qualitative Interviews

    Conducting interviews in qualitative research has traditionally been a common data collection method. With more accessible technologies and the onset of the COVID-19 pandemic, researchers facilitating interviews have increasingly become interested in using online methods. This guide provides a summary of how to facilitate an online qualitative ...

  23. Best practice for interviews

    Best practice for interviews. At the root of interviewing is an interest in understanding the lived experiences of other people (Seidman, 2006). Interviews invite the participant to make sense of their own experiences and to share these experiences with the researcher. Interviews are therefore an appropriate method when researchers want to ...

  24. In-depth interviews: The best strategies to gain high-quality insights

    How to conduct a successful in-depth interview. Editor's note: Lyndsay Sund is the senior project manager at Syncscript. This is an edited version of an article that originally appeared under the title "Mastering the Art of In-depth Interviews: Effective Techniques for Uncovering Insights." In-depth interviews are the cornerstone of qualitative research.

  25. A practical guide for conducting qualitative research in medical

    A practical guide for conducting qualitative research in medical education: Part 1-How to interview AEM Educ Train . 2021 Jul 1;5(3):e10646. doi: 10.1002/aet2.10646.

  26. The Rapid Implementation Feedback (RIF) report: real-time synthesis of

    Qualitative methods are a critical tool for enhancing implementation planning and tailoring, yet rapid turn-around of qualitative insights can be challenging in large implementation trials. The Department of Veterans Affairs-funded EMPOWER 2.0 Quality Enhancement Research Initiative (QUERI) is conducting a hybrid type 3 effectiveness-implementation trial comparing the impact of Replicating ...

  27. Achieving research impact in medical research through collaboration

    In the Netherlands, university medical centres (UMCs) bear primary responsibility for conducting medical research and delivering highly specialized care. The TopCare program was a policy experiment lasting 4 years in which three non-academic hospitals received funding from the Dutch Ministry of Health to also conduct medical research and deliver highly specialized care in specific domains.

  28. Developing Students' Competences in the Era of Technology: Experience

    Purpose: The purpose of this study was double: 1) To describe the aspects of technology that were involved in the process of integrating writing in the teaching of a discipline. 2) To identify some students' skills/competences that began to improve thanks to writing integration in the teaching of a content course. Methodology: Qualitative practices were used in data collection processes ...