The life history interviews ran for 40 – 60 minutes. The timing for sessions 2 and 3 is not provided.
Interviews are the most common data collection technique in qualitative research. There are four main types of interviews; the one you choose will depend on your research question, aims and objectives. It is important to formulate open-ended interview questions that are understandable and easy for participants to answer. Key considerations in setting up the interview will enhance the quality of the data obtained and the experience of the interview for the participant and the researcher.
Qualitative Research – a practical guide for health and social care researchers and practitioners Copyright © 2023 by Danielle Berkovic is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License , except where otherwise noted.
An official website of the United States government
The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.
The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.
Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .
Shazia jamshed.
Department of Pharmacy Practice, Kulliyyah of Pharmacy, International Islamic University Malaysia, Kuantan Campus, Pahang, Malaysia
Buckley and Chiang define research methodology as “a strategy or architectural design by which the researcher maps out an approach to problem-finding or problem-solving.”[ 1 ] According to Crotty, research methodology is a comprehensive strategy ‘that silhouettes our choice and use of specific methods relating them to the anticipated outcomes,[ 2 ] but the choice of research methodology is based upon the type and features of the research problem.[ 3 ] According to Johnson et al . mixed method research is “a class of research where the researcher mixes or combines quantitative and qualitative research techniques, methods, approaches, theories and or language into a single study.[ 4 ] In order to have diverse opinions and views, qualitative findings need to be supplemented with quantitative results.[ 5 ] Therefore, these research methodologies are considered to be complementary to each other rather than incompatible to each other.[ 6 ]
Qualitative research methodology is considered to be suitable when the researcher or the investigator either investigates new field of study or intends to ascertain and theorize prominent issues.[ 6 , 7 ] There are many qualitative methods which are developed to have an in depth and extensive understanding of the issues by means of their textual interpretation and the most common types are interviewing and observation.[ 7 ]
This is the most common format of data collection in qualitative research. According to Oakley, qualitative interview is a type of framework in which the practices and standards be not only recorded, but also achieved, challenged and as well as reinforced.[ 8 ] As no research interview lacks structure[ 9 ] most of the qualitative research interviews are either semi-structured, lightly structured or in-depth.[ 9 ] Unstructured interviews are generally suggested in conducting long-term field work and allow respondents to let them express in their own ways and pace, with minimal hold on respondents’ responses.[ 10 ]
Pioneers of ethnography developed the use of unstructured interviews with local key informants that is., by collecting the data through observation and record field notes as well as to involve themselves with study participants. To be precise, unstructured interview resembles a conversation more than an interview and is always thought to be a “controlled conversation,” which is skewed towards the interests of the interviewer.[ 11 ] Non-directive interviews, form of unstructured interviews are aimed to gather in-depth information and usually do not have pre-planned set of questions.[ 11 ] Another type of the unstructured interview is the focused interview in which the interviewer is well aware of the respondent and in times of deviating away from the main issue the interviewer generally refocuses the respondent towards key subject.[ 11 ] Another type of the unstructured interview is an informal, conversational interview, based on unplanned set of questions that are generated instantaneously during the interview.[ 11 ]
In contrast, semi-structured interviews are those in-depth interviews where the respondents have to answer preset open-ended questions and thus are widely employed by different healthcare professionals in their research. Semi-structured, in-depth interviews are utilized extensively as interviewing format possibly with an individual or sometimes even with a group.[ 6 ] These types of interviews are conducted once only, with an individual or with a group and generally cover the duration of 30 min to more than an hour.[ 12 ] Semi-structured interviews are based on semi-structured interview guide, which is a schematic presentation of questions or topics and need to be explored by the interviewer.[ 12 ] To achieve optimum use of interview time, interview guides serve the useful purpose of exploring many respondents more systematically and comprehensively as well as to keep the interview focused on the desired line of action.[ 12 ] The questions in the interview guide comprise of the core question and many associated questions related to the central question, which in turn, improve further through pilot testing of the interview guide.[ 7 ] In order to have the interview data captured more effectively, recording of the interviews is considered an appropriate choice but sometimes a matter of controversy among the researcher and the respondent. Hand written notes during the interview are relatively unreliable, and the researcher might miss some key points. The recording of the interview makes it easier for the researcher to focus on the interview content and the verbal prompts and thus enables the transcriptionist to generate “verbatim transcript” of the interview.
Similarly, in focus groups, invited groups of people are interviewed in a discussion setting in the presence of the session moderator and generally these discussions last for 90 min.[ 7 ] Like every research technique having its own merits and demerits, group discussions have some intrinsic worth of expressing the opinions openly by the participants. On the contrary in these types of discussion settings, limited issues can be focused, and this may lead to the generation of fewer initiatives and suggestions about research topic.
Observation is a type of qualitative research method which not only included participant's observation, but also covered ethnography and research work in the field. In the observational research design, multiple study sites are involved. Observational data can be integrated as auxiliary or confirmatory research.[ 11 ]
Research can be visualized and perceived as painstaking methodical efforts to examine, investigate as well as restructure the realities, theories and applications. Research methods reflect the approach to tackling the research problem. Depending upon the need, research method could be either an amalgam of both qualitative and quantitative or qualitative or quantitative independently. By adopting qualitative methodology, a prospective researcher is going to fine-tune the pre-conceived notions as well as extrapolate the thought process, analyzing and estimating the issues from an in-depth perspective. This could be carried out by one-to-one interviews or as issue-directed discussions. Observational methods are, sometimes, supplemental means for corroborating research findings.
Want to create or adapt books like this? Learn more about how Pressbooks supports open publishing practices.
Chapter 10: Qualitative Data Collection & Analysis Methods
Qualitative interviews might feel more like a conversation than an interview to respondents, however the researcher is usually guiding the conversation with the goal of gathering information from a respondent. A key difference between qualitative and quantitative interviewing is that qualitative interviews contain open-ended questions. Open-ended questions are questions for which a researcher does not provide answer options. Open-ended questions demand more of participants than closed-ended questions, because they require participants to come up with their own words, phrases, or sentences to respond.
In a qualitative interview, the researcher usually develops a guide in advance to which he or she then refers during the interview (or memorizes in advance of the interview). An interview guide is a list of topics or questions that the interviewer hopes to cover during the course of an interview. It is called a guide because it is used to guide the interviewer, but it is not inflexible. Think of an interview guide like your agenda for the day or your to-do list both probably contain all the items you hope to check off or accomplish, however, probably it is not mandatory for you to accomplish everything on the list or accomplish it in the exact order that you have written it down. Perhaps emerging events will influence you to rearrange your schedule, or perhaps you simply will not get to everything on the list.
Interview guides should outline issues that a researcher feels are likely to be important, but because participants are asked to provide answers in their own words, and to raise points that they believe are important, each interview is likely to flow a little differently. While the opening question in an in- depth interview may be the same across all interviews, from that point on what the participant says will shape how the interview proceeds. This is what makes in-depth interviewing so exciting. It is also what makes in-depth interviewing rather challenging to conduct. It takes a skilled interviewer to be able to ask questions and actually listen to respondents; and pick up on cues about when to follow up, when to move on, and when to simply let the participant speak without guidance or interruption.
Interview guides can list topics or questions. The specific format of an interview guide might depend on your style, experience, and comfort level as an interviewer or with your topic, however, interview guides are the result of thoughtful and careful work on the part of a researcher. It is important to ensure that the topics and questions are organized thematically and in the order in which they are likely to proceed (keep in mind, however, that the flow of a qualitative interview is in part determined by what a respondent has to say).
Sometimes researchers may create two versions of the guide for a qualitative interview: one version contains a very brief outline of the interview (perhaps with just topic headings), and another version contains detailed questions underneath each topic heading. In this case, the researcher might use the detailed guide to prepare and practice in advance of actually conducting interviews, and then bring just the brief outline to the interview. Bringing an outline, as opposed to a very long list of detailed questions, to an interview encourages the researcher to actually listen to what a participant is telling her. An overly-detailed interview guide will be difficult to navigate through during an interview and could give respondents the incorrect impression that the interviewer is more interested in her questions than in the participant’s answers.
Begin to construct your interview guide by brainstorming. There are no rules at the brainstorming stage—simply list all the topics and questions that come to mind when you think about your research question. Once you have developed a pretty good list, you can begin to pare it down by cutting questions and topics that seem redundant, and grouping like questions and topics together. If you have not done so yet, you may also want to come up with question and topic headings for your grouped categories. You should also consult the scholarly literature to find out what kinds of questions other interviewers have asked in studies of similar topics. As with quantitative survey research, it is best not to place very sensitive or potentially controversial questions at the very beginning of your qualitative interview guide. You need to give participants the opportunity to warm up to the interview and to feel comfortable talking with you. Finally, get some feedback on your interview guide. Ask your friends, family members, and your professors for some guidance and suggestions once you have come up with what you think is a pretty strong guide. Chances are they will catch a few things you had not noticed.
In terms of the specific questions you include on your guide, there are a few guidelines worth noting. First, try to avoid questions that can be answered with a simple yes or no, or, if you do choose to include such questions, be sure to include follow-up questions. Remember, one of the benefits of qualitative interviews is that you can ask participants for more information; be sure to do so. While it is a good idea to ask follow-up questions, try to avoid asking “why” as your follow-up question, since “why” questions can appear to be confrontational, even if that is not your intention. Often people will not know how to respond to “why.” This may be the case because they do not know why themselves. Instead of “why,” it is recommended that you say something like, “could you tell me a little more about that?” This allows participants to explain themselves further without feeling that they are being doubted or questioned in a hostile way.
Also, try to avoid phrasing your questions in a leading way. For example, rather than asking, “What do you think about people who drink and drive?” you could ask, “How do you feel about drinking and driving?” Finally, as noted earlier in this section, remember to keep most, if not all, of your questions open-ended. The key to a successful qualitative interview is giving participants the opportunity to share information in their own words and in their own way.
Even after the interview guide is constructed, the interviewer is not yet ready to begin conducting interviews. The researcher next has to decide how to collect and maintain the information that is provided by participants. It is probably most common for qualitative interviewers to take audio recordings of the interviews they conduct. Recording interviews allows the researcher to focus on her or his interaction with the interview participant rather than being distracted by trying to take notes. Of course, not all participants will feel comfortable being recorded and sometimes even the interviewer may feel that the subject is so sensitive that recording would be inappropriate. If this is the case, it is up to the researcher to balance excellent note-taking with exceptional question-asking and even better listening. It can be quite challenging to do all three at the same time. Recording is best, if you can do so. Whether you will be recording your interviews or not (and especially if not), it is crucial to practice the interview in advance. Ideally, try to find a friend or two willing to participate in a couple of trial runs with you. Even better, try and find a friend or two who are similar in at least some ways to your sample. They can give you the best feedback on your questions and your interview demeanor.
All interviewers should be aware of, give some thought to, and plan for, several additional factors, such as where to conduct an interview and how to make participants as comfortable as possible during an interview. Because these factors should be considered by both qualitative and quantitative interviewers, we will return to them in Chapter 11 “Issues to Consider for AllInterview Types.”
Research Methods for the Social Sciences: An Introduction Copyright © 2020 by Valerie Sheppard is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License , except where otherwise noted.
Latest news.
£150,000 study will explore links between breast cancer and breastfeeding
Bezos Centre for Sustainable Protein launches at Imperial with $30m funding
Public Engagement team celebrate six years of sector-leading Academy
At the root of interviewing is an interest in understanding the lived experiences of other people (Seidman, 2006). Interviews invite the participant to make sense of their own experiences and to share these experiences with the researcher. Interviews are therefore an appropriate method when researchers want to learn from and understand the experiences of others. Important educational issues facing Imperial College include the wellbeing of staff and students, and their experiences of new curricula and pedagogies such as active learning and technologically-enhanced learning. Interviews offer powerful insight into individual experiences of these issues, which can help Imperial improve overall. If you are new to interviewing, it might seem like an unnatural situation. However, interviews are great opportunities for collecting rich data. Participants open up their lives for us to investigate. The data that emerge from interviews is qualitative, often in the form of text from interview transcripts. This data can help us to describe people, explain phenomena, and understand experiences, among other things (Jacob & Furgerson, 2012). Even if you have experience of interviews, these tips can help you make the most out of your interview.
Create a comfortable environment in the interview setting.
Establishing an environment and setting where the participant will be comfortable is paramount to conducting a successful interview. The location of the interview itself should be comfortable for the participant (Herzog, 2012). If possible, try to conduct the interview somewhere that the participant knows well. Choose a quiet, private place (Jacob & Furgerson, 2012). For example, it would not be appropriate to conduct an interview discussing sensitive topics like a student’s sense of belonging in a very public setting like the Junior Common Room, where there is an increased chance that your interview might be overheard, and the participant may not feel comfortable to speak freely about the topic as a result (Elwood & Martin, 2000).
It is important that your participants feel comfortable being honest with you in their responses to your questions. When building rapport, be especially aware of your tone when you ask questions: convey that you are interested, and ask questions in a way that invites all types of responses. To help participants feel more at ease:
Be cautious of “over rapport,” which happens if a participant tries to please the interviewer by saying what he or she thinks is expected of them (Grinyer & Thomas, 2012).
The interview protocol includes your interview questions, but it can also be much more than that (Jacob & Fuergerson, 2012). A good interview protocol will remind you to conduct proper procedures such as collecting informed consent (see below), checking that audio equipment is working and that the participant is happy to be recorded. Also, in addition to your main interview questions, the protocol could include some prompts for you to use if the participant struggles to understand the question or to provide answers, or if the participant’s responses stray from the topic. It may also include a script for you to read off to open and close the interview. You don’t always need to include all of the above on your interview protocol, but it is advised to at least have your list of interview questions written down.
Most often, this is done by providing the participant with a participant information sheet explaining the research and detailing the risks and benefits associated with their participation in the interview, and a consent form for the participant to sign which indicates that they have understood the participant information sheet and they agree to participant in the interview with you. Please consult the Imperial College London Education Ethics Review Process (EERP) webpage for resources on participant information sheets and informed consent forms.
Remember to maintain eye contact to convey compassion and that you are listening (boyd, 2015). If a participant’s response seems unclear, do not be afraid to ask for clarification. Try to make the interview feel like a natural conversation. This means that you should refrain from taking too many notes during the interview. It is advisable to audio or video record the interview with the permission of the participant so that you can focus on the conversation. However, it is advisable to prepare note-taking equipment (Talmage, 2012), both as a back-up to recording equipment and to make certain key notes during the interview. These could be key conceptual ideas that spring to mind during the interview or simply points that you would like to ask more about later in the interview. You may also want to tick things off your interview protocol. Consider a backup recording option - for example, if you use an audio recorder as your primary device, consider preparing your phone or tablet to record the interview as a backup option.
Social roles shape the interview process (DiCicco-Bloom & Crabtree, 2006). If you are a member of staff interviewing students or colleagues, there are likely to be power relations and these may affect the interview (Wang & Yan, 2012). For example, a student you are interviewing might feel they need to give you the ‘right’ answer because you are in a position of authority in the Imperial context. Reassure participants that there are no right or wrong answers (Greene & Hogan, 2005), that their experiences are important, and that their participation is completely voluntary and that there will be absolutely no negative consequences of withdrawing from the interview. If you are interviewing students to evaluate a particular module, you may wish to emphasise that their participation in the interview will have no impact on their grades.
Consider the ethical implications of interviewing students within your class or department. It is better to have a neutral/external interviewer to interview your students instead, particularly if you are using interviews to evaluate your teaching practice or the effectiveness of your module.
Be careful not to let your own assumptions get in the way of their hearing perspectives or stories that you do not expect to hear (Johnson & Rowlands, 2012). Have an open mind, and pay equal attention to all of your interview participants to collect and make the most of rich data.
boyd, d. (2015). Making Sense of Teen Life: Strategies for Capturing Ethnographic Data in a Networked Era. In E. Hargittai, & C. Sandvig, Digital Research Confidential: The Secrets of Studying Behavior Online (pp. 79-102). Cambridge, MA: The MIT Press.
DiCicco-Bloom, B., & Crabtree, B. F. (2006). The qualitative research interview. Medical Education, 40 , 314-321.
Elwood, S. A., & Martin, D. G. (2000). 'Placing' Interviews: Location and Scales of Power in Qualitative Research. Professional Geographer, 52 (4), 649-657.
Gehlbach, H. (2015). User Guide: Panorama Student Survey. Boston: Panorama Education. Retrieved from https://www.panoramaed.com/panorama-student-survey
Gehlbach, H., & Artino Jr., A. R. (2018). The survey checklist (manifesto). Academic Medicine, 93 (3), 360-366. Retrieved from https://journals.lww.com/academicmedicine/fulltext/2018/03000/The_Survey_Checklist__Manifesto_.18.aspx#pdf-link
Gehlbach, H., & Brinkworth, M. E. (2011). Measure twice, cut down error: A process for enhancing the validity of survey scales. Review of General Psychology, 15 (4), 380-387. Retrieved from https://dash.harvard.edu/bitstream/handle/1/8138346/Gehlbach%20-%20Measure%20twice%208-31-11.pdf?sequence=1&isAllowed=y
Greene, S., & Hogan, D. (2005). Exploring Meaning in Interviews with Children. In S. Greene, & D. Hogan (Eds.), Researching Children's Experience (pp. 142-158). London, UK: SAGE Publications Ltd.
Grinyer, A., & Thomas, C. (2012). The Value of Interviewing on Multiple Locations or Longitudinally. In J. F. Gubrium, J. A. Holstein, A. B. Marvasti, & K. D. McKinney (Eds.), The SAGE Handbook of Interview Research (2nd ed., pp. 219-230). London, U.K.: SAGE.
Herzog, H. (2012). Interview Location and Its Social Meaning. In J. F. Gubrium, J. A. Holstein, A. B. Marvasti, & K. D. McKinney (Eds.), The SAGE Handbook of Interview Research (2nd ed., pp. 207-217). London, U.K.: SAGE.
Jacob, S. A., & Furgerson, S. P. (2012). Writing Interview Protocols and Conducting Interviews: Tips for Students New to the Field of Qualitative Research. The Qualitative Report, 17 (2), 1-10.
Johnson, J. M., & Rowlands, T. (2012). The Interpersonal Dynamics of In-depth Interviewing. In J. F. Gubrium, J. A. Holstein, A. B. Marvasti, & K. D. McKinney (Eds.), The SAGE Handbook of Interview Research (2nd ed., pp. 99-113). London, U.K.: SAGE.
Krosnick, J. A., & Presser, S. (2010). Question and questionnaire design. In P. V. Marsden, & J. D. Wright (Eds.), Handbook of Survey Research. Bingley, England: Emerald Group Publishing.
Schwarz, N. (1999). Self-reports: how the questions shape the answers. American Psychology, 54 , 93-105.
Seidman, I. (2006). Interviewing as Qualitative Research: A Guide for Researchers in Education and the Social Sciences (3rd ed.). New York: Teachers College Press.
Talmage, J. B. (2012). Listening to, and for, the Research Interview. In J. F. Gubrium, J. A. Holstein, A. B. Marvasti, & K. D. McKinney (Eds.), The SAGE Handbook of Interview Research (2nd ed., pp. 295-304). London, U.K.: SAGE.
Wang, J., & Yan, Y. (2012). The Interview Question. In J. F. Gubrium, J. A. Holstein, A. B. Marvasti, & K. D. McKinney (Eds.), The SAGE Handbook of Interview Research (2nd ed., pp. 231-242). London, U.K.: SAGE.
Warren, C. A. (2012). Interviewing as Social Interaction. In J. F. Gubrium, J. A. Holstein, A. B. Marvasti, & K. D. McKinney (Eds.), The SAGE Handbook of Interview Research (2nd ed., pp. 129-142). London, U.K.: SAGE.
In-depth interviews are an important methodology in qualitative marketing research. They offer researchers insights from real people. This article will share some techniques to consider when conducting an in-depth interview to make the best of your time with interviewees.
Editor’s note: Lyndsay Sund is the senior project manager at Syncscript. This is an edited version of an article that originally appeared under the title “ Mastering the Art of In-depth Interviews: Effective Techniques for Uncovering Insights .”
In-depth interviews are the cornerstone of qualitative research. They provide rich, detailed insights on complex issues that surveys and quantitative methods can’t capture. However, an engaged respondent can only take you so far. The effectiveness of these interviews hinge on the interviewer’s skill in fostering trust and eliciting genuine insights which can significantly enhance the quality of your findings. In this article, we’ll explore some proven techniques for conducting in-depth interviews that yield actionable results.
Building rapport is essential for a successful interview. Begin by introducing yourself warmly, expressing genuine interest and creating a comfortable atmosphere. Empathy, active listening and validation of participants’ experiences are key elements in establishing rapport. A comfortable respondent often makes for a more interesting interview.
Questions in the discussion guide are often scripted, but it’s important to maintain some flexibility, allowing room for spontaneous exploration of topics that arise. Don’t hesitate to ask for clarification or probe deeper into certain topics. It is important to invite participants to share stories or examples, which can yield richer insights.
We’ve noticed that if you sound like you’re reading from a questionnaire, the respondents will give shorter answers. Vary your voice, connect the questions to their previous responses or what you know from the screener and your respondents will be more likely to answer more thoroughly. We’ve found that the discussion guide is an excellent outline, but word choice and following a more conversational style matters.
Demonstrate genuine interest through attentive body language using both verbal and non-verbal cues and supportive listening, even if you are on a web-based platform. Mirror or paraphrase key points periodically to show understanding and allow the respondent to confirm or clarify. These probes delve deeper into participants’ perspectives, revealing nuances and underlying motivations.
In addition to verbal responses, pay close attention to participants’ non-verbal cues. Facial expressions, body language and tone of voice offer valuable insights into emotions, attitudes and underlying sentiments. Proactive listening to these cues enables interviewers to delve deeper into participants’ subconscious thoughts and feelings.
Silence can be a powerful tool in in-depth interviews. Allow moments of silence after posing a question to give participants ample time to process and formulate their responses. Resist the urge to fill silences with additional questions or commentary. Often, participants use these pauses to delve into deeper thoughts, resulting in richer insights. Remember the old dial-up internet service where we waited patiently to access the internet? The same rule applies here.
Always thank your interviewees for their time and insights and, when applicable, ask if they’d like to receive the findings of the study. Also, invite feedback on the interview process: “Is there anything I didn’t ask you that I should have?” This is the perfect opportunity to gain last-minute valuable insights for future interviews. After the interview, the focus shifts to analyzing the data.
Mastering effective techniques for conducting in-depth interviews is a valuable skill across various fields. By carefully preparing, building rapport, asking insightful questions, actively listening and ethically handling post-interview processes, you can uncover deep insights that surface-level methods cannot reach. Whether you’re a seasoned researcher or new to qualitative interviews, applying these strategies will enhance the quality and depth of your findings, leading to more impactful outcomes.
The benefits and limitations of intercept interviews Related Categories: Research Industry, One-on-One (Depth) Interviews, Qualitative Research Research Industry, One-on-One (Depth) Interviews, Qualitative Research, Mall Interviewing, On-site Interviewing, Recruiting-Qualitative
In Case You Missed It...May/June 2024 Related Categories: Research Industry, One-on-One (Depth) Interviews, Questionnaire Analysis Research Industry, One-on-One (Depth) Interviews, Questionnaire Analysis, The Business of Research, Employees, Survey Research
Yasna: AI assistant for interviewing people, regardless of the scale Related Categories: Research Industry, One-on-One (Depth) Interviews, Qualitative Research Research Industry, One-on-One (Depth) Interviews, Qualitative Research, Artificial Intelligence / AI
How friendship pairs can help marketing researchers Related Categories: Research Industry, One-on-One (Depth) Interviews, Qualitative Research Research Industry, One-on-One (Depth) Interviews, Qualitative Research, Advertising Effectiveness, Audience Research, Focus Group-Moderating
An official website of the United States government
The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.
The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.
Email citation, add to collections.
Your saved search, create a file for external citation management software, your rss feed.
Affiliations.
PubMed Disclaimer
The authors have no potential conflicts to disclose.
Full text sources.
NCBI Literature Resources
MeSH PMC Bookshelf Disclaimer
The PubMed wordmark and PubMed logo are registered trademarks of the U.S. Department of Health and Human Services (HHS). Unauthorized use of these marks is strictly prohibited.
Implementation Science Communications volume 5 , Article number: 69 ( 2024 ) Cite this article
105 Accesses
3 Altmetric
Metrics details
Qualitative methods are a critical tool for enhancing implementation planning and tailoring, yet rapid turn-around of qualitative insights can be challenging in large implementation trials. The Department of Veterans Affairs-funded EMPOWER 2.0 Quality Enhancement Research Initiative (QUERI) is conducting a hybrid type 3 effectiveness-implementation trial comparing the impact of Replicating Effective Programs (REP) and Evidence-Based Quality Improvement (EBQI) as strategies for implementing three evidence-based practices (EBPs) for women Veterans. We describe the development of the Rapid Implementation Feedback (RIF) report, a pragmatic, team-based approach for the rapid synthesis of qualitative data to aid implementation planning and tailoring, as well as findings from a process evaluation of adopting the RIF report within the EMPOWER 2.0 QUERI.
Trained qualitative staff conducted 125 semi-structured pre-implementation interviews with frontline staff, providers, and leadership across 16 VA sites between October 2021 and October 2022. High-priority topic domains informed by the updated Consolidated Framework for Implementation Research were selected in dialogue between EMPOWER 2.0 implementation and evaluation teams, and relevant key points were summarized for each interview to produce a structured RIF report, with emergent findings about each site highlighted in weekly written and verbal communications. Process evaluation was conducted to assess EMPOWER 2.0 team experiences with the RIF report across pre-implementation data collection and synthesis and implementation planning and tailoring.
Weekly RIF updates supported continuous EMPOWER 2.0 team communication around key findings, particularly questions and concerns raised by participating sites related to the three EBPs. Introducing the RIF report into team processes enhanced: team communication; quality and rigor of qualitative data; sensemaking around emergent challenges; understanding of site readiness; and tailoring of REP and EBQI implementation strategies. RIF report findings have facilitated rapid tailoring of implementation planning and rollout, supporting increased responsiveness to sites’ needs and concerns.
The RIF report provides a structured strategy for distillation of time-sensitive findings, continuous team communication amid a complex multi-site implementation effort, and effective tailoring of implementation rollout in real-time. Use of the RIF report may also support trust-building by enhancing responsiveness to sites during pre- and early implementation.
Enhancing Mental and Physical Health of Women Veterans (NCT05050266); https://clinicaltrials.gov/study/NCT05050266?term=EMPOWER%202.0&rank=1
Date of registration: 09/09/2021.
Peer Review reports
Tailoring implementation strategies for specific site needs is often critical for successful implementation. However, few approaches ensure that implementation teams possess the necessary information to deliver timely, tailored strategies in multi-site trials.
We introduce a practical approach, the Rapid Implementation Feedback (RIF) report, designed to share critical information within implementation and evaluation teams. We illustrate how the RIF report has proven instrumental in fostering effective communication and tailoring within the EMPOWER 2.0 Quality Enhancement Research Initiative (QUERI).
The RIF report offers a method for sharing pertinent and time-sensitive findings, empowering teams to swiftly and effectively tailor implementation in real time.
As implementation science has matured, implementation trials have become increasingly complex, often comparing two or more implementation strategies, integrating multiple quantitative and qualitative methods, and occurring across a dozen or more sites. Such complex initiatives require larger teams of implementation researchers and practitioners to conduct, raising challenges for effective and timely communication within teams. Meanwhile, tailoring interventions and implementation rollout to align with the unique strengths and challenges at individual sites – recognized as a valuable and often requisite strategy for achieving implementation and sustainment [ 1 , 2 , 3 ] – requires intensive, flexible, and dynamic engagement with sites. Contextual factors must be assessed, key partners identified, and critical information synthesized and shared to allow for rapid sensemaking and problem-solving.
The growth of implementation science as a field has been accompanied by an acceleration in the variety, rigor, and rapidity of qualitative methods available to support real-world research translation [ 4 , 5 ]. Effective work in implementation often requires gathering information that is purposeful and systematic, represents a variety of partners and perspectives, and accurately synthesizes diverse viewpoints to support meaningful communication and decision-making at every stage of implementation. Accordingly, an array of methodological strategies for supporting participatory and partner-engaged processes [ 6 , 7 ], rapid qualitative data collection and analysis [ 8 , 9 ], and ethnographic and observational approaches [ 10 , 11 , 12 ] have emerged, offering a growing array of qualitative methods to meet the needs of a given study or initiative.
To make use of these methods effectively, work and team processes suitable for the implementation context are needed. The importance of strong communication and relationship networks within implementing sites and teams has been recognized since the early days of the field [ 13 , 14 , 15 ], and recent scholarship has examined how relational communication is embedded within most strategies for implementation [ 16 ], trust-building [ 17 ], and scale-up and spread [ 18 ]. Yet relatively little scholarship has put forward methods for ensuring timely and effective communication within implementation teams, particularly amid efforts to achieve site-level tailoring in real-time. Across eight years of conducting hybrid effectiveness-implementation trials in support of improved care delivery for women Veterans, our team has learned that effective tailoring requires capturing and sharing critical information in an ongoing way [ 4 , 10 , 19 ]. In the first part of this article, we describe the development of a pragmatic, team-based approach for the rapid synthesis of qualitative data to support implementation planning and tailoring: the Rapid Implementation Feedback (RIF) report. In the latter part, we describe findings from a process evaluation of adopting the RIF report within the EMPOWER 2.0 QUERI, outlining how use of this approach has evolved our work.
Women Veterans represent the fastest-growing proportion of VA healthcare users. Despite substantial VA investment in women’s health, gender disparities persist in certain health outcomes, including cardiovascular and metabolic risk and mental health [ 20 , 21 , 22 ]. In tailoring healthcare delivery for women, prior studies suggest that women Veterans prefer gender-specific care and telehealth options [ 19 , 23 ]. In response, the VA EMPOWER 2.0 QUERI is conducting a hybrid type 3 effectiveness-implementation trial [ 24 ] comparing the impact of Replicating Effective Programs (REP) and Evidence-Based Quality Improvement (EBQI) as strategies for implementing three virtual evidence-based practices (EBPs) for women Veterans in 20 VA sites across the United States: (1) Diabetes Prevention Program (DPP) to reduce risk of progressing to type 2 diabetes [ 25 ]; (2) Telephone Lifestyle Coaching (TLC) to reduce cardiovascular risk [ 26 ]; and (3) Reach Out, Stay Strong Essentials (ROSE) to prevent postpartum depression [ 27 ]. REP combines phased intervention packaging, tailoring, training and technical assistance, and re-customization during maintenance/sustainment [ 28 ], while EBQI offers a systematic quality improvement method for engaging frontline providers in improvement efforts via tailoring, multi-level partnership, and ongoing facilitation [ 29 ]. We selected these bundled implementation strategies, REP and EBQI, based on their strong evidence for effectively supporting implementation in diverse healthcare settings [ 28 , 30 ]. Both of these strategies draw upon pre-implementation needs assessment and planned tailoring as key activities for successful implementation, which we postulated would be important based on our experience in the prior EMPOWER QUERI (2015–2020) [ 19 , 30 ]. These activities were deemed to be non-research by the VA Office of Patient Care Services prior to funding.
To coordinate the separate implementation and evaluation elements of our work, we established distinct-but-overlapping teams under the broader umbrella of EMPOWER 2.0, dedicated to: (1) implementing each of the EBPs (DPP, TLC, ROSE), with these smaller teams led by principal investigators for each EBP; (2) providing REP- or EBQI-consistent implementation support at each site (i.e., “REP team” and “EBQI team” project directors); and (3) executing qualitative and quantitive components of our overall evaluation (described in detail in [ 24 ]), in the form of the “qualitative team” and “measures team,” respectively.
Working in concert across these implementation and evaluation teams, EMPOWER 2.0 followed a standardized process for engaging with sites (Fig. 1 ). Initial efforts (beginning pre-funding) involved reaching out to partners at the regional Veterans Integrated Service Network (VISN) level to introduce the EBPs, answer questions, and request a list of potential VA medical centers (VAMCs) within the VISN that might be appropriate for implementation. Following EMPOWER 2.0’s cluster-randomized study design, VISNs were assigned to participate in two of the EBPS (either TLC and ROSE or DPP and ROSE; ROSE was offered to all sites in an effort to ensure an adequate number of pregnant Veteran participants) [ 24 ]. We extended invitations to identified VAMCs to participate in the two EBPs available in their VISN. If sites expressed interest, we conducted an introductory meeting with providers and leadership from Primary Care, Women’s Health, Mental Health, Whole Health [ 31 ], and/or Health Promotion and Disease Prevention, as appropriate to the EBP and each site’s local organization of care. Once a site confirmed their participation, they were randomized to receive either the REP or the EBQI implementation strategy. Following randomization, they were asked to identify a point person for each EBP and key individuals who would be likely to participate in local EBP implementation teams and/or play an important role in supporting implementation (e.g., VAMC leadership). These individuals (e.g., Medical Director, Health Educator) were then invited to participate in pre-implementation interviews prior to initiating REP or EBQI at their site. In each VISN, partners at the VISN level were also invited to participate in pre-implementation interviews, to obtain broader perspectives on the regional women’s health context and priorities.
EMPOWER 2.0 QUERI site-level outreach, randomization, and engagement
Intended to assess sites’ needs and resources and enable pre-implementation tailoring prior to launch, EMPOWER 2.0 pre-implementation interviews examined baseline care practices for each relevant care condition (prediabetes for DPP; cardiovascular risk for TLC; perinatal mental health for ROSE), as well as updated Consolidated Framework for Implementation Research (CFIR) domains including inner and outer setting, innovation, individuals (e.g., characteristics: motivation) and implementation process [ 32 ]. Semi-structured interview guides (previously published [ 24 ]) were developed building on prior work in the original EMPOWER QUERI [ 30 ] and the Women’s Health Patient-Aligned Clinical Team trial [ 33 ]. We have an expert qualitative team, each of whom has master’s or PhD-level training in qualitative methods and years of experience in conducting team-based qualitative research, including using rapid qualitative analysis approaches [ 8 , 9 ]. Most team members have worked together on EMPOWER and other projects for over five years.
Between October 2021 – October 2022, the qualitative team completed 125 interviews across 16 sites, with site and VISN-level participants representing a range of roles, including Women Veteran Program Managers, Women’s Health Primary Care Providers, Maternity Care Coordinators, primary care team members, health coaches, and nutritionists. Pre-implementation interviews took an average of 57 days (range 15–108 days) to complete per site, and included 4–13 participants depending on the size and complexity of the care facility.
The EMPOWER 2.0 qualitative team has a well-established approach to conducting rapid qualitative analysis [ 8 , 19 ] and strong personnel infrastructure and expertise. Even so, once pre-implementation interviews began, challenges quickly arose in ensuring that findings were being communicated to EMPOWER 2.0 implementation teams for DPP, TLC, and ROSE in a timely and effective manner, particularly given that each team was working with multiple sites concurrently. Key questions included: how do we ensure early findings are shared in time to support pre-implementation tailoring? How do we communicate effectively across the qualitative team conducting interviews and the teams responsible for implementation? And how do we keep qualitative team members up-to-date on implementation, so they are well-informed for interviews?
In responding to these challenges, we developed the Rapid Implementation Feedback (RIF) report to support data distillation and bidirectional feedback across our qualitative and implementation teams. In developing the RIF, the EMPOWER 2.0 implementation teams, which are composed of investigators and project directors for each EBP who provide external implementation support for each site, met with the qualitative interview team and agreed upon high-priority topic domains to be extracted from the interviews. These domains were related to implementation planning and included critical roles for implementation planning and launch ; implementation concerns and/or demand for the EBP ; and use of data to track women Veterans’ population health needs (see Table 1 ). These topics reflected both specific CFIR subdomains included in the pre-implementation interview guide (e.g., use of data as an assessment of the CFIR subdomain for information technology infrastructure ), as well as higher-level domains combined to aid in prioritizing key issues (e.g., germane responses related to inner setting , individual characteristics , and implementation process were combined into implementation concerns ). These topic domains were used to create a RIF report template (see Appendix 1 ), which was organized under headings by VISN (outer setting), site (inner setting), and EBP [ 32 ]; the same domains were selected for all EBPs, ensuring consistency in data distillation across the project. Compiling the RIF report ensured that, for example, all interview data relevant to critical roles for implementation planning for ROSE in Site A were collated and easy to locate. Thereafter, at the conclusion of an interview, the qualitative team reviewed interview notes and/or Microsoft Teams transcripts and extracted key points relevant to each priority topic; in doing so, team members followed a process similar to that used in developing structured summaries for rapid qualitative analysis [ 8 , 34 ], but differing by a targeted focus on relatively few domains. For each interview, the analyst would summarize key points related to each RIF domain (e.g., critical roles for implementation planning and launch ), as well as any brief or particularly salient quotes; every key point or quote was also labeled with a study identification number indicating the role of the respondent. The resulting key points and quotes were then added to the RIF report, creating a single, up-to-date written resource for implementation teams, which was cumulatively updated over time.
This approach to analysis is distinct in two key ways from the data distillation process typically used in rapid qualitative analysis [ 8 , 34 , 35 , 36 ]. First, in rapid qualitative analysis, templated summaries are first created at the level of the individual interview or other data episode, so that each data episode is associated with a summary of contents that can later be compiled into a multi-episode matrix. Second, structured summaries are traditionally intended to capture all of the key findings in a given data episode, and thus are both more comprehensive and less focused than the RIF report. By contrast, the RIF report collapsed two steps (i.e., summary then matrix) into one (i.e., RIF report) to assemble a targeted selection of high-priority data. In addition, because the data for each domain were collated from the beginning into a single document, the process of assessing data heterogeneity (e.g., diversity of opinions) and adequacy (e.g., saturation) for a given site was expedited. Up-to-date findings could be made available to the implementation teams on a consistent basis, despite the fact that the qualitative team was often interviewing among multiple sites concurrently. During this period, EMPOWER 2.0 held a weekly full-team meeting to coordinate implementation and evaluation efforts. The day before this weekly meeting, the updated RIF report was sent to the full EMPOWER 2.0 team in a secure encrypted email, with new additions highlighted for easy reference; the team was also notified if there were no RIF updates for the week. As implementation teams were also working concurrently across multiple sites, the RIF report became a centralized resource for organizing essential information in a dynamic environment.
Although the brief written RIF expedited communication of time-sensitive information across teams, challenges continued to arise in coordinating activities, tailoring EBPs, and general communication with sites. We therefore added a verbal update to the RIF Report (see Fig. 2 ), summarizing new additions to the RIF as part of our overall EMPOWER 2.0 weekly meeting. Updates were brief, organized by site, and included a brief summary of interviews conducted that week, along with the roles interviewed and unique findings (e.g., staff turnover issues). Members of the qualitative team also gave feedback on whether saturation had been reached at a site, or if additional interviewing would be helpful in developing a snapshot of key site features, strengths, and potential challenges.
Core components of the Rapid Implementation Feedback (RIF) report
To assess whether the RIF was an effective method for communication and coordination, we conducted a process evaluation of EMPOWER 2.0 teams’ experiences of using the RIF report. We reviewed periodic reflections conducted by the first author as part of EMPOWER 2.0’s overall implementation evaluation with 11 members of five internal teams: those responsible for leading DPP, TLC, and ROSE implementation (i.e., PIs and Co-PIs), and for supporting sites using REP and EBQI implementation strategies (i.e., project directors). Periodic reflections [ 10 ] are lightly guided discussions conducted by phone or teleconference software, which allow for consistent documentation of implementation activities, processes, and events, both planned and unexpected. We adapted the original periodic reflection template [ 10 ] as a discussion guide for EMPOWER 2.0 (previously published [ 24 ]). Reflections lasted 15–60 minutes, with length roughly corresponding to the amount of recent implementation activity, and were conducted monthly or bi-monthly with each team.
In examining how the RIF report was working for our teams, we conducted thematic analysis [ 37 ] of all periodic reflections ( n = 32) completed with EMPOWER 2.0 teams between October 2021, when the RIF was first introduced, and October 2022. All text relevant to the RIF report was extracted and reviewed inductively for key themes associated with perceived impacts of the RIF, resulting in a preliminary set of emergent themes, which were codified into a codebook. All segments of extracted text were then reviewed again and assigned codes as appropriate to their meaning; central findings for each code/theme were then distilled. This preliminary analysis was conducted by the lead author and then presented back to the full EMPOWER 2.0 team to allow for debriefing and member checking [ 38 , 39 ] over a series of meetings. Team members provided substantive feedback that aided in refining themes, and offered additional reflection and commentary on the RIF report and its role within team processes.
We identified five interconnected impacts associated with introducing the RIF report into EMPOWER 2.0 team processes: enhanced communication across teams; improved quality and rigor of qualitative data; heightened sensemaking around emergent challenges; increased understanding of site readiness; and informed tailoring of REP and EBQI implementation strategies. We describe each of these in turn below.
As intended, the RIF was felt to be an effective strategy for improving communication across EMPOWER 2.0’s internal teams. Having the RIF available in written format created an easily accessible resource for implementation teams as they prepared for next steps in engaging with sites, and for qualitative team members as they prepared for upcoming interviews. The verbal RIF update, because it occurred alongside implementation team updates as part of the weekly team call, ensured that information-sharing was bidirectional in real time. The continuous flow of information provided a regular opportunity for answering questions, clarifying areas of potential confusion, and identifying where additional information was needed. Additionally, the RIF served to keep all team members in sync with site-specific information on an ongoing basis.
“I love that the qualitative team is giving us real-time feedback. I don’t think I’ve ever done that except informally. I think that’s been a really nice addition to our meetings.” [EBP 1 lead]
On the whole, the enhanced communication among teams was felt to support team “synergy” and increase synchronization of activities in continued data-gathering and site engagement.
Although improving rigor was not an explicit goal of developing the RIF report, introducing this structured process was felt to have improved both the quality of data collection and the rigor of early analyses. Because of the improved bidirectional communication occurring as part of the weekly verbal RIF report with implementation teams, qualitative team members felt as though they had an increased understanding of implementation activities and site-level context. This in turn was felt to improve the quality of their interviewing by allowing them to ask more attuned follow-up questions and to prioritize topics that were “meaningful to inform implementation.”
“[We] felt very disconnected in the beginning like we didn’t have any information. Having the weekly calls to talk about these things was really helpful.” [Qualitative team member 1]
Qualitative team members also reported feeling more consistent and “in sync” in their processes for interviewing and preparing the RIF report, as the weekly discussions provided an opportunity for the team to observe, confer, and calibrate regarding the conduct of interviews and the content and level of detail included in ongoing RIF updates.
“It helps us stay impartial as interviewers across stakeholders, across sites, and as we modify the interview guide. It kept all of us…aligned with the parts we need to dig deeper into because they’re RIF/high priority.” [Qualitative team member 2].
In addition, introducing the RIF report was felt to increase the trustworthiness of preliminary analyses and data distillation, because while initial data reviews can be impressionistic or anecdotal, the RIF provided a structured and systematic way of consolidating multi-site data from the first pass. Because the RIF report provided early synthesis, it also aided in generating ideas for targeted analysis and coding conducted as part of evaluation activities in later phases.
Arising out of the enhanced team communication, and perhaps supported by the improved quality of information being gathered and distilled by the qualitative team, discussions prompted by the RIF helped the EMPOWER 2.0 team to identify and develop solutions to emergent challenges. As one example, the qualitative team quickly realized that, while it is common practice to keep implementation-focused and evaluation-focused teams distinct in an effort to reduce bias in hybrid trials, sites viewed everyone associated with EMPOWER 2.0 – including interviewers – as an “ambassador” of the project. Interviewers found early on that they were fielding important questions from sites regarding the EBPs and/or implementation plans, and often lacked the information to provide an appropriate response, which placed them in an awkward position. After this issue was raised as part of a weekly RIF update, the teams worked together to develop a living Frequently Asked Questions document to help interviewers answer common questions that were coming up during interviews. This document was later helpful in standardizing communication with sites more generally, serving as a resource for implementation teams as well.
In a second example, a key pre-implementation effort by the EMPOWER 2.0 measures team involved developing a dashboard of population health and performance metrics tailored to provide actionable information to sites on the healthcare needs of their women Veterans. As preparations for site launch continued, and discussions of RIF findings informed ongoing planning efforts, the measures team realized they lacked information on how sites were using existing population health and performance measures. The measures and qualitative teams then worked together to update the interview guide and add priority domains to the RIF report to aid in dashboard development. Having integrated these additions, the qualitative team was able to rapidly confirm the need for a dashboard display of women-only performance measures, and data were used to support tailoring to sites’ needs.
Reflecting the enhanced communication and improved data quality associated with adopting the RIF report, the EMPOWER 2.0 teams were also more able to develop timely assessments of site readiness. The distillation of qualitative interview data provided important contextual information about site-level participants’ level of EBP awareness, motivation, and competing demands prior to implementation planning meetings.
“They just seem generally enthusiastic.” [EBP 2 lead] “Most of what I was picking up on was people saying, ‘We don’t have anyone to do it.’ Just sites saying that they don’t have people…they don’t want to take it on right now.” [EBP 3 lead]
Readied with this information, implementation teams were able to prepare for engagement and planning efforts with a greater understanding of what the critical issues were likely to be.
Finally, building on an improved understanding of sites’ pre-implementation readiness, EMPOWER 2.0 teams felt better equipped to engage in planned tailoring of site outreach and implementation activities within the REP and EBQI strategy bundles. For example, when a key leader at one site was revealed to be “not entirely on board” with DPP implementation, the DPP team lead was able to offer targeted outreach to acknowledge and address the concerns expressed. When concerns were raised about staffing and EBP ownership prior to launch of ROSE, the ROSE team lead expressed, “We were prepared for tough conversations.”
“That became our ‘MO’…anything that comes up [in the RIF], we’ll try to address in the kick-off [meeting with sites] to show that we’re helping in addressing their questions.” [EBP 1 lead]
The RIF report was developed in response to the challenge, within the EMPOWER 2.0 hybrid type 3 effectiveness-implementation trial, of distilling and sharing critical information among internal teams as they pursued distinct implementation and evaluation tasks with an evolving cast of dynamic sites. Combined, the RIF report’s written and verbal components provide a method and process for rapidly extracting high-priority, actionable data, sharing these data in a focused and digestible way, and supporting team sensemaking and tailoring of implementation approaches in real time.
In evaluating the RIF report process, we found that its key benefits were interconnected and mutually reinforcing. Bidirectional communication increased the quality of qualitative data collection, which in turn improved the depth and salience of the data conveyed to the implementation teams, which in turn increased the teams’ ability to engage in active sensemaking and identify effective strategies for tailoring the implementation approach at each site. The tight informational feedback loop allowed us to be nimble and iterative both in data-gathering (e.g., by adding novel domains to the RIF as needed) and in tailoring (e.g., by allowing us to customize early messaging to address sites’ most pressing concerns).
Tailoring and adaptation of both interventions and implementation strategies have been recognized as essential for the successful translation of research into routine practice [ 40 , 41 , 42 , 43 ]. In response, a variety of qualitative and mixed-methods approaches have been put forward for capturing feedback from diverse partners, including user-centered adaptation [ 44 ], the Method for Program Adaptation through Community Engagement (M-PACE) [ 45 ], the ADAPT guidance [ 46 ], concept mapping [ 47 ], and intervention mapping [ 48 ]. These approaches have strengthened capacity for implementation researchers and practitioners to gather and synthesize often wide-ranging perspectives into actionable guidance for improving the acceptability, feasibility, appropriateness, and compatibility of interventions and implementation strategies. Yet there remains significant opportunity to streamline and systematize methods for tailoring in the context of hybrid type 2 and 3 trials, which often conduct formative evaluation in real time amid simultaneous data collection and implementation activities. In addition to providing a model for how to embed a structured method for data capture, distillation, and sharing within a complex implementation trial, we believe the RIF report offers a pragmatic method to improve both the quality of information synthesis and the ability of teams to engage in timely sensemaking.
Creating an effective internal communication process via the RIF supported tailored delivery of EBPs at each site, which in turn was felt to enhance the relationships between EMPOWER 2.0 QUERI members and site partners. The role of relationships as an underlying and underexplored element within implementation has garnered increasing attention [ 15 ]. Bartley et al. [ 16 ] conducted an analysis of the Expert Recommendations for Implementing Change (ERIC) taxonomy of implementation strategies [ 49 ], and found that nearly half (36 of 73) could be classified as highly or semi-relational in nature. Connelly and collaborators [ 50 ] developed a Relational Facilitation Guidebook based in relational coordination and the principle that high-quality communication and relationships result in improved healthcare quality. Metz and colleagues [ 17 ] have proposed a theoretical model for building trusting relationships to support implementation, drawing on theory and research evidence to identify both technical and relational strategies associated with nurturing trust. There is considerable overlap between Metz et al.’s strategies and the processes supported by adopting the RIF report in EMPOWER 2.0, particularly those related to bidirectional communication, co-learning, and frequent interactions, which in turn enabled greater responsiveness to sites. We found the structured communication offered by the RIF helped to support trust-building both within EMPOWER 2.0 and in our teams’ interactions with sites.
Future teams weighing potential use of the RIF report should first consider whether the RIF report is suitable to their project goals and resources. It may be less suitable for teams whose timelines allow for traditional coding-based or rapid qualitative approaches to data analysis, who do not intend to engage in formative evaluation or planned tailoring, or who have concerns that any modifications to the implementation approach may be incompatible with their trial design. In EMPOWER 2.0, core components for determining fidelity to implementation strategy in both study arms (REP and EBQI) were identified before initiating pre-implementation activities, and both strategies included planned tailoring to address specific conditions at sites (e.g., perceived patient needs, key professional roles and service lines to be involved). We were thus able to ensure that no decisions made in RIF-related or other discussions varied from our trial protocol.
Teams electing to adopt the RIF report may benefit from discussing how best to integrate this method into their workflow, and what specific tailoring of the RIF report is needed to ensure alignment with their implementation, research, and/or evaluation goals. We recommend that teams discuss and come to consensus on four RIF elements: (1) selected high-priority topic domains, e.g., site-level concerns, which may be higher-level or more closely focused on implementation theory constructs, as appropriate to the project; (2) what data sources will be included (e.g., data from provider or leadership interviews, surveys, or periodic reflections); (3) the preferred format for written and verbal RIF reports, including salient categories for organizing information (e.g., by site or professional role); and (4) the preferred frequency of sharing RIF reports. Given the established importance of identifying effective local champions in implementation [ 51 , 52 , 53 , 54 ], identifying critical roles and service lines for implementation planning and launch are domains likely to be of value for many projects, as is the domain of implementation concerns , which encapsulates important doubts or anxieties expressed by respondents that may be addressable by the implementation team. Teams documenting shifts to the implementation approach in response to respondent feedback might also consider adding a tailoring / action items or next steps domain to track decisions made during discussions of RIF findings. With regard to frequency, weekly RIF reports worked well for EMPOWER 2.0 because this tempo aligned with existing meetings and the busy pace of pre-implementation activities, but this frequency may not be necessary for all teams. Dialogue across these issues is likely to be of value for teams in developing a shared understanding of how project goals will be operationalized, and may allow for more agile responses when change is needed or challenges arise.
There are several limitations to the process evaluation described here. First, it should be noted that periodic reflections were conducted by the first author, who has worked with most members of the implementation teams for at least five years. As an ethnographic method occurring repeatedly over time, reflections benefit from the long-term relationship built between discussion lead and participants, and may be subject to less reporting bias than other data collection methods [ 10 ]. Nonetheless, the potential for biased reporting should be acknowledged. We endeavored to ensure the accuracy, completeness, and trustworthiness of findings [ 39 , 55 , 56 ] by engaging in multiple rounds of member checking with the EMPOWER 2.0 team, first in dedicated meetings and later in preparing and revising this manuscript.
In considering the limitations of the RIF report as a methodological approach to support effective distillation and tailoring, it is important to note that this process was developed and executed by a highly trained and experienced team, which likely facilitated qualitative team members in completing the structured reports in a timely and consistent manner. We found that analyses conducted for the RIF report were adequate to support all of the pre-implementation tailoring required for this initiative; however, projects – and particularly projects occurring earlier in the implementation pipeline than this hybrid type 3 trial – may vary in their early-stage analytic needs. Notably, no negative impacts associated with introducing the RIF were identified by team members; this may reflect the fact that the RIF report replaced other rapid qualitative analysis activities (e.g., developing structured summaries for each interview) rather than adding to the team workload. It should be noted that the EMPOWER 2.0 core team also builds on significant experience working together over time, which may have enhanced the quality of communication and coordination emerging from RIF updates. The RIF report may not be relevant or appropriate in implementation efforts where formative evaluation and/or tailoring are not intended or desirable (e.g., in implementation trials assessing the effectiveness of strategies that do not include planned tailoring), although its step-by-step process for synthesizing data relevant to high-priority topics for rapid communication is likely to have broad utility. Future research should consider whether the RIF report has generalizability as a method for use in less complex implementation studies, or by smaller or less experienced teams.
Rapid qualitative methods are a critical tool for enhancing implementation planning, communication, and tailoring, but can be challenging to execute in the context of complex implementation trials, such as those occurring across multiple sites and requiring coordination across implementation and evaluation teams. The RIF report extends rapid qualitative methods by providing a structured process to enhance focused data distillation and timely communication across teams, laying the groundwork for an up-to-date assessment of site readiness, improved identification and sensemaking around emergent problems, and effective and responsive tailoring to meet the needs of diverse sites.
The datasets generated and/or analysed during the current study are not publicly available as participants have not provided consent for sharing; de-identified portions may be available from the corresponding author on reasonable request.
Diabetes Prevention Program
Evidence-Based Practice
Evidence-Based Quality Improvement
Enhancing Mental and Physical Health of Women Veterans through Engagement and Retention
Quality Enhancement and Research Initiative
Replicating Effective Programs
Rapid Implementation Feedback
Reach Out, Stay Strong Essentials
Telephone Lifestyle Coaching
Veterans Affairs
VA Medical Center
Veterans Integrated Service Network
Krause J, Van Lieshout J, Klomp R, Huntink E, Aakhus E, Flottorp S, et al. Identifying determinants of care for tailoring implementation in chronic diseases: an evaluation of different methods. Implementation Sci. 2014;9(1):102.
Article Google Scholar
Treichler EBH, Mercado R, Oakes D, Perivoliotis D, Gallegos-Rodriguez Y, Sosa E, et al. Using a stakeholder-engaged, iterative, and systematic approach to adapting collaborative decision skills training for implementation in VA psychosocial rehabilitation and recovery centers. BMC Health Serv Res. 2022;22(1):1543.
Article PubMed PubMed Central Google Scholar
Chambers DA, Glasgow RE, Stange KC. The dynamic sustainability framework: addressing the paradox of sustainment amid ongoing change. Implementation Science. 2013;8(1). Available from: http://implementationscience.biomedcentral.com/articles/ https://doi.org/10.1186/1748-5908-8-117 Cited 2017 Mar 14.
Hamilton AB, Finley EP. Qualitative methods in implementation research: An introduction. Psychiatry Res. 2019;280:112516.
Cohen D, Crabtree BF, Damschroder LJ, Hamilton AB, Heurtin-Roberts S, Leeman J, et al. Qualitative Methods in Implementation Science. National Cancer Institute; 2018. Available from: https://cancercontrol.cancer.gov/sites/default/files/2020-09/nci-dccps-implementationscience-whitepaper.pdf
Cunningham-Erves J, Mayo-Gamble T, Vaughn Y, Hawk J, Helms M, Barajas C, et al. Engagement of community stakeholders to develop a framework to guide research dissemination to communities. Health Expect. 2020;23(4):958–68.
Hamilton AB, Brunner J, Cain C, Chuang E, Luger TM, Canelo I, et al. Engaging multilevel stakeholders in an implementation trial of evidence-based quality improvement in VA women’s health primary care. Behav Med Pract Policy Res. 2017;7(3):478–85.
Hamilton. Qualitative methods in rapid turn-around health services research. 2013 Dec 11; VA HSR&D Cyberseminar Spotlight on Women’s Health. Available from: http://www.hsrd.research.va.gov/for_researchers/cyber_seminars/archives/780-notes.pdf
St. George SM, Harkness AR, Rodriguez-Diaz CE, Weinstein ER, Pavia V, Hamilton AB. Applying Rapid Qualitative Analysis for Health Equity: Lessons Learned Using “EARS” With Latino Communities. Int J Qual Methods. 2023;22:160940692311649.
Finley EP, Huynh AK, Farmer MM, Bean-Mayberry B, Moin T, Oishi SM, et al. Periodic reflections: a method of guided discussions for documenting implementation phenomena. BMC Med Res Methodol. 2018;18(1):153.
Gertner AK, Franklin J, Roth I, Cruden GH, Haley AD, Finley EP, et al. A scoping review of the use of ethnographic approaches in implementation research and recommendations for reporting. Implementation Research and Practice. 2021;2:263348952199274.
Palinkas LA, Zatzick D. Rapid Assessment Procedure Informed Clinical Ethnography (RAPICE) in Pragmatic Clinical Trials of Mental Health Services Implementation: Methods and Applied Case Study. Adm Policy Ment Health. 2019;46(2):255–70.
Lanham HJ, McDaniel RR, Crabtree BF, Miller WL, Stange KC, Tallia AF, et al. How Improving Practice Relationships Among Clinicians and Nonclinicians Can Improve Quality in Primary Care. Jt Comm J Qual Patient Saf. 2009;35(9):457–66.
PubMed PubMed Central Google Scholar
Miake-Lye IM, Delevan DM, Ganz DA, Mittman BS, Finley EP. Unpacking organizational readiness for change: an updated systematic review and content analysis of assessments. BMC Health Serv Res. 2020;20(1):106.
Finley EP, Closser S, Sarker M, Hamilton AB. Editorial: The theory and pragmatics of power and relationships in implementation. Front Health Serv. 2023;23(3):1168559.
Bartley L, Metz A, Fleming WO. What implementation strategies are relational? Using Relational Theory to explore the ERIC implementation strategies. FrontHealth Serv. 2022;17(2):913585.
Metz A, Jensen T, Farley A, Boaz A, Bartley L, Villodas M. Building trusting relationships to support implementation: A proposed theoretical model. FrontHealth Serv. 2022;23(2):894599.
Ketley D. A new and unique resource to help you spread and scale innovation and improvement. NHS Horizons. 2023. Available from: https://horizonsnhs.com/a-new-and-unique-resource-to-help-you-spread-and-scale-innovation-and-improvement/ Cited 2023 Mar 27
Dyer KE, Moreau JL, Finley E, Bean-Mayberry B, Farmer MM, Bernet D, et al. Tailoring an evidence-based lifestyle intervention to meet the needs of women Veterans with prediabetes. Women Health. 2020;60(7):748–62.
Goldstein KM, Melnyk SD, Zullig LL, Stechuchak KM, Oddone E, Bastian LA, et al. Heart Matters: Gender and Racial Differences Cardiovascular Disease Risk Factor Control Among Veterans. Women’s Health Issues. 2014;24(5):477–83.
Article PubMed Google Scholar
Vimalananda VG, Biggs ML, Rosenzweig JL, Carnethon MR, Meigs JB, Thacker EL, et al. The influence of sex on cardiovascular outcomes associated with diabetes among older black and white adults. J Diabetes Complications. 2014;28(3):316–22.
Breland JY, Phibbs CS, Hoggatt KJ, Washington DL, Lee J, Haskell S, et al. The Obesity Epidemic in the Veterans Health Administration: Prevalence Among Key Populations of Women and Men Veterans. J GEN INTERN MED. 2017;32(S1):11–7.
Sheahan KL, Goldstein KM, Than CT, Bean-Mayberry B, Chanfreau CC, Gerber MR, et al. Women Veterans’ Healthcare Needs, Utilization, and Preferences in Veterans Affairs Primary Care Settings. J GEN INTERN MED. 2022;37(S3):791–8.
Hamilton AB, Finley EP, Bean-Mayberry B, Lang A, Haskell SG, Moin T, et al. Enhancing Mental and Physical Health of Women through Engagement and Retention (EMPOWER) 2.0 QUERI: study protocol for a cluster-randomized hybrid type 3 effectiveness-implementation trial. Implement Sci Commun. 2023Mar 8;4(1):23.
Moin T, Damschroder LJ, AuYoung M, Maciejewski ML, Havens K, Ertl K, et al. Results From a Trial of an Online Diabetes Prevention Program Intervention. Am J Prev Med. 2018;55(5):583–91.
Damschroder LJ, Reardon CM, Sperber N, Robinson CH, Fickel JJ, Oddone EZ. Implementation evaluation of the Telephone Lifestyle Coaching (TLC) program: organizational factors associated with successful implementation. Behav Med Pract Policy Res. 2017;7(2):233–41.
Zlotnick C, Tzilos G, Miller I, Seifer R, Stout R. Randomized controlled trial to prevent postpartum depression in mothers on public assistance. J Affect Disord. 2016;189:263–8.
Kilbourne AM, Neumann MS, Pincus HA, Bauer MS, Stall R. Implementing evidence-based interventions in health care: application of the replicating effective programs framework. Implementation Science. 2007;2(1). Available from: http://implementationscience.biomedcentral.com/articles/ https://doi.org/10.1186/1748-5908-2-42 Cited 2017 May 11
Rubenstein LV, Stockdale SE, Sapir N, Altman L, Dresselhaus T, Salem-Schatz S, et al. A Patient-Centered Primary Care Practice Approach Using Evidence-Based Quality Improvement: Rationale, Methods, and Early Assessment of Implementation. J GEN INTERN MED. 2014;29(S2):589–97.
Article PubMed Central Google Scholar
Hamilton AB, Farmer MM, Moin T, Finley EP, Lang AJ, Oishi SM, et al. Enhancing Mental and Physical Health of Women through Engagement and Retention (EMPOWER): a protocol for a program of research. Implementation Science. 2017;12(1). Available from: https://implementationscience.biomedcentral.com/articles/ https://doi.org/10.1186/s13012-017-0658-9 Cited 2018 Jan 5
Kligler B. Whole Health in the Veterans Health Administration. Glob Adv Health Med. 2022;11:2164957X2210772.
Damschroder LJ, Reardon CM, Widerquist MAO, Lowery J. The updated Consolidated Framework for Implementation Research based on user feedback. Implementation Sci. 2022;17(1):75.
Yano EM, Darling JE, Hamilton AB, Canelo I, Chuang E, Meredith LS, et al. Cluster randomized trial of a multilevel evidence-based quality improvement approach to tailoring VA Patient Aligned Care Teams to the needs of women Veterans. Implementation Sci. 2015;11(1):101.
Nevedal AL, Reardon CM, Opra Widerquist MA, Jackson GL, Cutrona SL, White BS, et al. Rapid versus traditional qualitative analysis using the Consolidated Framework for Implementation Research (CFIR). Implementation Sci. 2021;16(1):67.
Kowalski C, Nevedal AL, Finley, Erin P., Young J, Lewinski A, Midboe AM, et al. Raising expectations for rapid qualitative implementation efforts: guidelines to ensure rigor in rapid qualitative study design, conduct, and reporting. 16th Annual Conference on the Science of Dissemination and Implementation in Health; 2023 Dec 13; Washington, D.C.
Gale RC, Wu J, Erhardt T, Bounthavong M, Reardon CM, Damschroder LJ, et al. Comparison of rapid vs in-depth qualitative analytic methods from a process evaluation of academic detailing in the Veterans Health Administration. Implementation Sci. 2019;14(1):11.
Braun V, Clarke V. Thematic analysis. In: Cooper H, Camic PM, Long DL, Panter AT, Rindskopf D, Sher KJ, editors. APA handbook of research methods in psychology, Vol 2: Research designs: Quantitative, qualitative, neuropsychological, and biological. Washington: American Psychological Association. 2012;57–71. Available from: http://content.apa.org/books/13620-004 Cited 2023 Mar 28
Torrance H. Triangulation, Respondent Validation, and Democratic Participation in Mixed Methods Research. J Mixed Methods Res. 2012;6(2):111–23.
Birt L, Scott S, Cavers D, Campbell C, Walter F. Member Checking: A Tool to Enhance Trustworthiness or Merely a Nod to Validation? Qual Health Res. 2016;26(13):1802–11.
Stirman SW, Miller CJ, Toder K, Calloway A. Development of a framework and coding system for modifications and adaptations of evidence-based interventions. Implementation Science. 2013;8(1). Available from: http://implementationscience.biomedcentral.com/articles/ https://doi.org/10.1186/1748-5908-8-65 Cited 2017 Sep 5
Miller CJ, Barnett ML, Baumann AA, Gutner CA, Wiltsey-Stirman S. The FRAME-IS: a framework for documenting modifications to implementation strategies in healthcare. Implementation Sci. 2021;16(1):36.
Wiltsey Stirman S, Baumann AA, Miller CJ. The FRAME: an expanded framework for reporting adaptations and modifications to evidence-based interventions. Implementation Sci. 2019;14(1):58.
Powell BJ, Beidas RS, Lewis CC, Aarons GA, McMillen JC, Proctor EK, et al. Methods to Improve the Selection and Tailoring of Implementation Strategies. J Behav Health Serv Res. 2017;44(2):177–94.
Ware P, Ross HJ, Cafazzo JA, Laporte A, Gordon K, Seto E. User-Centered Adaptation of an Existing Heart Failure Telemonitoring Program to Ensure Sustainability and Scalability: Qualitative Study. JMIR Cardio. 2018;2(2):e11466.
Chen EK, Reid MC, Parker SJ, Pillemer K. Tailoring Evidence-Based Interventions for New Populations: A Method for Program Adaptation Through Community Engagement. Eval Health Prof. 2013;36(1):73–92.
Article CAS PubMed Google Scholar
Moore G, Campbell M, Copeland L, Craig P, Movsisyan A, Hoddinott P, et al. Adapting interventions to new contexts—the ADAPT guidance. BMJ. 2021;3:n1679.
Waltz TJ, Powell BJ, Matthieu MM, Damschroder LJ, Chinman MJ, Smith JL, et al. Use of concept mapping to characterize relationships among implementation strategies and assess their feasibility and importance: results from the Expert Recommendations for Implementing Change (ERIC) study. Implementation Science. 2015;10(1). Available from: http://implementationscience.biomedcentral.com/articles/ https://doi.org/10.1186/s13012-015-0295-0 Cited 2017 Sep 5
Fernandez ME, Ruiter RAC, Markham CM, Kok G. Intervention Mapping: Theory- and Evidence-Based Health Promotion Program Planning: Perspective and Examples. Front Public Health. 2019;7:209.
Powell BJ, Waltz TJ, Chinman MJ, Damschroder LJ, Smith JL, Matthieu MM, et al. A refined compilation of implementation strategies: results from the Expert Recommendations for Implementing Change (ERIC) project. Implementation Science. 2015;10(1). Available from: http://implementationscience.biomedcentral.com/articles/ https://doi.org/10.1186/s13012-015-0209-1 Cited 2017 Nov 2
Connelly B, Gilmartin H, Hale A, Kenney R, Morgon B, Sjoberg H. The Relational Facilitation Guidebook [Internet]. Denver-Seattle Center of Innovation for Veteran-Centered and Value-Driven Care; 2023 Feb. Available from: https://www.seattledenvercoin.research.va.gov/education/rc/docs/Relational_Facilitation_Guidebook.pdf Cited 2023 Mar 29
Bonawitz K, Wetmore M, Heisler M, Dalton VK, Damschroder LJ, Forman J, et al. Champions in context: which attributes matter for change efforts in healthcare? Implementation Sci. 2020;15(1):62.
Demes JAE, Nickerson N, Farand L, Montekio VB, Torres P, Dube JG, et al. What are the characteristics of the champion that influence the implementation of quality improvement programs? Eval Program Plann. 2020;80:101795.
Flanagan ME, Plue L, Miller KK, Schmid AA, Myers L, Graham G, et al. A qualitative study of clinical champions in context: Clinical champions across three levels of acute care. SAGE Open Medicine. 2018;6:205031211879242.
Wood K, Giannopoulos V, Louie E, Baillie A, Uribe G, Lee KS, et al. The role of clinical champions in facilitating the use of evidence-based practice in drug and alcohol and mental health settings: A systematic review. Implementation Research and Practice. 2020;1:263348952095907.
Morse JM, Barrett M, Mayan M, Olson K, Spiers J. Verification strategies for establishing reliability and validity in qualitative research. Int J Qual Methods. 2002;1(2):13–22.
Abraham TH, Finley EP, Drummond KL, Haro EK, Hamilton AB, Townsend JC, et al. A Method for Developing Trustworthiness and Preserving Richness of Qualitative Data During Team-Based Analysis of Large Data Sets. Am J Eval. 2021;42(1):139–56.
Download references
All views expressed are those of the authors and do not represent the views of the US Government or the Department of Veterans Affairs. The authors would like to thank the EMPOWER QUERI 2.0 team, the VA Women’s Health Research Network (SDR 10-012), the participating Veteran Integrated Service Networks, and the women Veterans who inspire this work. Dr. Hamilton is supported by a VA HSR&D Research Career Scientist Award (RCS 21-135). Dr. Moin also receives support from the NIH/NIDDK (R01DK124503, R01DK127733, and R18DK122372), NIH/NIDDK Centers for Disease Control and Prevention (U18DP006535), the Patient-Centered Outcomes Research Institute (PCORI; SDM-2018C2-13543), the Department of Veterans Affairs (CSP NODES, CSP#2002), and UCLA/UCOP.
We would like to acknowledge funding from the VA Quality Enhancement Research Initiative (QUERI; QUE 20–028), the VA QUERI Rapid Qualitative Methods for Implementation Practice Hub (QIS 22–234), and VA Health Services Research & Development (Hamilton; RCS 21–135).
Authors and affiliations.
Center for the Study of Healthcare Innovation, Implementation, and Policy (CSHIIP), VA Greater Los Angeles Healthcare System, Los Angeles, CA, USA
Erin P. Finley, Joya G. Chrystal, Alicia R. Gable, Erica H. Fletcher, Agatha Palma, Ismelda Canelo, Rebecca S. Oberman, La Shawnta S. Jackson, Rachel Lesser, Tannaz Moin, Bevanne Bean-Mayberry, Melissa M. Farmer & Alison Hamilton
Joe R. & Teresa Lozano Long School of Medicine, The University of Texas Health Science Center at San Antonio, San Antonio, TX, USA
Erin P. Finley
David Geffen School of Medicine, University of California Los Angeles, Los Angeles, CA, USA
Tannaz Moin, Bevanne Bean-Mayberry & Alison Hamilton
You can also search for this author in PubMed Google Scholar
The original Rapid Implementation Feedback (RIF) report format was developed by JC, AG, AH, and EPF, with feedback from EHF, AP, IC, RO, LSJ, RL, TM, BBM, and MF. The analysis for this manuscript was planned by EPF, AH, JC, and AG. Preliminary analysis was conducted by EPF, with refinement and verification of findings provided by all authors during member checking meetings. The first draft was written by EPF, JC, AG, EHF, and AH. All authors reviewed, edited, and approved the final manuscript.
Correspondence to Erin P. Finley .
Ethics approval and consent to participate.
This proposal was funded through VA’s Quality Enhancement Research Initiative (QUERI), which uses operational funds to support program improvement. QUERI projects are conducted as quality improvement for the purposes of program implementation and evaluation and are approved as such by the main VA operations partner, which was the VA Office of Patient Care Services for EMPOWER 2.0 (approval received 11/26/2019). All interview participants provide oral, recorded consent for participation.
Not applicable.
Erin P. Finley and Alison Hamilton are on the editorial board for Implementation Science Communications.
Publisher’s note.
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Supplementary material 1., rights and permissions.
Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ . The Creative Commons Public Domain Dedication waiver ( http://creativecommons.org/publicdomain/zero/1.0/ ) applies to the data made available in this article, unless otherwise stated in a credit line to the data.
Reprints and permissions
Cite this article.
Finley, E.P., Chrystal, J.G., Gable, A.R. et al. The Rapid Implementation Feedback (RIF) report: real-time synthesis of qualitative data for proactive implementation planning and tailoring. Implement Sci Commun 5 , 69 (2024). https://doi.org/10.1186/s43058-024-00605-9
Download citation
Received : 14 December 2023
Accepted : 09 June 2024
Published : 21 June 2024
DOI : https://doi.org/10.1186/s43058-024-00605-9
Anyone you share the following link with will be able to read this content:
Sorry, a shareable link is not currently available for this article.
Provided by the Springer Nature SharedIt content-sharing initiative
ISSN: 2662-2211
Health Research Policy and Systems volume 22 , Article number: 72 ( 2024 ) Cite this article
Metrics details
In the Netherlands, university medical centres (UMCs) bear primary responsibility for conducting medical research and delivering highly specialized care. The TopCare program was a policy experiment lasting 4 years in which three non-academic hospitals received funding from the Dutch Ministry of Health to also conduct medical research and deliver highly specialized care in specific domains. This study investigates research collaboration outcomes for all Dutch UMCs and non-academic hospitals in general and, more specifically, for the domains in the non-academic hospitals participating in the TopCare program. Additionally, it explores the organizational boundary work employed by these hospitals to foster productive research collaborations.
A mixed method research design was employed combining quantitative bibliometric analysis of publications and citations across all Dutch UMCs and non-academic hospitals and the TopCare domains with geographical distances, document analysis and ethnographic interviews with actors in the TopCare program.
Quantitative analysis shows that, over the period of study, international collaboration increased among all hospitals while national collaboration and single institution research declined slightly. Collaborative efforts correlated with higher impact scores, and international collaboration scored higher than national collaboration. A total of 60% of all non-academic hospitals’ publications were produced in collaboration with UMCs, whereas almost 30% of the UMCs’ publications were the result of such collaboration. Non-academic hospitals showed a higher rate of collaboration with the UMC that was nearest geographically, whereas TopCare hospitals prioritized expertise over geographical proximity within their specialized domains. Boundary work mechanisms adopted by TopCare hospitals included aligning research activities with organizational mindset (identity), bolstering research infrastructure (competence) and finding and mobilizing strategic partnerships with academic partners (power). These efforts aimed to establish credibility and attractiveness as collaboration partners.
Research collaboration between non-academic hospitals and UMCs, particularly where this also involves international collaboration, pays off in terms of publications and impact. The TopCare hospitals used the program’s resources to perform boundary work aimed at becoming an attractive and credible collaboration partner for academia. Local factors such as research history, strategic domain focus, in-house expertise, patient flows, infrastructure and network relationships influenced collaboration dynamics within TopCare hospitals and between them and UMCs.
Peer Review reports
Research collaboration has taken flight worldwide in recent decades [ 1 ], as reflected by the growing number of authors listed on research papers [ 2 , 3 ]. Collaborative research has become the norm for many, if not most, scientific disciplines [ 4 , 5 , 6 , 7 , 8 ]. Several studies have found a positive relationship between collaboration and output [ 9 , 10 , 11 , 12 , 13 ]. Publications resulting from research collaborations tend to be cited more frequently [ 14 , 15 , 16 , 17 , 18 ] and to be of higher research quality [ 5 , 14 , 19 , 20 ]. In particular, international collaboration can lead to more citations [ 17 , 21 , 22 , 23 , 24 ], although there are major differences internationally and between fields [ 25 ]. Moreover, international collaboration is often set as an eligibility requirement for European research grants, which have become necessary as national-level resources dwindle. Funding consortia also encourage and require boundary crossings, such as research collaborations between academia and societal partners. Collaboration within public research organizations and universities further plays a crucial role in the international dissemination of knowledge [ 26 ].
In the medical domain, initiatives have been rolled out in numerous countries to encourage long-term collaboration and the exchange of knowledge and research findings. Each initiative takes a strategic approach to assembling the processes needed to support these exchanges across the boundaries of stakeholder groups. In the Netherlands, medical research has traditionally been concentrated in public academia, especially the university medical centres (UMCs). Increasingly, however, research activities are being undertaken in non-academic teaching hospitals (hereafter, non-academic hospitals), driven by their changing patterns of patient influx. In 2013, a Dutch study based on citation analysis showed that collaboration between UMCs and non-academic hospitals leads to high-quality research [ 27 ]. There was further encouragement for medical research in Dutch non-academic hospitals in 2014, when a 4-year policy experiment, the TopCare program, was launched, with three such hospitals receiving additional funding from the Ministry of Health to also provide highly specialized care and undertake medical research. Funding for this combination of care and research is available for UMCs under the budgetary “academic component” of the Dutch healthcare system. Such additional funds are not available for non-academic hospitals, nor can they allocate their regular budgets to research. In the past, these hospitals managed to conduct research and provide specialized care through their own financial and time investments, or by securing occasional external research funding. The TopCare policy experiment was thus meant to find new ways of organizing and funding highly specialized care and medical research in non-academic hospitals.
Despite the increasing emphasis on research collaboration, we still know little about its impact and how it can be achieved. This study integrates two sides of research collaboration in Dutch hospitals and combines elements of quantitative and qualitative research for a broad (output and impact) and deep (boundary work to achieve collaboration) understanding of the phenomenon. We define research collaboration as collaboration between two or more organizations (at least one being a UMC or non-academic hospital) that has resulted in a co-authored (joint) scientific publication [ 28 ]. The research questions are: How high is the level of collaboration in the Dutch medical research field, what is the impact of collaboration, and how are productive research collaborations achieved?
To answer these questions, we performed mixed methods research in UMCs and non-academic hospitals. To examine the impact of various collaboration models – namely, single institution, national and international – across all eight Dutch UMCs and 28 non-academic hospitals between 2009 and 2018/2019, we conducted a bibliometric analysis of publications and citations. We additionally carried out a similar analysis for the TopCare non-academic hospitals between 2010 and 2016 to examine the effects of collaboration in the two domains funded through the program at each hospital. The latter timeframe was chosen to match the duration of the program, 2014–2018. We further conducted an in-depth qualitative analysis of the organizational boundary work done by two non-academic hospitals participating in the TopCare program to initiate and enhance productive research collaborations around specialized research and care within and between hospitals on a national level. Historically, such endeavours have been predominantly reserved for UMCs. The program was therefore a unique opportunity to examine such boundary work.
The landscape of medical research in the netherlands, collaboration in medical research.
The Netherlands has a three-tiered hospital system: general hospitals (including non-academic hospitals), specialized hospitals focusing on a specific medical field or patient population, and UMCs. Nowadays, there are 7 UMCs, 17 specialized hospitals and 58 general hospitals, of which 26 are non-academic [ 29 ].
UMCs receive special funding (the budgetary “academic component”) for research and oversee medical training programs in their region. Non-academic hospitals do not receive structural government funding for medical research and have less chance of obtaining other funding because they are not formally acknowledged as knowledge-producing organizations. Research has less priority in most of these hospitals than in UMCs. On the introduction of government policies regarding competition in healthcare and the development of quality guidelines emphasizing high-volume treatments, some non-academic hospitals began focusing on specific disease areas, in a bid to distinguish themselves from other hospitals and to perform research in and hence develop more knowledge about these priority areas. This led to a greater concentration of highly specialized care [ 30 ]. Non-academic hospitals have also become important partners in medical research for UMCs due to their large patient volumes.
To further stimulate research in non-academic hospitals, the Ministry of Health awarded three such hospitals €28.8 million in funding over a 4-year period (2014–2018) to support medical research and specialized care for which they do not normally receive funding [ 31 ]. It should be noted that, in non-academic hospitals, the concept of highly specialized research and care applies not to the entire hospital but rather to specific departments or disease areas. This is why the TopCare non-academic hospitals have been evaluated on the basis of specific domains. The funding recipients were two non-academic hospitals and one specialized hospital. In this article, we focus on UMCs and general non-academic hospitals and therefore excluded the specialized hospital from our analysis. Hospital #1 is the largest non-academic hospital in the Netherlands (1100 beds), even larger than some UMCs. Its fields of excellence (known as “domains”) are lung and heart care. Hospital #2 is a large non-academic hospital (950 beds) that focuses on emergency care and neurology. According to the two hospitals, these four highly specialized care and research-intensive domains are comparable to high-complexity care and research in UMCs [ 31 ].
The TopCare program ran through ZonMw, the Netherlands Organization for Health Research and Development, the main funding body for health research in the Netherlands. ZonMw established a committee to assess the research proposals and complex care initiatives of the participating hospitals and to set several criteria for funding eligibility. One requirement was that participating hospitals had to collaborate with universities or UMCs on research projects and were not allowed to conduct basic research in the context of the program, as this was seen as the special province of UMCs.
In the qualitative part of this study, we analyse the boundary work done by actors to influence organizational boundaries as well as the practices undertaken to initiate or enhance collaboration between TopCare non-academic hospitals and academia (universities and UMCs). We refer to boundary work when actors create, shape or disrupt organizational boundaries [ 32 , 33 , 34 , 35 ]. In particular, boundary work involves opening a boundary for collaboration and creating linkages with external partners [ 36 ]. In this article, we use three organizational boundary concepts – “identity”, “competence” and “power” – out of four presented by Santos and Eisenhardt. These concepts are concerned with fostering collaboration, whereas the fourth is concerned with “efficiency” and is less relevant here. Identity involves creating a reputation for research to become an attractive partner while preserving identity. Competence involves creating opportunities for research, for example, in manpower and infrastructure. Finally, power involves creating a negotiating position vis-à-vis relevant others [ 35 ].
The data for this study consist of different types of analysis: (1) quantitative bibliometric data on the publications and citations of all eight Dutch UMCs and 28 non-academic hospitals, and (2) quantitative bibliometric data on the publications and citations in the four domains of two TopCare non-academic hospitals, qualitative (policy) document analysis and in-depth ethnographic interviews with various actors in the Dutch TopCare program. The quantitative data collected from Dutch UMCs and non-academic hospitals were utilized to contextualize data gathered within the TopCare program. We discuss and explain the data collection and methodology in detail in the two sections below.
Quantitative approach: bibliometric analysis of all 8 Dutch UMCs and 28 non-academic hospitals
We performed a bibliometric analysis of the publications of 28 non-academic hospitals and 8 UMCs Footnote 1 in the Netherlands between 2009 and 2018. Data for the study were derived from the Center for Science and Technology Studies’ (CWTS) in-house version of the Web of Science (WoS) database. The year 2009 was chosen because the address affiliations in publications are more accurately defined from this year onward. To examine trends over time, we divided the period 2009–2018/2019 into two blocks of 4 years and an additional year for citation impact measurement (2009–2012/2013 and 2014–2017/2018; see explanation in Appendix 1).
The bibliometric analysis includes several bibliometric indicators that describe both the output and impact of the relevant research (Table 5 in Appendix 1). One of the indicators, the mean normalized citation score (MNCS), reveals the average impact of a hospital’s publications compared with the average score of all other publications in that area of research. If the MNCS is higher than 1, then on average, the output of that hospital’s domain is cited more often than an “average” publication in that research area.
To map the ways hospitals cooperate, we follow two lines of analysis. The first is centred around a typology of scientific activities and differentiates between (i) a single institution (SI; all publications with only one address) and (ii) international collaboration (IC; collaboration with at least one international partner). All other publications are grouped as (iii) national collaboration (NC; collaboration with Dutch organizations only).
The second line is centred around geographical distance and size of collaboration. The geographical distances between each non-academic hospital and each of the eight UMCs were measured in Google Maps. The size of collaboration was measured by counting the joint publications of each non-academic hospital and the eight UMCs. Subsequently, we assessed whether the non-academic hospitals also had the most joint publications with the nearest UMC.
Quantitative and qualitative approach to the two TopCare hospitals and their four domains, the “TopCare program” case study
The quantitative approach to the TopCare program relies on a bibliometric analysis of publications within each hospital’s two domains: lung and heart care in TopCare non-academic hospital #1, and trauma and neurology in TopCare non-academic hospital #2. Our bibliometric analysis focused on publications within the four selected TopCare domains between 2010 and 2016, following the same methodology described in the previous section under ‘Data collection’. Each domain provided an overview of its publications. The number of publications produced by the two domains at each TopCare hospital is combined in the results. Although this timeframe differs from the broader analysis of all UMCs and non-academic hospitals, comparing these two periods offers insights into the “representative position” of the two domains of each non-academic hospital participating in the TopCare program, in terms of publications and citations.
We took a qualitative approach to analysing the collaborative activities in the two TopCare non-academic hospitals, where each domain has its own leadership arrangements, regional demographic priorities and history of research collaboration [cf. 37 ]. This part of the study consisted of interviews and document analysis.
Over the course of the 4-year program, J.P. and/or R.B. conducted and recorded 90 semi-structured interviews that were then transcribed. For this study, we used repeated in-depth ethnographic interviews with the main actors in the Dutch TopCare program, which took place between 2014 and 2018. We conducted a total of 27 interviews; 20 of the interviews were with a single person, 5 with two persons, and 2 with three persons. The interviews were held with 20 different respondents; 12 respondents were interviewed multiple times. Table 1 presents the different respondents in non-academic hospitals #1 and #2.
Desk research was performed for documents related to the TopCare program (Table 6 – details of document analysis in Appendix 1).
The bibliometric analysis of the four domains in the two TopCare non-academic hospitals follows the same methodology as described in Abramo et al. [ 1 ].
We tested the assumption that joint publications are most frequent between a non-academic hospital and its nearest UMC. If the geographical distance between TopCare non-academic hospitals and their collaborative academic partners is described as “nearby”, then they both work within the same region.
The ethnographic interviews were audio-recorded and transcribed in full with the respondents’ permission. These transcripts were subject to close reading and coding by two authors, J.P. and J.O., to identify key themes derived from the theory [ 35 ] (Table 7 in the Appendix). These were then discussed and debated with the wider research team with the goal of developing a critical interpretation of the boundary work done to initiate or enhance research collaboration [cf. 37 ]. The processed interview data were submitted to the respondents for member check. All respondents gave permission to use the data for this study, including the specific quotes. In the Netherlands, this research requires no ethical approval.
Triangulating the results of the document analysis and the interviews enables us to identify different overarching themes within each boundary concept (identity, competence and power). These themes were utilized as a framework for structuring individual paragraphs, which we explain in greater detail in Table 4 in the Results.
Bibliometric analysis of all Dutch UMCs and non-academic hospitals
This section reports the results of the quantitative bibliometric analysis of the output, trends and impact of collaboration between all UMCs and non-academic hospitals from 2009 to 2018/2019. It provides a broad picture of the output – in terms of research publications – of both existing and ongoing collaborations between all UMCs and non-academic hospitals within the specified timeframe. It furthermore describes the analysis results concerning the relationship between collaboration and the geographical distance between two collaborating hospitals.
The first step in understanding the degree of collaboration between hospitals is to measure the research output by number of publications. The total number of publications between 2009 and 2018 is shown in Table 8 ( Appendix 1) and Fig. 1 .
Types of collaboration for UMCs and non-academic hospitals from 2009 to 2018/2019. # Total number of publications. Percentage of total (100%) accounted for by single institution, national collaboration and international collaboration
The majority of these publications (89%) are affiliated with UMCs. UMCs, in particular, tend to have a relatively higher proportion of single-institution publications and are more engaged in international collaboration. This pattern may be indicative of UMCs’ enhanced access to research grants and EU subsidies, as well as their active involvement in international consortia.
Collaboration between UMCs and non-academic hospitals appears to be more prevalent and impactful for non-academic hospitals than for UMCs: 70% of all publications originating from a non-academic hospital were the result of joint efforts between a UMC and a non-academic hospital, whereas only 8% of all UMC publications were produced in collaboration with a non-academic hospital (Table 8 in Appendix 1).
Table 9 Appendix 1) and Fig. 2 show the relative number of publications of all 8 UMCs and all 28 non-academic hospitals in the two periods: 2009–2012/2013 and 2014–2017/2018. For both UMCs and non-academic hospitals, international collaboration accounted for a relatively larger share of publications in recent years.
Type of research collaboration for UMCs and non-academic hospitals over time. Percentage of total (100%) accounted for by single institution, national collaboration and international collaboration in each period
As the non-academic hospitals often collaborate with UMCs, it is interesting to analyse these collaborations geographically (distance). The assumption is that geographical proximity matters, with the most-frequent joint publications being between a non-academic hospital and the nearest UMC.
Figure 3 shows that 61% (17 out of 28) of the non-academic hospitals collaborate most frequently with the nearest UMCs. Geographical proximity is thus an important but not the only determining factor in collaboration.
Collaboration with nearest UMC from 2009 to 2018
The mean normalized citation scores (MNCS) shown in Table 2 cover all 8 UMCs and 28 non-academic hospitals.
The MNCS in Table 2 and the mean normalized journal scores (MNJS) in Table 10 (Appendix 1) show similar patterns. The impact score for both UMCs and non-academic hospitals is greatest for international collaboration. Non-academic hospitals’ single-institution publications score lower than the global average, which was defined as 1.
In sum, quantitative analysis exposes two trends. The first is growth in international collaboration for all UMCs and non-academic hospitals over time, also revealing that collaboration leads to higher MNCS impact scores. Second, geographical proximity between UMCs and non-academic hospitals is an important but not the only determining factor in collaboration. This is the context in which the TopCare program operated in 2014–2018.
“TopCare program” case study
This section presents the results of our analysis of the collaboration networks of the two TopCare non-academic hospitals, consisting of: (1) quantitative bibliometric analysis of the output and impact of these networks between 2010 and 2016, along with the geographical distance to their academic partners, and (2) qualitative ethnographic interviews to identify the boundary work conducted by these hospitals.
Bibliometric analysis of the two TopCare non-academic hospitals’ international and national collaboration networks across four domains
The results of the bibliometric analysis indicate the representative positions of the two domains within each TopCare non-academic hospital. Between 2010 and 2016, these hospitals generated a higher number of single-institution publications compared with the average of all non-academic hospitals. Percentage-wise, their output resembled that of the UMCs, underscoring their leading positions in their respective domains. The percentage of publications based on national collaboration in the domains of TopCare hospital #2 is comparable to that of non-academic hospitals overall, while there is more international collaboration in the domains of TopCare hospital #1 than at non-academic hospitals overall (Fig. 4 , Appendix 1 and Fig. 1 ). The impact of the research is above the global average, and the publications have a higher average impact when there is collaboration with international partners; this is true across all four domains (Table 11 in Appendix 1).
In terms of geographical distance, only the neurology domain of TopCare hospital #2 collaborates with an academic partner within the same region. All other domains collaborate with partners outside the region, a striking difference from the geographical results shown in Fig. 3 .
Ethnographic analysis
This section reviews the results of our ethnographic analysis of the two TopCare hospitals from 2014 to 2018. To analyse the boundary work these hospitals performed to initiate and/or enhance productive research collaborations, we use the framework suggested by Santos and Eisenhardt (2005) for examining organizational boundary work through the concepts of identity, competence and power. Table 3 provides a description of each boundary and how these concepts are defined in our case study on the basis of the overarching themes in the document analysis and the interviews.
In the TopCare program, the non-academic hospitals used their unique history and expertise to create a joint research focus in a domain and to enhance their positions and influence their collaboration with UMCs and universities.
A manager in hospital #1’s lung domain explained the work being done from a historical perspective, emphasizing not only the innovative history of the hospital but also its central position in patient care:
The first-ever lung lavage, lung transplant and angioplasty were performed in this hospital. Nationally, this hospital has always, and we’re talking about 50–60 years ago now, been at the forefront, and has always invested in this line of research and care. So that is truly institutionally built, there is just that history and you can’t just copy that. And we have the numbers: for interstitial lung diseases, we have 2000 patients in our practice and receive 600 new patients per year. (interview with manager at hospital #1 in 2018).
To explain why patient care and research into rare interstitial lung diseases is centred in hospital #1 as a strategic domain focus, a leading international pulmonary physician – a “boundary spanner” (see below) – pointed to the importance of building team expertise and creating facilities:
I lead that care program for interstitial lung diseases and preside over the related research. I’ve often been asked: you’re a professor, so why don’t you go to a UMC, couldn’t you do much more there? But the care was developed here [in this hospital]. The expertise needed to recognize interstitial lung diseases depends not only on me but also on the radiologist and pathologist; together we have a team that can do this. We have created facilities that no other hospital has for these diseases. If I leave to do the same work in a UMC, I’d have to start over and I’d be going back 30 years. (interview with pulmonary physician at hospital #1 in 2014).
The doctors working in this hospital’s lung and heart domains finance the working hours they put into research themselves. “This fits in with the spirit of a top clinical hospital and the entrepreneurial character of our hospital.” (interview with project leader at hospital #1 in 2018).
Hospital #2, the result of a merger in 2016, struggled to find its strategic focus. A surgical oncologist at this hospital clarified one of the disadvantages of the merger: “People are [still] busy dealing with the money and positions, and the gaze is turned inward, the primary processes. So clinical research is very low on the agenda.” She continued by saying that a small project team acting on behalf of the hospital’s board of directors (BoD) was seeking the best-fit profile for the program, which had raised some opposition in departments excluded from the chosen strategic focus. As a consequence, the hospital had begun to showcase its highly specialized care in the field of neurosurgical treatments. It had a long history and was the first to use a Gamma Knife device for treating brain tumours. The experts in this domain could thus act as authorities, and they became a national centre of expertise. Their strategic partner was a nearby UMC, and they treated relevant patients from other hospitals in their region.
To generate impact, research priorities in a domain are aligned with the focus of the hospital. A member of the BoD of hospital #2 stressed the urgency of “specializing or focusing on a particular area of care” and emphasized that the TopCare budget was being utilized to create a joint focus within a domain. The resulting collective identity mobilized internal affairs and was recognized as valuable by third parties. An important reason for joining the TopCare program for both hospitals was to be able to position themselves strategically as attractive and credible research partners:
The focus is on the domains of neurology and trauma because we think as a non-academic hospital we have something extra to offer: the very close relationship between patient care and research, because we have a larger number of patients of this type here than the universities. (interview with care manager at hospital #2 in 2013).
In short, the boundary of identity requires a closer alignment between these hospitals’ research activities and their strategic objectives and organizational mindset, and demands that they also showcase their staff’s expertise. The TopCare program offered opportunities to transform and consolidate their identity by enhancing their value proposition, that is, their unique history, strategic domain focus, expertise and number of patients.
All domains in the TopCare program chose to utilize the TopCare funding to invest in their research infrastructure, and to build research networks to share and learn. A research infrastructure consists of all the organizational, human, material and technological facilities needed for specialist care and research [ 31 ].
The TopCare data show that funding is essential for generating research impact. A manager at hospital #1 described its current financial circumstances:
A lot of research and much of the care is currently not funded, it is actually paid for mostly by the hospital... We have had massive budgetary adjustments the past two or three years. ...It is increasingly difficult to finance these kinds of activities within your own operation. (interview with manager at hospital #1 in 2018).
The TopCare funding was used to enhance the material infrastructure in hospital #1’s heart domain:
A number of things in healthcare are really terribly expensive, and there is simply no financing at all for them. …Cardiac devices, for example. We are constantly trying things out, but there’s no compensation for it. (interview with project leader at hospital #1 in 2018).
Hospital #1 had a long-standing and firm relationship with a UMC in the lung domain, giving it a solid material infrastructure. For example, there were spaces where researchers, especially PhD students, could meet, collaborate and share knowledge [ 31 ]. Another essential part of the material infrastructure for the lung domain was the biobank, as highlighted by a leading international pulmonary physician:
Our board of directors made funds available through the innovation fund to start up a biobank, but developing it and keeping it afloat has now been made possible thanks to the TopCare funding. It’s a gift from heaven! It will allow for further expansion and we can now seek international cooperation. (interview with pulmonary physician at hospital #1 in 2014).
Notably, the program allowed both non-academic hospitals to digitize their infrastructure, for example, with clinical registration and data management systems. According to an orthopaedic surgeon at hospital #2, “Logistics have been created, which can very easily be applied to other areas. By purchasing a data system, everyone can record data in a similar way.”
Besides investing in data infrastructure, the human dimension was another crucial factor in the research infrastructure. Instead of working on research “at night”, it became embedded in physicians’ working hours. All domains indicated the importance of having researchers, statisticians and data management expertise available to ensure and enhance the quality of research, and both hospitals invested in research staffing.
After losing many research-minded traumatologists to academia, hospital #2 decided to invest in dedicated researchers to form an intermediate layer of full-time senior researchers linked to clinicians within the two domains.
I personally think this is the most important layer in a hospital, with both a doctor and a senior researcher supervising students and PhD candidates. Clinicians ask practical questions and researchers ask a lot of theoretical questions. Both perspectives are needed to change practices. I have also learned that it takes a few years before the two can understand each other’s language. (interview with neurosurgeon at hospital #2 in 2018).
The program offered the hospitals opportunities to structure internal forms of collaboration and build a knowledge base within a domain. For example, hospital #1 organized educational sessions with all PhD students in the heart domain.
Having more researchers working in our hospital has given the whole research culture a boost, as well as the fact that they are producing more publications and dissertations. (interview with cardiologist at hospital #1 in 2018).
Hospital #2 also encouraged cross-domain learning by organizing meetings between the neurology and trauma domains.
You know, you may not be able to do much together content-wise, but you can learn a lot from each other in terms of the obstacles you face (interview with project manager at hospital #2 in 2016).
At the beginning there was resistance to participating in the program.
It was doom and gloom; without more support, groups refused to join. That kind of discussion. So the financial details have been important in terms of willingness to participate. (interview with surgical oncologist at hospital #2 in 2018).
Another obstacle was local approval for multicentre studies, which led to considerable delay (interview with psychologist at hospital #2 in 2018). Overall, the TopCare program created a flywheel effect for other domains that proved essential for internal collaborations (interview with surgical oncologist at hospital #2 in 2018).
In hospital #1, collaboration between the heart and lung domains grew closer.
Divisions between the different disciplines are much less pronounced in our hospital than in UMCs. So it’s much easier to work together. We’d already collaborated closely on lung diseases, and this has improved during the program. (interview with cardiologist at hospital #1 in 2016)
At the network level, the TopCare data show that most researchers participated in national networks. For example, the neurology domain in hospital #2 had established a network of 16 non-academic hospitals. Limited funding prevented researchers at non-academic hospitals from attending many international seminars, and they had more trouble building their international networks. One exception concerned the researchers in the lung domain of hospital #1, who expanded their international network by organizing an international seminar during the TopCare program and by contributing to other national and international seminars.
Each TopCare domain provided highly specialized care and wanted to become a centre of expertise. However, a hospital can only provide highly specialized care if research is conducted to determine the best treatment strategies. The data show how the two are interwoven.
For example, a PhD student has sought to collaborate with a UMC on a specific aorta subject in which we have greater expertise and more volume in terms of patients than UMCs. Based on this link with this UMC, a different policy was drawn up and also implemented immediately in all kinds of other UMCs. (interview with cardiologist at hospital #1 in 2018).
Often, a leading scientist who is the driving force behind a domain in a hospital is a “boundary spanner”, a person in a unique position to bridge organizational boundaries and foster research collaboration by “enabling exchange between production and use of knowledge” [ 40 , p. 1176], [ 41 ]. For example, the leading pulmonary physician in hospital #1 is a boundary spanner who has done a huge amount of work to enhance collaboration. With interstitial lung disease care being concentrated here, this professor can offer fellowships and stimulate virtual knowledge-sharing by video conferencing for “second-opinion” consultations. The TopCare funding was used to finance this. The network is successful at a non-academic level.
These consultations are with colleagues in other hospitals and they avoid patients having to be referred. (interview with project leader at hospital #1 in 2018). Our network now [in 2018] consists of more than 14 hospitals, which we call every week to discuss patients with an interstitial lung disease. …UMCs participate indirectly in this network. For example, the north has a specific centre for this disease in a non-academic hospital and a nearby UMC refers patients to this centre, who are then discussed in our network. (interview with pulmonary physician at hospital #1 in 2018).
This physician also noted that the network was still growing; other colleagues from non-academic hospitals wanted to join it.
Yesterday, colleagues from XX and XX were here. And they all said, “I’ve never learned so much about interstitial lung diseases.” We’re imparting enormous amounts of expertise. (interview with pulmonary physician at hospital #1 in 2018).
In sum, focusing on the boundary of competence, the TopCare hospitals created and mobilized resources to invest in their research infrastructure. In every domain, this infrastructure was used to strengthen the relationship between research, care and education, and to build and enhance internal and external research networks to share and learn.
For TopCare non-academic hospitals, the boundary of power is concerned with creating the right sphere of influence, meaning BoDs and administrators attempt to find and mobilize new strategic partners and build mutual relationships with various stakeholders at different levels.
A project leader at hospital #2 emphasized that the additional resources of the TopCare program created an opportunity for the non-academic hospitals “to show our collaborative partners that we’re a valuable partner.” For once, the tables were turned:
We’ve always had a good relationship with one UMC; they always used the data from our surgeries. But it’s nice that we can finally ask them whether they want to join us. That makes it a little more equal, and we can be a clinical partner. (interview with neurosurgeon at hospital #2 in 2018).
One of the requirements in each domain when applying to ZonMw for funding was alignment with academia in a research and innovation network. Collaboration often appeared more difficult at the administrative level when the academic partners worked in the same field of expertise, and tended to be more successful when the partners focused on different fields, where their interests did not conflict. According to a board member at hospital #2 who played a crucial role in a partnership agreement, a conscious decision was taken beforehand to seek partners beyond the medical domain as well.
There may be conflict with other groups within the walls of a UMC and I don’t see that as promising. You have to work together and we aren’t in a real position to do so. (interview with board member at hospital #2 in 2018).
Just before the end of the program, it was announced that this hospital had concluded a partnership agreement with a university to broaden their joint research program alongside neurology and trauma. An important prerequisite was that both organizations invest 1 million euros in the partnership. The board member revealed that the relationship with this university had in fact existed for some time:
So we went and talked to the university and they became interested. Then the top level was reorganized and replaced and we had to start from scratch again. That took a lot of time. Our goals were to awaken the enthusiasm of the board and at least three deans, otherwise it would be a very isolated matter. And we succeeded. Last week we had a matchmaking meeting at the university and there were about 50 pitches showing how we could be of value to each other. (interview with board member at hospital #2 in 2018)
Looking back, he defined the conditions for a successful collaboration with academia:
In terms of substance, the two sides have to be going in the same direction and complement each other, for example, in expertise, techniques, and/or facilities. And what is really important is that people know each other and are willing to meet each other…and there must be appreciation. (interview with board member at hospital #2 in 2018).
The trauma domain in hospital #2 wanted to become a trauma research centre in its region, and after investing in its research infrastructure, it found a new strategic academic partner:
We have also found new partners, for example, the Social Health Care Department of a UMC [name]. And that really has become a strong partnership; the intent was there for years, but we had no money. (interview with epidemiologist at hospital #2 in 2018).
The neurology domain at this hospital worked to form a network with a university of technology and a university social science department.
Officially, our hospital can’t serve as a co-applicant for funding and that is frustrating. However, I am pleased to show that we are contributing to innovation. (interview with neurosurgeon at hospital #2 in 2018).
A board member at this hospital reflected on the qualities needed for research and concluded: “The neuro group has more of those intrinsic qualities than the trauma group. …I think the trauma group is actually at a crossroads and will think twice about whether they can attract capacity to develop the research side or fall back to a very basic level.”
In hospital #1, administrators rejected a proposal to collaborate with the nearest UMC submitted by medical specialists in the heart domain. Past conflicts and unsuccessful ventures still influenced the present, even though the individuals involved had already left.
A further factor was raised by a manager at hospital #1, who reflected on the importance of obtaining a professorship in the heart domain:
If we can, even on the basis of any kind of appointment, obtain a professorship from the heart centre, then yes, that helps! …I think it just helps throughout the whole operation, politically speaking, as extra confirmation, extra legitimization for that status. (interview with manager at hospital #1 in 2016).
Eventually, hospital #1 managed to find alignment with a UMC in another region during the program and a medical specialist from the hospital became a professor by special appointment.
This UMC showed the greatest determination, actually, while we could have chosen to collaborate with the nearest UMC [but we didn’t]. And there was actually also a real click between both the administrators and the specialists. (interview with manager at hospital #1 in 2018).
Additionally, the TopCare data show that, while there may be close alignment with the nearest UMC, collaboration is not limited to this and proximity can sometimes even be detrimental (for example, in some cases hospitals compete for patients). As research and care in the TopCare hospitals’ domains became more specialized, they required the specific expertise of UMCs in other regions.
One critical dependency in the collaboration between a university or UMC and a non-academic hospital is the distribution of dissertation premiums, valued at about €100,000 per successful PhD track. Currently, after completion of a dissertation, the premium goes entirely to the university or UMC, even when much of the candidate’s research and supervision takes place in a non-academic hospital [ 31 ]. This structural difference makes collaboration less financially valuable to non-academic hospitals. For example, the leading pulmonary physician in hospital #1 is a professor who is affiliated with both a UMC and non-academic hospital, a boundary spanner who works across organizational boundaries, is successful in research, and bears responsibility for a significant proportion of the research output in the lung domain and in the collaboration with other organizations. Moreover, he does most of the PhD supervision, and his students do their work in hospital #1. Despite all this work, the dissertation premium goes to the UMC. Although efforts have been made to change this, certain institutional structures are so strongly embedded that it is difficult to open the organizational boundary.
During our research, we observed how the BoDs and administrators of the two TopCare hospitals discussed the progress of the program and worked together to learn from each other.
We can learn a lot from hospital #1 regarding the organization of our research, we think. That has been very inspiring. …On the other hand, the focus has been very centred on getting the domain and project requests funded at all. (interview with care manager at hospital #2 in 2013).
The BoDs opted for an approach aimed at building mutual trust and understanding. As a result, their alliance became more intensive during the program. By the time the program’s final report was released, both BoDs were leveraging their power to influence ZonMw’s next step: the follow-up to TopCare. They had a targeted plan for their lobbying. For example, after mutual coordination, the BoD of each hospital sent a letter to the Ministry of Health sketching their vision for the future.
In summary, for the TopCare hospitals, the boundary of power centred on finding alignment with strategic academic partners and the other BoDs and administrators in the TopCare program. Moreover, ties with strategic partners were important for extending the organization’s sphere of influence [ 33 ] in building and enhancing productive research collaborations. These hospitals recognized that they could not dismantle the existing structure of research funding, and they therefore committed themselves to trying to extend the TopCare program. Table 4 summarizes the opportunities and challenges within the three boundary concepts.
In our study, we used a mixed methods research design to explore research collaborations by focusing on the research output and impact of UMCs and non-academic hospitals in the Netherlands and by zeroing in on the boundary work of two Dutch non-academic hospitals for achieving collaboration.
Our bibliometric analysis shows that collaboration matters, especially for non-academic hospitals. Access to research grants, EU funding and international collaborations is harder for non-academic hospitals, and they need to collaborate with UMCs to generate research impact, assessed by means of MNCS impact scores. Conversely, non-academic hospitals are important for UMCs because they have a larger volume of patients. When UMCs and non-academic hospitals collaborate, their impact scores are higher. Impact scores are, moreover, higher for international collaborative publications across all types of hospital and all periods. More in-depth research is needed into why collaboration increases impact.
Bibliometric analysis of the domains of the two TopCare non-academic hospitals underscores their leading role in these domains. Upon receiving TopCare funding, the hospitals had to engage in various forms of boundary work to meet the requirement mandated by ZonMw of establishing a research collaboration with academia. They used the additional program resources to invest [ 33 ] in opening a boundary for research collaboration with academic partners.
Identity work involves creating an image of the organizational unit that legitimizes its research and care status in line with the dominant mindset of the organization. In practice, the relevant unit needs to establish a distinctive history and domain focus that aligns with the organizational strategy of the hospital, in-house expertise and patient flow. This requires coordination work with the BoD. However, not all domains have been successful in creating such an identity. It proved much more difficult for the trauma domain, for example, because their research is not as highly specialized as and more fragmented than the other domains.
Competence work focuses on organizational (a well-functioning science support unit), technological (registration systems) and material (floor space or biobank) infrastructure, depending on individual requirements. Additionally, tremendous efforts go into the human dimension of infrastructure, as TopCare hospitals consider research staff and making time available for doctors to be important conditions for building structurally supportive research programs. In a previous study, we highlighted that collaboration between all non-academic hospitals within the Association of Top Clinical Teaching Hospitals (STZ) is essential for strengthening their research infrastructure [ 42 ], and can also be seen as a matter of efficiency [ 35 ]. Moreover, in each TopCare hospital, competence work served to bring domains together to facilitate shared learning. Knowledge-sharing across departments or communities is an example of opening boundaries to facilitate integration, convergence or enrichment of points of view [ 36 , 43 , 44 ].
Professors with double affiliations can act as boundary spanners. They play a significant role as experts in a domain by creating its distinctive character, and they surmount borders and break down barriers through their network relationships with other hospitals. Additionally, these persons are responsible for a significant share of the research output in their domain and conduct research with worldwide impact in collaboration with other organizations. Their boundary work must be recognized as essential because they bring usable knowledge to the table, create opportunities for improved relationships across disciplines, enhance communication between stakeholders and facilitate more productive research collaborations [cf. 45 ].
The TopCare hospitals do much less work in the power dimension because the domains in which they operate are adjacent to those of academia. Our study shows that more successful, productive research collaborations are created when the hospital’s academic partner works in a complementary but not identical field. Only in one case, the heart domain, did collaboration succeed in an identical field, but that was because the academic partner was located outside of the hospital’s region and was therefore not a competitor. According to Joo et al., a potential partner’s suitability is determined not only by complementarity, their unique contribution to research collaboration in terms of expertise, skills, knowledge, contexts or resources but also by compatibility and capacity. Partner compatibility involves alignment in vision, commitment, trust, culture, values, norms and working styles, which facilitate rapport-building and cross-institutional collaboration [ 46 ]. TopCare data indicate that research collaborations should be managed to ensure all partners can operate as equals [ 47 ]. Partner capacity refers to the ability to provide timely resources (for example, expertise, skills or knowledge) for projects, as well as leadership commitment, community engagement and institutional support for long-term, mission-driven goals, such as the joint research program in neurology and trauma at hospital #2 and a university.
These three qualitative criteria – partner compatibility, complementarity and capacity – are aspects of power dynamics that influence strategic decisions about recruiting research partners. Generally, power dynamics shape a hospital’s strategic choices regarding whether to collaborate, with whom to partner and the extent of the research collaboration [ 48 ]. Future research should examine these power dynamics in a more integrated manner to unlock the full potential of collaboration [ 46 ].
It was possible to unravel how non-academic hospitals participating in the TopCare program engaged in research collaborations with academia. As the program did not interfere with the existing care, research and financing structures within the UMCs, it allowed TopCare non-academic hospitals to also combine top clinical care and research. The boundary concepts allow us to observe a dual dynamic in the collaboration: the opening of boundaries while simultaneously maintaining certain limits. Opening boundaries refers to facilitating collaboration through activities related to identity and competence, while maintaining them involves the power balance. The temporary program did not disrupt the existing power balance associated with the budgetary “academic component” and the dissertation premiums that accrue to academia. Overall, then, the power dimension may well be the primary factor that made it impossible for the TopCare non-academic hospitals to attain their ultimate goal: secure a consistent form of funding for their research and top clinical care. Instead, the national authorities introduced a new, temporary funding program for non-academic hospitals, and preserved the status quo favouring academia.
A key finding is that, if a hospital is successful in establishing coherence between the different forms of boundary work, it can create productive research collaborations and generate research impact. The TopCare hospitals performed boundary work to strengthen their research infrastructure (competence) and their research status (identity) and create a favourable negotiating position opposite academia (power). For example, choosing the lung domain as the hospital’s strategic focus (identity) and establishing a database as a fundamental source of information for research by a boundary spanner (competence) generated sufficient power to make the hospital a key player in this field and a much-respected collaboration partner, nationally and internationally. However, some restrictions remained in place, such as the national lung research network consisting only of non-academic hospitals, with UMCs participating only indirectly.
Another key finding is that possessing a substantial budget is not in itself enough to ensure successful research collaboration. It is clear from this study that extensive boundary work is also needed to facilitate research collaboration. Given the absence of structural funding, the TopCare non-academic hospitals were under pressure to deliver results during the program, making research collaboration even more crucial for them than for the UMCs in this context. Additionally, because highly specialized care and research at the TopCare non-academic hospitals required unique expertise, they had a growing need for collaboration at the national level. Contrary to assumptions and the findings of our analysis of UMCs and non-academic hospitals overall, their collaborative partners were not predominantly located at the nearest UMC.
Does our study align with the literature and support the results of similar initiatives, such as the establishment of Collaborations for Leadership in Applied Health Research and Care (CLAHRC), a regional multi-agency research network of universities and local national health service (NHS) organizations focused on improving patient outcomes in England by conducting and utilizing applied health research [ 49 ]? And what does it contribute to previous research?
While differences exist between the National Health Service (NHS) and the healthcare system in the Netherlands, there are also noteworthy parallels that render a comparison possible. These include encouraging networks to boost research productivity, fostering collaboration within a competitive system and funding research that is relevant to public health priorities. Moreover, building upon the findings of CLAHRC regarding boundary work within a competitive system and developing and funding research that is relevant to patient needs and public health priorities, there are further parallels, such as creating strong local research infrastructures and local networks [ 49 ], and using influential and skilled boundary spanners [ 49 , 50 ]. In addition, we found that research history, strategic domain focus, in-house expertise, patient flows, and network relationships pre-conditioned the TopCare hospitals’ collaboration with academia. Our results further show that, for non-academic hospitals seeking to create productive research collaborations, it is essential to work in complementary fields and to establish a coherence between identity, competence and power.
Our findings indicate that, after opening a boundary with academia, the focus of the TopCare hospitals was on searching for mutual engagement. These hospitals tried to clarify their added value by creating boundaries to distinguish themselves from UMCs, and attempted to extend the TopCare program without it overlapping with the budgetary “academic component”, so that it posed no threat to the UMCs. Boundary-crossing involves a two-way interaction of mutual engagement and commitment to change in practices [ 51 ]. It is likely that the program did not last long enough to instigate changes in practices, as it can take time to develop mutual understanding and foster trusting relationships [ 52 ].
Based on the CLAHRC results and our research findings, the trend towards regionalization in the Netherlands [ 53 ] and a new leading and coordinating role for UMCs in this research landscape [ 52 , 54 ] can only be successful if boundary work is conducted, allowing research-minded non-academic hospitals to:
Build a “collaborative identity” [ 50 , 55 , 56 ] over time with their academic partners (identity);
Establish added value in their research infrastructures compared with that of their academic partners (competence);
Create solid networks for learning and sharing knowledge [ 55 , 57 ] with their academic partners (competence);
Mobilize boundary spanners to bridge disciplinary and professional boundaries in research, teaching and practice [ 49 , 50 , 55 , 58 ] and publish articles in collaboration with academic partners with high research impact (competence);
Find the inspiration and confidence to increase their co-dependence to, for example, gain benefits from interacting with different partners in the field [ 35 ] (power); and
Create long-term collaborations with academia across sectors over time, as well as within sectors; this requires iterative and continual engagement between clinicians, academics, managers, practitioners and patients (power) [ 49 , 52 ].
It is conceivable that the evaluation of the follow-up study to the TopCare program, which will extend to 2025, could unravel these next steps.
Our results demonstrate that collaboration in research is important and should be encouraged. However, the current methods used to assess researchers underestimate this importance. Reward systems and metrics focus on the performance of individual researchers and may even discourage the development of medical research networks and collaboration [ 52 , 59 ]. There is ongoing debate about and rising criticism of the dominance of scientific impact scores as a measure of the performance of health researchers and research organizations [ 60 ]. Other forms of impact, such as the societal impact of medical research, are becoming more important, and different metrics are being developed. Research collaboration among individuals and organizations should be incentivized and rewarded, and should also be embedded in performance assessment and the core competences of all actors involved [ 61 ]. New ways of rewarding research collaboration within organizations should therefore be explored.
This study is limited, both geographically and institutionally, to the Netherlands, and factors other than national and international research collaborations may explain the increase in research output and impact. For example, the research articles in our sample have not been analysed on substantive aspects such as methodology and funding. A bias may therefore have been introduced. Furthermore, the research output and impact of the TopCare non-academic hospitals that we measured was limited to the 4-year program period. A further limitation was the use of these hospitals’ research output as a measure of the influence of the TopCare program, as we were interested not only in the short-term effects (publications) but also in the long-term ones (on the work conducted to build research infrastructures). Moreover, the focus in the qualitative material concerning the TopCare program was on the two TopCare non-academic hospitals and, more specifically, on their national rather than their international collaborations.
Research collaboration between non-academic hospitals and academia in the Netherlands pays off in terms of publications and impact. For the publication of scientific articles, collaboration between UMCs and non-academic hospitals appears to be more prevalent and impactful for non-academic hospitals than for UMCs. When UMCs and non-academic hospitals collaborate, their impact scores tend to be higher. More research is needed into why collaboration leads to more impact.
Non-academic hospitals showed a higher rate of collaboration with the nearest UMC, whereas collaborative partners of TopCare hospitals were not predominantly located at the nearest UMC. TopCare hospitals prioritized expertise over geographical proximity as a predicator of their collaborative efforts, particularly as research and care in their domains became more specialized.
Drawing on the additional resources of the TopCare program, participating non-academic hospitals invested significantly in boundary work to open boundaries for research collaboration with academic partners and, simultaneously, to create boundaries that distinguished them from UMCs. Identity work was performed to ensure that their history and domain focuses were coherent with the dominant mindset of their organization, while competence work was done to enhance their research infrastructure. The human dimension of the infrastructure received considerable attention: more research staff, time made available for doctors and recognition that boundary spanners facilitate research collaborations.
Power work to find and mobilize strategic academic partners was mostly focused on complementary fields, as non-academic hospitals work in domains adjacent to those of academia. The TopCare hospitals tended to avoid power conflicts, resulting in a preservation of the status quo favouring academia.
The local research history, strategic domain focus, in-house expertise, patient flows, infrastructure and network relationships of each TopCare hospital influenced collaboration with academia [cf. 37 , 58 . Increased coherence between the different forms of boundary work led to productive research collaborations and generated research impact. To meet future requirements, such as regionalization, further boundary work is needed to create long-term collaborations and new ways of rewarding research collaboration within organizations.
The datasets used and/or analysed during the study are available from the corresponding author upon reasonable request.
The names of the UMCs and non-academic hospitals and their numbers are not up to date due to mergers in the intervening period. The database contains data on eight UMCs; today there are seven, as two UMCs in Amsterdam merged in 2018. There are 28 non-academic hospitals in the database, whereas today 27 such hospitals are members of the Association of Top Clinical Teaching Hospitals ( https://www.stz.nl ). To ensure data consistency, the database remains unchanged.
Board of directors
Center for Science and Technology Studies
International collaboration
Mean normalized citation score
Mean normalized journal score
National collaboration
Netherlands Federation of University Medical Centers
Single institution
Association of Top Clinical Teaching Hospitals
University medical centre
Abramo G, D’Angelo CA, Di Costa F. Research collaboration and productivity: is there correlation? High Educ. 2009. https://doi.org/10.1007/s10734-008-9139-z .
Article Google Scholar
De Solla Price DJ. Little science, big science. New York: Columbia University Press; 1963.
Book Google Scholar
Narin F, Carpenter MP. National publication and citation comparisons. JASIS&T. 1975. https://doi.org/10.1002/asi.4630260203 .
Beaver D, Rosen R. Studies in scientific collaboration: part III – professionalization and the natural history of modern scientific co-authorship. Scientometrics. 1979. https://doi.org/10.1007/BF02016308 .
Katz JS, Martin BR. What is research collaboration? Res Policy. 1997. https://doi.org/10.1016/S0048-7333(96)00917-1 .
Clark BY, Llorens JJ. Investments in scientific research: examining the funding threshold effects on scientific collaboration and variation by academic discipline. PSJ. 2012. https://doi.org/10.1111/j.1541-0072.2012.00470.x .
Bozeman B, Fay D, Slade CP. Research collaboration in universities and academic entrepreneurship: the-state-of-the-art. J Technol Transf. 2013. https://doi.org/10.1007/s10961-012-9281-8 .
Van Raan AF. Measuring science. In: Moed HF, Glänzel W, Schmoch U, editors. Handbook of quantitative science and technology research: the use of patent and publication statistics in studies of S&T systems. Dordrecht: Springer; 2004. p. 19–50. https://doi.org/10.1007/1-4020-2755-9_2 .
Chapter Google Scholar
Lotka AJ. The frequency distribution of scientific productivity. J Wash Acad Sci. 1926;16:317–23.
Google Scholar
De Solla Price DJ, Beaver D. Collaboration in an invisible college. Am Psychol. 1966. https://doi.org/10.1037/h0024051 .
Zuckerman H. Nobel laureates in science: patterns of productivity, collaboration, and authorship. Am Sociol Rev. 1967;32:391–403.
Article CAS PubMed Google Scholar
Morrison PS, Dobbie G, McDonald FJ. Research collaboration among university scientists. High Educ Res Dev. 2003. https://doi.org/10.1080/0729436032000145149 .
Lee S, Bozeman B. The impact of research collaboration on scientific productivity. Soc Stud Sci. 2005. https://doi.org/10.1177/0306312705052359 .
Beaver DB. Collaboration and teamwork in physics. Czechoslov J Phys B. 1986. https://doi.org/10.1007/BF01599717 .
Acedo FJ, Barroso C, Casanueva C, Galán JL. Co-authorship in management and organizational studies: an empirical and network analysis. J Manag Stud. 2006. https://doi.org/10.1111/j.1467-6486.2006.00625.x .
Wuchty S, Jones BF, Uzzi B. The increasing dominance of teams in production of knowledge. Science. 2007. https://doi.org/10.1126/science.1136099 .
Article PubMed Google Scholar
Sooryamoorthy R. Do types of collaboration change citation? Collaboration and citation patterns of South African science publications. Scientometrics. 2009. https://doi.org/10.1007/s11192-009-2126-z .
Gazni A, Didegah F. Investigating different types of research collaboration and citation impact: a case study of Harvard University’s publications. Scientometrics. 2011. https://doi.org/10.1007/s11192-011-0343-8 .
Landry R, Traore N, Godin B. An econometric analysis of the effect of collaboration on academic research productivity. High Educ. 1996. https://doi.org/10.1007/BF00138868 .
Laband DN, Tollison RD. Intellectual collaboration. J Political Econ. 2000. https://doi.org/10.1086/262132 .
Van Raan A. The influence of international collaboration on the impact of research results: some simple mathematical considerations concerning the role of self-citations. Scientometrics. 1998;42(3):423–8.
Glänzel W. National characteristics in international scientific co-authorship relations. Scientometrics. 2001. https://doi.org/10.1023/a:1010512628145 .
Glänzel W, Schubert A. Analysing scientific networks through co-authorship. In: Moed HF, Glänzel W, Schmoch U, editors. Handbook of quantitative science and technology research: the use of patent and publication statistics in studies of S&T systems. Dordrecht: Kluwer; 2004. p. 257–76. https://doi.org/10.1007/1-4020-2755-9_20 .
Didegah F, Thelwall M. Which factors help authors produce the highest impact research? Collaboration, journal and document properties. J Informetr. 2013. https://doi.org/10.1016/J.JOI.2013.08.006 .
Thelwall M, Maflahi N. Academic collaboration rates and citation associations vary substantially between countries and fields. J Assoc Inf Sci Technol. 2020. https://doi.org/10.1002/asi.24315 .
Archibugi D, Coco A. International partnerships for knowledge in business and academia: a comparison between Europe and the USA. Technovation. 2004. https://doi.org/10.1016/S0166-4972(03)00141-X .
Levi M, Sluiter HE, Van Leeuwen T, Rook M, Peeters G. Medisch-wetenschappelijk onderzoek in Nederland: Hoge kwaliteit door samenwerking UMC’s en opleidingsziekenhuizen. NTvG. 2013;157:A6081.
Abramo G, D’Angelo CA, Di Costa F. University-industry research collaboration: a model to assess university capability. High Educ. 2011. https://doi.org/10.48550/arXiv.1811.01763 .
Centraal Bureau voor de Statistiek. Health care institutions; key figures, finance and personnel. https://www.cbs.nl/nl-nl/cijfers/detail/83652ENG . Accessed 6 Mar 2024.
Postma J, Zuiderent-Jerak T. Beyond volume indicators and centralization: toward a broad perspective on policy for improving quality of emergency care. Ann Emerg Med. 2017. https://doi.org/10.1016/j.annemergmed.2017.02.020 .
Postma JP, Van Dongen-Leunis A, Van Hakkaart-van Roijen L, Bal RA. Evaluatie Topzorg. Een evaluatie van 4 jaar specialistische zorg en wetenschappelijk onderzoek in het St. Antonius Ziekenhuis, het Oogziekenhuis en het ETZ. Rotterdam: Erasmus School of Health Policy & Management; 2018.
Gieryn TF. Boundary-work and the demarcation of science from non-science: strains and interests in professional ideologies of scientists. Am Sociol Rev. 1983. https://doi.org/10.2307/2095325 .
Gieryn TF. Cultural boundaries of science: credibility on the line. Chicago: University of Chicago Press; 1999.
Abbott A. The system of professions. Chicago: University of Chicago Press; 1988.
Santos FM, Eisenhardt KM. Organizational boundaries and theories of organization. Organ Sci. 2005. https://doi.org/10.1287/orsc.1050.0152 .
Chreim S, Langley A, Comeau-Vallée M, Huq JL, Reay T. Leadership as boundary work in healthcare teams. Leadership. 2013. https://doi.org/10.1177/174271501246 .
Waring J, Crompton A, Overton C, Roe B. Decentering health research networks: framing collaboration in the context of narrative incompatibility and regional geo-politics. Public Policy Adm. 2022. https://doi.org/10.1177/0952076720911686 .
Siaw CA, Sarpong D. Dynamic exchange capabilities for value co-creation in ecosystems. J Bus Res. 2021. https://doi.org/10.1016/j.jbusres.2021.05.060 .
Velter M, Bitzer V, Bocken N, Kemp R. Boundary work for collaborative sustainable business model innovation: the journey of a Dutch SME. J Bus Models. 2021. https://doi.org/10.5278/jbm.v9i4.6267 .
Bednarek AT, Wyborn C, Cvitanovic C, Meyer R, Colvin RM, Addison PF, et al. Boundary spanning at the science–policy interface: the practitioners’ perspectives. Sustain Sci. 2018. https://doi.org/10.1007/s11625-018-0550-9 .
Article PubMed PubMed Central Google Scholar
Neal JW, Neal ZP, Brutzman B. Defining brokers, intermediaries, and boundary spanners: a systematic review. Evid Policy. 2022. https://doi.org/10.1332/174426420X16083745764324 .
Van Oijen JCF, Wallenburg I, Bal R, Grit KJ. Institutional work to maintain, repair, and improve the regulatory regime: how actors respond to external challenges in the public supervision of ongoing clinical trials in the Netherlands. PLoS ONE. 2020. https://doi.org/10.1371/journal.pone.0236545 .
Carlile PR. Transferring, translating, and transforming: an integrative framework for managing knowledge across boundaries. Organ Sci. 2004. https://doi.org/10.1287/ORSC.1040.0094 .
Orlikowski WJ. Knowing in practice: enacting a collective capability in distributed organizing. Organ Sci. 2002. https://doi.org/10.1287/orsc.13.3.249.2776 .
Goodrich KA, Sjostrom KD, Vaughan C, Nichols L, Bednarek A, Lemos MC. Who are boundary spanners and how can we support them in making knowledge more actionable in sustainability fields? Curr Opin Environ Sustain. 2020. https://doi.org/10.1016/j.cosust.2020.01.001 .
Joo J, Selingo J, Alamuddin R. Unlocking the power of collaboration. How to develop a successful collaborative network in and around higher education. Ithaka S+R; 2019.
McDonald J, Jayasuriya R, Harris MF. The influence of power dynamics and trust on multidisciplinary collaboration: a qualitative case study of type 2 diabetes mellitus. BMC Health Serv Res. 2012. https://doi.org/10.1186/1472-6963-12-63 .
Harrington S, Fox S, Molinder HT. Power, partnership, and negotiations: the limits of collaboration. WPA-LOGAN. 1998;21:52–64.
Soper B, Hinrichs S, Drabble S, Yaqub O, Marjanovic S, Hanney S, et al. Delivering the aims of the Collaborations for Leadership in Applied Health Research and Care: understanding their strategies and contributions. Health Serv Deliv Res. 2015. https://doi.org/10.3310/hsdr03250 .
Lockett A, El Enany N, Currie G, Oborn E, Barrett M, Racko G, Bishop S, Waring J. A formative evaluation of Collaboration for Leadership in Applied Health Research and Care (CLAHRC): institutional entrepreneurship for service innovation. Health Serv Deliv Res. 2014. https://doi.org/10.3310/hsdr02310 .
Engeström Y. The horizontal dimension of expansive learning: weaving a texture of cognitive trails in the terrain of health care in Helsinki. In: Achtenhagen F, John EG, editors. Milestones of vocational and occupational education and training. Bielefelt: W Bertelmanns Verlag; 2003. p. 152–79.
Gezondheidsraad. Onderzoek waarvan je beter wordt: Een heroriëntatie op umc-onderzoek. Den Haag: Gezondheidsraad; 2016.
van der Woerd O, Schuurmans J, Wallenburg I, van der Scheer W, Bal R. Heading for health policy reform: transforming regions of care from geographical place into governance object. Policy Politics. 2024. https://doi.org/10.1332/03055736Y2024D000000030 .
Iping R, Kroon M, Steegers C, van Leeuwen T. A research intelligence approach to assess the research impact of the Dutch university medical centers. Health Res Policy Syst. 2022. https://doi.org/10.1186/s12961-022-00926-y .
Rycroft-Malone J, Burton C, Wilkinson JE, Harvey G, McCormack B, Baker R, et al. Collective action for knowledge moblisation: a realist evaluation of the collaborations for leadership in applied Health Research and care. Health Serv Deliv Res. 2015. https://doi.org/10.3310/hsdr03440 .
Kislov R, Harvey G, Walshe K. Collaborations for leadership in applied health research and care: lessons from the theory of communities of practice. Implement Sci. 2011. https://doi.org/10.1186/1748-5908-6-64 .
Harvey G, Fitzgerald L, Fielden S, McBride A, Waterman H, Bamford D, et al. The NIHR collaboration for leadership in applied health research and care (CLAHRC) for Greater Manchester: combining empirical, theoretical and experiential evidence to design and evaluate a large-scale implementation strategy. Implement Sci. 2011. https://doi.org/10.1186/1748-5908-6-96 .
Currie G, Lockett A, Enany NE. From what we know to what we do: lessons learned from the translational CLAHRC initiative in England. J Health Serv Res Policy. 2013. https://doi.org/10.1177/1355819613500484 .
Hurley TJ. Collaborative leadership: engaging collective intelligence to achieve results across organisational boundaries. White Paper. Oxford Leadership. 2011.
DORA. The declaration. https://sfdora.org/read . Accessed 6 Mar 2024.
O’Leary R, Gerard C. Collaboration across boundaries: insights and tips from federal senior executives. Washington: IBM Center for The Business of Government; 2012.
Traag VA, Waltman L, Van Eck NJ. From Louvain to Leiden: guaranteeing well-connected communities. Sci Rep. 2019;9(1):5223. https://doi.org/10.1038/s41598-019-41695-z .
Article CAS Google Scholar
Download references
The authors thank the two reviewers and the members of the Health Care Governance department of Erasmus School of Health Policy & Management, Erasmus University Rotterdam for their helpful comments on earlier drafts. We are particularly indebted to Kor Grit for his helpful comments and critical appraisal of this paper.
The TopCare program was funded by the Netherlands Organization for Health Research and Development (ZonMw) ( www.zonmw.nl/en ) under Grant [Number 80-84200-98-14001]. ZonMw had no role in the design or conduct of the study; the collection, management, analysis and interpretation of the data; or the preparation, review and approval of the manuscript.
Authors and affiliations.
Erasmus School of Health Policy & Management, Erasmus University Rotterdam, P.O. Box 1738, 3000 DR, Rotterdam, The Netherlands
Jacqueline C. F. van Oijen, Annemieke van Dongen-Leunis, Jeroen Postma & Roland Bal
Centre for Science and Technology Studies, Leiden University, Leiden, The Netherlands
Thed van Leeuwen
You can also search for this author in PubMed Google Scholar
Conceptualization: J.v.O., A.v.D.L. and T.v.L. (bibliometric analysis of UMCs and non-academic hospitals); A.v.D.L. and T.v.L. (bibliometric analysis of TopCare domains); and J.v.O., J.P. and R.B. (ethnographic interviews in the TopCare program). Formal analysis: J.v.O., A.v.D.L. and T.v.L. (bibliometric analysis of UMCs and non-academic hospitals); A.v.D.L and T.v.L. (bibliometric analysis of TopCare domains); J.v.O., J.B. and R.B. (ethnographic interviews in the TopCare program). Funding acquisition: R.B. (TopCare program). Investigation: A.v.D.L and T.v.L. (database analysis of UMCs and non-academic hospitals and TopCare domains) and J.v.O., J.B. and R.B. (ethnographic interviews in the TopCare program). Methodology: J.v.O., A.v.D.L and T.v.L. (bibliometric analysis of UMCs and non-academic hospitals); A.v.D.L and T.v.L. (bibliometric analysis of TopCare domains); and J.v.O., J.B. and R.B. (ethnographic interviews in the TopCare program). Project administration: T.v.L. and A.v.D.L (bibliometric analysis of UMCs and non-academic hospitals and TopCare domains) and J.P. (TopCare program). Supervision: T.v.L. (bibliometric analysis of UMCs and non-academic hospitals and TopCare domains) and R.B. (bibliometric analysis of UMCs and non-academic hospitals and TopCare domains, and ethnographic interviews in the TopCare program). Visualization: A.v.D.L and T.v.L. (bibliometric analysis of UMCs and non-academic hospitals and TopCare domains). Original draft: J.v.O., A.v.D.L and R.B. Draft & revision: J.v.O., A.v.D.L, J.P., T.v.L. and R.B. All authors read and approved the final manuscript (and agreed to be both personally accountable for their own contributions and to ensure that questions related to the accuracy or integrity of any part of the work, even ones in which the author was not personally involved, would be appropriately investigated and resolved and that the resolution would be documented in the literature).
Correspondence to Jacqueline C. F. van Oijen .
Ethics approval and consent to participate.
Not applicable; at the time we were conducting the research, ethical approval was not required. Nowadays our facility has an Ethics Committee that assesses research proposals involving human subjects (including interview studies), but this was not the case then. This study is not subject to the Dutch Medical Research Involving Human Subjects Act (WMO); it concerns collaboration on medical research in TopCare non-academic hospitals. For research not subject to the WMO, local policy and applicable procedures apply; as the TopCare program began in 2014, there were, as yet, no institutional rules in this area.
Member check is part of our policy of informed consent of respondents and consent for publication. Specifically, we gave respondents the opportunity to peruse and add to quotes from their semi-structured interviews and to confirm our interpretation. The focus was on confirming and amending the quote and verifying the interpretation. The research team discussed the feedback received from the respondents and weighed it against the context of data analysis. Any disagreement on a respondent’s feedback was discussed directly with the respondent until consensus was reached. The STZ and NFU have given permission to use the data collected by CWTS on behalf of the NFU and STZ for the bibliometric analysis of this study. They have taken note of the results of this study and agreed to its publication.
The authors declare that they have no competing interests.
Publisher’s note.
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
See Fig. 4 and Tables 5 , 6 , 7 , 8 , 9 , 10 and 11 .
UMCs produce 18 times (= 27,592/1503) more SI, four times (= 42,557/10880) more NC and 14 times (82,540/5896) more IC publications than non-academic hospitals.
Of all publications, 89% (= 152,688/170967) are attributed to UMCs and 11% (18,279/170967) to non-academic hospitals.
Joint publications in national collaboration: 82% (= 8943/10880) non-academic hospitals and 21% (= 8943/42557) UMCs.
Joint international publications: 66% (= 3874/5896) non-academic hospitals and 5% (= 3874/82540) UMCs.
Joint publications: 70% (= 12,816/18279) non-academic hospitals and 8% (= 12,816/152688) UMCs.
Relationship between joint publications and total publications in each type of collaboration: 17% (= 8943/53436) national collaboration and 4% (= 3874/88435) international collaboration.
Types of collaboration involving TopCare hospitals #1 and #2 between 2010 and 2016. #, total number of publications
Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ . The Creative Commons Public Domain Dedication waiver ( http://creativecommons.org/publicdomain/zero/1.0/ ) applies to the data made available in this article, unless otherwise stated in a credit line to the data.
Reprints and permissions
Cite this article.
van Oijen, J.C.F., van Dongen-Leunis, A., Postma, J. et al. Achieving research impact in medical research through collaboration across organizational boundaries: Insights from a mixed methods study in the Netherlands. Health Res Policy Sys 22 , 72 (2024). https://doi.org/10.1186/s12961-024-01157-z
Download citation
Received : 31 December 2022
Accepted : 31 May 2024
Published : 25 June 2024
DOI : https://doi.org/10.1186/s12961-024-01157-z
Anyone you share the following link with will be able to read this content:
Sorry, a shareable link is not currently available for this article.
Provided by the Springer Nature SharedIt content-sharing initiative
ISSN: 1478-4505
Creative writing in economics.
Teacher collaboration in instructional teams and student achievement, educational research: planning, conducting, and evaluating quantitative and qualitative research, implementing clil in higher education in thailand: the extent to which clil improves agricultural students' writing ability, agricultural content, and cultural knowledge., related papers.
Showing 1 through 3 of 0 Related Papers
IMAGES
VIDEO
COMMENTS
Conducting interviews involves a well-planned and deliberate process to collect accurate and valid data. Here's a step-by-step guide on how to conduct interviews in qualitative research, broken down into three stages: 1. Before the interview. The first step in conducting a qualitative interview is determining your research question.
Summary. The qualitative research interview is a powerful data-collection tool which affords researchers in medical education opportunities to explore unknown areas of education and practice within medicine. This paper articulates 12 tips for consideration when conducting qualitative research interviews, and outlines the qualitative research ...
A qualitative research interview is a one-to-one data collection session between a researcher and a participant. Interviews may be carried out face-to-face, over the phone or via video call using a service like Skype or Zoom. There are three main types of qualitative research interview - structured, unstructured or semi-structured.
Qualitative Inquiry 9(3):335-354. Weighs the potential benefits and harms of conducting interviews on topics that may cause emotional distress. Argues that the researcher's skills and code of ethics should ensure that the interviewing process provides more of a benefit to both participant and researcher than a harm to the former.
5. Not keeping your golden thread front of mind. We touched on this a little earlier, but it is a key point that should be central to your entire research process. You don't want to end up with pages and pages of data after conducting your interviews and realize that it is not useful to your research aims.
Qualitative research interviews are depth interviews. They elicit detailed feedback from your leads and customers. Unstructured interviews reveal why people react in a certain way or make certain decisions. According to The Hartford, qualitative research provides an anecdotal look into your business. That provides an important form of data.
An interview is a qualitative research method that relies on asking questions in order to collect data. Interviews involve two or more people, one of whom is the interviewer asking the questions. ... Depending on the type of interview you are conducting, your questions will differ in style, phrasing, and intention. Structured interview ...
Vancouver, Canada. Abstract. Interviews are one of the most promising ways of collecting qualitative data throug h establishment of a. communication between r esearcher and the interviewee. Re ...
Gentle: lets people finish; gives them time to think; tolerates pauses. 5. Sensitive: listens attentively to what is said and how it is said; is empathetic in dealing with the interviewee. 6. Open: responds to what is important to interviewee and is flexible. 7. Steering: knows what he/she wants to find out. 8.
Abstract. The qualitative research interview is an important data collection tool for a variety of methods used within the broad spectrum of medical education research. However, many medical teachers and life science researchers undergo a steep learning curve when they first encounter qualitative interviews, both in terms of new theory but also ...
InterViews by Steinar Kvale Interviewing is an essential tool in qualitative research and this introduction to interviewing outlines both the theoretical underpinnings and the practical aspects of the process. After examining the role of the interview in the research process, Steinar Kvale considers some of the key philosophical issues relating ...
TIPSHEET QUALITATIVE INTERVIEWINGTIP. HEET - QUALITATIVE INTERVIEWINGQualitative interviewing provides a method for collecting rich and detailed information about how individuals experience, understand. nd explain events in their lives. This tipsheet offers an introduction to the topic and some advice on. arrying out eff.
In-depth interviewing is a qualitative research technique that involves conducting intensive individual interviews with a small number of respondents to explore their perspectives on a particular idea, program, or situation. For example, we might ask participants, staff, and others associated with a program about their experiences and ...
In this article, she shares five interviewing tips that have served her well. 1. Convey Intent. Proeschold-Bell says it's important for the interviewer to know the intent behind each question so that it can be clearly conveyed to the interviewee. Understanding the intent of a question, she's found, helps interviewers decide whether or not ...
Qualitative interviews usually involve follow-up questions and are conducted in a conversation or discussion format. A qualitative interview is a more personal form of research agenda compared to general questionnaires or focused group studies. Such formats often include open-ended and follow-up questions. LEARN ABOUT: Behavioral Research.
Qualitative interviewis a broad term uniting semi-structuredand unstructured interviews. Quali- tative interviewing is less structured and more likely to evolve as a natural conversation; it is of- ten conducted in the form of respondents narrating their personal experiences or life histories. Qualitative interviews can be part of ethnography ...
Introduce yourself and explain the aim of the interview. Devise your questions so interviewees can help answer your research question. Have a sequence to your questions / topics by grouping them in themes. Make sure you can easily move back and forth between questions / topics. Make sure your questions are clear and easy to understand.
What are interviews? An interviewing method is the most commonly used data collection technique in qualitative research. 1 The purpose of an interview is to explore the experiences, understandings, opinions and motivations of research participants. 2 Interviews are conducted one-on-one with the researcher and the participant. Interviews are most appropriate when seeking to understand a ...
10. Be willing to make "on the spot" revisions to your interview protocol. Many times when you are conducting interviews a follow up question may pop into your mind. If a question occurs to you in the interview ask it. Sometimes the "ah-ha" question that makes a great project comes to you in the moment.
As no research interview lacks structure most of the qualitative research interviews are either semi-structured, lightly structured or in-depth. Unstructured interviews are generally suggested in conducting long-term field work and allow respondents to let them express in their own ways and pace, with minimal hold on respondents' responses.
10.3 Conducting Qualitative Interviews. Qualitative interviews might feel more like a conversation than an interview to respondents, however the researcher is usually guiding the conversation with the goal of gathering information from a respondent. A key difference between qualitative and quantitative interviewing is that qualitative ...
Conducting interviews in qualitative research has traditionally been a common data collection method. With more accessible technologies and the onset of the COVID-19 pandemic, researchers facilitating interviews have increasingly become interested in using online methods. This guide provides a summary of how to facilitate an online qualitative ...
Best practice for interviews. At the root of interviewing is an interest in understanding the lived experiences of other people (Seidman, 2006). Interviews invite the participant to make sense of their own experiences and to share these experiences with the researcher. Interviews are therefore an appropriate method when researchers want to ...
How to conduct a successful in-depth interview. Editor's note: Lyndsay Sund is the senior project manager at Syncscript. This is an edited version of an article that originally appeared under the title "Mastering the Art of In-depth Interviews: Effective Techniques for Uncovering Insights." In-depth interviews are the cornerstone of qualitative research.
A practical guide for conducting qualitative research in medical education: Part 1-How to interview AEM Educ Train . 2021 Jul 1;5(3):e10646. doi: 10.1002/aet2.10646.
Qualitative methods are a critical tool for enhancing implementation planning and tailoring, yet rapid turn-around of qualitative insights can be challenging in large implementation trials. The Department of Veterans Affairs-funded EMPOWER 2.0 Quality Enhancement Research Initiative (QUERI) is conducting a hybrid type 3 effectiveness-implementation trial comparing the impact of Replicating ...
In the Netherlands, university medical centres (UMCs) bear primary responsibility for conducting medical research and delivering highly specialized care. The TopCare program was a policy experiment lasting 4 years in which three non-academic hospitals received funding from the Dutch Ministry of Health to also conduct medical research and deliver highly specialized care in specific domains.
Purpose: The purpose of this study was double: 1) To describe the aspects of technology that were involved in the process of integrating writing in the teaching of a discipline. 2) To identify some students' skills/competences that began to improve thanks to writing integration in the teaching of a content course. Methodology: Qualitative practices were used in data collection processes ...