7 CFR § 3406.20 - Evaluation criteria for research proposals.

The maximum score a research proposal can receive is 150 points. Unless otherwise stated in the annual solicitation published in the Federal Register, the peer review panel will consider the following criteria and weights to evaluate proposals submitted:

Evaluation criterion Weight
(a) Significance of the problem:
This criterion is used to assess the likelihood that the project will advance or have a substantial impact upon the body of knowledge constituting the natural and social sciences undergirding the agricultural, natural resources, and food systems.
(1) Impact—Is the problem or opportunity to be addressed by the proposed project clearly identified, outlined, and delineated? Are research questions or hypotheses precisely stated? Is the project likely to further advance food and agricultural research and knowledge? Does the project have potential for augmenting the food and agricultural scientific knowledge base? Does the project address a State, regional, national, or international problem(s)? Will the benefits to be derived from the project transcend the applicant institution or the grant period? 15 points.
(2) Continuation plans—Are there plans for continuation or expansion of the project beyond USDA support? Are there plans for continuing this line of research or research support activity with the use of institutional funds after the end of the grant? Are there indications of external, non-Federal support? Are there realistic plans for making the project self-supporting? What is the potential for royalty or patent income, technology transfer or university-business enterprises? What are the probabilities of the proposed activity or line of inquiry being pursued by researchers at other institutions? 10 points.
(3) Innovation—Are significant aspects of the project based on an innovative or a non-traditional approach? Does the project reflect creative thinking? To what degree does the venture reflect a unique approach that is new to the applicant institution or new to the entire field of study? 10 points.
(4) Products and results—Are the expected products and results of the project clearly outlined and likely to be of high quality? Will project results be of an unusual or unique nature? Will the project contribute to a better understanding of or an improvement in the quality, distribution, or effectiveness of the Nation's food and agricultural scientific and professional expertise base, such as increasing the participation of women and minorities? 15 points.
(b) Overall approach and cooperative linkages:
This criterion relates to the soundness of the proposed approach and the quality of the partnerships likely to evolve as a result of the project.
(1) Proposed approach—Do the objectives and plan of operation appear to be sound and appropriate relative to the proposed initiative(s) and the impact anticipated? Is the proposed sequence of work appropriate? Does the proposed approach reflect sound knowledge of current theory and practice and awareness of previous or ongoing related research? If the proposed project is a continuation of a current line of study or currently funded project, does the proposal include sufficient preliminary data from the previous research or research support activity? Does the proposed project flow logically from the findings of the previous stage of study? Are the procedures scientifically and managerially sound? Are potential pitfalls and limitations clearly identified? Are contingency plans delineated? Does the timetable appear to be readily achievable? 5 points.
(2) Evaluation—Are the evaluation plans adequate and reasonable? Do they allow for continuous or frequent feedback during the life of the project? Are the individuals involved in project evaluation skilled in evaluation strategies and procedures? Can they provide an objective evaluation? Do evaluation plans facilitate the measurement of project progress and outcomes? 5 points
(3) Dissemination—Does the proposed project include clearly outlined and realistic mechanisms that will lead to widespread dissemination of project results, including national electronic communication systems, publications and presentations at professional society meetings? 5 points.
(4) Partnerships and collaborative efforts—Does the project have significant potential for advancing cooperative ventures between the applicant institution and a USDA agency? Does the project workplan include an effective role for the cooperating USDA agency(s)? Will the project encourage and facilitate better working relationships in the university science community, as well as between universities and the public or private sector? Does the project encourage appropriate multi-disciplinary collaboration? Will the project lead to long-term relationships or cooperative partnerships that are likely to enhance research quality or supplement available resources? 15 points.
(c) Institutional capacity building:
This criterion relates to the degree to which the project will strengthen the research capacity of the applicant institution. In the case of a joint project proposal, it relates to the degree to which the project will strengthen the research capacity of the applicant institution and that of any other institution assuming a major role in the conduct of the project.
(1) Institutional enhancement—Will the project help the institution to advance the expertise of current faculty in the natural or social sciences; provide a better research environment, state-of-the-art equipment, or supplies; enhance library collections related to the area of research; or enable the institution to provide efficacious organizational structures and reward systems to attract, hire and retain first-rate research faculty and students—particularly those from underrepresented groups? 15 points.
(2) Institutional commitment—Is there evidence to substantiate that the institution attributes a high-priority to the project, that the project is linked to the achievement of the institution's long-term goals, that it will help satisfy the institution's high-priority objectives, or that the project is supported by the institution's strategic plans? Will the project have reasonable access to needed resources such as scientific instrumentation, facilities, computer services, library and other research support resources? 15 points.
(d) Personnel Resources 10 Points
This criterion relates to the number and qualifications of the key persons who will carry out the project. Are designated project personnel qualified to carry out a successful project? Are there sufficient numbers of personnel associated with the project to achieve the stated objectives and the anticipated outcomes? Will the project help develop the expertise of young scientists at the doctoral or post-doctorate level?
(e) Budget and cost-effectiveness:
This criterion relates to the extent to which the total budget adequately supports the project and is cost-effective.
(1) Budget—Is the budget request justifiable? Are costs reasonable and necessary? Will the total budget be adequate to carry out project activities? Are the source(s) and amount(s) of non-Federal matching support clearly identified and appropriately documented? For a joint project proposal, is the shared budget explained clearly and in sufficient detail? 10 points.
(2) Cost-effectiveness—Is the proposed project cost-effective? Does it demonstrate a creative use of limited resources, maximize research value per dollar of USDA support, achieve economies of scale, leverage additional funds or have the potential to do so, focus expertise and activity on a high-priority research initiative(s), or promote coalition building for current or future ventures? 5 points.
(f) Overall quality of proposal 5 points
This criterion relates to the degree to which the proposal complies with the application guidelines and is of high quality. Is the proposal enhanced by its adherence to instructions (table of contents, organization, pagination, margin and font size, the 20-page limitation, appendices, etc.); accuracy of forms; clarity of budget narrative; well prepared vitae for all key personnel associated with the project; and presentation (are ideas effectively presented, clearly articulated, thoroughly explained, etc.)?
  • Search Menu
  • Sign in through your institution
  • Advance articles
  • Author Guidelines
  • Submission Site
  • Open Access
  • Why Publish?
  • About Science and Public Policy
  • Editorial Board
  • Advertising and Corporate Services
  • Journals Career Network
  • Self-Archiving Policy
  • Dispatch Dates
  • Journals on Oxford Academic
  • Books on Oxford Academic

Issue Cover

Article Contents

1. introduction, 2. background, 4. findings, 5. discussion, 6. conclusion and final remarks, supplementary material, data availability, conflict of interest statement., acknowledgements.

  • < Previous

Evaluation of research proposals by peer review panels: broader panels for broader assessments?

ORCID logo

  • Article contents
  • Figures & tables
  • Supplementary Data

Rebecca Abma-Schouten, Joey Gijbels, Wendy Reijmerink, Ingeborg Meijer, Evaluation of research proposals by peer review panels: broader panels for broader assessments?, Science and Public Policy , Volume 50, Issue 4, August 2023, Pages 619–632, https://doi.org/10.1093/scipol/scad009

  • Permissions Icon Permissions

Panel peer review is widely used to decide which research proposals receive funding. Through this exploratory observational study at two large biomedical and health research funders in the Netherlands, we gain insight into how scientific quality and societal relevance are discussed in panel meetings. We explore, in ten review panel meetings of biomedical and health funding programmes, how panel composition and formal assessment criteria affect the arguments used. We observe that more scientific arguments are used than arguments related to societal relevance and expected impact. Also, more diverse panels result in a wider range of arguments, largely for the benefit of arguments related to societal relevance and impact. We discuss how funders can contribute to the quality of peer review by creating a shared conceptual framework that better defines research quality and societal relevance. We also contribute to a further understanding of the role of diverse peer review panels.

Scientific biomedical and health research is often supported by project or programme grants from public funding agencies such as governmental research funders and charities. Research funders primarily rely on peer review, often a combination of independent written review and discussion in a peer review panel, to inform their funding decisions. Peer review panels have the difficult task of integrating and balancing the various assessment criteria to select and rank the eligible proposals. With the increasing emphasis on societal benefit and being responsive to societal needs, the assessment of research proposals ought to include broader assessment criteria, including both scientific quality and societal relevance, and a broader perspective on relevant peers. This results in new practices of including non-scientific peers in review panels ( Del Carmen Calatrava Moreno et al. 2019 ; Den Oudendammer et al. 2019 ; Van den Brink et al. 2016 ). Relevant peers, in the context of biomedical and health research, include, for example, health-care professionals, (healthcare) policymakers, and patients as the (end-)users of research.

Currently, in scientific and grey literature, much attention is paid to what legitimate criteria are and to deficiencies in the peer review process, for example, focusing on the role of chance and the difficulty of assessing interdisciplinary or ‘blue sky’ research ( Langfeldt 2006 ; Roumbanis 2021a ). Our research primarily builds upon the work of Lamont (2009) , Huutoniemi (2012) , and Kolarz et al. (2016) . Their work articulates how the discourse in peer review panels can be understood by giving insight into disciplinary assessment cultures and social dynamics, as well as how panel members define and value concepts such as scientific excellence, interdisciplinarity, and societal impact. At the same time, there is little empirical work on what actually is discussed in peer review meetings and to what extent this is related to the specific objectives of the research funding programme. Such observational work is especially lacking in the biomedical and health domain.

The aim of our exploratory study is to learn what arguments panel members use in a review meeting when assessing research proposals in biomedical and health research programmes. We explore how arguments used in peer review panels are affected by (1) the formal assessment criteria and (2) the inclusion of non-scientific peers in review panels, also called (end-)users of research, societal stakeholders, or societal actors. We add to the existing literature by focusing on the actual arguments used in peer review assessment in practice.

To this end, we observed ten panel meetings in a variety of eight biomedical and health research programmes at two large research funders in the Netherlands: the governmental research funder The Netherlands Organisation for Health Research and Development (ZonMw) and the charitable research funder the Dutch Heart Foundation (DHF). Our first research question focuses on what arguments panel members use when assessing research proposals in a review meeting. The second examines to what extent these arguments correspond with the formal −as described in the programme brochure and assessment form− criteria on scientific quality and societal impact creation. The third question focuses on how arguments used differ between panel members with different perspectives.

2.1 Relation between science and society

To understand the dual focus of scientific quality and societal relevance in research funding, a theoretical understanding and a practical operationalisation of the relation between science and society are needed. The conceptualisation of this relationship affects both who are perceived as relevant peers in the review process and the criteria by which research proposals are assessed.

The relationship between science and society is not constant over time nor static, yet a relation that is much debated. Scientific knowledge can have a huge impact on societies, either intended or unintended. Vice versa, the social environment and structure in which science takes place influence the rate of development, the topics of interest, and the content of science. However, the second part of this inter-relatedness between science and society generally receives less attention ( Merton 1968 ; Weingart 1999 ).

From a historical perspective, scientific and technological progress contributed to the view that science was valuable on its own account and that science and the scientist stood independent of society. While this protected science from unwarranted political influence, societal disengagement with science resulted in less authority by science and debate about its contribution to society. This interdependence and mutual influence contributed to a modern view of science in which knowledge development is valued both on its own merit and for its impact on, and interaction with, society. As such, societal factors and problems are important drivers for scientific research. This warrants that the relation and boundaries between science, society, and politics need to be organised and constantly reinforced and reiterated ( Merton 1968 ; Shapin 2008 ; Weingart 1999 ).

Glerup and Horst (2014) conceptualise the value of science to society and the role of society in science in four rationalities that reflect different justifications for their relation and thus also for who is responsible for (assessing) the societal value of science. The rationalities are arranged along two axes: one is related to the internal or external regulation of science and the other is related to either the process or the outcome of science as the object of steering. The first two rationalities of Reflexivity and Demarcation focus on internal regulation in the scientific community. Reflexivity focuses on the outcome. Central is that science, and thus, scientists should learn from societal problems and provide solutions. Demarcation focuses on the process: science should continuously question its own motives and methods. The latter two rationalities of Contribution and Integration focus on external regulation. The core of the outcome-oriented Contribution rationality is that scientists do not necessarily see themselves as ‘working for the public good’. Science should thus be regulated by society to ensure that outcomes are useful. The central idea of the process-oriented Integration rationality is that societal actors should be involved in science in order to influence the direction of research.

Research funders can be seen as external or societal regulators of science. They can focus on organising the process of science, Integration, or on scientific outcomes that function as solutions for societal challenges, Contribution. In the Contribution perspective, a funder could enhance outside (societal) involvement in science to ensure that scientists take responsibility to deliver results that are needed and used by society. From Integration follows that actors from science and society need to work together in order to produce the best results. In this perspective, there is a lack of integration between science and society and more collaboration and dialogue are needed to develop a new kind of integrative responsibility ( Glerup and Horst 2014 ). This argues for the inclusion of other types of evaluators in research assessment. In reality, these rationalities are not mutually exclusive and also not strictly separated. As a consequence, multiple rationalities can be recognised in the reasoning of scientists and in the policies of research funders today.

2.2 Criteria for research quality and societal relevance

The rationalities of Glerup and Horst have consequences for which language is used to discuss societal relevance and impact in research proposals. Even though the main ingredients are quite similar, as a consequence of the coexisting rationalities in science, societal aspects can be defined and operationalised in different ways ( Alla et al. 2017 ). In the definition of societal impact by Reed, emphasis is placed on the outcome : the contribution to society. It includes the significance for society, the size of potential impact, and the reach , the number of people or organisations benefiting from the expected outcomes ( Reed et al. 2021 ). Other models and definitions focus more on the process of science and its interaction with society. Spaapen and Van Drooge introduced productive interactions in the assessment of societal impact, highlighting a direct contact between researchers and other actors. A key idea is that the interaction in different domains leads to impact in different domains ( Meijer 2012 ; Spaapen and Van Drooge 2011 ). Definitions that focus on the process often refer to societal impact as (1) something that can take place in distinguishable societal domains, (2) something that needs to be actively pursued, and (3) something that requires interactions with societal stakeholders (or users of research) ( Hughes and Kitson 2012 ; Spaapen and Van Drooge 2011 ).

Glerup and Horst show that process and outcome-oriented aspects can be combined in the operationalisation of criteria for assessing research proposals on societal aspects. Also, the funders participating in this study include the outcome—the value created in different domains—and the process—productive interactions with stakeholders—in their formal assessment criteria for societal relevance and impact. Different labels are used for these criteria, such as societal relevance , societal quality , and societal impact ( Abma-Schouten 2017 ; Reijmerink and Oortwijn 2017 ). In this paper, we use societal relevance or societal relevance and impact .

Scientific quality in research assessment frequently refers to all aspects and activities in the study that contribute to the validity and reliability of the research results and that contribute to the integrity and quality of the research process itself. The criteria commonly include the relevance of the proposal for the funding programme, the scientific relevance, originality, innovativeness, methodology, and feasibility ( Abdoul et al. 2012 ). Several studies demonstrated that quality is seen as not only a rich concept but also a complex concept in which excellence and innovativeness, methodological aspects, engagement of stakeholders, multidisciplinary collaboration, and societal relevance all play a role ( Geurts 2016 ; Roumbanis 2019 ; Scholten et al. 2018 ). Another study showed a comprehensive definition of ‘good’ science, which includes creativity, reproducibility, perseverance, intellectual courage, and personal integrity. It demonstrated that ‘good’ science involves not only scientific excellence but also personal values and ethics, and engagement with society ( Van den Brink et al. 2016 ). Noticeable in these studies is the connection made between societal relevance and scientific quality.

In summary, the criteria for scientific quality and societal relevance are conceptualised in different ways, and perspectives on the role of societal value creation and the involvement of societal actors vary strongly. Research funders hence have to pay attention to the meaning of the criteria for the panel members they recruit to help them, and navigate and negotiate how the criteria are applied in assessing research proposals. To be able to do so, more insight is needed in which elements of scientific quality and societal relevance are discussed in practice by peer review panels.

2.3 Role of funders and societal actors in peer review

National governments and charities are important funders of biomedical and health research. How this funding is distributed varies per country. Project funding is frequently allocated based on research programming by specialised public funding organisations, such as the Dutch Research Council in the Netherlands and ZonMw for health research. The DHF, the second largest private non-profit research funder in the Netherlands, provides project funding ( Private Non-Profit Financiering 2020 ). Funders, as so-called boundary organisations, can act as key intermediaries between government, science, and society ( Jasanoff 2011 ). Their responsibility is to develop effective research policies connecting societal demands and scientific ‘supply’. This includes setting up and executing fair and balanced assessment procedures ( Sarewitz and Pielke 2007 ). Herein, the role of societal stakeholders is receiving increasing attention ( Benedictus et al. 2016 ; De Rijcke et al. 2016 ; Dijstelbloem et al. 2013 ; Scholten et al. 2018 ).

All charitable health research funders in the Netherlands have, in the last decade, included patients at different stages of the funding process, including in assessing research proposals ( Den Oudendammer et al. 2019 ). To facilitate research funders in involving patients in assessing research proposals, the federation of Dutch patient organisations set up an independent reviewer panel with (at-risk) patients and direct caregivers ( Patiëntenfederatie Nederland, n.d .). Other foundations have set up societal advisory panels including a wider range of societal actors than patients alone. The Committee Societal Quality (CSQ) of the DHF includes, for example, (at-risk) patients and a wide range of cardiovascular health-care professionals who are not active as academic researchers. This model is also applied by the Diabetes Foundation and the Princess Beatrix Muscle Foundation in the Netherlands ( Diabetesfonds, n.d .; Prinses Beatrix Spierfonds, n.d .).

In 2014, the Lancet presented a series of five papers about biomedical and health research known as the ‘increasing value, reducing waste’ series ( Macleod et al. 2014 ). The authors addressed several issues as well as potential solutions that funders can implement. They highlight, among others, the importance of improving the societal relevance of the research questions and including the burden of disease in research assessment in order to increase the value of biomedical and health science for society. A better understanding of and an increasing role of users of research are also part of the described solutions ( Chalmers et al. 2014 ; Van den Brink et al. 2016 ). This is also in line with the recommendations of the 2013 Declaration on Research Assessment (DORA) ( DORA 2013 ). These recommendations influence the way in which research funders operationalise their criteria in research assessment, how they balance the judgement of scientific and societal aspects, and how they involve societal stakeholders in peer review.

2.4 Panel peer review of research proposals

To assess research proposals, funders rely on the services of peer experts to review the thousands or perhaps millions of research proposals seeking funding each year. While often associated with scholarly publishing, peer review also includes the ex ante assessment of research grant and fellowship applications ( Abdoul et al. 2012 ). Peer review of proposals often includes a written assessment of a proposal by an anonymous peer and a peer review panel meeting to select the proposals eligible for funding. Peer review is an established component of professional academic practice, is deeply embedded in the research culture, and essentially consists of experts in a given domain appraising the professional performance, creativity, and/or quality of scientific work produced by others in their field of competence ( Demicheli and Di Pietrantonj 2007 ). The history of peer review as the default approach for scientific evaluation and accountability is, however, relatively young. While the term was unheard of in the 1960s, by 1970, it had become the standard. Since that time, peer review has become increasingly diverse and formalised, resulting in more public accountability ( Reinhart and Schendzielorz 2021 ).

While many studies have been conducted concerning peer review in scholarly publishing, peer review in grant allocation processes has been less discussed ( Demicheli and Di Pietrantonj 2007 ). The most extensive work on this topic has been conducted by Lamont (2009) . Lamont studied peer review panels in five American research funding organisations, including observing three panels. Other examples include Roumbanis’s ethnographic observations of ten review panels at the Swedish Research Council in natural and engineering sciences ( Roumbanis 2017 , 2021a ). Also, Huutoniemi was able to study, but not observe, four panels on environmental studies and social sciences of the Academy of Finland ( Huutoniemi 2012 ). Additionally, Van Arensbergen and Van den Besselaar (2012) analysed peer review through interviews and by analysing the scores and outcomes at different stages of the peer review process in a talent funding programme. In particular, interesting is the study by Luo and colleagues on 164 written panel review reports, showing that the reviews from panels that included non-scientific peers described broader and more concrete impact topics. Mixed panels also more often connected research processes and characteristics of applicants with impact creation ( Luo et al. 2021 ).

While these studies primarily focused on peer review panels in other disciplinary domains or are based on interviews or reports instead of direct observations, we believe that many of the findings are relevant to the functioning of panels in the context of biomedical and health research. From this literature, we learn to have realistic expectations of peer review. It is inherently difficult to predict in advance which research projects will provide the most important findings or breakthroughs ( Lee et al. 2013 ; Pier et al. 2018 ; Roumbanis 2021a , 2021b ). At the same time, these limitations may not substantiate the replacement of peer review by another assessment approach ( Wessely 1998 ). Many topics addressed in the literature are inter-related and relevant to our study, such as disciplinary differences and interdisciplinarity, social dynamics and their consequences for consistency and bias, and suggestions to improve panel peer review ( Lamont and Huutoniemi 2011 ; Lee et al. 2013 ; Pier et al. 2018 ; Roumbanis 2021a , b ; Wessely 1998 ).

Different scientific disciplines show different preferences and beliefs about how to build knowledge and thus have different perceptions of excellence. However, panellists are willing to respect and acknowledge other standards of excellence ( Lamont 2009 ). Evaluation cultures also differ between scientific fields. Science, technology, engineering, and mathematics panels might, in comparison with panellists from social sciences and humanities, be more concerned with the consistency of the assessment across panels and therefore with clear definitions and uses of assessment criteria ( Lamont and Huutoniemi 2011 ). However, much is still to learn about how panellists’ cognitive affiliations with particular disciplines unfold in the evaluation process. Therefore, the assessment of interdisciplinary research is much more complex than just improving the criteria or procedure because less explicit repertoires would also need to change ( Huutoniemi 2012 ).

Social dynamics play a role as panellists may differ in their motivation to engage in allocation processes, which could create bias ( Lee et al. 2013 ). Placing emphasis on meeting established standards or thoroughness in peer review may promote uncontroversial and safe projects, especially in a situation where strong competition puts pressure on experts to reach a consensus ( Langfeldt 2001 ,2006 ). Personal interest and cognitive similarity may also contribute to conservative bias, which could negatively affect controversial or frontier science ( Luukkonen 2012 ; Roumbanis 2021a ; Travis and Collins 1991 ). Central in this part of literature is that panel conclusions are the outcome of and are influenced by the group interaction ( Van Arensbergen et al. 2014a ). Differences in, for example, the status and expertise of the panel members can play an important role in group dynamics. Insights from social psychology on group dynamics can help in understanding and avoiding bias in peer review panels ( Olbrecht and Bornmann 2010 ). For example, group performance research shows that more diverse groups with complementary skills make better group decisions than homogenous groups. Yet, heterogeneity can also increase conflict within the group ( Forsyth 1999 ). Therefore, it is important to pay attention to power dynamics and maintain team spirit and good communication ( Van Arensbergen et al. 2014a ), especially in meetings that include both scientific and non-scientific peers.

The literature also provides funders with starting points to improve the peer review process. For example, the explicitness of review procedures positively influences the decision-making processes ( Langfeldt 2001 ). Strategic voting and decision-making appear to be less frequent in panels that rate than in panels that rank proposals. Also, an advisory instead of a decisional role may improve the quality of the panel assessment ( Lamont and Huutoniemi 2011 ).

Despite different disciplinary evaluative cultures, formal procedures, and criteria, panel members with different backgrounds develop shared customary rules of deliberation that facilitate agreement and help avoid situations of conflict ( Huutoniemi 2012 ; Lamont 2009 ). This is a necessary prerequisite for opening up peer review panels to include non-academic experts. When doing so, it is important to realise that panel review is a social, emotional, and interactional process. It is therefore important to also take these non-cognitive aspects into account when studying cognitive aspects ( Lamont and Guetzkow 2016 ), as we do in this study.

In summary, what we learn from the literature is that (1) the specific criteria to operationalise scientific quality and societal relevance of research are important, (2) the rationalities from Glerup and Horst predict that not everyone values societal aspects and involve non-scientists in peer review to the same extent and in the same way, (3) this may affect the way peer review panels discuss these aspects, and (4) peer review is a challenging group process that could accommodate other rationalities in order to prevent bias towards specific scientific criteria. To disentangle these aspects, we have carried out an observational study of a diverse range of peer review panel sessions using a fixed set of criteria focusing on scientific quality and societal relevance.

3.1 Research assessment at ZonMw and the DHF

The peer review approach and the criteria used by both the DHF and ZonMw are largely comparable. Funding programmes at both organisations start with a brochure describing the purposes, goals, and conditions for research applications, as well as the assessment procedure and criteria. Both organisations apply a two-stage process. In the first phase, reviewers are asked to write a peer review. In the second phase, a panel reviews the application based on the advice of the written reviews and the applicants’ rebuttal. The panels advise the board on eligible proposals for funding including a ranking of these proposals.

There are also differences between the two organisations. At ZonMw, the criteria for societal relevance and quality are operationalised in the ZonMw Framework Fostering Responsible Research Practices ( Reijmerink and Oortwijn 2017 ). This contributes to a common operationalisation of both quality and societal relevance on the level of individual funding programmes. Important elements in the criteria for societal relevance are, for instance, stakeholder participation, (applying) holistic health concepts, and the added value of knowledge in practice, policy, and education. The framework was developed to optimise the funding process from the perspective of knowledge utilisation and includes concepts like productive interactions and Open Science. It is part of the ZonMw Impact Assessment Framework aimed at guiding the planning, monitoring, and evaluation of funding programmes ( Reijmerink et al. 2020 ). At ZonMw, interdisciplinary panels are set up specifically for each funding programme. Panels are interdisciplinary in nature with academics of a wide range of disciplines and often include non-academic peers, like policymakers, health-care professionals, and patients.

At the DHF, the criteria for scientific quality and societal relevance, at the DHF called societal impact , find their origin in the strategy report of the advisory committee CardioVascular Research Netherlands ( Reneman et al. 2010 ). This report forms the basis of the DHF research policy focusing on scientific and societal impact by creating national collaborations in thematic, interdisciplinary research programmes (the so-called consortia) connecting preclinical and clinical expertise into one concerted effort. An International Scientific Advisory Committee (ISAC) was established to assess these thematic consortia. This panel consists of international scientists, primarily with expertise in the broad cardiovascular research field. The DHF criteria for societal impact were redeveloped in 2013 in collaboration with their CSQ. This panel assesses and advises on the societal aspects of proposed studies. The societal impact criteria include the relevance of the health-care problem, the expected contribution to a solution, attention to the next step in science and towards implementation in practice, and the involvement of and interaction with (end-)users of research (R.Y. Abma-Schouten and I.M. Meijer, unpublished data). Peer review panels for consortium funding are generally composed of members of the ISAC, members of the CSQ, and ad hoc panel members relevant to the specific programme. CSQ members often have a pre-meeting before the final panel meetings to prepare and empower CSQ representatives participating in the peer review panel.

3.2 Selection of funding programmes

To compare and evaluate observations between the two organisations, we selected funding programmes that were relatively comparable in scope and aims. The criteria were (1) a translational and/or clinical objective and (2) the selection procedure consisted of review panels that were responsible for the (final) relevance and quality assessment of grant applications. In total, we selected eight programmes: four at each organisation. At the DHF, two programmes were chosen in which the CSQ did not participate to better disentangle the role of the panel composition. For each programme, we observed the selection process varying from one session on one day (taking 2–8 h) to multiple sessions over several days. Ten sessions were observed in total, of which eight were final peer review panel meetings and two were CSQ meetings preparing for the panel meeting.

After management approval for the study in both organisations, we asked programme managers and panel chairpersons of the programmes that were selected for their consent for observation; none refused participation. Panel members were, in a passive consent procedure, informed about the planned observation and anonymous analyses.

To ensure the independence of this evaluation, the selection of the grant programmes, and peer review panels observed, was at the discretion of the project team of this study. The observations and supervision of the analyses were performed by the senior author not affiliated with the funders.

3.3 Observation matrix

Given the lack of a common operationalisation for scientific quality and societal relevance, we decided to use an observation matrix with a fixed set of detailed aspects as a gold standard to score the brochures, the assessment forms, and the arguments used in panel meetings. The matrix used for the observations of the review panels was based upon and adapted from a ‘grant committee observation matrix’ developed by Van Arensbergen. The original matrix informed a literature review on the selection of talent through peer review and the social dynamics in grant review committees ( van Arensbergen et al. 2014b ). The matrix includes four categories of aspects that operationalise societal relevance, scientific quality, committee, and applicant (see  Table 1 ). The aspects of scientific quality and societal relevance were adapted to fit the operationalisation of scientific quality and societal relevance of the organisations involved. The aspects concerning societal relevance were derived from the CSQ criteria, and the aspects concerning scientific quality were based on the scientific criteria of the first panel observed. The four argument types related to the panel were kept as they were. This committee-related category reflects statements that are related to the personal experience or preference of a panel member and can be seen as signals for bias. This category also includes statements that compare a project with another project without further substantiation. The three applicant-related arguments in the original observation matrix were extended with a fourth on social skills in communication with society. We added health technology assessment (HTA) because one programme specifically focused on this aspect. We tested our version of the observation matrix in pilot observations.

Aspects included in the observation matrix and examples of arguments.

Short title of aspects in the observation matrixExamples of arguments
Criterion: scientific quality
Fit in programme objectives‘This disease is underdiagnosed, and undertreated, and therefore fits the criteria of this call very well.’
‘Might have a relevant impact on patient care, but to what extent does it align with the aims of this programme.’
Match science and health-care problem‘It is not properly compared to the current situation (standard of care).’
‘Super relevant application with a fitting plan, perhaps a little too mechanistic.’
International competitiveness‘Something is done all over the world, but they do many more evaluations, however.’
Feasibility of the aims‘… because this is a discovery study the power calculation is difficult, but I would recommend to increase the sample size.’
‘It’s very risky, because this is an exploratory … study without hypotheses.’
‘The aim is to improve …, but there is no control to compare with.’
‘Well substantiated that they are able to achieve the objectives.’
Plan of work‘Will there be enough cases in this cohort?’
‘The budget is no longer correct.’
‘Plan is good, but … doubts about the approach, because too little information….’
Criterion: societal relevance
Health-care problem‘Relevant problem for a small group.’
‘… but is this a serious health condition?’
‘Prevalence is low, but patients do die, morbidity is very high.’
Contribution to solution‘What will this add since we already do…?’
‘It is unclear what the intervention will be after the diagnosis.’
‘Relevance is not good. Side effects are not known and neither is effectiveness.’
Next step in science‘What is needed to go from this retrospective study towards implementation?’
‘It’s not clear whether that work package is necessary or “nice to have”.’
‘Knowledge utilisation paragraph is standard, as used by copywriters.’
Activities towards partners‘What do the applicants do to change the current practice?’
‘Important that the company also contributes financially to the further development.’
‘This proposal includes a good communication plan.’
Participation/diversity‘A user committee is described, but it isn’t well thought through: what is their role?’
‘It’s also important to invite relatives of patients to participate.’
‘They thought really well what their patient group can contribute to the study plan.’
Applicant-related aspects
Scientific publication applicant‘One project leader only has one original paper, …, focus more on other diseases.’
‘Publication output not excellent. Conference papers and posters of local meetings, CV not so strong.’
Background applicant‘… not enough with this expertise involved in the leadership.’
‘Very good CV, … has won many awards.’
‘Candidate is excellent, top 10 to 20 in this field….’
Reputation applicant‘… the main applicant is a hotshot in this field.’
‘Candidate leads cohorts as …, gets a no.’
Societal skills‘Impressed that they took my question seriously, that made my day.’
‘They were very honest about overoptimism in the proposal.’
‘Good group, but they seem quite aware of their own brilliance.’
HTA
HTA‘Concrete revenues are negative, however improvement in quality-adjusted life years but very shaky.’
Committee-related aspects
Personal experience with the applicant‘This researcher only wants to acquire knowledge, nothing further.’
‘I reviewed him before and he is not very good at interviews.’
Personal/unasserted preference‘Excellent presentation, much better than the application.’ (Without further elaboration)
‘This academic lab has advantages, but also disadvantages with regard to independence.’
‘If it can be done anywhere, it is in this group.’
Relation with applicants’ institute/network‘May come up with new models, they’re linked with a group in … who can do this very well.’
Comparison with other applications‘What is the relevance compared to the other proposal? They do something similar.’
‘Look at the proposals as a whole, portfolio, we have clinical and we have fundamental.’
Short title of aspects in the observation matrixExamples of arguments
Criterion: scientific quality
Fit in programme objectives‘This disease is underdiagnosed, and undertreated, and therefore fits the criteria of this call very well.’
‘Might have a relevant impact on patient care, but to what extent does it align with the aims of this programme.’
Match science and health-care problem‘It is not properly compared to the current situation (standard of care).’
‘Super relevant application with a fitting plan, perhaps a little too mechanistic.’
International competitiveness‘Something is done all over the world, but they do many more evaluations, however.’
Feasibility of the aims‘… because this is a discovery study the power calculation is difficult, but I would recommend to increase the sample size.’
‘It’s very risky, because this is an exploratory … study without hypotheses.’
‘The aim is to improve …, but there is no control to compare with.’
‘Well substantiated that they are able to achieve the objectives.’
Plan of work‘Will there be enough cases in this cohort?’
‘The budget is no longer correct.’
‘Plan is good, but … doubts about the approach, because too little information….’
Criterion: societal relevance
Health-care problem‘Relevant problem for a small group.’
‘… but is this a serious health condition?’
‘Prevalence is low, but patients do die, morbidity is very high.’
Contribution to solution‘What will this add since we already do…?’
‘It is unclear what the intervention will be after the diagnosis.’
‘Relevance is not good. Side effects are not known and neither is effectiveness.’
Next step in science‘What is needed to go from this retrospective study towards implementation?’
‘It’s not clear whether that work package is necessary or “nice to have”.’
‘Knowledge utilisation paragraph is standard, as used by copywriters.’
Activities towards partners‘What do the applicants do to change the current practice?’
‘Important that the company also contributes financially to the further development.’
‘This proposal includes a good communication plan.’
Participation/diversity‘A user committee is described, but it isn’t well thought through: what is their role?’
‘It’s also important to invite relatives of patients to participate.’
‘They thought really well what their patient group can contribute to the study plan.’
Applicant-related aspects
Scientific publication applicant‘One project leader only has one original paper, …, focus more on other diseases.’
‘Publication output not excellent. Conference papers and posters of local meetings, CV not so strong.’
Background applicant‘… not enough with this expertise involved in the leadership.’
‘Very good CV, … has won many awards.’
‘Candidate is excellent, top 10 to 20 in this field….’
Reputation applicant‘… the main applicant is a hotshot in this field.’
‘Candidate leads cohorts as …, gets a no.’
Societal skills‘Impressed that they took my question seriously, that made my day.’
‘They were very honest about overoptimism in the proposal.’
‘Good group, but they seem quite aware of their own brilliance.’
HTA
HTA‘Concrete revenues are negative, however improvement in quality-adjusted life years but very shaky.’
Committee-related aspects
Personal experience with the applicant‘This researcher only wants to acquire knowledge, nothing further.’
‘I reviewed him before and he is not very good at interviews.’
Personal/unasserted preference‘Excellent presentation, much better than the application.’ (Without further elaboration)
‘This academic lab has advantages, but also disadvantages with regard to independence.’
‘If it can be done anywhere, it is in this group.’
Relation with applicants’ institute/network‘May come up with new models, they’re linked with a group in … who can do this very well.’
Comparison with other applications‘What is the relevance compared to the other proposal? They do something similar.’
‘Look at the proposals as a whole, portfolio, we have clinical and we have fundamental.’

3.4 Observations

Data were primarily collected through observations. Our observations of review panel meetings were non-participatory: the observer and goal of the observation were introduced at the start of the meeting, without further interactions during the meeting. To aid in the processing of observations, some meetings were audiotaped (sound only). Presentations or responses of applicants were not noted and were not part of the analysis. The observer made notes on the ongoing discussion and scored the arguments while listening. One meeting was not attended in person and only observed and scored by listening to the audiotape recording. Because this made identification of the panel members unreliable, this panel meeting was excluded from the analysis of the third research question on how arguments used differ between panel members with different perspectives.

3.5 Grant programmes and the assessment criteria

We gathered and analysed all brochures and assessment forms used by the review panels in order to answer our second research question on the correspondence of arguments used with the formal criteria. Several programmes consisted of multiple grant calls: in that case, the specific call brochure was gathered and analysed, not the overall programme brochure. Additional documentation (e.g. instructional presentations at the start of the panel meeting) was not included in the document analysis. All included documents were marked using the aforementioned observation matrix. The panel-related arguments were not used because this category reflects the personal arguments of panel members that are not part of brochures or instructions. To avoid potential differences in scoring methods, two of the authors independently scored half of the documents that were checked and validated afterwards by the other. Differences were discussed until a consensus was reached.

3.6 Panel composition

In order to answer the third research question, background information on panel members was collected. We categorised the panel members into five common types of panel members: scientific, clinical scientific, health-care professional/clinical, patient, and policy. First, a list of all panel members was composed including their scientific and professional backgrounds and affiliations. The theoretical notion that reviewers represent different types of users of research and therefore potential impact domains (academic, social, economic, and cultural) was leading in the categorisation ( Meijer 2012 ; Spaapen and Van Drooge 2011 ). Because clinical researchers play a dual role in both advancing research as a fellow academic and as a user of the research output in health-care practice, we divided the academic members into two categories of non-clinical and clinical researchers. Multiple types of professional actors participated in each review panel. These were divided into two groups for the analysis: health-care professionals (without current academic activity) and policymakers in the health-care sector. No representatives of the private sector participated in the observed review panels. From the public domain, (at-risk) patients and patient representatives were part of several review panels. Only publicly available information was used to classify the panel members. Members were assigned to one category only: categorisation took place based on the specific role and expertise for which they were appointed to the panel.

In two of the four DHF programmes, the assessment procedure included the CSQ. In these two programmes, representatives of this CSQ participated in the scientific panel to articulate the findings of the CSQ meeting during the final assessment meeting. Two grant programmes were assessed by a review panel with solely (clinical) scientific members.

3.7 Analysis

Data were processed using ATLAS.ti 8 and Microsoft Excel 2010 to produce descriptive statistics. All observed arguments were coded and given a randomised identification code for the panel member using that particular argument. The number of times an argument type was observed was used as an indicator for the relative importance of that argument in the appraisal of proposals. With this approach, a practical and reproducible method for research funders to evaluate the effect of policy changes on peer review was developed. If codes or notes were unclear, post-observation validation of codes was carried out based on observation matrix notes. Arguments that were noted by the observer but could not be matched with an existing code were first coded as a ‘non-existing’ code, and these were resolved by listening back to the audiotapes. Arguments that could not be assigned to a panel member were assigned a ‘missing panel member’ code. A total of 4.7 per cent of all codes were assigned a ‘missing panel member’ code.

After the analyses, two meetings were held to reflect on the results: one with the CSQ and the other with the programme coordinators of both organisations. The goal of these meetings was to improve our interpretation of the findings, disseminate the results derived from this project, and identify topics for further analyses or future studies.

3.8 Limitations

Our study focuses on studying the final phase of the peer review process of research applications in a real-life setting. Our design, a non-participant observation of peer review panels, also introduced several challenges ( Liu and Maitlis 2010 ).

First, the independent review phase or pre-application phase was not part of our study. We therefore could not assess to what extent attention to certain aspects of scientific quality or societal relevance and impact in the review phase influenced the topics discussed during the meeting.

Second, the most important challenge of overt non-participant observations is the observer effect: the danger of causing reactivity in those under study. We believe that the consequences of this effect on our conclusions were limited because panellists are used to external observers in the meetings of these two funders. The observer briefly explained the goal of the study during the introductory round of the panel in general terms. The observer sat as unobtrusively as possible and avoided reactivity to discussions. Similar to previous observations of panels, we experienced that the fact that an observer was present faded into the background during a meeting ( Roumbanis 2021a ). However, a limited observer effect can never be entirely excluded.

Third, our design to only score the arguments raised, and not the responses of the applicant, or information on the content of the proposals, has its positives and negatives. With this approach, we could assure the anonymity of the grant procedures reviewed, the applicants and proposals, panels, and individual panellists. This was an important condition for the funders involved. We took the frequency arguments used as a proxy for the relative importance of that argument in decision-making, which undeniably also has its caveats. Our data collection approach limits more in-depth reflection on which arguments were decisive in decision-making and on group dynamics during the interaction with the applicants as non-verbal and non-content-related comments were not captured in this study.

Fourth, despite this being one of the largest observational studies on the peer review assessment of grant applications with the observation of ten panels in eight grant programmes, many variables might explain differences in arguments used within and beyond our view. Examples of ‘confounding’ variables are the many variations in panel composition, the differences in objectives of the programmes, and the range of the funding programmes. Our study should therefore be seen as exploratory and thus warrants caution in drawing conclusions.

4.1 Overview of observational data

The grant programmes included in this study reflected a broad range of biomedical and health funding programmes, ranging from fellowship grants to translational research and applied health research. All formal documents available to the applicants and to the review panel were retrieved for both ZonMw and the DHF. In total, eighteen documents corresponding to the eight grant programmes were studied. The number of proposals assessed per programme varied from three to thirty-three. The duration of the panel meetings varied between 2 h and two consecutive days. Together, this resulted in a large spread in the number of total arguments used in an individual meeting and in a grant programme as a whole. In the shortest meeting, 49 arguments were observed versus 254 in the longest, with a mean of 126 arguments per meeting and on average 15 arguments per proposal.

We found consistency between how criteria were operationalised in the grant programme’s brochures and in the assessment forms of the review panels overall. At the same time, because the number of elements included in the observation matrix is limited, there was a considerable diversity in the arguments that fall within each aspect (see examples in  Table 1 ). Some of these differences could possibly be explained by differences in language used and the level of detail in the observation matrix, the brochure, and the panel’s instructions. This was especially the case in the applicant-related aspects in which the observation matrix was more detailed than the text in the brochure and assessment forms.

In interpretating our findings, it is important to take into account that, even though our data were largely complete and the observation matrix matched well with the description of the criteria in the brochures and assessment forms, there was a large diversity in the type and number of arguments used and in the number of proposals assessed in the grant programmes included in our study.

4.2 Wide range of arguments used by panels: scientific arguments used most

For our first research question, we explored the number and type of arguments used in the panel meetings. Figure 1 provides an overview of the arguments used. Scientific quality was discussed most. The number of times the feasibility of the aims was discussed clearly stands out in comparison to all other arguments. Also, the match between the science and the problem studied and the plan of work were frequently discussed aspects of scientific quality. International competitiveness of the proposal was discussed the least of all five scientific arguments.

The number of arguments used in panel meetings.

The number of arguments used in panel meetings.

Attention was paid to societal relevance and impact in the panel meetings of both organisations. Yet, the language used differed somewhat between organisations. The contribution to a solution and the next step in science were the most often used societal arguments. At ZonMw, the impact of the health-care problem studied and the activities towards partners were less frequently discussed than the other three societal arguments. At the DHF, the five societal arguments were used equally often.

With the exception of the fellowship programme meeting, applicant-related arguments were not often used. The fellowship panel used arguments related to the applicant and to scientific quality about equally often. Committee-related arguments were also rarely used in the majority of the eight grant programmes observed. In three out of the ten panel meetings, one or two arguments were observed, which were related to personal experience with the applicant or their direct network. In seven out of ten meetings, statements were observed, which were unasserted or were explicitly announced as reflecting a personal preference. The frequency varied between one and seven statements (sixteen in total), which is low in comparison to the other arguments used (see  Fig. 1 for examples).

4.3 Use of arguments varied strongly per panel meeting

The balance in the use of scientific and societal arguments varied strongly per grant programme, panel, and organisation. At ZonMw, two meetings had approximately an equal balance in societal and scientific arguments. In the other two meetings, scientific arguments were used twice to four times as often as societal arguments. At the DHF, three types of panels were observed. Different patterns in the relative use of societal and scientific arguments were observed for each of these panel types. In the two CSQ-only meetings the societal arguments were used approximately twice as often as scientific arguments. In the two meetings of the scientific panels, societal arguments were infrequently used (between zero and four times per argument category). In the combined societal and scientific panel meetings, the use of societal and scientific arguments was more balanced.

4.4 Match of arguments used by panels with the assessment criteria

In order to answer our second research question, we looked into the relation of the arguments used with the formal criteria. We observed that a broader range of arguments were often used in comparison to how the criteria were described in the brochure and assessment instruction. However, arguments related to aspects that were consequently included in the brochure and instruction seemed to be discussed more frequently than in programmes where those aspects were not consistently included or were not included at all. Although the match of the science with the health-care problem and the background and reputation of the applicant were not always made explicit in the brochure or instructions, they were discussed in many panel meetings. Supplementary Fig. S1 provides a visualisation of how arguments used differ between the programmes in which those aspects were, were not, consistently included in the brochure and instruction forms.

4.5 Two-thirds of the assessment was driven by scientific panel members

To answer our third question, we looked into the differences in arguments used between panel members representing a scientific, clinical scientific, professional, policy, or patient perspective. In each research programme, the majority of panellists had a scientific background ( n  = 35), thirty-four members had a clinical scientific background, twenty had a health professional/clinical background, eight members represented a policy perspective, and fifteen represented a patient perspective. From the total number of arguments (1,097), two-thirds were made by members with a scientific or clinical scientific perspective. Members with a scientific background engaged most actively in the discussion with a mean of twelve arguments per member. Similarly, clinical scientists and health-care professionals participated with a mean of nine arguments, and members with a policy and patient perspective put forward the least number of arguments on average, namely, seven and eight. Figure 2 provides a complete overview of the total and mean number of arguments used by the different disciplines in the various panels.

The total and mean number of arguments displayed per subgroup of panel members.

The total and mean number of arguments displayed per subgroup of panel members.

4.6 Diverse use of arguments by panellists, but background matters

In meetings of both organisations, we observed a diverse use of arguments by the panel members. Yet, the use of arguments varied depending on the background of the panel member (see  Fig. 3 ). Those with a scientific and clinical scientific perspective used primarily scientific arguments. As could be expected, health-care professionals and patients used societal arguments more often.

The use of arguments differentiated by panel member background.

The use of arguments differentiated by panel member background.

Further breakdown of arguments across backgrounds showed clear differences in the use of scientific arguments between the different disciplines of panellists. Scientists and clinical scientists discussed the feasibility of the aims more than twice as often as their second most often uttered element of scientific quality, which was the match between the science and the problem studied . Patients and members with a policy or health professional background put forward fewer but more varied scientific arguments.

Patients and health-care professionals accounted for approximately half of the societal arguments used, despite being a much smaller part of the panel’s overall composition. In other words, members with a scientific perspective were less likely to use societal arguments. The relevance of the health-care problem studied, activities towards partners , and arguments related to participation and diversity were not used often by this group. Patients often used arguments related to patient participation and diversity and activities towards partners , although the frequency of the use of the latter differed per organisation.

The majority of the applicant-related arguments were put forward by scientists, including clinical scientists. Committee-related arguments were very rare and are therefore not differentiated by panel member background, except comments related to a comparison with other applications. These arguments were mainly put forward by panel members with a scientific background. HTA -related arguments were often used by panel members with a scientific perspective. Panel members with other perspectives used this argument scarcely (see Supplementary Figs S2–S4 for the visual presentation of the differences between panel members on all aspects included in the matrix).

5.1 Explanations for arguments used in panels

Our observations show that most arguments for scientific quality were often used. However, except for the feasibility , the frequency of arguments used varied strongly between the meetings and between the individual proposals that were discussed. The fact that most arguments were not consistently used is not surprising given the results from previous studies that showed heterogeneity in grant application assessments and low consistency in comments and scores by independent reviewers ( Abdoul et al. 2012 ; Pier et al. 2018 ). In an analysis of written assessments on nine observed dimensions, no dimension was used in more than 45 per cent of the reviews ( Hartmann and Neidhardt 1990 ).

There are several possible explanations for this heterogeneity. Roumbanis (2021a) described how being responsive to the different challenges in the proposals and to the points of attention arising from the written assessments influenced discussion in panels. Also when a disagreement arises, more time is spent on discussion ( Roumbanis 2021a ). One could infer that unambiguous, and thus not debated, aspects might remain largely undetected in our study. We believe, however, that the main points relevant to the assessment will not remain entirely unmentioned, because most panels in our study started the discussion with a short summary of the proposal, the written assessment, and the rebuttal. Lamont (2009) , however, points out that opening statements serve more goals than merely decision-making. They can also increase the credibility of the panellist, showing their comprehension and balanced assessment of an application. We can therefore not entirely disentangle whether the arguments observed most were also found to be most important or decisive or those were simply the topics that led to most disagreement.

An interesting difference with Roumbanis’ study was the available discussion time per proposal. In our study, most panels handled a limited number of proposals, allowing for longer discussions in comparison with the often 2-min time frame that Roumbanis (2021b) described, potentially contributing to a wider range of arguments being discussed. Limited time per proposal might also limit the number of panellists contributing to the discussion per proposal ( De Bont 2014 ).

5.2 Reducing heterogeneity by improving operationalisation and the consequent use of assessment criteria

We found that the language used for the operationalisation of the assessment criteria in programme brochures and in the observation matrix was much more detailed than in the instruction for the panel, which was often very concise. The exercise also illustrated that many terms were used interchangeably.

This was especially true for the applicant-related aspects. Several panels discussed how talent should be assessed. This confusion is understandable when considering the changing values in research and its assessment ( Moher et al. 2018 ) and the fact that the instruction of the funders was very concise. For example, it was not explicated whether the individual or the team should be assessed. Arensbergen et al. (2014b) described how in grant allocation processes, talent is generally assessed using limited characteristics. More objective and quantifiable outputs often prevailed at the expense of recognising and rewarding a broad variety of skills and traits combining professional, social, and individual capital ( DORA 2013 ).

In addition, committee-related arguments, like personal experiences with the applicant or their institute, were rarely used in our study. Comparisons between proposals were sometimes made without further argumentation, mainly by scientific panel members. This was especially pronounced in one (fellowship) grant programme with a high number of proposals. In this programme, the panel meeting concentrated on quickly comparing the quality of the applicants and of the proposals based on the reviewer’s judgement, instead of a more in-depth discussion of the different aspects of the proposals. Because the review phase was not part of this study, the question of which aspects have been used for the assessment of the proposals in this panel therefore remains partially unanswered. However, weighing and comparing proposals on different aspects and with different inputs is a core element of scientific peer review, both in the review of papers and in the review of grants ( Hirschauer 2010 ). The large role of scientific panel members in comparing proposals is therefore not surprising.

One could anticipate that more consequent language in the operationalising criteria may lead to more clarity for both applicants and panellists and to more consistency in the assessment of research proposals. The trend in our observations was that arguments were used less when the related criteria were not or were consequently included in the brochure and panel instruction. It remains, however, challenging to disentangle the influence of the formal definitions of criteria on the arguments used. Previous studies also encountered difficulties in studying the role of the formal instruction in peer review but concluded that this role is relatively limited ( Langfeldt 2001 ; Reinhart 2010 ).

The lack of a clear operationalisation of criteria can contribute to heterogeneity in peer review as many scholars found that assessors differ in the conceptualisation of good science and to the importance they attach to various aspects of research quality and societal relevance ( Abdoul et al. 2012 ; Geurts 2016 ; Scholten et al. 2018 ; Van den Brink et al. 2016 ). The large variation and absence of a gold standard in the interpretation of scientific quality and societal relevance affect the consistency of peer review. As a consequence, it is challenging to systematically evaluate and improve peer review in order to fund the research that contributes most to science and society. To contribute to responsible research and innovation, it is, therefore, important that funders invest in a more consistent and conscientious peer review process ( Curry et al. 2020 ; DORA 2013 ).

A common conceptualisation of scientific quality and societal relevance and impact could improve the alignment between views on good scientific conduct, programmes’ objectives, and the peer review in practice. Such a conceptualisation could contribute to more transparency and quality in the assessment of research. By involving panel members from all relevant backgrounds, including the research community, health-care professionals, and societal actors, in a better operationalisation of criteria, more inclusive views of good science can be implemented more systematically in the peer review assessment of research proposals. The ZonMw Framework Fostering Responsible Research Practices is an example of an initiative aiming to support standardisation and integration ( Reijmerink et al. 2020 ).

Given the lack of a common definition or conceptualisation of scientific quality and societal relevance, our study made an important decision by choosing to use a fixed set of detailed aspects of two important criteria as a gold standard to score the brochures, the panel instructions, and the arguments used by the panels. This approach proved helpful in disentangling the different components of scientific quality and societal relevance. Having said that, it is important not to oversimplify the causes for heterogeneity in peer review because these substantive arguments are not independent of non-cognitive, emotional, or social aspects ( Lamont and Guetzkow 2016 ; Reinhart 2010 ).

5.3 Do more diverse panels contribute to a broader use of arguments?

Both funders participating in our study have an outspoken public mission that requests sufficient attention to societal aspects in assessment processes. In reality, as observed in several panels, the main focus of peer review meetings is on scientific arguments. Next to the possible explanations earlier, the composition of the panel might play a role in explaining arguments used in panel meetings. Our results have shown that health-care professionals and patients bring in more societal arguments than scientists, including those who are also clinicians. It is, however, not that simple. In the more diverse panels, panel members, regardless of their backgrounds, used more societal arguments than in the less diverse panels.

Observing ten panel meetings was sufficient to explore differences in arguments used by panel members with different backgrounds. The pattern of (primarily) scientific arguments being raised by panels with mainly scientific members is not surprising. After all, it is their main task to assess the scientific content of grant proposals and fit their competencies. As such, one could argue, depending on how one justifies the relationship between science and society, that health-care professionals and patients might be better suited to assess the value for potential users of research results. Scientific panel members and clinical scientists in our study used less arguments that reflect on opening up and connecting science directly to others who can bring it further (being industry, health-care professionals, or other stakeholders). Patients filled this gap since these two types of arguments were the most prevalent type put forward by them. Making an active connection with society apparently needs a broader, more diverse panel for scientists to direct their attention to more societal arguments. Evident from our observations is that in panels with patients and health-care professionals, their presence seemed to increase the attention placed on arguments beyond the scientific arguments put forward by all panel members, including scientists. This conclusion is congruent with the observation that there was a more equal balance in the use of societal and scientific arguments in the scientific panels in which the CSQ participated. This illustrates that opening up peer review panels to non-scientific members creates an opportunity to focus on both the contribution and the integrative rationality ( Glerup and Horst 2014 ) or, in other words, to allow productive interactions between scientific and non-scientific actors. This corresponds with previous research that suggests that with regard to societal aspects, reviews from mixed panels were broader and richer ( Luo et al. 2021 ). In panels with non-scientific experts, more emphasis was placed on the role of the proposed research process to increase the likelihood of societal impact over the causal importance of scientific excellence for broader impacts. This is in line with the findings that panels with more disciplinary diversity, in range and also by including generalist experts, applied more versatile styles to reach consensus and paid more attention to relevance and pragmatic value ( Huutoniemi 2012 ).

Our observations further illustrate that patients and health-care professionals were less vocal in panels than (clinical) scientists and were in the minority. This could reflect their social role and lower perceived authority in the panel. Several guides are available for funders to stimulate the equal participation of patients in science. These guides are also applicable to their involvement in peer review panels. Measures to be taken include the support and training to help prepare patients for their participation in deliberations with renowned scientists and explicitly addressing power differences ( De Wit et al. 2016 ). Panel chairs and programme officers have to set and supervise the conditions for the functioning of both the individual panel members and the panel as a whole ( Lamont 2009 ).

5.4 Suggestions for future studies

In future studies, it is important to further disentangle the role of the operationalisation and appraisal of assessment criteria in reducing heterogeneity in the arguments used by panels. More controlled experimental settings are a valuable addition to the current mainly observational methodologies applied to disentangle some of the cognitive and social factors that influence the functioning and argumentation of peer review panels. Reusing data from the panel observations and the data on the written reports could also provide a starting point for a bottom-up approach to create a more consistent and shared conceptualisation and operationalisation of assessment criteria.

To further understand the effects of opening up review panels to non-scientific peers, it is valuable to compare the role of diversity and interdisciplinarity in solely scientific panels versus panels that also include non-scientific experts.

In future studies, differences between domains and types of research should also be addressed. We hypothesise that biomedical and health research is perhaps more suited for the inclusion of non-scientific peers in panels than other research domains. For example, it is valuable to better understand how potentially relevant users can be well enough identified in other research fields and to what extent non-academics can contribute to assessing the possible value of, especially early or blue sky, research.

The goal of our study was to explore in practice which arguments regarding the main criteria of scientific quality and societal relevance were used by peer review panels of biomedical and health research funding programmes. We showed that there is a wide diversity in the number and range of arguments used, but three main scientific aspects were discussed most frequently. These are the following: is it a feasible approach; does the science match the problem , and is the work plan scientifically sound? Nevertheless, these scientific aspects were accompanied by a significant amount of discussion of societal aspects, of which the contribution to a solution is the most prominent. In comparison with scientific panellists, non-scientific panellists, such as health-care professionals, policymakers, and patients, often use a wider range of arguments and other societal arguments. Even more striking was that, even though non-scientific peers were often outnumbered and less vocal in panels, scientists also used a wider range of arguments when non-scientific peers were present.

It is relevant that two health research funders collaborated in the current study to reflect on and improve peer review in research funding. There are few studies published that describe live observations of peer review panel meetings. Many studies focus on alternatives for peer review or reflect on the outcomes of the peer review process, instead of reflecting on the practice and improvement of peer review assessment of grant proposals. Privacy and confidentiality concerns of funders also contribute to the lack of information on the functioning of peer review panels. In this study, both organisations were willing to participate because of their interest in research funding policies in relation to enhancing the societal value and impact of science. The study provided them with practical suggestions, for example, on how to improve the alignment in language used in programme brochures and instructions of review panels, and contributed to valuable knowledge exchanges between organisations. We hope that this publication stimulates more research funders to evaluate their peer review approach in research funding and share their insights.

For a long time, research funders relied solely on scientists for designing and executing peer review of research proposals, thereby delegating responsibility for the process. Although review panels have a discretionary authority, it is important that funders set and supervise the process and the conditions. We argue that one of these conditions should be the diversification of peer review panels and opening up panels for non-scientific peers.

Supplementary material is available at Science and Public Policy online.

Details of the data and information on how to request access is available from the first author.

Joey Gijbels and Wendy Reijmerink are employed by ZonMw. Rebecca Abma-Schouten is employed by the Dutch Heart Foundation and as external PhD candidate affiliated with the Centre for Science and Technology Studies, Leiden University.

A special thanks to the panel chairs and programme officers of ZonMw and the DHF for their willingness to participate in this project. We thank Diny Stekelenburg, an internship student at ZonMw, for her contributions to the project. Our sincerest gratitude to Prof. Paul Wouters, Sarah Coombs, and Michiel van der Vaart for proofreading and their valuable feedback. Finally, we thank the editors and anonymous reviewers of Science and Public Policy for their thorough and insightful reviews and recommendations. Their contributions are recognisable in the final version of this paper.

Abdoul   H. , Perrey   C. , Amiel   P. , et al.  ( 2012 ) ‘ Peer Review of Grant Applications: Criteria Used and Qualitative Study of Reviewer Practices ’, PLoS One , 7 : 1 – 15 .

Google Scholar

Abma-Schouten   R. Y. ( 2017 ) ‘ Maatschappelijke Kwaliteit van Onderzoeksvoorstellen ’, Dutch Heart Foundation .

Alla   K. , Hall   W. D. , Whiteford   H. A. , et al.  ( 2017 ) ‘ How Do We Define the Policy Impact of Public Health Research? A Systematic Review ’, Health Research Policy and Systems , 15 : 84.

Benedictus   R. , Miedema   F. , and Ferguson   M. W. J. ( 2016 ) ‘ Fewer Numbers, Better Science ’, Nature , 538 : 453 – 4 .

Chalmers   I. , Bracken   M. B. , Djulbegovic   B. , et al.  ( 2014 ) ‘ How to Increase Value and Reduce Waste When Research Priorities Are Set ’, The Lancet , 383 : 156 – 65 .

Curry   S. , De Rijcke   S. , Hatch   A. , et al.  ( 2020 ) ‘ The Changing Role of Funders in Responsible Research Assessment: Progress, Obstacles and the Way Ahead ’, RoRI Working Paper No. 3, London : Research on Research Institute (RoRI) .

De Bont   A. ( 2014 ) ‘ Beoordelen Bekeken. Reflecties op het Werk van Een Programmacommissie van ZonMw ’, ZonMw .

De Rijcke   S. , Wouters   P. F. , Rushforth   A. D. , et al.  ( 2016 ) ‘ Evaluation Practices and Effects of Indicator Use—a Literature Review ’, Research Evaluation , 25 : 161 – 9 .

De Wit   A. M. , Bloemkolk   D. , Teunissen   T. , et al.  ( 2016 ) ‘ Voorwaarden voor Succesvolle Betrokkenheid van Patiënten/cliënten bij Medisch Wetenschappelijk Onderzoek ’, Tijdschrift voor Sociale Gezondheidszorg , 94 : 91 – 100 .

Del Carmen Calatrava Moreno   M. , Warta   K. , Arnold   E. , et al.  ( 2019 ) Science Europe Study on Research Assessment Practices . Technopolis Group Austria .

Google Preview

Demicheli   V. and Di Pietrantonj   C. ( 2007 ) ‘ Peer Review for Improving the Quality of Grant Applications ’, Cochrane Database of Systematic Reviews , 2 : MR000003.

Den Oudendammer   W. M. , Noordhoek   J. , Abma-Schouten   R. Y. , et al.  ( 2019 ) ‘ Patient Participation in Research Funding: An Overview of When, Why and How Amongst Dutch Health Funds ’, Research Involvement and Engagement , 5 .

Diabetesfonds ( n.d. ) Maatschappelijke Adviesraad < https://www.diabetesfonds.nl/over-ons/maatschappelijke-adviesraad > accessed 18 Sept 2022 .

Dijstelbloem   H. , Huisman   F. , Miedema   F. , et al.  ( 2013 ) ‘ Science in Transition Position Paper: Waarom de Wetenschap Niet Werkt Zoals het Moet, En Wat Daar aan te Doen Is ’, Utrecht : Science in Transition .

Forsyth   D. R. ( 1999 ) Group Dynamics , 3rd edn. Belmont : Wadsworth Publishing Company .

Geurts   J. ( 2016 ) ‘ Wat Goed Is, Herken Je Meteen ’, NRC Handelsblad < https://www.nrc.nl/nieuws/2016/10/28/wat-goed-is-herken-je-meteen-4975248-a1529050 > accessed 6 Mar 2022 .

Glerup   C. and Horst   M. ( 2014 ) ‘ Mapping “Social Responsibility” in Science ’, Journal of Responsible Innovation , 1 : 31 – 50 .

Hartmann   I. and Neidhardt   F. ( 1990 ) ‘ Peer Review at the Deutsche Forschungsgemeinschaft ’, Scientometrics , 19 : 419 – 25 .

Hirschauer   S. ( 2010 ) ‘ Editorial Judgments: A Praxeology of “Voting” in Peer Review ’, Social Studies of Science , 40 : 71 – 103 .

Hughes   A. and Kitson   M. ( 2012 ) ‘ Pathways to Impact and the Strategic Role of Universities: New Evidence on the Breadth and Depth of University Knowledge Exchange in the UK and the Factors Constraining Its Development ’, Cambridge Journal of Economics , 36 : 723 – 50 .

Huutoniemi   K. ( 2012 ) ‘ Communicating and Compromising on Disciplinary Expertise in the Peer Review of Research Proposals ’, Social Studies of Science , 42 : 897 – 921 .

Jasanoff   S. ( 2011 ) ‘ Constitutional Moments in Governing Science and Technology ’, Science and Engineering Ethics , 17 : 621 – 38 .

Kolarz   P. , Arnold   E. , Farla   K. , et al.  ( 2016 ) Evaluation of the ESRC Transformative Research Scheme . Brighton : Technopolis Group .

Lamont   M. ( 2009 ) How Professors Think : Inside the Curious World of Academic Judgment . Cambridge : Harvard University Press .

Lamont   M. Guetzkow   J. ( 2016 ) ‘How Quality Is Recognized by Peer Review Panels: The Case of the Humanities’, in M.   Ochsner , S. E.   Hug , and H.-D.   Daniel (eds) Research Assessment in the Humanities , pp. 31 – 41 . Cham : Springer International Publishing .

Lamont   M. Huutoniemi   K. ( 2011 ) ‘Comparing Customary Rules of Fairness: Evaluative Practices in Various Types of Peer Review Panels’, in C.   Charles   G.   Neil and L.   Michèle (eds) Social Knowledge in the Making , pp. 209–32. Chicago : The University of Chicago Press .

Langfeldt   L. ( 2001 ) ‘ The Decision-making Constraints and Processes of Grant Peer Review, and Their Effects on the Review Outcome ’, Social Studies of Science , 31 : 820 – 41 .

——— ( 2006 ) ‘ The Policy Challenges of Peer Review: Managing Bias, Conflict of Interests and Interdisciplinary Assessments ’, Research Evaluation , 15 : 31 – 41 .

Lee   C. J. , Sugimoto   C. R. , Zhang   G. , et al.  ( 2013 ) ‘ Bias in Peer Review ’, Journal of the American Society for Information Science and Technology , 64 : 2 – 17 .

Liu   F. Maitlis   S. ( 2010 ) ‘Nonparticipant Observation’, in A. J.   Mills , G.   Durepos , and E.   Wiebe (eds) Encyclopedia of Case Study Research , pp. 609 – 11 . Los Angeles : SAGE .

Luo   J. , Ma   L. , and Shankar   K. ( 2021 ) ‘ Does the Inclusion of Non-academix Reviewers Make Any Difference for Grant Impact Panels? ’, Science & Public Policy , 48 : 763 – 75 .

Luukkonen   T. ( 2012 ) ‘ Conservatism and Risk-taking in Peer Review: Emerging ERC Practices ’, Research Evaluation , 21 : 48 – 60 .

Macleod   M. R. , Michie   S. , Roberts   I. , et al.  ( 2014 ) ‘ Biomedical Research: Increasing Value, Reducing Waste ’, The Lancet , 383 : 101 – 4 .

Meijer   I. M. ( 2012 ) ‘ Societal Returns of Scientific Research. How Can We Measure It? ’, Leiden : Center for Science and Technology Studies, Leiden University .

Merton   R. K. ( 1968 ) Social Theory and Social Structure , Enlarged edn. [Nachdr.] . New York : The Free Press .

Moher   D. , Naudet   F. , Cristea   I. A. , et al.  ( 2018 ) ‘ Assessing Scientists for Hiring, Promotion, And Tenure ’, PLoS Biology , 16 : e2004089.

Olbrecht   M. and Bornmann   L. ( 2010 ) ‘ Panel Peer Review of Grant Applications: What Do We Know from Research in Social Psychology on Judgment and Decision-making in Groups? ’, Research Evaluation , 19 : 293 – 304 .

Patiëntenfederatie Nederland ( n.d. ) Ervaringsdeskundigen Referentenpanel < https://www.patientenfederatie.nl/zet-je-ervaring-in/lid-worden-van-ons-referentenpanel > accessed 18 Sept 2022.

Pier   E. L. , M.   B. , Filut   A. , et al.  ( 2018 ) ‘ Low Agreement among Reviewers Evaluating the Same NIH Grant Applications ’, Proceedings of the National Academy of Sciences , 115 : 2952 – 7 .

Prinses Beatrix Spierfonds ( n.d. ) Gebruikerscommissie < https://www.spierfonds.nl/wie-wij-zijn/gebruikerscommissie > accessed 18 Sep 2022 .

( 2020 ) Private Non-profit Financiering van Onderzoek in Nederland < https://www.rathenau.nl/nl/wetenschap-cijfers/geld/wat-geeft-nederland-uit-aan-rd/private-non-profit-financiering-van#:∼:text=R%26D%20in%20Nederland%20wordt%20gefinancierd,aan%20wetenschappelijk%20onderzoek%20in%20Nederland > accessed 6 Mar 2022 .

Reneman   R. S. , Breimer   M. L. , Simoons   J. , et al.  ( 2010 ) ‘ De toekomst van het cardiovasculaire onderzoek in Nederland. Sturing op synergie en impact ’, Den Haag : Nederlandse Hartstichting .

Reed   M. S. , Ferré   M. , Marin-Ortega   J. , et al.  ( 2021 ) ‘ Evaluating Impact from Research: A Methodological Framework ’, Research Policy , 50 : 104147.

Reijmerink   W. and Oortwijn   W. ( 2017 ) ‘ Bevorderen van Verantwoorde Onderzoekspraktijken Door ZonMw ’, Beleidsonderzoek Online. accessed 6 Mar 2022.

Reijmerink   W. , Vianen   G. , Bink   M. , et al.  ( 2020 ) ‘ Ensuring Value in Health Research by Funders’ Implementation of EQUATOR Reporting Guidelines: The Case of ZonMw ’, Berlin : REWARD|EQUATOR .

Reinhart   M. ( 2010 ) ‘ Peer Review Practices: A Content Analysis of External Reviews in Science Funding ’, Research Evaluation , 19 : 317 – 31 .

Reinhart   M. and Schendzielorz   C. ( 2021 ) Trends in Peer Review . SocArXiv . < https://osf.io/preprints/socarxiv/nzsp5 > accessed 29 Aug 2022.

Roumbanis   L. ( 2017 ) ‘ Academic Judgments under Uncertainty: A Study of Collective Anchoring Effects in Swedish Research Council Panel Groups ’, Social Studies of Science , 47 : 95 – 116 .

——— ( 2021a ) ‘ Disagreement and Agonistic Chance in Peer Review ’, Science, Technology & Human Values , 47 : 1302 – 33 .

——— ( 2021b ) ‘ The Oracles of Science: On Grant Peer Review and Competitive Funding ’, Social Science Information , 60 : 356 – 62 .

( 2019 ) ‘ Ruimte voor ieders talent (Position Paper) ’, Den Haag : VSNU, NFU, KNAW, NWO en ZonMw . < https://www.universiteitenvannederland.nl/recognitionandrewards/wp-content/uploads/2019/11/Position-paper-Ruimte-voor-ieders-talent.pdf >.

( 2013 ) San Francisco Declaration on Research Assessment . The Declaration . < https://sfdora.org > accessed 2 Jan 2022 .

Sarewitz   D. and Pielke   R. A.  Jr. ( 2007 ) ‘ The Neglected Heart of Science Policy: Reconciling Supply of and Demand for Science ’, Environmental Science & Policy , 10 : 5 – 16 .

Scholten   W. , Van Drooge   L. , and Diederen   P. ( 2018 ) Excellent Is Niet Gewoon. Dertig Jaar Focus op Excellentie in het Nederlandse Wetenschapsbeleid . The Hague : Rathenau Instituut .

Shapin   S. ( 2008 ) The Scientific Life : A Moral History of a Late Modern Vocation . Chicago : University of Chicago press .

Spaapen   J. and Van Drooge   L. ( 2011 ) ‘ Introducing “Productive Interactions” in Social Impact Assessment ’, Research Evaluation , 20 : 211 – 8 .

Travis   G. D. L. and Collins   H. M. ( 1991 ) ‘ New Light on Old Boys: Cognitive and Institutional Particularism in the Peer Review System ’, Science, Technology & Human Values , 16 : 322 – 41 .

Van Arensbergen   P. and Van den Besselaar   P. ( 2012 ) ‘ The Selection of Scientific Talent in the Allocation of Research Grants ’, Higher Education Policy , 25 : 381 – 405 .

Van Arensbergen   P. , Van der Weijden   I. , and Van den Besselaar   P. V. D. ( 2014a ) ‘ The Selection of Talent as a Group Process: A Literature Review on the Social Dynamics of Decision Making in Grant Panels ’, Research Evaluation , 23 : 298 – 311 .

—— ( 2014b ) ‘ Different Views on Scholarly Talent: What Are the Talents We Are Looking for in Science? ’, Research Evaluation , 23 : 273 – 84 .

Van den Brink , G. , Scholten , W. , and Jansen , T. , eds ( 2016 ) Goed Werk voor Academici . Culemborg : Stichting Beroepseer .

Weingart   P. ( 1999 ) ‘ Scientific Expertise and Political Accountability: Paradoxes of Science in Politics ’, Science & Public Policy , 26 : 151 – 61 .

Wessely   S. ( 1998 ) ‘ Peer Review of Grant Applications: What Do We Know? ’, The Lancet , 352 : 301 – 5 .

Supplementary data

Month: Total Views:
April 2023 723
May 2023 266
June 2023 152
July 2023 130
August 2023 355
September 2023 189
October 2023 198
November 2023 181
December 2023 153
January 2024 197
February 2024 222
March 2024 227
April 2024 218
May 2024 229
June 2024 134
July 2024 123
August 2024 87

Email alerts

Citing articles via.

  • Recommend to your Library

Affiliations

  • Online ISSN 1471-5430
  • Print ISSN 0302-3427
  • Copyright © 2024 Oxford University Press
  • About Oxford Academic
  • Publish journals with us
  • University press partners
  • What we publish
  • New features  
  • Open access
  • Institutional account management
  • Rights and permissions
  • Get help with access
  • Accessibility
  • Advertising
  • Media enquiries
  • Oxford University Press
  • Oxford Languages
  • University of Oxford

Oxford University Press is a department of the University of Oxford. It furthers the University's objective of excellence in research, scholarship, and education by publishing worldwide

  • Copyright © 2024 Oxford University Press
  • Cookie settings
  • Cookie policy
  • Privacy policy
  • Legal notice

This Feature Is Available To Subscribers Only

Sign In or Create an Account

This PDF is available to Subscribers Only

For full access to this pdf, sign in to an existing account, or purchase an annual subscription.

Have a language expert improve your writing

Run a free plagiarism check in 10 minutes, generate accurate citations for free.

  • Knowledge Base
  • Starting the research process
  • How to Write a Research Proposal | Examples & Templates

How to Write a Research Proposal | Examples & Templates

Published on October 12, 2022 by Shona McCombes and Tegan George. Revised on November 21, 2023.

Structure of a research proposal

A research proposal describes what you will investigate, why it’s important, and how you will conduct your research.

The format of a research proposal varies between fields, but most proposals will contain at least these elements:

Introduction

Literature review.

  • Research design

Reference list

While the sections may vary, the overall objective is always the same. A research proposal serves as a blueprint and guide for your research plan, helping you get organized and feel confident in the path forward you choose to take.

Table of contents

Research proposal purpose, research proposal examples, research design and methods, contribution to knowledge, research schedule, other interesting articles, frequently asked questions about research proposals.

Academics often have to write research proposals to get funding for their projects. As a student, you might have to write a research proposal as part of a grad school application , or prior to starting your thesis or dissertation .

In addition to helping you figure out what your research can look like, a proposal can also serve to demonstrate why your project is worth pursuing to a funder, educational institution, or supervisor.

Research proposal aims
Show your reader why your project is interesting, original, and important.
Demonstrate your comfort and familiarity with your field.
Show that you understand the current state of research on your topic.
Make a case for your .
Demonstrate that you have carefully thought about the data, tools, and procedures necessary to conduct your research.
Confirm that your project is feasible within the timeline of your program or funding deadline.

Research proposal length

The length of a research proposal can vary quite a bit. A bachelor’s or master’s thesis proposal can be just a few pages, while proposals for PhD dissertations or research funding are usually much longer and more detailed. Your supervisor can help you determine the best length for your work.

One trick to get started is to think of your proposal’s structure as a shorter version of your thesis or dissertation , only without the results , conclusion and discussion sections.

Download our research proposal template

Prevent plagiarism. Run a free check.

Writing a research proposal can be quite challenging, but a good starting point could be to look at some examples. We’ve included a few for you below.

  • Example research proposal #1: “A Conceptual Framework for Scheduling Constraint Management”
  • Example research proposal #2: “Medical Students as Mediators of Change in Tobacco Use”

Like your dissertation or thesis, the proposal will usually have a title page that includes:

  • The proposed title of your project
  • Your supervisor’s name
  • Your institution and department

The first part of your proposal is the initial pitch for your project. Make sure it succinctly explains what you want to do and why.

Your introduction should:

  • Introduce your topic
  • Give necessary background and context
  • Outline your  problem statement  and research questions

To guide your introduction , include information about:

  • Who could have an interest in the topic (e.g., scientists, policymakers)
  • How much is already known about the topic
  • What is missing from this current knowledge
  • What new insights your research will contribute
  • Why you believe this research is worth doing

As you get started, it’s important to demonstrate that you’re familiar with the most important research on your topic. A strong literature review  shows your reader that your project has a solid foundation in existing knowledge or theory. It also shows that you’re not simply repeating what other people have already done or said, but rather using existing research as a jumping-off point for your own.

In this section, share exactly how your project will contribute to ongoing conversations in the field by:

  • Comparing and contrasting the main theories, methods, and debates
  • Examining the strengths and weaknesses of different approaches
  • Explaining how will you build on, challenge, or synthesize prior scholarship

Following the literature review, restate your main  objectives . This brings the focus back to your own project. Next, your research design or methodology section will describe your overall approach, and the practical steps you will take to answer your research questions.

Building a research proposal methodology
? or  ? , , or research design?
, )? ?
, , , )?
?

To finish your proposal on a strong note, explore the potential implications of your research for your field. Emphasize again what you aim to contribute and why it matters.

For example, your results might have implications for:

  • Improving best practices
  • Informing policymaking decisions
  • Strengthening a theory or model
  • Challenging popular or scientific beliefs
  • Creating a basis for future research

Last but not least, your research proposal must include correct citations for every source you have used, compiled in a reference list . To create citations quickly and easily, you can use our free APA citation generator .

Some institutions or funders require a detailed timeline of the project, asking you to forecast what you will do at each stage and how long it may take. While not always required, be sure to check the requirements of your project.

Here’s an example schedule to help you get started. You can also download a template at the button below.

Download our research schedule template

Example research schedule
Research phase Objectives Deadline
1. Background research and literature review 20th January
2. Research design planning and data analysis methods 13th February
3. Data collection and preparation with selected participants and code interviews 24th March
4. Data analysis of interview transcripts 22nd April
5. Writing 17th June
6. Revision final work 28th July

If you are applying for research funding, chances are you will have to include a detailed budget. This shows your estimates of how much each part of your project will cost.

Make sure to check what type of costs the funding body will agree to cover. For each item, include:

  • Cost : exactly how much money do you need?
  • Justification : why is this cost necessary to complete the research?
  • Source : how did you calculate the amount?

To determine your budget, think about:

  • Travel costs : do you need to go somewhere to collect your data? How will you get there, and how much time will you need? What will you do there (e.g., interviews, archival research)?
  • Materials : do you need access to any tools or technologies?
  • Help : do you need to hire any research assistants for the project? What will they do, and how much will you pay them?

If you want to know more about the research process , methodology , research bias , or statistics , make sure to check out some of our other articles with explanations and examples.

Methodology

  • Sampling methods
  • Simple random sampling
  • Stratified sampling
  • Cluster sampling
  • Likert scales
  • Reproducibility

 Statistics

  • Null hypothesis
  • Statistical power
  • Probability distribution
  • Effect size
  • Poisson distribution

Research bias

  • Optimism bias
  • Cognitive bias
  • Implicit bias
  • Hawthorne effect
  • Anchoring bias
  • Explicit bias

Once you’ve decided on your research objectives , you need to explain them in your paper, at the end of your problem statement .

Keep your research objectives clear and concise, and use appropriate verbs to accurately convey the work that you will carry out for each one.

I will compare …

A research aim is a broad statement indicating the general purpose of your research project. It should appear in your introduction at the end of your problem statement , before your research objectives.

Research objectives are more specific than your research aim. They indicate the specific ways you’ll address the overarching aim.

A PhD, which is short for philosophiae doctor (doctor of philosophy in Latin), is the highest university degree that can be obtained. In a PhD, students spend 3–5 years writing a dissertation , which aims to make a significant, original contribution to current knowledge.

A PhD is intended to prepare students for a career as a researcher, whether that be in academia, the public sector, or the private sector.

A master’s is a 1- or 2-year graduate degree that can prepare you for a variety of careers.

All master’s involve graduate-level coursework. Some are research-intensive and intend to prepare students for further study in a PhD; these usually require their students to write a master’s thesis . Others focus on professional training for a specific career.

Critical thinking refers to the ability to evaluate information and to be aware of biases or assumptions, including your own.

Like information literacy , it involves evaluating arguments, identifying and solving problems in an objective and systematic way, and clearly communicating your ideas.

The best way to remember the difference between a research plan and a research proposal is that they have fundamentally different audiences. A research plan helps you, the researcher, organize your thoughts. On the other hand, a dissertation proposal or research proposal aims to convince others (e.g., a supervisor, a funding body, or a dissertation committee) that your research topic is relevant and worthy of being conducted.

Cite this Scribbr article

If you want to cite this source, you can copy and paste the citation or click the “Cite this Scribbr article” button to automatically add the citation to our free Citation Generator.

McCombes, S. & George, T. (2023, November 21). How to Write a Research Proposal | Examples & Templates. Scribbr. Retrieved August 20, 2024, from https://www.scribbr.com/research-process/research-proposal/

Is this article helpful?

Shona McCombes

Shona McCombes

Other students also liked, how to write a problem statement | guide & examples, writing strong research questions | criteria & examples, how to write a literature review | guide, examples, & templates, get unlimited documents corrected.

✔ Free APA citation check included ✔ Unlimited document corrections ✔ Specialized in correcting academic texts

  • Skip to main content
  • Skip to primary sidebar
  • Skip to footer

Molecular Foundry

PROPOSAL GUIDE 6. Proposal Questions and Evaluation Criteria

Project goals and significance (20 points).

Proposal question: Describe the scientific or technological motivation, long-term goals, and significance of your project in the context of your field of study. Describe your immediate goal in the context of a one-year Molecular Foundry User Project.

Review criteria: To what extent is the proposed research expected to significantly advance the scientific or technological field? How likely is the proposed work to produce high-quality publications?

Project Plan and Timeline (15 points)

Proposal question: Describe your project plan, indicating what work will be done at your home institution and what in the Foundry Facilities requested in “Resource Request”. Describe how each of these pieces of work contributes to the immediate goal identified in “Project Goals and Significance”. Describe the expected timeline of your project, including an estimate of the time required for each piece of work.

Review criteria: Is the experimental plan achievable at the Foundry within the proposed project term (not to exceed one year for a Standard Proposal)? How will the results obtained at the Foundry complement and propel follow-up research at the user’s home institution?

Resource Request (15 points)

Proposal question: Describe how Foundry capabilities and expertise are needed to realize the immediate goal identified in “Project Goals and Significance”. Identify the lead facility and any support facilities requested, and give a short description of the intended work at each.

Review criteria: Is the request for Foundry resources well justified? To what extent will the proposed work take advantage of the unique capabilities or combinations of capabilities/expertise (either within a single facility, or across the Foundry as a whole)?

Relevant Experience (10 points)

Proposal question: Describe the current level of expertise of each researcher in relation to the proposed work. For researchers expected to do hands-on work at the Foundry, be explicit about their level of training or experience on the requested instruments and capabilities.

Review criteria: Are the users adequately prepared for efficient use of limited Foundry resources? How do the users’ track records of innovative, technically demanding research inform the likelihood of success of the proposed project?

Next Section: 7. Proposal Tips and Examples

Was this page useful?

like

  • Become a Foundry User
  • Prepare Your Proposal
  • User Portal
  • Instrument Scheduler

SciELO in Perspective

  • GENERAL HUMANITIES PRESS RELEASES
  • Methodology
  • Editorial team
  • Publication guidelines

How to assess research proposals?

By Lilian Nassi-Calò

Photo: Oliver Tacke .

The peer review of research proposals (grants) aims to judge the merit of projects and researchers and enable the best to be contemplated. The high number of candidates and proposals, however, has caused saturation of the reviewers, who find themselves immersed in increasing numbers of projects, not knowing the best way to assess them.

In a post previously published in this blog, the possibility of making reviews on grant proposals openly available has been discussed, as a way to help researchers devise better proposals, while allowing public recognition of referees and helping to prevent fraud in the appraisal process. This alternative comes from the successful experience of journals which made peer-reviewers comments openly available along with the published article.

Recently, Ewan Birney, director of the European Molecular Biology Laboratory’s European Bioinformatics Institute at Hinxton, UK, asked his Twitter followers for practical suggestions on how to identify the best candidates from hundreds of research grants submissions received by his institution. To his surprise, the scientific community responded enthusiastically with many suggestions which, in turn, led to other comments on Twitter. The experience was reported in Nature 1 , which is also receiving comments on its page.

Birney 2 started the debate on Twitter asking about a proxy for quality other than the journal title to assess the candidates’ competence, whose articles combined added up to about 2,500 overall. Yoav Gilad 3 , a geneticist at the University of Chicago, IL, US, advised him to read the 2,500 abstracts or the papers, even if it meant including more referees in the assessment process. Birney said that he considered it not feasible, although correct. Birney thinks, like many, that the journal’s title or its Impact Factor (IF) does not necessarily reflect the individual quality of the papers. Moreover, his task is even more difficult, because it includes assessing proposals that do not fall exactly within his area of ​​expertise. “Of course, even if I was using journal as proxy here it wouldn’t help me – everyone here has published ‘well’”.

The discussion continues on Twitter with a suggestion from Stephen Curry 4 , a structural biologist at the Imperial College in London about asking candidates to identify their four most relevant publications and justify their choices in a single page report. Richard Sever 5 , co-founder of the Cold Spring Harbor Laboratory (CSHL) bioRxiv biomedical articles’ repository and assistant-director of CSHL Press considered it a good idea, pointing out, however, that this method could actually select candidates good at writing one page summaries.

The biggest concern, according to Birney, in using citation based metrics, as suggested by many researchers, lies in the fact that they vary considerably between disciplines and may not be comparable in a heterogeneous sample. Hugo Hilton 6 , an immunologist at the University of Stanford at CA, US, expressed his concern, as a candidate, that the selection processes are subject to not totally clear criteria and classic biases as the prestige of journals where applicants publish. It is worth, at this point, mentioning the Declaration on Research Assessment (DORA) of 2012 7 , in which members of the American Society for Cell Biology pledged not to use the IF to evaluate researchers in grant proposals, career promotions and hiring, precisely to avoid distortions. Up to now the Declaration was signed by over 150 prominent scientists and 80 academic organizations.

Birney says that the referees should have a certain degree of autonomy to assess the proposals and there is no problem if all of them do not follow exactly the same procedures in their assessments. “I would prefer subjective but unbiased opinions, and five of them with different criteria than trying to unify the criteria so we all agree with the same answers.” However, he points out, transparency in the process is essential.

Despite being aware of the problems in using journals prestige as a proxy for quality, Birney believes that its use is unavoidable due to the large volume of proposals and candidates. He also advises candidates to highlight their achievements clearly in the proposal, rather than just pinpoint journal titles from their publications list.

The paper on Nature received several comments suggesting ways to speed up the evaluation process and come up with shortlists. It is also possible to registered users to submit their views on the topic 8 . Join the discussion you too!

1. CHAWLA, D.S. How to judge scientists’ strengths. Nature . 2015, volº 527, nº 279. DOI: 10.1038/527279f

2. Ewan Birney: http://twitter.com/ewanbirney

3. Yoav Gilad: http://twitter.com/Y_Gilad

4. Stephen Curry: http://twitter.com/Stephen_Curry

5. Richard Sever: http://twitter.com/cshperspectives

6. Hugo Hilton: http://twitter.com/Hilton_HG

7. SCIENTIFIC ELECTRONIC LIBRARY ONLINE. Declaration recommends eliminate the use of Impact factor for research evaluation . SciELO in Perspective. [viewed 22 November 2015]. Available from: http://blog.scielo.org/en/2013/07/16/declaration-recommends-eliminate-the-use-of-impact-factor-for-research-evaluation/

8. < http://www.nature.com/foxtrot/svc/login?type=commenting >

CHAWLA, D.S. How to judge scientists’ strengths. Nature . 2015, volº 527, nº 279. DOI: 10.1038/527279f

MALHOTRA, V. and MARDER, E. Peer review: The pleasure of publishing – originally published in the journal eLife in January/2015 . SciELO in Perspective. [viewed 21 November 2015]. Available from: http://blog.scielo.org/en/2015/05/11/peer-review-the-pleasure-of-publishing-originally-published-in-the-journal-elife-in-january2015/

SCIENTIFIC ELECTRONIC LIBRARY ONLINE. Could grant proposal reviews be made available openly?. SciELO in Perspective. [viewed 21 November 2015]. Available from: http://blog.scielo.org/en/2015/03/20/could-grant-proposal-reviews-be-made-available-openly/

SCIENTIFIC ELECTRONIC LIBRARY ONLINE. Declaration recommends eliminate the use of Impact factor for research evaluation . SciELO in Perspective. [viewed 22 November 2015]. Available from: http://blog.scielo.org/en/2013/07/16/declaration-recommends-eliminate-the-use-of-impact-factor-for-research-evaluation/

SCIENTIFIC ELECTRONIC LIBRARY ONLINE. Paper proposes four pillars for scholarly communication to favor the speed and the quality of science . SciELO in Perspective. [viewed 21 November 2015]. Available from: http://blog.scielo.org/en/2013/07/31/paper-proposes-four-pillars-for-scholarly-communication-to-favor-the-speed-and-the-quality-of-science/

SCIENTIFIC ELECTRONIC LIBRARY ONLINE. Peer-review as a research topic in its own right . SciELO in Perspective. [viewed 21 November 2015]. Available from: http://blog.scielo.org/en/2015/04/24/peer-review-as-a-research-topic-in-its-own-right/

SCIENTIFIC ELECTRONIC LIBRARY ONLINE. Scientometrics of peer-reviewers – will they be finally recognized? . SciELO in Perspective. [viewed 21 November 2015]. Available from: http://blog.scielo.org/en/2014/05/14/scientometrics-of-peer-reviewers-will-they-be-finally-recognized/

External links

bioRxiv – < http://biorxiv.org/ >

San Francisco Declaration on Research Assessment – < http://am.ascb.org/dora/ >

About Lilian Nassi-Calò

Lilian Nassi-Calò studied chemistry at Instituto de Química – USP, holds a doctorate in Biochemistry by the same institution and a post-doctorate as an Alexander von Humboldt fellow in Wuerzburg, Germany. After her studies, she was a professor and researcher at IQ-USP. She also worked as an industrial chemist and presently she is Coordinator of Scientific Communication at BIREME/PAHO/WHO and a collaborator of SciELO.

Translated from the original in portuguese  by Lilian Nassi-Calò.

Related Posts:

Watercolor of Alan Turing generated by Midjourney AI

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Save my name, email, and website in this browser for the next time I comment.

Post Navigation

X

SciELO Network

Press Releases

  • It is now possible to develop an accessible passenger cabin for people with physical and sensory disabilities August 20, 2024 Rosa Emilia Moraes
  • An Analysis of the Epistemological Foundations of Machine Learning August 15, 2024 Garibaldi Sarmento
  • Barnyardgrass resistance to herbicides demands new agricultural strategies August 12, 2024 Ciência Rural
  • Nietzsche and politics as will to dominion July 26, 2024 Adilson Felicio Feiler
  • 115 years of Memórias do Instituto Oswaldo Cruz, reflections on the current model and the future of scientific journals July 24, 2024 laraoliveira

Recent Posts

  • An Analysis of the Epistemological Foundations of Machine Learning
  • Tracing the origins of ‘publish or perish’ [Originally published in the LSE Impact blog in July/2024]
  • 115 years of Memórias do Instituto Oswaldo Cruz , reflections on the current model and the future of scientific journals
  • Trans/Form/Ação officially adopts preprints as a submission method
  • Bibliometric study on the adoption of open science measures reveals high scores for both Brazilian and SciELO-indexed journals

Recent Comments

  • Nota de pesar: Charles Pessanha - anpocs.org.br on Interview and Tribute to Charles Pessanha [Originally published in DADOS’ blog in January/2020]
  • Pesquisa da Pesquisa e Inovação on The use of research metrics is diversified in the Leiden Manifesto
  • Pesquisa da Pesquisa organizará sessão especial em uma das maiores conferências sobre indicadores de CTI do mundo – Pesquisa da Pesquisa e Inovação on The use of research metrics is diversified in the Leiden Manifesto
  • SciELO to Celebrate 25 Years with Series of Events - Public Knowledge Project on SciELO 25 Years: Open Science with IDEIA – Impact, Diversity, Equity, Inclusion, and Accessibility
  • Como aumentar a divulgação da produção científica e das revistas acadêmicas - ABCD - Agência de Bibliotecas e Coleções Digitais on Why XML?

The posts are of the authors responsibility and don't necessarily convey the opinions of the SciELO Program.

SciELO - Scientific Electronic Library Online FAPESP - CAPES - CNPq - BIREME - FapUNIFESP http://www.scielo.org [email protected]

Home News Analysis Interviews Documents Newsletter About

Proposal Assessment

If this proposal has been submitted to a specific scheme then the research council will provide additional information in the Instructions to Reviewer section

- EPSRC specific

- ESRC specific

- MRC specific

- NC3Rs specific

- STFC specific

EPSRC - Specific Requirements:

Scheme Specific Guidance:

This proposal has been submitted against a specific scheme/call, which will have explicit aims and objectives and which will have set out additional assessment criteria relating to meeting these. You should ensure you have read the scheme guidance and/or call document and should comment here on how well the proposal meets the aims of the call and the extent to which it addresses all the specific criteria.

EPSRC reviewer guidance and the specific assessment criteria for each scheme is available on the EPSRC website at https://www.epsrc.ac.uk/funding/assessmentprocess/review/ .

If the proposal has been submitted in response to a published call, you are asked to read that call document and to make your assessment of the proposal within the context of the aims, objectives and specific assessment criteria for that call. The call document can be found on the EPSRC Website following this link: http://www.epsrc.ac.uk/funding/calls/ .

Back to top

ESRC - Specific Requirements

ESRC Academic Assessment guidance

Reviewer scoring scale

ESRC uses a numerical reviewer’s scoring scale from 1 to 6. This scale is used for all ESRC schemes where proposals are sent to external peer reviewers and then to a panel meeting for a final funding recommendation; this includes Research Grant proposals. Please note that proposals to fast-track calls are assessed using the Panel Introducer scoring scale, from 1-10.

Please also be aware that for Research Grant proposals we allow an applicant (PI) response to reviewers’ comments.

For your overall score, please use the following:

* These descriptions refer solely to scientific quality for simplicity. However, your score should take into account all the assessment criteria for the specific scheme, as detailed below.

If you feel unable to assess a proposal against a particular criterion, you can also indicate this by ticking ‘Unable to assess’ against that criterion on the reviewer form (effectively score 0).

Detailed notes on Academic Reviewer Guidance covering Assessment Criteria can be located here: http://www.esrc.ac.uk/funding/guidance-for-peer-reviewers/

ESRC User Assessment Guidance

Reviewer scoring scales

ESRC uses a numerical reviewer’s scoring scale from 1 to 6. This scale is now used for all ESRC schemes where proposals are sent to external peer reviewers and then to a panel meeting for a final funding recommendation; this includes Research Grant proposals. Please note that proposals to fast-track calls are assessed using the Panel Introducer scoring scale, from 1-10.

High (equivalent to score 6)

Research of high importance to users of research, i.e., of such novelty or timeliness and promise that a significant contribution to policy or practice is likely.

Worthy (equivalent to score 4)

Research that will add to understanding and is worthy of support but which may not be of such relevance or urgency as to have a significant influence on policy or practice.

Reject (equivalent to score 2)

Research which is flawed in its proposed contribution to policy or practice or is repetitious of other work.

Detailed comments in support of these scores should be provided in the free text overall assessment section. 

If you wish to comment on scientific quality, please use the above ‘Academic’ scoring table guide.

Detailed notes on User Reviewer Guidance covering Assessment Criteria can be located here: http://www.esrc.ac.uk/funding/guidance-for-peer-reviewers/

B ack to top

MRC – Specific Requirements

Proposal Assessment (criteria)

Reviews are based around three core criteria:

Importance: how important are the questions, or gaps in knowledge, that are being addressed?

Scientific potential: what are the prospects for good scientific progress?

Resources requested: are the funds requested essential for the work, and do the importance and scientific potential justify funding on the scale requested?   

We also ask reviewers to consider other aspects of the research, such as the potential impact, ethical issues, data management plans, appropriate use of animals, the research environment and more. Detailed criteria for the different schemes we operate can be found in the Reviewers Handbook, along with a series of questions that you should consider when preparing your review.

For further guidance for peer reviewers please see the MRC website by selecting:   http://www.mrc.ac.uk/funding/peer-review/guidance-for-peer-reviewers/

NC3Rs - Specific Requirements:

ALL comments in this section will be sent, unedited, to the applicant. Your identity will not be revealed.  

Scientific Potential

Please consider the following:

What are the prospects for good scientific progress?

Has the host Research Organisation demonstrated a commitment to supporting the work?

Is there a firm foundation to take the work forward?

Are collaborators well chosen?

Research Plans

Please comment on:

How innovative is the proposal? To your knowledge, is the same or similar work being undertaken elsewhere?

Are the experimental plans realistic and feasible, given the aims of the research and the resources?

Are the methods and study designs appropriate? Are sufficient details given?

If appropriate, is there suitable preliminary data included? Note: this may be limited for the pilot study grant scheme.

What are the scientific, technical or organisational challenges and have the applicants identified plans to tackle them?

In the case of applications for Pilot Study grants how will the work be developed and how feasible are the subsequent proposals?

Is information provided on what will be the next steps for evaluation, validation and implementation?

With regard to animal work

- Has information been provided on care, husbandry and refinements to procedures?

- Has the number of animals been minimised?

Ethics and Research Governance of the proposal

In completing this section please consider the following:

Is the work ethically acceptable?

Are there any ethical issues that need separate consideration?

Are the ethical review and research governance arrangements clear and acceptable?

Where applicable, have replacement, refinement and reduction been applied to the proposed work?

Risks of research misuse

Please consider if there are any ethical, safety or security issues, or other potential adverse consequences, associated with the proposed research.

Are there any tangible risks that the research would generate outcomes that could be misused for harmful purposes?

Are there any actions which could lead to harm to humans, animals or the environment - including terrorist misuse?

If such issues exist, have these been addressed satisfactorily in the proposal?

Relevance to NC3Rs Strategy

Practical advances in applying the 3Rs to animal research are important in order to ensure high-quality, reproducible and humane science; and to address public concerns regarding the use of animals. One of the key aims of the NC3Rs is to promote the development of new research approaches, which have a reduced reliance on the use of animals and/or lead to improved animal welfare. The Centre does this partly through funding high quality research which advances knowledge in each of the 3Rs.

Please comment on the relevance of the proposal to the NC3Rs strategy.

Is the relevance to the NC3Rs priorities clearly and convincingly explained?

Have the applicants provided a clear assessment of the predicted advances in the 3Rs?

Do you agree with this assessment?

STFC - Specific Requirements:

For calls against the STFC Standard and Project Peer Review Panel scheme please answer the questions on Strengths, Weaknesses, and Resources. The Impact section is only relevant to PPRP.

Further guidance is available below and STFC assessment criteria can be found on the documents to review helptext  page

For Astronomy Grants Panel proposals it is highly likely that you will be asked to comment on several projects within the proposal. It is essential that the reviewer clearly identifies each project separately when providing comments.

For strengths and weaknesses please include your thoughts on the proposal with emphasis on:

  • the importance of the science and supporting the proposed project(s)
  • the international competitiveness.  How does the project fit with the international context and what is the international relevance
  • the scientific excellence

Please also provide your thoughts on:

  • the applicants track record
  • potential for international leadership.  How and why would you rate the UK activity in this area and its international standing?
  • the timeliness of the proposal; and
  • the feasibility of the outlined work**

For Resources comment on the justification for the level of resources requested and their appropriateness to deliver the stated aims. Please state whether the resources requested have been justified and are appropriate (including facility requests such as computing etc. Or what modifications you would recommend). Please pay particular attention to staffing and equipment.                                                            

For the questions on Impact , please see the on screen guidance within the form.

IPS/Follow on Fund  and CLASP proposals should be assessed making sure there is evidence of knowledge exchange that will stimulate technology exploitation through the identified route to market. There are three main criteria, Economic Impact, Social Impact and Overall quality which are defined below:

Economic Impact:

  • Likelihood of commercialisation (route to market, inc. cost)
  • Economic benefit to UK (inc. cost savings)
  • Market assessment (need, size, competitors, value, location)
  • IP Management plan

Social Impact:

  • Societal benefits
  • Staff training / Capacity building
  • Dissemination plan 
  • Academic Benefits

Overall Quality:

  • Scientific quality (current technology status, objectives and deliverables)
  • Risk management
  • User engagement
  • Suitability of applicants and partners (outline who is doing what)
  • Value for money (justification of costs)
  • Added value of IPS funding

*UKRI recognises that the COVID-19 pandemic has caused major interruptions and disruptions across our communities and are committed to ensuring that individual applicants and their wider team, including partners and networks, are not penalised for any disruption to their career(s) such as breaks and delays, disruptive working patterns and conditions, the loss of on-going work, and role changes that may have been caused by the pandemic.

When undertaking your assessment of the research project, you should consider the unequal impacts of the impact that COVID-19 related disruption might have had on the track record and career development of those individuals included in the proposal, and you should focus on the capability of the applicant and their wider team to deliver the research they are proposing.

** UKRI acknowledges that it is a challenge for applicants to determine the future impacts of COVID-19 while the pandemic continues to evolve. Applicants have been advised that their applications should be based on the information available at the point of submission and, if applicable, the known application specific impacts of COVID-19 should be accounted for. Where known impacts have occurred, these should have been highlighted in the application, including the assumptions/information at the point of submission. Applicants were not required to include contingency plans for the potential impacts of COVID-19. Requests for travel both domestically and internationally could be included in accordance to the relevant scheme guidelines, noting the above advice.

When undertaking your assessment of the research project you should assess the project as written, noting that any changes that the project might require in the future, which arise from the COVID-19 pandemic, will be resolved as a post-award issue by UKRI if the project is successful. Potential complications related to COVID-19 should not affect your assessment or the score you give the project

Cornell University

Search This Site

  • Budget & Planning
  • Common Data Set
  • Substantive change process
  • Brown bags and presentations
  • University Factbook
  • Diversity dashboards
  • Undergraduate
  • Graduate School
  • Professional Schools
  • Medical Division
  • Total Enrollment
  • Undergraduate Enrollment
  • Graduate School Enrollment
  • Professional Schools Enrollment
  • Medical Division Enrollment
  • Tuition and Self-Help
  • Degrees Conferred
  • Academic Staff
  • Non-academic Staff
  • External Environment
  • All undergraduate students
  • Incoming undergraduates
  • Graduating seniors
  • Faculty and academics
  • Submit survey proposals
  • Survey calendar
  • So you want to survey Cornell students…
  • Academic program changes
  • Academic program review

Formal Review of Research Proposals

When is Formal Review Required?

Student & Campus Life research projects that will use substantial resources of the Cornell community must be formally reviewed by the committee before they can be initiated. At a minimum, this includes research that draws participants from a major institutional data base, for example, those maintained by the University Registrar; Office of the Dean of Students; Fraternity, Sorority and Independent Living; and Class Councils. Regardless of how potential participants are to be identified, research that meets the following criteria will also require formal review by the committee:

  • Involves more that 100 participants for a quantitative data collection method (e.g., survey research) or 25 participants for a qualitative data collection method (e.g., focus groups or interviews);
  • Is broader in scope than program evaluation (e.g., asks about more than just program-based experiences or includes individuals who did not participate in the target program or event); and
  • Will require a substantial amount of participants’ time (e.g., protocols that will take more than 10 or 15 minutes to complete, or longitudinal research designs).

Conversely, research projects that are very limited in scope, and research that is conducted exclusively for program evaluation purposes (i.e., research that examines the program-related experiences of students who participate in a specific program or event) will generally be exempt from formal review by the committee.

Submitting a Proposal for Formal Review

The committee meets monthly during the fall, winter and spring semesters to formally review research proposals and conduct related business. At least eight weeks before the anticipated launch date of the project, researchers should submit a  SCLRG research proposal form to Leslie Meyerhoff or Marne Einarson . The proposal form asks for information about the purpose and proposed design of the study, as well as draft versions of data collection instruments. Samples of completed research proposals are available here and here .

The following criteria will be used by the committee to evaluate research proposals:

  • Importance: Does the research address an important issue at Cornell? Will it provide useful information for academic planning or providing services to Cornell students?
  • Content and Design : Does the proposed methodology fit the research question(s)? Are the questions well-constructed and easily understood? Is the instrument of reasonable length? Have the questions been pretested?
  • Population and Sampling Methodology: Who is the target population? Is the sampling methodology appropriate to the research question(s)? Has the same student cohort and/or sample been used in other recent research? Could a smaller sample be drawn to achieve the same objective? How will the researcher(s) gain access to the proposed participants?
  • Timing: Does the proposed timing of the research overlap with or follow closely upon other research directed toward the same population? When were data on this issue last collected at Cornell? Is the data collection period scheduled at a time when students are likely to respond?
  • Data Management and Dissemination: Who will have access to the data? What are the provisions for secure storage of the data? Can data from this research be linked to other data sets? What is the plan for analyzing the data and disseminating the results? How will research results contribute to better decision making? How will research results be shared more broadly?
  • Resources : What resources will be required to conduct this research (e.g., instrument design, Web application development, mail and/or e-mail services, data entry and analysis)? From where will these resources be obtained?
  • Overall Impact: What will be the impact of the study? Are there any conceivable negative impacts on the University? Will the study overburden respondents? Overall, do the expected benefits of the study appear to outweigh the costs?

Based on their evaluation of the research proposal, the committee may decide to:

  • Approve the project as submitted
  • Approve the project with recommendations for changes that must be adopted before the project can be initiated
  • Require revisions and re-submission of the project before approval is granted
  • Reject the project (e.g., the potential benefits of the data do not justify the costs of collection; the research design has weaknesses that cannot be rectified)

IRB Approval

If research results will not be used exclusively for internal purposes (e.g., they will be presented or published beyond Cornell; or used for an undergraduate honors thesis, master’s thesis or doctoral dissertation), researchers may also be required to obtain approval from Cornell’s Institutional Review Board for Human Participants (IRB). IRB approval should be sought after the proposal has been reviewed by the SAS Research Group. The committee should subsequently be informed of the decision of the IRB.

© 2024 Cornell University

If you have a disability and are having trouble accessing information on this website or need materials in an alternate format, contact  [email protected]  for assistance.

  • Privacy Policy

Research Method

Home » How To Write A Research Proposal – Step-by-Step [Template]

How To Write A Research Proposal – Step-by-Step [Template]

Table of Contents

How To Write a Research Proposal

How To Write a Research Proposal

Writing a Research proposal involves several steps to ensure a well-structured and comprehensive document. Here is an explanation of each step:

1. Title and Abstract

  • Choose a concise and descriptive title that reflects the essence of your research.
  • Write an abstract summarizing your research question, objectives, methodology, and expected outcomes. It should provide a brief overview of your proposal.

2. Introduction:

  • Provide an introduction to your research topic, highlighting its significance and relevance.
  • Clearly state the research problem or question you aim to address.
  • Discuss the background and context of the study, including previous research in the field.

3. Research Objectives

  • Outline the specific objectives or aims of your research. These objectives should be clear, achievable, and aligned with the research problem.

4. Literature Review:

  • Conduct a comprehensive review of relevant literature and studies related to your research topic.
  • Summarize key findings, identify gaps, and highlight how your research will contribute to the existing knowledge.

5. Methodology:

  • Describe the research design and methodology you plan to employ to address your research objectives.
  • Explain the data collection methods, instruments, and analysis techniques you will use.
  • Justify why the chosen methods are appropriate and suitable for your research.

6. Timeline:

  • Create a timeline or schedule that outlines the major milestones and activities of your research project.
  • Break down the research process into smaller tasks and estimate the time required for each task.

7. Resources:

  • Identify the resources needed for your research, such as access to specific databases, equipment, or funding.
  • Explain how you will acquire or utilize these resources to carry out your research effectively.

8. Ethical Considerations:

  • Discuss any ethical issues that may arise during your research and explain how you plan to address them.
  • If your research involves human subjects, explain how you will ensure their informed consent and privacy.

9. Expected Outcomes and Significance:

  • Clearly state the expected outcomes or results of your research.
  • Highlight the potential impact and significance of your research in advancing knowledge or addressing practical issues.

10. References:

  • Provide a list of all the references cited in your proposal, following a consistent citation style (e.g., APA, MLA).

11. Appendices:

  • Include any additional supporting materials, such as survey questionnaires, interview guides, or data analysis plans.

Research Proposal Format

The format of a research proposal may vary depending on the specific requirements of the institution or funding agency. However, the following is a commonly used format for a research proposal:

1. Title Page:

  • Include the title of your research proposal, your name, your affiliation or institution, and the date.

2. Abstract:

  • Provide a brief summary of your research proposal, highlighting the research problem, objectives, methodology, and expected outcomes.

3. Introduction:

  • Introduce the research topic and provide background information.
  • State the research problem or question you aim to address.
  • Explain the significance and relevance of the research.
  • Review relevant literature and studies related to your research topic.
  • Summarize key findings and identify gaps in the existing knowledge.
  • Explain how your research will contribute to filling those gaps.

5. Research Objectives:

  • Clearly state the specific objectives or aims of your research.
  • Ensure that the objectives are clear, focused, and aligned with the research problem.

6. Methodology:

  • Describe the research design and methodology you plan to use.
  • Explain the data collection methods, instruments, and analysis techniques.
  • Justify why the chosen methods are appropriate for your research.

7. Timeline:

8. Resources:

  • Explain how you will acquire or utilize these resources effectively.

9. Ethical Considerations:

  • If applicable, explain how you will ensure informed consent and protect the privacy of research participants.

10. Expected Outcomes and Significance:

11. References:

12. Appendices:

Research Proposal Template

Here’s a template for a research proposal:

1. Introduction:

2. Literature Review:

3. Research Objectives:

4. Methodology:

5. Timeline:

6. Resources:

7. Ethical Considerations:

8. Expected Outcomes and Significance:

9. References:

10. Appendices:

Research Proposal Sample

Title: The Impact of Online Education on Student Learning Outcomes: A Comparative Study

1. Introduction

Online education has gained significant prominence in recent years, especially due to the COVID-19 pandemic. This research proposal aims to investigate the impact of online education on student learning outcomes by comparing them with traditional face-to-face instruction. The study will explore various aspects of online education, such as instructional methods, student engagement, and academic performance, to provide insights into the effectiveness of online learning.

2. Objectives

The main objectives of this research are as follows:

  • To compare student learning outcomes between online and traditional face-to-face education.
  • To examine the factors influencing student engagement in online learning environments.
  • To assess the effectiveness of different instructional methods employed in online education.
  • To identify challenges and opportunities associated with online education and suggest recommendations for improvement.

3. Methodology

3.1 Study Design

This research will utilize a mixed-methods approach to gather both quantitative and qualitative data. The study will include the following components:

3.2 Participants

The research will involve undergraduate students from two universities, one offering online education and the other providing face-to-face instruction. A total of 500 students (250 from each university) will be selected randomly to participate in the study.

3.3 Data Collection

The research will employ the following data collection methods:

  • Quantitative: Pre- and post-assessments will be conducted to measure students’ learning outcomes. Data on student demographics and academic performance will also be collected from university records.
  • Qualitative: Focus group discussions and individual interviews will be conducted with students to gather their perceptions and experiences regarding online education.

3.4 Data Analysis

Quantitative data will be analyzed using statistical software, employing descriptive statistics, t-tests, and regression analysis. Qualitative data will be transcribed, coded, and analyzed thematically to identify recurring patterns and themes.

4. Ethical Considerations

The study will adhere to ethical guidelines, ensuring the privacy and confidentiality of participants. Informed consent will be obtained, and participants will have the right to withdraw from the study at any time.

5. Significance and Expected Outcomes

This research will contribute to the existing literature by providing empirical evidence on the impact of online education on student learning outcomes. The findings will help educational institutions and policymakers make informed decisions about incorporating online learning methods and improving the quality of online education. Moreover, the study will identify potential challenges and opportunities related to online education and offer recommendations for enhancing student engagement and overall learning outcomes.

6. Timeline

The proposed research will be conducted over a period of 12 months, including data collection, analysis, and report writing.

The estimated budget for this research includes expenses related to data collection, software licenses, participant compensation, and research assistance. A detailed budget breakdown will be provided in the final research plan.

8. Conclusion

This research proposal aims to investigate the impact of online education on student learning outcomes through a comparative study with traditional face-to-face instruction. By exploring various dimensions of online education, this research will provide valuable insights into the effectiveness and challenges associated with online learning. The findings will contribute to the ongoing discourse on educational practices and help shape future strategies for maximizing student learning outcomes in online education settings.

About the author

' src=

Muhammad Hassan

Researcher, Academic Writer, Web developer

You may also like

Business Proposal

Business Proposal – Templates, Examples and Guide

Proposal

Proposal – Types, Examples, and Writing Guide

How to choose an Appropriate Method for Research?

How to choose an Appropriate Method for Research?

How To Write A Proposal

How To Write A Proposal – Step By Step Guide...

Research Proposal

Research Proposal – Types, Template and Example

Grant Proposal

Grant Proposal – Example, Template and Guide

Stanford Research Development Office

Guidance for Writing Proposal Sections

Created: 10/06/22

Updated: 08/19/24

More resources will be added as we continue to develop this page. (Most recent content update: July 2, 2024)

The following proposal sections, listed in alphabetical order, are commonly required by a variety of funders. For each, we have provided resources to assist in preparing content; some have been developed by RDO, while others are curated from trusted internal and external sources.

The resources below are intended to be a starting point. Solicitations will often specify unique requirements for each of these sections. Always check the requirements from your specific agency and call. 

Biographical Sketches

These documents provide evidence of an individual's qualifications for the role played in a proposed project and are generally requested in specific formats based on sponsor. For most STEM sponsors, RDO recommends using SciEnCV for generating and saving your biosketch as it will be easier to update and the interface allows reformatting for use in proposals for different sponsors.

  • Start here : Stanford ORA Biosketch Resource Page - Stanford resource with links to NSF and NIH biosketch guidance as well as SciEnCV resources
  • SciEnCV - a tool for assembling biographical information for federal sponsors that can easily be ported into multiple sponsor formats. It is quickly becoming an accepted (and often required) biosketch format for many sponsors including NIH, NSF, and DOE
  • NSF SciEnCV FAQs and Guide - start here if you need help setting up your SciEnCV account or run into questions along the way

Broader Impacts

Broader Impacts requirements generally ask for the answer to the question "how does your research benefit society?" This term and requirement are commonly associated with NSF, but other agencies can also have similar requirements. The resources below help to describe the breadth of what broader impacts can be as well as give advice on how to develop a vision and craft a compelling story about the broader impacts of your work. 

  • Stanford Grant Writing Academy Broader Impacts Resources - Among other information, includes a short video explaining NSF's BI requirement and suggestions on crafting a strong BI element for your proposal
  • Stanford Office of Education and STEM Outreach - A part of the Office of Community Engagement, ESO serves as a nexus connecting Stanford faculty, students, and postdocs with youth, schoolteachers, nonprofit organizations, and the broader community with the goals of increasing engagement, participation, equity and inclusion in STEM fields
  • ARIS Broader Impacts Toolkit - resources from the Center for Advancing Research Impact in Society designed to assist proposal teams as they develop broader impact projects

Budget and Budget Justifications

Budgets are an integral part of proposals that have a direct effect on how monies can be used, are tracked, and are audited in the post award period.

  • Start here : Stanford ORA Budget Resource Page - find templates and helpful links and information including California's partial sales and use tax exemption for research and development equipment
  • Stanford VPDoR Rates page - tables, policies, and information on F&A rates, fringe benefit rates, and others

Conflicts of Interest

Often sponsors require a list of collaborators and other affiliates in a form that allows the agency to ensure that no conflicts exist in the process of selecting reviewers or to check for PI conflict of interest in various areas. These can be in the form of "COA", "Collaborator", "COI" or other documents. Be sure to check and follow your sponsor's guidelines for these documents; many provide their own specific required templates.

  • Stanford Global Engagement Review Program coordinates input from multiple offices that advise on various aspects of foreign engagements to assess risks related to undue foreign influence, research security, and integrity

Data Management Plans

Many funding agencies will require a data management plan (DMP) as part of a proposal. The DMP describes the types of data you expect to collect, how they will be managed, and how access and preservation will be accomplished over time.

  • Start here : Stanford Libraries has a Resource Page with information about DMPs including access to an Online Data Management Plan Tool for creating a ready-to-use plan for your proposal
  • DMP Self Assessment Questionnaire (Stanford Libraries)
  • Stanford Libraries Data Management Services assists researchers with data preservation and access and has other data tools and services available 
  • Lane Medical Library NIH DMSP Checklist
  • Stanford University IT data Storage Recommendations
  • DOE suggested elements for a DMP
  • NASA DMP guidance
  • NEH guidelines for digital humanities  
  • NIH DMS Plan policy (new guidelines effective January 25, 2023)
  • NIH Sample Plans for different contexts
  • NSF DMP requirements (also includes links to directorate-specific guidances)
  • NSF FAQs for public access  
  • RDO has prepared a guide to creating NSF Data Management Plans (current guide reflects NSF PAPPG 23-1; to be updated after PAPPG 24-1 goes into effect May 20, 2024)
  • Effective practices for making research data discoverable and citable (NSF Dear Colleague Letter, March 2022)

Diversity Plans

Sponsors sometimes require demonstration that a project team will make specific efforts to promote diversity, equity, and inclusion. These requirements vary by sponsor and it is important to understand the level at which the activities are to take place. For example, does the sponsor want to see activities that are community-focused or targeted to the researchers and staff you will have on the project? In any case, a strong diversity plan also includes evaluation strategies and metrics for success. 

  • Stanford RDO's thought starter for DOE PIER Plan
  • Stanford SoM PDO template for NIH PEDP
  • DOE CBP: Community Benefits Plans (page includes links to templates)
  • DOE PIER: Promoting Inclusive and Equitable Research Plans
  • DOE DEI Informational Resources
  • NIH PEDP: Plan for Enhancing Diverse Perspectives

Evaluation Plans

Often addressed in multiple sections of a proposal, evaluation plans are an important component of understanding whether a project or strategy is effective and successful. Developing robust evaluation plans at the proposal stage can demonstrate to the reviewers and funders that you have thought about what "success" means and how you will be certain you will achieve it or adjust practices to course correct along the way. These are commonly requested for educational activities, outreach plans, workforce development strategies, and management plans.

  • American Evaluation Association Find an Evaluator Tool - a directory that can be searched by location, area of expertise, or name

Institutional Support

Funding agencies may request that cost sharing, details on facilities, equipment, and other resources available to the proposal team, and other forms of institutional support be included with proposals. The scope and format of these requirements will depend on the specific funding opportunity or call. RDO recommends starting early in your proposal development process and working in collaboration with department or school leadership to identify and request appropriate institutional support for your proposal.

  • RDO's Thought Starter: Stanford Institutional Support for Large, Strategic Grant Proposals - a list of support that may be appropriate for large, strategic proposals that are beyond the usual scale for a given discipline. Contains notes on how to start the conversations necessary to secure different types of institutional support, relevant policies set by the University, and other factors to consider.

Management Plans

Management plans are common elements of large collaborative or center grants. This section is intended to demonstrate to reviewers how teams will work together to accomplish the various goals of a project. Some plans also require detailed administrative information as well as plans for evaluation of project activities (see section on Evaluation Plans above).

  • Start here : RDO Management Plan Guidelines - six common topics for consideration when devising a management plan for STEM center grants
  • RDO resources for collaboration and team science
  • DOE's EFRC Good Management Practices - while it originated from a specific DOE program, this document contains excellent advice that is generalizable to other research center management strategies

Postdoctoral Mentoring Plans

Postdoctoral mentoring plans (PMPs) are often required in STEM-focused proposals where a postdoctoral researcher's involvement is indicated. These serve as roadmaps for both mentor and mentee to navigate the key aspects of mentorship and professional development of postdocs. It's best to avoid using a boilerplate approach and instead tailor the Plan to the specific program you are proposing, institution you are with, and/or postdoc(s) to be mentored.

  • Start here : RDO's Postdoctoral Mentoring Plan Guidelines - an NSF-focused document with prompts and suggestions for writing an effective PMP that is also useful in thinking of strategies to fulfill PMP requirements for other sponsors. Note: Proposals due or submitted on or after May 20, 2024 will be required to submit a Mentoring Plan applicable to both graduate students and postdoctoral researchers, in lieu of the prior Postdoctoral Mentoring Plan requirement. Please see NSF PAPPG 24-1 for details. 
  • National Postdoc Association Institutional Guide to Postdoc Mentorship - includes specific guidance on PMPs as well as links to resources on mentorship

Sponsors sometimes request information on protocols and plans related to safety in various context including in the laboratory, at field sites, or any off-campus work environment. The university has policies and procedures related to these topics which can be found in addition to other resources linked below.

  • Start here : Stanford EH&S website - central website for Stanford safety services and support which also includes information on training, standard operating procedures, and many safety related resources for the campus community
  • Stanford ORA template for NSF Plans for Safe and Inclusive Working Environments for Off-campus Research - an NSF-focused document with instructions, applicable University policy information, and fillable fields for PIs to complete their project-specific information

research proposal assessment criteria

Assessment of applications - MRC

The criteria against which applications should be assessed directly relates to the core responsive mode application questions :

  • vision of the project
  • approach to the project
  • capability of the applicant or applicants and the project team to deliver the project
  • resources requested to do the project
  • ethical and responsible research and innovation considerations of the project

Some opportunities will have additional questions that reflect their disciplinary specific requirements. Further detail on what assessors should be looking for is available on the relevant funding opportunity page under the ‘how to apply’ section.

The Medical Research Council (MRC) also requires sex to be justified in the experimental design of grant applications involving animals, and human and animal tissues and cells, as part of the sex in experimental design requirement .

MRC’s policy on embedding diversity in research design will apply to applications involving human participants, samples or data, submitted to opportunity deadlines after 1 September 2023.

When undertaking your assessment of the research, you should consider the unequal impacts that COVID-19 -related disruption described by the applicants might have had on the research, track record and career development of those individuals included in the application.

Last updated: 11 April 2024

This is the website for UKRI: our seven research councils, Research England and Innovate UK. Let us know if you have feedback or would like to help improve our online products and services .

COMMENTS

  1. 7 CFR § 3406.20

    § 3406.20 Evaluation criteria for research proposals. The maximum score a research proposal can receive is 150 points. Unless otherwise stated in the annual solicitation published in the Federal Register, the peer review panel will consider the following criteria and weights to evaluate proposals submitted:

  2. PDF Criteria for Evaluating Research Proposals

    10. General Scholarship. Logical and coherent organization. Breakdown into effective systems of headings. Evidence of insight into nature of problem. Imagination in design of study in interpretation of results. Evidence of ad~uate grasp of research and statistical tools. Display of scientific attitudes: report.

  3. Evaluation of research proposals by peer review panels: broader panels

    To assess research proposals, funders rely on the services of peer experts to review the thousands or perhaps millions of research proposals seeking funding each year. While often associated with scholarly publishing, peer review also includes the ex ante assessment of research grant and fellowship applications (Abdoul et al. 2012). Peer review ...

  4. PDF Evaluation Criteria for Research Proposal Name

    Evaluation Criteria for Research Proposal Name: Section 1 Introduction Introduces reader to problem 1 0 Provides evidence that substantiates problem's existence 2 1 0 ... Each research study summarized addresses: sample, methods, results 5 4 3 2 1 0 Transitions are used to introduce each subheading 2 1 0 Five or more studies are reviewed 5 4 3 ...

  5. How to Write a Research Proposal

    Research proposal examples. Writing a research proposal can be quite challenging, but a good starting point could be to look at some examples. We've included a few for you below. Example research proposal #1: "A Conceptual Framework for Scheduling Constraint Management" Example research proposal #2: "Medical Students as Mediators of ...

  6. A Comprehensive Guide to Developing Research Proposals, Assessment

    Scholars who have a thorough awareness of the essential components of research proposal preparation, assessment criteria, and recommended practices are better prepared to navigate the complicated ...

  7. PDF Proposal Evaluation Criteria

    Proposal Evaluation Criteria. Proposal Evaluation Criteria. d. reative ProjectsR. search1. Intellectual merit Does the proposal have a clear and specific res. arch question/artistic goal? Has the student demonstrated engagement with schol.

  8. PDF Evaluation of research proposals: the why and what of the ERC's recent

    c excellence. Research evaluation is therefore at the heart of its operations. Recently, the Scientific Council1 of the ERC has introduced changes in the evaluation processes and evaluation forms for the 2024 calls for research proposals2, as describ. d in the ERC 'Work Programme 2024' and the associated guidance documents3. This report ...

  9. What Is A Research Proposal? Examples + Template

    The purpose of the research proposal (its job, so to speak) is to convince your research supervisor, committee or university that your research is suitable (for the requirements of the degree program) and manageable (given the time and resource constraints you will face). The most important word here is "convince" - in other words, your ...

  10. Parts of a Research Proposal

    A research proposal's purpose is to capture the evaluator's attention, demonstrate the study's potential benefits, and prove that it is a logical and consistent approach (Van Ekelenburg, 2010). To ensure that your research proposal contains these elements, there are several aspects to include in your proposal (Al-Riyami, 2008): Title; Abstract

  11. PROPOSAL GUIDE 6. Proposal Questions and Evaluation Criteria

    Proposal question: Describe the current level of expertise of each researcher in relation to the proposed work. For researchers expected to do hands-on work at the Foundry, be explicit about their level of training or experience on the requested instruments and capabilities. Review criteria: Are the users adequately prepared for efficient use ...

  12. How to assess research proposals?

    The peer review of research proposals (grants) aims to judge the merit of projects and researchers and enable the best to be contemplated. The director of an institution in the United Kingdom shared on Twitter his struggle in evaluating the numerous proposals received and started a discussion forum from which ideas and suggestions emerged.

  13. Proposal Assessment

    Proposal Assessment (criteria) Reviews are based around three core criteria: Importance: how important are the questions, or gaps in knowledge, that are being addressed? ... When undertaking your assessment of the research project you should assess the project as written, noting that any changes that the project might require in the future ...

  14. PDF Thesis Research Proposal Evaluation Rubric- PRINT version 10-2011

    Not Applicable. 1. Problem Definition‐Hypothesis: Stated the research problem clearly, provided motivation for undertaking the research. 2. Specific Aims: Provided succinct, clear, logical description of the objectives and plan of action. 3. Background‐ Literature and Previous Work:

  15. Formal Review of Research Proposals

    The proposal form asks for information about the purpose and proposed design of the study, as well as draft versions of data collection instruments. Samples of completed research proposals are available here and here. The following criteria will be used by the committee to evaluate research proposals:

  16. PDF RFP Writing: Evaluation & Selection Criteria

    Depending on the RFP content, proposal submission requirements will vary. Regardless of the proposal submission requirements you include, it is important to put yourself in the shoes of the proposer, and to check that the submission requirements are clear and directly tied to either evaluation criteria, or government legal and policy requirements.

  17. PDF How to Write a Good Postgraduate RESEARCH PROPOSAL

    Style: If space allows, provide a clear project title. Structure your text - if allowed use section headings. Present the information in short paragraphs rather than a solid block of text. Write short sentences. If allowed, provide images/charts/diagrams to help break up the text.

  18. (Pdf) Research Proposal

    11.6 assessment of research proposal The following criteria are used by researchers for assessin g research proposals. Criteria Used for Assessin g Short Research Proposals

  19. PDF Assessment Criteria for Early Career Researcher's Proposals in the

    procedure for selecting the best proposals must be applied. In this research in progress paper, we present quality criteria for the ex-ante assessment of research proposals from early career researchers in the humanities. Applying a bottom-up approach we base the evaluation criteria on scholars' ratings of quality criteria regarding their

  20. Writing A Research Proposal: 8 Common Mistakes

    Overview: 8 Research Proposal Killers. The research topic is too broad (or just poorly articulated). The research aims, objectives and questions don't align. The research topic is not well justified. The study has a weak theoretical foundation. The research design is not well articulated well enough. Poor writing and sloppy presentation.

  21. How To Write A Research Proposal

    Here is an explanation of each step: 1. Title and Abstract. Choose a concise and descriptive title that reflects the essence of your research. Write an abstract summarizing your research question, objectives, methodology, and expected outcomes. It should provide a brief overview of your proposal. 2.

  22. How to write a research proposal?

    A proposal needs to show how your work fits into what is already known about the topic and what new paradigm will it add to the literature, while specifying the question that the research will answer, establishing its significance, and the implications of the answer. [ 2] The proposal must be capable of convincing the evaluation committee about ...

  23. Guidance for Writing Proposal Sections

    Budgets are an integral part of proposals that have a direct effect on how monies can be used, are tracked, and are audited in the post award period. Start here: Stanford ORA Budget Resource Page - find templates and helpful links and information including California's partial sales and use tax exemption for research and development equipment

  24. Assessment of applications

    The criteria against which applications should be assessed directly relates to the core responsive mode application questions: vision of the project. approach to the project. capability of the applicant or applicants and the project team to deliver the project. resources requested to do the project. ethical and responsible research and ...