Skip to main content

Do German university medical centres promote robust and transparent research? A cross-sectional study of institutional policies

Abstract

Background

In light of replication and translational failures, biomedical research practices have recently come under scrutiny. Experts have pointed out that the current incentive structures at research institutions do not sufficiently incentivise researchers to invest in robustness and transparency and instead incentivise them to optimize their fitness in the struggle for publications and grants. This cross-sectional study aimed to describe whether and how relevant policies of university medical centres in Germany support the robust and transparent conduct of research and how prevalent traditional metrics are.

Methods

For 38 German university medical centres, we searched for institutional policies for academic degrees and academic appointments as well as websites for their core facilities and research in general between December 2020 and February 2021. We screened the documents for mentions of indicators of robust and transparent research (study registration; reporting of results; sharing of research data, code and protocols; open access; and measures to increase robustness) and for mentions of more traditional metrics of career progression (number of publications; number and value of awarded grants; impact factors; and authorship order).

Results

While open access was mentioned in 16% of PhD regulations, other indicators of robust and transparent research were mentioned in less than 10% of institutional policies for academic degrees and academic appointments. These indicators were more frequently mentioned on the core facility and general research websites. Institutional policies for academic degrees and academic appointments had frequent mentions of traditional metrics.

Conclusions

References to robust and transparent research practices are, with a few exceptions, generally uncommon in institutional policies at German university medical centres, while traditional criteria for academic promotion and tenure still prevail.

Peer Review reports

Background

In recent years, the field of biomedicine has seen broad and increasing reflection on its research practices. Various authors have pointed out that flaws in the choice of research questions and in the conduct of biomedical research lead to research waste [1]. These statements have been accompanied by findings that biomedical research often fails to reproduce [2,3,4,5,6], which ultimately hampers the goal of biomedical research, which is translation of findings into medical practice, and ultimately improving healthcare [1].

Concretely, while authors have discussed a possible low base rate of true hypotheses [7], and others have pointed to necessary changes in how research is funded [8] and regulated [9], much of the discussion has focused on the design, conduct, dissemination and reporting of biomedical research. It has been argued that the field fundamentally lacks transparency, with study reports, research protocols or participant data often not publicly accessible, and many research findings not being published at all [10]. If findings are published, they often lack sufficient detail and suffer from selective reporting of outcomes or limitations [12]. In addition, authors have pointed to flaws in biomedical study design and statistical analyses [7, 13, 14]. A recent survey from the Netherlands found that some of these so-called questionable research practices are more prevalent in the biomedical field than in other fields [11].

Several solutions have been proposed to address the flaws in the design, conduct, dissemination and reporting of biomedical research. One of the most widely discussed proposals is the call for more transparent, or “open,” science along all steps of biomedical research. One of these steps is study registration, that is, registering study protocols before data collection, which is supposed to disclose flexibility in data analysis that might lead to false-positive results [15,16,17]. There have been calls to increase the robustness of science, for example, by asking and supporting researchers in choosing adequately large samples, appropriately randomising participants and performing blinding of subjects, experimenters and outcome assessors [3, 4, 18, 19]. Researchers have been urged to share their data, code and protocols to increase transparency and reproducibility of biomedical research [20], and to report all research results in a timely manner, in line with established reporting guidelines, and ideally without paywalls (open access). This is supposed to tackle prevalent publication bias in which only positive results are reported in journals [21], which distorts the evidence base and thus leads to research waste, for example, by encouraging follow-up studies that would have been considered futile if all research had been reported. To aid in this, new publication formats, namely, preprints and registered reports [22], have been established. All of these procedures are, in the long run, supposed to increase trust in science and lead to more reproducible research [23]. Additionally, more emphasis has been put on actual replication of studies [24], and there have also been calls to abandon [25], redefine [26] or better justify [27] statistical significance thresholds; however, these suggestions have been subject to debate.

To date, the uptake of the aforementioned robust and transparent practices has been slow [28,29,30,31,32,33]. Many have pointed out that the current incentive structures for researchers do not sufficiently incentivise them to invest in robustness and transparency and instead incentivise them to optimise their fitness in the struggle for publications and grants [34,35,36,37]. To receive promotion and ultimately tenure, researchers are evaluated based primarily on how many journal articles (with high impact factors) they publish and how much grant money they secure [35]. The negative influence of the so-called publication pressure on research quality has been shown by mathematical simulations [35, 36] as well as empirical surveys indicating that it is both positively associated with questionable research practices, and negatively associated with responsible research practices [11, 38]. It has been said that all stakeholder groups, including funders and journals, must contribute [9, 12] to an incentive system that actually does reward robust and transparent research practices; in the case of funders, for example, by awarding grants based not only on publication numbers, but on the adoption of open practices and, in the case of publishers, by providing peer review that embraces open practices (allowing peer reviewers to better serve as quality control instances and detect questionable research practices [11]) and not publishing only positive findings, but instead basing editorial decisions just on the soundness of the research. This is, as some studies show, currently not always the case [39, 40].

The role and influence of the research institutions has thus far been less prominently discussed [3]. Since research institutions define the requirements for academic degrees, academic appointments and available intramural funding, their policies and regulations could, and do [11, 38], have a strong impact on researchers’ capability, opportunity and motivation to apply robust and transparent research practices in their work. With regard to university policies, some changes have already been proposed. One of these changes is abandoning the current dysfunctional incentive systems of promotion [35, 36]. Another is an increased focus on transparent practices: the signers of the San Francisco Declaration on Research Assessment (DORA) call for institutions to clearly highlight “that the scientific content of a paper is much more important than publication metrics or the identity of the journal in which it was published” [41]. More specifically, Moher et al. [42] suggest that rewards, incentives and performance metrics at institutions should align with the full dissemination of research, reuse of original datasets and more complete reporting, namely, the sharing of protocols, code and data, as well as preregistration of research (see also the publications by the League of European Research Universities [43] and others [12, 44,45,46,47]). Mejlgaard et al. [48] propose that institutions should incentivise making data findable, accessible, interoperable and reusable (FAIR) [49]. Begley et al. [3] suggest similar rules for academic degrees and academic appointments but with regard to the robustness of the research. These authors also demand that the use of reporting guidelines, such as the ARRIVE (Animal Research: Reporting of In Vivo Experiments) guidelines [50] or the CONSORT (Consolidated Standards of Reporting Trials) guidelines [51], be mandated by institutions. Additionally, core facilities such as clinical research units and animal research facilities provide centralised services for the conduct of clinical or animal studies (this includes animal protection and research according to the so-called 3R principles: replace, reduce, refine [52]). These core facilities could have additional influence [53], for example, by recommending that researchers report their results in a timely and nonselective way or by requiring researchers to adhere to established reporting guidelines.

Studying the uptake of the aforementioned recommendations in institutional policies could inform areas for improvement in policy-making at universities. To our knowledge, however, only one study [54] has dealt with this issue, sampling biomedical faculties of 170 universities worldwide and searching criteria for promotion and tenure. The authors report that mentions of traditional criteria of research evaluation were very frequent, while mentions of robust and transparent research practices were rare.

In this cross-sectional study, we aim to describe whether and how relevant policies of university medical centres (UMCs) in Germany support the robust and transparent conduct of research and how prevalent traditional metrics of career progression are. We choose to investigate only German UMCs, as this ensures better comparability of the institutions, since different countries have different regulatory environments (for example, German UMCs are currently in the process of implementing new good scientific practice regulations, mandated by the German Research Foundation [Deutsche Forschungsgemeinschaft, [DFG]), different curricula for medical studies and different frameworks for postgraduate degrees. The focus on Germany also allows us to perform in-depth data collection of German-language documents.

Methods

A detailed methodology is described in our preregistered study protocol, which is available here: https://osf.io/wu69s/ (including a list of protocol amendments and deviations). The following section provides a summary of the methods, which are reported in accordance with the STROBE (Strengthening the Reporting of Observational studies in Epidemiology) [55] guidelines.

Sampling and search strategy

We obtained a list of all German medical faculties from the website of the German medical faculty council (Medizinischer Fakultätentag). For each of the 38 faculties (as of December 2020), we performed a manual search of their websites between 14 December 2020 and 12 February 2021. The search terms and strategy were based on discussions in our full research team after piloting; they have been presented in detail in our protocol. The search was done by the first author (MH), who searched the websites of both the medical faculties and the adjacent university hospitals, looking for the sources presented in Table 1.

Table 1 Data sources that were screened in this study

Regarding the PhD and habilitation regulations and the application forms and procedural guidelines for tenure, we saved all related policy documents. Regarding the websites of clinical research units, websites of animal research facilities, 3R centres and animal protection offices, and the general research websites, we first went through each website in detail (including all subpages), saving only those websites and documents that contained any mention of one of the indicators summarised in Table 2. (See Additional file 1: Table S1 for a more fine-grained terminology with subcategories).

Table 2 Indicators that were chosen for inclusion in this study

We chose both the indicators of robust and transparent research and the traditional metrics of career progression based on their frequent discussion in the literature as either cornerstones of more robust and transparent biomedical research or as incentives leading to the opposite [3, 39, 41, 45, 48]. We also chose them for their consistency with previous research works [54] and publications from our institute [32, 37].

Data extraction

All documents were imported into qualitative research software (MAXQDA 2020, Release 20.3.0, VERBI GmbH, Germany). We applied deductive content analysis [56]. One rater (MRH) went through all of the documents and coded whether there was any mention of the prespecified indicators of robust and transparent research, as well as the traditional indicators of metrics for career progression. While we searched all documents for the indicators of robust and transparent research, we only searched the PhD and habilitation regulations and application forms and procedural guidelines for tenure for the traditional metrics, as these related specifically to career progression.

If a certain indicator was found, the rater decided whether it was just mentioned (e.g. a university explaining what open access is, or a clinical research unit stating that 60% of clinical trial results were published) or whether that procedure was incentivised/required (e.g. a university specifically requiring a certain impact factor to receive top marks in the PhD or a clinical research unit offering support with summary results reporting of clinical trials). Thus, while we refer to the traditional indicators as “metrics” based on their frequent usage as that, there is no actual difference between indicators and metrics in the sense that they can both incentivise or require behaviour. We based our assessment of incentivised/required on the COM-B model of behaviour change [57], which distinguishes between capability, opportunity and motivation to change behaviour, and lists education, persuasion, incentivisation, coercion, training, restriction, environmental restructuring, modelling and enablement as potential interventions. We defined anything that could increase capability, opportunity or motivation to engage in that behaviour as “incentivised” or “required”.

A second, independent rater (AF) went through the documents of 10 of the 38 UMCs.

Results

The datasets generated and analysed during the current study are available in a repository on the Open Science Framework (https://osf.io/4pzjg/). The code for calculations of inter-rater reliability, which also includes robustness checks, is available on GitHub (https://github.com/Martin-R-H/umc-policy-review). The inter-rater reliability in our sample of 10 UMCs, measured by Cohen’s kappa, was κ = 0.806. Thus, we deemed further double-coding unnecessary.

Overall, the web searches of the 38 German UMCs yielded 339 documents. We found PhD regulations for 37 UMCs (97%), habilitation regulations for 35 UMCs (92%), tenure application forms for 25 UMCs (66%) and procedural guidelines for tenure for 11 UMCs (29%). We found 38 general research websites (100%), 32 websites of clinical research units (84%) and 23 animal research websites (61%; see Table 3). Additional file 1: Table S2 shows numbers for each UMC.

Table 3 Number of documents we included for each university and document type

The results are presented in detail in Tables 4 and 5, divided by each procedure and each type of document or website. Additional file 1: Tables S3 and S4 provide more detailed data on the subcategories of the indicators of robust and transparent science. Tables 6 and 7 provide example quotes.

Table 4 Number of university medical centres that mention indicators of robust and transparent science and traditional indicators of career progression in each of the included sources
Table 5 Number of university medical centres that mention indicators of robust and transparent science in each of the included sources
Table 6 Examples of mentions of each practice, divided by policy type (empty sections indicate that no section regarding the metric was found)
Table 7 Examples for mentions of each practice, divided by policy type (empty sections indicate that no section regarding the metric was found)

Indicators of robust and transparent science

Study registration

The issue or relevance of registering studies was not mentioned in any (0%) of the documents regarding academic promotion and tenure. Thirty-four percent of websites of clinical research units mentioned registration, with 31% of those also incentivising or requiring the practice. This appeared mostly in the form of clinical research units offering support with registering clinical studies. Only 4% of animal research websites and 5% of general research websites mentioned registration. The animal facility provided a link to an animal study register, while the two research webpages generally endorsed the practice.

Reporting of results

Eight percent of the PhD regulations and 3% of habilitation regulations mentioned the issue of results reporting; these mentions included general requirements that the respective thesis be published. The habilitation regulation also referred to timely publication, asking individuals to publish their thesis no later than 2 years after receiving the degree. Results reporting was also mentioned by 9% of clinical research units, 4% of animal research websites and 21% of general research websites. All mentions expressed general endorsements or highlighted education regarding the publication of all results. One of the clinical research units further offered help with the publication process. The animal research facility that mentioned results reporting provided a tool to identify publication formats that fit the characteristics of the respective datasets. When the general research websites mentioned reporting results, they usually referred to statements in the university’s or the DFG’s good scientific practice guidelines for publishing research.

Data/code/protocol sharing

Data, code, or protocol sharing was only mentioned in one PhD regulation (3%). In this mention, supervisors were asked to consider data sharing in the evaluation of the thesis. No habilitation regulations, tenure application forms or procedural guidelines for tenure mentioned this indicator (0%). Likewise, no clinical research unit website mentioned sharing of data/protocols (0%). Four percent of animal research websites and 21% of research websites mentioned data, code or protocol sharing. In the case of the animal facility, the mention was a general introduction to the FAIR principles [49] of data sharing. The general research websites included endorsements of data and code sharing, mostly within the university’s good scientific practice guidelines.

Open access

Sixteen percent of PhD regulations and 3% of habilitation requirements mentioned open access. In one PhD regulation, PhD supervisors were asked to also keep in mind whether the work was published with open access. In the other cases, the PhD regulation mentioned that the university library had the right to publish the submitted thesis in a repository (green open access). No clinical research unit (0%) and 4% of animal research websites mentioned open access. In the case of the animal facility, it was a link to an interview in which an “open access culture” was announced. Thirty-four percent of general research websites mentioned open access; these websites either generally recommended open access or referred to the university’s open access publishing funds.

Measures to improve robustness

Robustness was mentioned in 3% of PhD regulations but in none (0%) of the habilitation regulations, tenure application forms or procedural guidelines for tenure. Robustness was mentioned by 81% of websites of clinical research units and 26% of the animal research websites. The clinical research units usually offered services to help with power calculations and randomisation (and, in a few cases, blinding). In the case of animal research websites, the mentions pointed to documents recommending power calculation as part of an effort to protect animals, courses on robust animal research and general informational material on these issues. None (0%) of the general research webpages mentioned the issue of robustness.

Traditional indicators

Number of publications

References to publication numbers were made by 100% of PhD regulations and 91% of habilitation regulations. No tenure application documents referred to the number of publications, aside from requirements to provide a complete list of publications. Procedural guidelines for tenure had references to the number of publications in 27% of cases. The PhD regulations and habilitation requirements listed a certain number of publications as a requirement to obtain a PhD or habilitation, respectively.

Number and value of grants

None (0%) of the PhD regulations mentioned grant money. Among the habilitation regulations, 11% mentioned grant money, while 84% of the tenure application forms mentioned grant money, in which case there were requirements to provide a complete list of grants awarded. Twenty-seven percent of the procedural guidelines for tenure regulations also mentioned grants. These passages stated that experience with grants was expected or that people were required to provide a list of grants they received.

Impact factor

Sixteen percent of the PhD regulations and 63% of the habilitation requirements mentioned an impact factor, with most of them establishing concrete incentives or requirements. These two types of regulations contained passages that asked doctoral students or habilitation candidates to publish in high-impact journals to achieve the highest grade (summa cum laude) or regulations that allowed PhD students to publish only one paper instead of three if that paper was in a sufficiently “good” journal. Tenure application forms mentioned impact factors in 72% of cases, mostly requiring applicants to provide a list of impact factors of each journal they published in. None (0%) of the procedural guidelines for tenure mentioned impact factors.

Authorship order

Ninety-seven percent of the PhD regulations mentioned the authorship order, always as an incentive/requirement. The same applied to 80% of habilitation regulations, all of which incentivised or required it. These were regulations requiring PhD students and habilitation candidates to publish a portion of their articles as the first or last author (e.g. a very common regulation for German PhD students is to publish three papers, one of which with first/last authorship). Sixty-eight percent of tenure application forms also mentioned this requirement, noting that applicants should provide a list of publications divided by authorship. None (0%) of the procedural guidelines for tenure had a related section.

Discussion

In this study, we aimed to assess how and to what extent the 38 German UMCs promote robust and transparent research in their publicly available institutional policies for academic degrees, academic appointments, core facilities and research in general. We also investigated the presence of traditional metrics of researcher evaluation. Our results show that current UMC policies on academic degrees (e.g. PhD regulations) or appointments (e.g. tenure application forms) contain very few (less than 10%) references to our chosen indicators for robust and transparent research, such as study registration, reporting of results, data/code/protocol sharing or measures to improve robustness (e.g. sample size calculation, randomisation, blinding). An exception is open access, which was mentioned in 16% (6 out of 37) PhD regulations, in most cases referring to a repository to which the thesis could be publicly uploaded. In contrast, the number of publications and the authorship order were frequently mentioned in UMC policies on academic degrees and appointments, particularly PhD and habilitation regulations (more than 80%). The majority of application forms for tenure further mentioned impact factors and secured grant money (more than 70%).

The UMCs’ websites for clinical and animal research included more frequent mentions of robust and transparent research, but these differed based on the type of website. Clinical research unit websites frequently mentioned study registration and measures to improve robustness, while animal research websites only had frequent mentions of measures to improve robustness. These mentions were mostly related to sample size calculations and randomization. The general research websites had the most frequent mentions of open access, reporting of results, and data, code or protocol sharing. In most of these cases, these indicators were mentioned in the good scientific practice guidelines. In the case of open access, some websites also featured references to a university-wide open access publishing fund.

Our findings are in line with a similar study that collected data from an international sample [54]. The authors found very frequent mentions of traditional criteria for research evaluation, while mentions of robust and transparent research practices were less frequent than in our study, with none of the documents mentioning publishing in open access mediums, registering research or adhering to reporting guidelines, and only one mentioning data sharing. The results are unsurprising, given recent findings that practices for robust and transparent research are only very slowly becoming more prevalent [30, 32]; however, they stand in stark contrast to the various experts and institutions that have called for institutions to align their promotion criteria with robust and transparent research [3, 41,42,43, 47, 48, 58, 59]. While we focused exclusively on a full sample of all German UMCs, our approach could also be applied to other countries.

It is important to keep in mind that policies and incentives are constantly changing. As mentioned in the introduction, a major German funder, the DFG, recently reworked their good scientific practice guidelines [60], expecting universities to ratify them in their own good scientific practice guidelines by July 2022. For the first time, these guidelines state that measures to avoid bias in research, such as blinding, should be used and that researchers should document all information and generally should publish all results, including those that do not support the hypothesis. They also recommend open sharing of data and materials in accordance with the FAIR principles and suggest that authors consider alternative publication platforms, such as academic repositories. Some German UMCs might have already changed their internal good scientific practice guidelines by the time the data collection of this study was conducted, which is the reason why we did not explicitly include these guidelines in our web search (we included them, however, if we found them on the general research websites).

One limitation of our study is that the raters were not blinded, which was not possible due to the ability to identify the policies from context. Another limitation is that we only searched for publicly available policies and did not survey relevant representatives of the 38 UMCs personally to identify further policies. For the two types of tenure-related policies in particular, we found relevant policies for only 66% (application forms) and 29% (procedural guidelines) of all UMCs. We refrained from this additional step, however, because the results across the available tenure policies showed a very homogeneous pattern of no mentions (0%) of measures for robust and transparent research, and we assumed that this pattern did not differ across policies that were not publicly available.

While our study focused on reviewing policies for robust and transparent research in policies for academic degrees and academic appointments, as well as their research and core facility websites, there are other ways for institutions to promote these practices. An example is the performance-based allocation of intramural resources, the so-called Leistungsorientierte Mittelvergabe (LOM). The LOM might also have a strong influence on researcher behaviour, and it has been proposed that it should be based on transparency of research [61]. Another example would be education on robust and transparent research practices, which has already become a target of reform in Germany. These reforms aim explicitly at training for medical students, who normally do not receive any training in research methodology, to allow them to better understand the evidence base of biomedical research [62,63,64]. Education aimed at postgraduates might mostly be organised and announced via internal channels of a university and thus not visible for our web search-based methodology. Third, robustness and transparency might be improved by better supervision or better actions against research misconduct, including better whistleblowing systems [48]. Nevertheless, we are convinced that our approach was able to find policies that cover many institutional incentives, especially policies for promotion and tenure, which have a strong influence on researcher behaviour.

Additionally, initiatives for transparent research exist at the federal and national levels (e.g. Projekt DEAL for open access). While universities remain obliged to include these national incentives and policies in their own regulations, future research might focus on these other incentives or policies in the biomedical field.

More generally, there is discussion about how academic institutions—or the academic system in general—need to change to facilitate better research. People have argued that new regulations for open and transparent research might not lead to genuine change for the better, but rather to box-ticking, for example, by arguing that reporting guidelines are not really of help [65] or by showing that study registrations sometimes lack specificity [66]. Additionally, questions have been raised whether assessing individual researchers is the right strategy after all [67]. Criticism has been directed at the general work structures in academia, with some arguing that short-term, non-permanent contracts [68] and a general overweight of third-party funding [69, 70] lead to an unhealthy amount of competition and power imbalances in academia, which in turn facilitate the use of questionable research practices. Research institutions and academia at large are complex systems, with many layers of incentives, and it is yet unclear which measures will lead to a change for the better.

Thus, future research should also address the effects of policies and other institutional activities to increase robust and transparent research practices [71]. Thus far, only a few studies have addressed this. For example, Keyes et al. [72] evaluated the effect of a clinical trial registration and reporting programme, which turned out to be a success. More generally, there is a lack of research on interventions on organisational climate and culture in academia [73].

Conclusion

In summary, current UMC policies on academic degrees or appointments do not promote procedures for robust and transparent research, especially in terms of policies for academic degrees and academic appointments. In contrast, the number of publications and the authorship order play a dominant role in almost all UMC policies on academic degrees and appointments, and most of the tenure- and appointment-related policies further promote impact factors and grant money secured. This stands in stark contrast to the various experts and institutions that have called for institutions to align their promotion criteria with robust and transparent research.

Availability of data and materials

The datasets generated and analysed during the current study are available in a repository on the Open Science Framework (https://osf.io/4pzjg/). The code for inter-rater reliability calculations, which also includes robustness checks, is available on GitHub (https://github.com/Martin-R-H/umc-policy-review).

Abbreviations

ARRIVE:

Animal Research: Reporting of In Vivo Experiments

CONSORT:

Consolidated Standards of Reporting Trials

DFG:

Deutsche Forschungsgemeinschaft (German Research Foundation)

DORA:

San Francisco Declaration on Research AssessmentDRKS: Deutsches Register Klinischer Studien (German Clinical Trials Register)

DRKS:

Deutsches Register Klinischer Studien (German Clinical Trials Register)

FAIR:

Findable, accessible, interoperable and reusable

LOM:

Leistungsorientierte Mittelvergabe (performance-based allocation of funding)

PhD:

Doctor of philosophy

STROBE:

Strengthening the Reporting of Observational Studies in Epidemiology

UMC:

University medical centre

3R:

Replace, reduce, refine

References

  1. Macleod MR, Michie S, Roberts I, Dirnagl U, Chalmers I, Ioannidis JPA, et al. Biomedical research: increasing value, reducing waste. Lancet. 2014;383(9912):101–4.

    Article  Google Scholar 

  2. Begley CG, Ellis LM. Raise standards for preclinical cancer research. Nature. 2012;483(7391):531–3.

    Article  CAS  Google Scholar 

  3. Begley CG, Buchan AM, Dirnagl U. Robust research: institutions must do their part for reproducibility. Nature. 2015;525(7567):25–7.

    Article  CAS  Google Scholar 

  4. Prinz F, Schlange T, Asadullah K. Believe it or not: how much can we rely on published data on potential drug targets? Nat Rev Drug Discov. 2011;10(9):712–712.

    Article  CAS  Google Scholar 

  5. Errington TM, Denis A, Perfito N, Iorns E, Nosek BA. Challenges for assessing replicability in preclinical cancer biology. eLife. 2021;10:e67995.

  6. Errington TM, Mathur M, Soderberg CK, Denis A, Perfito N, Iorns E, et al. Investigating the replicability of preclinical cancer biology. eLife. 2021;10:e71601.

  7. Ioannidis JPA. Why most published research findings are false. PLoS Med. 2005;2(8):6.

    Article  Google Scholar 

  8. Chalmers I, Bracken MB, Djulbegovic B, Garattini S, Grant J, Gülmezoglu AM, et al. How to increase value and reduce waste when research priorities are set. Lancet. 2014;383(9912):156–65.

    Article  Google Scholar 

  9. Salman RA-S, Beller E, Kagan J, Hemminki E, Phillips RS, Savulescu J, et al. Increasing value and reducing waste in biomedical research regulation and management. Lancet. 2014;383(9912):176–85.

  10. Chan A-W, Song F, Vickers A, Jefferson T, Dickersin K, Gøtzsche PC, et al. Increasing value and reducing waste: addressing inaccessible research. Lancet. 2014;383(9913):257–66.

    Article  Google Scholar 

  11. Gopalakrishna G, ter Riet G, Vink G, Stoop I, Wicherts JM, Bouter L. Prevalence of questionable research practices, research misconduct and their potential explanatory factors: a survey among academic researchers in The Netherlands [Internet]. MetaArXiv; 2021 Jul [cited 2022 Jan 16]. Available from: https://osf.io/vk9yt.

  12. Glasziou P, Altman DG, Bossuyt P, Boutron I, Clarke M, Julious S, et al. Reducing waste from incomplete or unusable reports of biomedical research. Lancet. 2014;383(9913):267–76.

    Article  Google Scholar 

  13. Ioannidis JPA, Greenland S, Hlatky MA, Khoury MJ, Macleod MR, Moher D, et al. Increasing value and reducing waste in research design, conduct, and analysis. Lancet. 2014;383(9912):166–75.

    Article  Google Scholar 

  14. Macleod MR, Lawson McLean A, Kyriakopoulou A, Serghiou S, de Wilde A, Sherratt N, et al. Risk of bias in reports of in vivo research: a focus for improvement. PLoS Biol. 2015;13(10):e1002273.

  15. Simmons JP, Nelson LD, Simonsohn U. False-positive psychology: undisclosed flexibility in data collection and analysis allows presenting anything as significant. Psychol Sci. 2011;22(11):1359–66.

    Article  Google Scholar 

  16. Botvinik-Nezer R, Holzmeister F, Camerer CF, Dreber A, Huber J, Johannesson M, et al. Variability in the analysis of a single neuroimaging dataset by many teams. Nature. 2020;582:84–8.

    Article  CAS  Google Scholar 

  17. Nosek BA, Ebersole CR, DeHaven AC, Mellor DT. The preregistration revolution. Proc Natl Acad Sci USA. 2018;115(11):2600–6.

    Article  CAS  Google Scholar 

  18. Nature. Announcement: towards greater reproducibility for life-sciences research in Nature. Nature. 2017;546(7656):8.

  19. Percie du Sert N, Bamsey I, Bate ST, Berdoy M, Clark RA, Cuthill I, et al. The experimental design assistant. PLoS Biol. 2017;15(9):e2003779.

  20. Munafò MR, Nosek BA, Bishop DVM, Button KS, Chambers CD, Percie du Sert N, et al. A manifesto for reproducible science. Nat Hum Behav. 2017;1(1):0021.

  21. DeVito NJ, Goldacre B. Catalogue of bias: publication bias. BMJ Evid Based Med. 2019;24(2):53–4.

    Article  Google Scholar 

  22. Chambers C. What’s next for registered reports? Nature. 2019;573(7773):187–9.

    Article  CAS  Google Scholar 

  23. Nosek BA, Spies JR, Motyl M. Scientific Utopia: II. Restructuring incentives and practices to promote truth over publishability. Perspect Psychol Sci. 2012;7(6):615–31.

  24. Drude NI, Martinez Gamboa L, Danziger M, Dirnagl U, Toelch U. Improving preclinical studies through replications. eLife. 2021;10:e62101.

  25. Amrhein V, Greenland S, McShane B. Retire statistical significance. Nature. 2019 Mar 21;567:305–7.

  26. Benjamin DJ, Berger JO, Johannesson M, Nosek BA, Wagenmakers E-J, Berk R, et al. Redefine statistical significance. Nat Hum Behav. 2018;2(1):6–10.

    Article  Google Scholar 

  27. Lakens D, Adolfi FG, Albers CJ, Anvari F, Apps MAJ, Argamon SE, et al. Justify your alpha. Nat Hum Behav. 2018;2(3):168–71.

    Article  Google Scholar 

  28. Hobert A, Jahn N, Mayr P, Schmidt B, Taubert N. Open access uptake in Germany 2010–2018: adoption in a diverse research landscape. Scientometrics. 2021. https://doi.org/10.1007/s11192-021-04002-0.

    Article  Google Scholar 

  29. Keyes A, Mayo-Wilson E, Atri N, Lalji A, Nuamah PS, Tetteh O, et al. Time from submission of Johns Hopkins University trial results to posting on ClinicalTrials.gov. JAMA Intern Med. 2020;180(2):317.

  30. Wallach JD, Boyack KW, Ioannidis JPA. Reproducible research practices, transparency, and open access data in the biomedical literature, 2015–2017. Dirnagl U, editor. PLoS Biol. 2018;16(11):e2006930.

  31. Wieschowski S, Biernot S, Deutsch S, Glage S, Bleich A, Tolba R, et al. Publication rates in animal research. Extent and characteristics of published and non-published animal studies followed up at two German university medical centres. Lopes LC, editor. PLoS ONE. 2019;14(11):e0223758.

  32. Wieschowski S, Riedel N, Wollmann K, Kahrass H, Müller-Ohlraun S, Schürmann C, et al. Result dissemination from clinical trials conducted at German university medical centers was delayed and incomplete. J Clin Epidemiol. 2019;1(115):37–45.

    Article  Google Scholar 

  33. Scheliga K, Friesike S. Putting open science into practice: a social dilemma? First Monday [Internet]. 2014 Aug 24 [cited 2022 Feb 1]; Available from: https://journals.uic.edu/ojs/index.php/fm/article/view/5381.

  34. Flier J. Faculty promotion must assess reproducibility. Nature. 2017;549(7671):133–133.

    Article  CAS  Google Scholar 

  35. Higginson AD, Munafò MR. Current incentives for scientists lead to underpowered studies with erroneous conclusions. PLoS Biol. 2016;14(11):e2000995.

  36. Smaldino PE, McElreath R. The natural selection of bad science. R Soc Open Sci. 2016;3:106384.

  37. Strech D, Weissgerber T, Dirnagl U. Improving the trustworthiness, usefulness, and ethics of biomedical research through an innovative and comprehensive institutional initiative. PLoS Biol. 2020;18(2):e3000576.

  38. Gopalakrishna G, Wicherts JM, Vink G, Stoop I. Prevalence of responsible research practices among academics in the Netherlands [Internet]. MetaArXiv. 2021; p. 27. Available from: https://osf.io/preprints/metaarxiv/xsn94/.

  39. Robson SG, Baum MA, Beaudry JL, Beitner J, Brohmer H, Chin JM, et al. Promoting open science: a holistic approach to changing behaviour. Collabra Psychol. 2021;7(1):30137.

    Article  Google Scholar 

  40. Minnerup J, Wersching H, Diederich K, Schilling M, Ringelstein EB, Wellmann J, et al. Methodological quality of preclinical stroke studies is not required for publication in high-impact journals. J Cereb Blood Flow Metab. 2010;30(9):1619–24.

    Article  Google Scholar 

  41. San Francisco Declaration on Research Assessment [Internet]. 2013 [cited 2021 Jul 26]. Available from: https://sfdora.org/read/.

  42. Moher D, Glasziou P, Chalmers I, Nasser M, Bossuyt PMM, Korevaar DA, et al. Increasing value and reducing waste in biomedical research: who’s listening? Lancet. 2016;387(10027):1573–86.

    Article  Google Scholar 

  43. Lerouge I, Hol T. Towards a research integrity culture at universities: from recommendations to implementation [Internet]. 2020 Jan. Available from: https://www.leru.org/publications/towards-a-research-integrity-culture-at-universities-from-recommendations-to-implementation.

  44. Moher D, Naudet F, Cristea IA, Miedema F, Ioannidis JPA, Goodman SN. Assessing scientists for hiring, promotion, and tenure. PLoS Biol. 2018;16(3):e2004089.

  45. Moher D, Bouter L, Kleinert S, Glasziou P, Sham MH, Barbour V, et al. The Hong Kong Principles for assessing researchers: fostering research integrity. PLoS Biol. 2020;18(7):e3000737.

  46. McKiernan EC. Imagining the “open” university: sharing scholarship to improve research and education. PLoS Biol. 2017;15(10):e1002614.

  47. Wissenschaftsrat. Perspektiven der Universitätsmedizin [Internet]. 2016 [cited 2021 Aug 3]. Available from: https://www.wissenschaftsrat.de/download/archiv/5663-16.pdf?__blob=publicationFile&v=1.

  48. Mejlgaard N, Bouter LM, Gaskell G, Kavouras P, Allum N, Bendtsen A-K, et al. Research integrity: nine ways to move from talk to walk. Nature. 2020;586(7829):358–60.

    Article  CAS  Google Scholar 

  49. Wilkinson MD, Dumontier M, Aalbersberg IjJ, Appleton G, Axton M, Baak A, et al. The FAIR guiding principles for scientific data management and stewardship. Sci Data. 2016;3(1):160018.

  50. Percie du Sert N, Ahluwalia A, Alam S, Avey MT, Baker M, Browne WJ, et al. Reporting animal research: explanation and elaboration for the ARRIVE guidelines 2.0. Boutron I, editor. PLoS Biol. 2020;18(7):e3000411.

  51. Schulz KF, Altman DG, Moher D. CONSORT 2010 statement: updated guidelines for reporting parallel group randomised trials. BMJ. 2010. https://doi.org/10.1136/bmj.c332.

    Article  PubMed  PubMed Central  Google Scholar 

  52. Russell WMS, Burch RL. The principles of humane experimental technique. London: Methuen; 1959.

    Google Scholar 

  53. Kos-Braun IC, Gerlach B, Pitzer C. A survey of research quality in core facilities. eLife. 2020;9:e62212.

  54. Rice DB, Raffoul H, Ioannidis JPA, Moher D. Academic criteria for promotion and tenure in biomedical sciences faculties: cross sectional analysis of international sample of universities. BMJ. 2020. https://doi.org/10.1136/bmj.m2081.

    Article  PubMed  PubMed Central  Google Scholar 

  55. von Elm E, Altman DG, Egger M, Pocock SJ, Gøtzsche PC, Vandenbroucke JP. The Strengthening the Reporting of Observational Studies in Epidemiology (STROBE) statement: guidelines for reporting observational studies. Lancet. 2007;370(9596):1453–7.

    Article  Google Scholar 

  56. Elo S, Kyngäs H. The qualitative content analysis process. J Adv Nurs. 2008;62(1):107–15.

    Article  Google Scholar 

  57. Michie S, van Stralen MM, West R. The behaviour change wheel: a new method for characterising and designing behaviour change interventions. Implement Sci. 2011;6:42.

    Article  Google Scholar 

  58. Ioannidis JPA, Khoury MJ. Assessing value in biomedical research: the PQRST of appraisal and reward. JAMA. 2014;312(5):483.

    Article  CAS  Google Scholar 

  59. Deutsche Forschungsgemeinschaft. Replizierbarkeit von Ergebnissen in der Medizin und Biomedizin. Stellungnahme der Arbeitsgruppe “Qualität in der Klinischen Forschung” der DFG-Senatskommission für Grundsatzfragen in der Klinischen Forschung [Internet]. DFG; 2018. Available from: https://www.dfg.de/download/pdf/dfg_im_profil/reden_stellungnahmen/2018/180507_stellungnahme_replizierbarkeit_sgkf.pdf.

  60. Deutsche Forschungsgemeinschaft. Guidelines for safeguarding good research practice. Code of conduct. 2019 Sep 15 [cited 2021 May 20]; Available from: https://zenodo.org/record/3923602.

  61. Kip M, Bobrov E, Riedel N, Scheithauer H, Gazlig T, Dirnagl U. Einführung von Open Data als zusätzlicher Indikator für die Leistungsorientierte Mittelvergabe (LOM)-Forschung an der Charité—Universitätsmedizin Berlin. 2019; p. 1.

  62. Ratte A, Drees S, Schmidt-Ott T. The importance of scientific competencies in German medical curricula—the student perspective. BMC Med Educ. 2018;18(1):146.

    Article  Google Scholar 

  63. Medizinischer Fakultätentag. Positionspapier Vermittlung von Wissenschaftskompetenz im Medizinstudium [Internet]. Medizinischer Fakultätentag; 2017 [cited 2022 Jan 16]. Available from: https://medizinische-fakultaeten.de/medien/stellungnahmen/positionspapier-vermittlung-von-wissenschaftskompetenz-im-medizinstudium/.

  64. Bundesministerium für Bildung und Forschung. Masterplan Medizinstudium 2020 [Internet]. Bundesministerium für Bildung und Forschung; 2017 [cited 2022 Jan 16]. Available from: https://www.bmbf.de/bmbf/shareddocs/kurzmeldungen/de/masterplan-medizinstudium-2020.html.

  65. Barbour RS. Checklists for improving rigour in qualitative research: a case of the tail wagging the dog? BMJ. 2001;322(7294):1115–7.

    Article  CAS  Google Scholar 

  66. Bakker M, Veldkamp CLS, van Assen MALM, Crompvoets EAV, Ong HH, Nosek BA, et al. Ensuring the quality and specificity of preregistrations. Bero L, editor. PLoS Biol. 2020;18(12):e3000937.

  67. Tiokhin L, Panchanathan K, Smaldino PE, Lakens D. Shifting the level of selection in science [Internet]. MetaArXiv; 2021 Oct [cited 2022 Jan 30]. Available from: https://osf.io/juwck.

  68. Dirnagl U. #IchbinHannah and the fight for permanent jobs for postdocs: how a fictitious postdoc (almost) triggered a fundamental reform of German academia. EMBO Rep. 2022. https://doi.org/10.15252/embr.202254623.

    Article  PubMed  PubMed Central  Google Scholar 

  69. Barbutev A-S. Wir brauchen einen Systemwechsel. ZEIT Campus [Internet]. 2021 Oct 30 [cited 2022 Feb 1]; Available from: https://www.zeit.de/campus/2021-10/ichbinhanna-hochschule-sabine-kunst-birgitt-riegraf-paderborn-befristete-stellen-mittelbau

  70. Janotta L, Lukman C. Wer gut betreut, schadet seiner Karriere. FAZ.NET [Internet]. 2021 Nov 20 [cited 2022 Feb 1]; Available from: https://www.faz.net/aktuell/wirtschaft/arm-und-reich/ichbinhanna-aerger-ueber-arbeitsverhaeltnisse-in-der-wissenschaft-17644369.html

  71. Bouter L. What research institutions can do to foster research integrity. Sci Eng Ethics. 2020;26(4):2363–9.

    Article  Google Scholar 

  72. Keyes A, Mayo-Wilson E, Nuamah P, Lalji A, Tetteh O, Ford DE. Creating a program to support registering and reporting clinical trials at Johns Hopkins University. Acad Med. 2021;96(4):529–33.

    Article  Google Scholar 

  73. Viđak M, Barać L, Tokalić R, Buljan I, Marušić A. Interventions for organizational climate and culture in academia: a scoping review. Sci Eng Ethics. 2021;27(2):24.

    Article  Google Scholar 

  74. Strauss M, Ehlers J, Gerß J, Klotz L, Reinecke H, Leischik R. Status Quo—Die Anforderungen an die medizinische Habilitation in Deutschland. DMW. 2020;145(23):e130–6.

    Article  Google Scholar 

  75. Schiermeier Q. Breaking the Habilitation habit. Nature. 2002;415(6869):257–8.

    Article  CAS  Google Scholar 

Download references

Acknowledgements

The authors would like to acknowledge Danielle Rice, Miriam Kip, Tim Neumann, Delwen Franzen and Tamarinde Haven for their helpful comments on the first drafts of the protocol. They would also like to thank the two reviewers and the editor for their feedback on the first submitted manuscript drafts.

Funding

Open Access funding enabled and organised by Projekt DEAL. This work was funded by the German Federal Ministry of Education and Research (BMBF 01PW18012). The funder had no role in the study design, data collection and analysis, decision to publish or preparation of the manuscript.

Author information

Authors and Affiliations

Authors

Contributions

MH: conceptualisation, data curation, formal analysis, investigation, methodology, writing—original draft. AF: formal analysis, investigation, writing—review & editing. DS: conceptualisation, funding acquisition, methodology, project administration, resources, supervision, writing—review & editing. All authors read and approved the final manuscript.

Corresponding author

Correspondence to M. R. Holst.

Ethics declarations

Ethics approval and consent to participate

Not applicable.

Consent for publication

Not applicable.

Competing interests

All authors are affiliated with German university medical centres. They declare no further conflicts of interest.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary Information

Additional file 1: Table S1.

Indicators that were chosen for inclusion in this study (detailed view). Table S2. Number of documents we included for each university and document type. Table S3. Number of university medical centres that mention indicators of robust and transparent science (fine-grained structure) for career progression in each of the included sources. Table S4. Number of university medical centres that mention indicators of robust and transparent science (fine-grained structure) for career progression in each of the included sources.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Holst, M.R., Faust, A. & Strech, D. Do German university medical centres promote robust and transparent research? A cross-sectional study of institutional policies. Health Res Policy Sys 20, 39 (2022). https://doi.org/10.1186/s12961-022-00841-2

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s12961-022-00841-2

Keywords