Open science in research synthesis
Wed-02
Presented by: Tamara Heck
Research syntheses as a method are applied to systematically search, select and analyse research literature according to a specific research question. They should be conducted with rigor and according to current methodological standards. Guidelines like by Cochrane and Campbell point out methodological processes and illustrate how to document data and results (e.g. via PRISMA: Page et al., 2021) to guarantee transparency, comprehensiveness, reproducibility and at best reusability according to open science principles.
However, due to the popularity of research syntheses, we currently see two phenomena. First, different methodological processes in research syntheses led to multiple different review types (Sutton et al., 2019). Those types differ in their goals, processes and data output. Second, researchers apply new KI-tools to conduct research syntheses more efficiently (Marshall & Wallace, 2019). These semi-automated tools can support the screening and syntheses processes based on text mining tools and statistical relevance measurements. Again, the data, its format and result output differ depending on the chosen method and tool.
In view of these phenomena, this contribution asks, how research synthesis and their data should be documented and saved to consider open science and open data principles. It discusses current guidelines and their detailed descriptions with regard to their support to make research synthesis transparent and reproducible. Herewith, it will focus on different types and processes of research synthesis and their impact on data and results output as well as their documentation.
Related to these challenges is the complexity of research synthesis. While guidelines for research synthesis give support in describing those processes as explicitly as possible and delineating individual steps from one another, theories on information search behaviour characterise literature search and selection processes as complex (Kuhlthau, 2004) and non-linear (Foster, 2005). This fact might stand against a transparent and reproducible documentation of data selection and evaluation. The contribution will address the question of whether and how a good documentation of a research synthesis and its data can reduce the complexity of information search and achieve a transparency we need to reproduce and reuse research synthesis data.
Foster, A. (2005). A non-linear model of information seeking behaviour. Information Research, 10(2). http://InformationR.net/ir/10-2/paper222.html
Kuhlthau, C. C. (2004). Seeking Meaning: A Process Approach to Library and Information Services, 2nd edition. Libraries Unlimited
Marshall, I. J. & Wallace, B. C. (2019). Toward systematic review automation: a practical guide to using machine learning tools in research synthesis. Systematic reviews, 8(1), 163. https://doi.org/10.1186/s13643-019-1074-9
Page, M. J., McKenzie, J. E., Bossuyt, P. M., Boutron, I., Hoffmann, T. C., Mulrow, C. D., Shamseer, L., Tetzlaff, J. M., Akl, E. A., Brennan, S. E., Chou, R., Glanville, J., Grimshaw, J. M., Hróbjartsson, A., Lalu, M. M., Li, T., Loder, E. W., Mayo-Wilson, E., McDonald, S., . . . Moher, D. (2021). The PRISMA 2020 statement: an updated guideline for reporting systematic reviews. BMJ (Clinical research ed.), 372, n71. https://doi.org/10.1136/bmj.n71
Sutton, A., Clowes, M., Preston, L. & Booth, A. (2019). Meeting the review family: exploring review types and associated information retrieval requirements. Health information and libraries journal, 36(3), 202-222. https://doi.org/10.1111/hir.12276
However, due to the popularity of research syntheses, we currently see two phenomena. First, different methodological processes in research syntheses led to multiple different review types (Sutton et al., 2019). Those types differ in their goals, processes and data output. Second, researchers apply new KI-tools to conduct research syntheses more efficiently (Marshall & Wallace, 2019). These semi-automated tools can support the screening and syntheses processes based on text mining tools and statistical relevance measurements. Again, the data, its format and result output differ depending on the chosen method and tool.
In view of these phenomena, this contribution asks, how research synthesis and their data should be documented and saved to consider open science and open data principles. It discusses current guidelines and their detailed descriptions with regard to their support to make research synthesis transparent and reproducible. Herewith, it will focus on different types and processes of research synthesis and their impact on data and results output as well as their documentation.
Related to these challenges is the complexity of research synthesis. While guidelines for research synthesis give support in describing those processes as explicitly as possible and delineating individual steps from one another, theories on information search behaviour characterise literature search and selection processes as complex (Kuhlthau, 2004) and non-linear (Foster, 2005). This fact might stand against a transparent and reproducible documentation of data selection and evaluation. The contribution will address the question of whether and how a good documentation of a research synthesis and its data can reduce the complexity of information search and achieve a transparency we need to reproduce and reuse research synthesis data.
Foster, A. (2005). A non-linear model of information seeking behaviour. Information Research, 10(2). http://InformationR.net/ir/10-2/paper222.html
Kuhlthau, C. C. (2004). Seeking Meaning: A Process Approach to Library and Information Services, 2nd edition. Libraries Unlimited
Marshall, I. J. & Wallace, B. C. (2019). Toward systematic review automation: a practical guide to using machine learning tools in research synthesis. Systematic reviews, 8(1), 163. https://doi.org/10.1186/s13643-019-1074-9
Page, M. J., McKenzie, J. E., Bossuyt, P. M., Boutron, I., Hoffmann, T. C., Mulrow, C. D., Shamseer, L., Tetzlaff, J. M., Akl, E. A., Brennan, S. E., Chou, R., Glanville, J., Grimshaw, J. M., Hróbjartsson, A., Lalu, M. M., Li, T., Loder, E. W., Mayo-Wilson, E., McDonald, S., . . . Moher, D. (2021). The PRISMA 2020 statement: an updated guideline for reporting systematic reviews. BMJ (Clinical research ed.), 372, n71. https://doi.org/10.1136/bmj.n71
Sutton, A., Clowes, M., Preston, L. & Booth, A. (2019). Meeting the review family: exploring review types and associated information retrieval requirements. Health information and libraries journal, 36(3), 202-222. https://doi.org/10.1111/hir.12276