Open Science als Beitrag zur Qualität in der Bildungsforschung
Tóm tắt
Ausgangspunkt dieses Beitrags sind Diskussionen um die Belastbarkeit empirischer Befunde in benachbarten Disziplinen, namentlich der Sozialpsychologie, die in der sog. „Replication Crisis“ gipfelten. Von derartigen Diskussionen um Replikationen und „Questionable Research Practices“ ist die Bildungsforschung bisher noch nicht in dem Maße betroffen, aber die Problemlagen sind in Teilbereichen ähnlich. Es mag daher nur eine Frage der Zeit sein, bevor diese Kontroversen auch in der Bildungsforschung aufkommen. Vor diesem Hintergrund argumentieren wir, wie Open Science einen Beitrag leisten kann, um die Belastbarkeit von Befunden der Bildungsforschung zu erhöhen. Im Besonderen greifen wir drei Open Science Praktiken auf: Präregistrierung, Open Materials und Open Data. Wir stellen diese vor und beleuchten, wie sie in der Bildungsforschung implementiert werden können. Wir thematisieren dabei die spezifischen Verhältnisse der Bildungsforschung im Vergleich zu Nachbardisziplinen, und gehen auf Limitationen und Besonderheiten der Bildungsforschung ein. Wir schließen mit einem Plädoyer für Transparenz.
Tài liệu tham khảo
Alliance—SHAring Reward and Credit (SHARC) Interest Group, T. R. D., David, R., Mabile, L., Specht, A., Stryeck, S., Thomsen, M., Yahia, M., et al. (2020). FAIRness literacy: the Achilles’ heel of applying FAIR principles. Data Science Journal, 19, 32. https://doi.org/10.5334/dsj-2020-032.
Begley, C. G., & Ellis, L. M. (2012). Raise standards for preclinical cancer research. Nature, 483, 531–533.
Benedictus, R., Miedema, F., & Ferguson, M. W. (2016). Fewer numbers, better science. Nature, 538(7626), 453–455.
Berliner, D. C. (2002). Comment: educational research: the hardest science of all. Educational Researcher, 31(8), 18–20. https://doi.org/10.3102/0013189X031008018.
Brohmer, H., Corcoran, K., Aert, R. C. M. van, & Eckerstorfer, L. V. (2019, August 2). Is there Inspiration Through Observation? A Meta-analysis on Goal Contagion (PREPRINT). https://doi.org/10.31219/osf.io/xydgf.
Chambers, C. (2019). What’s next for registered reports? Nature, 573, 187–189. https://doi.org/10.1038/d41586-019-02674-6.
Cheung, A. C. K., & Slavin, R. E. (2016). How methodological features affect effect sizes in education. Educational Researcher, 45(5), 283–292. https://doi.org/10.3102/0013189X16656615.
Conn, L. K., Edwards, C. N., Rosenthal, R., & Crowne, D. (1968). Perception of emotion and response to teachers’ expectancy by elementary school children. Psychological Reports, 22, 27–34.
Coyne, M. D., Cook, B. G., & Therrien, W. J. (2016). Recommendations for replication research in special education: a framework of systematic, conceptual replications. Remedial and Special Education, 37(4), 244–253. https://doi.org/10.1177/0741932516648463.
Creamer, E. G. (2018). An introduction to fully integrated mixed methods research. Los Angeles: SAGE.
Creswell, J. W., & Creswell, J. D. (2018). Research design: qualitative, quantitative, and mixed methods approaches (5. Aufl.). Los Angeles: SAGE.
Dickersin, K. (1990). The existence of publication bias and risk factors for its occurrence. JAMA, 263(10), 1385–1389.
Döring, N., & Bortz, J. (2016). Forschungsmethoden und Evaluation in den Sozial- und Humanwissenschaften (Springer-Lehrbuch). Berlin, Heidelberg: Springer. https://doi.org/10.1007/978-3-642-41089-5.
Eder, F., Altrichter, H., Hofmann, F., & Weber, C. (2015). Evaluation der Neuen Mittelschule (NMS) Befunde aus den Anfangskohorten; Forschungsbericht. Graz: Leykam.
Fanelli, D. (2009). How many scientists fabricate and falsify research? A systematic review and meta-analysis of survey data. PLoS ONE. https://doi.org/10.1371/journal.pone.0005738.
Fanelli, D. (2010a). “Positive” results increase down the hierarchy of the sciences. PLoS ONE. https://doi.org/10.1371/journal.pone.0010068.
Fanelli, D. (2010b). Do pressures to publish increase scientists’ bias? An empirical support from US states data. PLoS ONE. https://doi.org/10.1371/journal.pone.0010271.
Fecher, B., & Friesike, S. (2014). Open science: one term, five schools of thought. In Opening science (S. 17–47). Cham: Springer.
Frith, U. (2020). Fast lane to slow science. Trends in Cognitive Sciences, 24, 1–2. https://doi.org/10.1016/j.tics.2019.10.007.
Gandrud, C. (2015). Reproducible research with R and RStudio (2. Aufl.). Chapman & Hall/CRC the R series. Boca Raton: CRC Press, Taylor & Francis Group.
Gehlbach, H., & Robinson, C. D. (2018). Mitigating illusory results through preregistration in education. Journal of Research on Educational Effectiveness, 11(2), 296–315. https://doi.org/10.1080/19345747.2017.1387950.
Haven, T. L., & Van Grootel, L. (2019). Preregistering qualitative research. Accountability in Research, 26(3), 229–244. https://doi.org/10.1080/08989621.2019.1580147.
Head, M. L., Holman, L., Lanfear, R., Kahn, A. T., & Jennions, M. D. (2015). The extent and consequences of P‑hacking in science. PLOS Biology, 13(3), e1002106. https://doi.org/10.1371/journal.pbio.1002106.
Ioannidis, J. P. A. (2005). Why Most Published Research Findings Are False. PLoS Medicine, 2(8), e124.
John, L. K., Loewenstein, G., & Prelec, D. (2012). Measuring the prevalence of questionable research practices with incentives for truth telling. Psychological Science, 23, 524–532. https://doi.org/10.1177/0956797611430953.
Kerr, N. L. (1998). HARking: hypothesizing after the results are known. Personality and Social Psychology Review, 2(3), 196–217.
Kidwell, M. C., Lazarević, L. B., Baranski, E., Hardwicke, T. E., Piechowski, S., Falkenberg, L. S., Nosek, B. A., et al. (2016). Badges to acknowledge open practices: a simple, low-cost, effective method for increasing transparency. PLoS Biology, 14(5), 1–15. https://doi.org/10.1371/journal.pbio.1002456.
Knoth, P., & Pontika, N. (2015). Open Science Taxonomy. figshare. Figure. https://doi.org/10.6084/m9.figshare.1508606.v3.
Krammer, G., Pflanzl, B., & Matischek-Jauk, M. (2020a). 39 Aspekte der Online-Lehre zur Förderung positiven Erlebens und/oder Motivation bei Lehramtsstudierenden: Mixed-Method Befunde zu Beginn von COVID-19. September. https://doi.org/10.35542/osf.io/6v2yr.
Krammer, G., Pflanzl, B. & Matischek-Jauk, M. (2020b). Aspekte der Online-Lehre und deren Zusammenhang mit positivem Erleben und Motivation bei Lehramtsstudierenden: Mixed-Method Befunde zu Beginn von COVID-19. Z f Bildungsforsch. https://doi.org/10.1007/s35834-020-00283-2.
Kuckartz, U. (2014). Mixed methods. Wiesbaden: Springer. https://doi.org/10.1007/978-3-531-93267-5.
Makel, M. C., & Plucker, J. A. (2014). Facts are more important than novelty: replication in the education sciences. Educational Researcher, 43(6), 304–316. https://doi.org/10.3102/0013189X14545513.
Munafò, M. R., Nosek, B. A., Bishop, D. V., Button, K. S., Chambers, C. D., Du Sert, N. P., & Ioannidis, J. P. (2017). A manifesto for reproducible science. Nature human behaviour, 1(1), 1–9.
Nosek, B. A., Ebersole, C. R., DeHaven, A. C., & Mellor, D. T. (2018). The preregistration revolution. Proceedings of the National Academy of Sciences, 115, 2600–2606. https://doi.org/10.1073/pnas.1708274114.
OECD (2015). Making open science a reality. Paris: OECD Publishing. https://doi.org/10.1787/5jrs2f963zs1-en.
Open Science Collaboration (2015). Estimating the reproducibility of psychological science. Science. https://doi.org/10.1126/science.aac4716.
Pigott, T. D., Valentine, J. C., Polanin, J. R., Williams, R. T., & Canada, D. D. (2013). Outcome-reporting bias in education research. Educational Researcher, 42(8), 424–432. https://doi.org/10.3102/0013189X13507104.
Prinz, F., Schlange, T., & Asadullah, K. (2011). Believe it or not: How much can we rely on published data on potential drug targets? Nature Reviews Drug Discovery, 10, 712–713.
Rosenthal, R. (1969). Interpersonal expectations: Effects of the experimenter’s hypothesis. In R. Rosenthal & R. L. Rosnow (Hrsg.), Artifact in behavioral research (S. 181–277). New York: Academic Press.
Rosenthal, R., & Jacobson, L. (1966). Teachers’ expectancies: determinants of pupils’ IQ gains. Psychological reports, 19(1), 115–118.
Rosenthal, R., & Jacobson, L. (1968). Pygmalion in the classroom: teacher expectation and pupils’ intellectual development. New York: Holt, Rhinehart & Winston.
Rost, D. H., & Bienefeld, M. (2019). Nicht replizieren: publizieren!? Zeitschrift für Pädagogische Psychologie, 33(3–4), 163–176. https://doi.org/10.1024/1010-0652/a000253.
Schäfer, T., & Schwarz, M. A. (2019). The meaningfulness of effect sizes in psychological research: differences between sub-disciplines and the impact of potential biases. Frontiers in Psychology. https://doi.org/10.3389/fpsyg.2019.00813.
Scheel, A. M., Schijen, M., & Lakens, D. (2020, February 5). An excess of positive results: Comparing the standard Psychology literature with Registered Reports. https://doi.org/10.31234/osf.io/p6e9c.
Schmidt, S. (2009). Shall we really do it again? The powerful concept of replication is neglected in the social sciences. Review of General Psychology, 13(2), 90–100. https://doi.org/10.1037/a0015108.
Scriven, M. (2008). A Summative evaluation of RCT methodology: an alternative approach to causal research. Journal of Multidisciplinary Evaluation, 5(9), 11–24.
Simmons, J. P., Nelson, L. D., & Simonsohn, U. (2011). False-positive psychology: undisclosed flexibility in data collection and analysis allows presenting anything as significant. Psychological science, 22(11), 1359–1366.
Smaldino, P. E., & McElreath, R. (2016). The natural selection of bad science. Royal Society Open Science, 3, 160384. https://doi.org/10.1098/rsos.160384.
Smaldino, P. E., Turner, M. A., & Contreras Kallens, P. A. (2019). Open science and modified funding lotteries can impede the natural selection of bad science. Royal Society open science, 6, 190194. https://doi.org/10.1098/rsos.190194.
Soto, C. J. (2019). How replicable are links between personality traits and consequential life outcomes? The life outcomes of personality replication project. Psychological Science, 30, 711–727. https://doi.org/10.1177/0956797619831612.
Stengers, I. (2018). Another science is possible: a manifesto for slow science. Cambridge, Medford: Polity Press.
Sterling T. D. (1959). Publication Decisions and their Possible Effects on Inferences Drawn from Tests of Significance—or Vice Versa. Journal of the American Statistical Association, 54(285), 30–34.
Sterling, T. D., Rosenbaum, W. L., & Weinkam, J. J. (1995). Publication decisions revisited: the effect of the outcome of statistical tests on the decision to publish and vice versa. The American Statistician, 49(1), 108–112.
Szucs, D., & Ioannidis, J. P. A. (2017). Empirical assessment of published effect sizes and power in the recent cognitive neuroscience and psychology literature. PLoS Biology, 15(3), 1–18. https://doi.org/10.1371/journal.pbio.2000797.
Travers, J. C., Cook, B. G., Therrien, W. J., & Coyne, M. D. (2016). Replication research and special education. Remedial and Special Education, 37(4), 195–204. https://doi.org/10.1177/0741932516648462.
Ulrich, R., Erdfelder, E., Deutsch, R., Strauß, B., Brüggemann, A., Hannover, B., & Rief, W. (2016). Inflation von falsch-positiven Befunden in der psychologischen Forschung. Psychologische Rundschau, 67, 163–174.
Vicente-Saez, R., & Martinez-Fuentes, C. (2018). Open Science now: a systematic literature review for an integrated definition. Journal of Business Research, 88, 428–436. https://doi.org/10.1016/j.jbusres.2017.12.043.
Wicherts, J. M., Bakker, M., & Molenaar, D. (2011). Willingness to share research data is related to the strength of the evidence and the quality of reporting of statistical results. PLoS ONE, 6, 1–7. https://doi.org/10.1371/journal.pone.0026828.
Wicherts, J. M., Veldkamp, C. L. S., Augusteijn, H. E. M., Bakker, M., van Aert, R. C. M., & van Assen, M. A. L. M. (2016). Degrees of freedom in planning, running, analyzing, and reporting psychological studies: a checklist to avoid P‑hacking. Frontiers in Psychology. https://doi.org/10.3389/fpsyg.2016.01832.
Wilkinson, M. D., Dumontier, M., Aalbersberg, I. J., Appleton, G., Axton, M., Baak, A., Bouwman, J., et al. (2016). The FAIR Guiding Principles for scientific data management and stewardship. Scientific data, 3, 1–9.
Zee, T. van der, & Reich, J. (2018). Open education science. AERA Open. https://doi.org/10.1177/2332858418787466.