“It might be this, it should be that…” uncertainty and doubt in day-to-day research practice

European Journal for Philosophy of Science - Tập 9 - Trang 1-21 - 2019
Jutta Schickore1, Nora Hangel2
1Department of History and Philosophy of Science and Medicine, Indiana University, Bloomington, USA
2Zukunftskolleg, University of Konstanz, Konstanz, Germany

Tóm tắt

This paper examines how scientists conceptualize their research methodologies. Do scientists raise concerns about vague criteria and genuine uncertainties in experimental practice? If so, what sorts of issues do they identify as problematic? Do scientists acknowledge the presence of value judgments in scientific research, and do they reflect on the relation between epistemic and non-epistemic criteria for decisionmaking? We present findings from an analysis of qualitative interviews with 63 scientific researchers who talk about their views on good research practice. We argue that analysts of science should care about scientists’ conceptualizations of the criteria and of the practical judgments that scientific inquiry involves. While scientists’ accounts of their own research methodologies alone do not give us a full picture of how science really works, they can point us to areas of concern. They can inspire and direct philosophical reflections about how science works. Throughout the interviews, the participating researchers provided specific examples from their own research contexts as illustrations of their methodological points. These examples reveal that scientists often struggle to evaluate the quality of their data, to figure out whether the available evidence confirms their hypothesis, whether a replication was successful, or to what extent they can rely on peer-reviewed papers. General ideas about good research methods do not directly translate into specific evaluation criteria or strategies that can guide research and help validate empirical data.

Tài liệu tham khảo

Anderson, E. (2004). Uses of Value Judgments in Science: A General Argument, with Lessons from a Case Study of Feminist Research on Divorce. Hypatia, 19, 1–24. Bechtel, W. (2000). From Imaging to Believing: Epistemic Issues in Generating Biological Data. In R. Creath & J. Maienschein (Eds.), Biology and Epistemology (pp. 138–166). Cambridge: Cambridge University Press. Bezuidenhout, L. (2015). Variations in Scientific Data Production: What Can We Learn from #Overlyhonestmethods? Science and Engineering Ethics, 21(6), 1509–1523. Chang, H. (2012). Acidity: The Persistence of the Everyday in the Scientific. Philosophy of Science, 79, 690–700. https://doi.org/10.1086/667900. Çoko, K. (2015). The Multiple Dimensions of Multiple Determination (Dissertation). Indiana University Bloomington, IN. Collins, H. (1992). Changing Order: Replication and Induction in Scientific Practice. Chicago: University of Chicago Press. Corbin, J., & Strauss, A. L. (2008). Basics of Qualitative Research: Techniques and Procedures for Developing Grounded Theory. Thousand Oaks: SAGE. Feest, U. (2018). Why Replication is Overrated. Paper presented at the 26th Biennial Meeting of the Philosophy of Science Association, Nov. 1–4, 2018, Seattle. Retrieved from https://philpapers.org/rec/FEEWRI. Accessed 12 June 2018. Franklin, A., & Howson, C. (1984). Why do Scientists Prefer to Vary their Experiments? Studies in History and Philosophy of Science Part A, 15, 51–62. https://doi.org/10.1016/0039-3681(84)90029-3. Franklin, A., & Perovic, S. (2015). Experiment in Physics. Retrieved from https://plato.stanford.edu/archives/win2016/entries/physics-experiment/. Accessed 19 Sept 2017. Hangel, N., & Schickore, J. (2017). Scientists’ Conceptions of Good Research Practice. Perspectives on Science, 25(6), 766–791. Hangel, N., & Schmidt-Pfister, D. (2017). Why do you publish? On the tensions between generating scientific knowledge and publication pressure. Aslib Journal of Information Management, 69(5), 529–544. Intemann, K. (2005). Feminism, Underdetermination, and Values in Science. Philosophy of Science, 72, 1001–1012. https://doi.org/10.1086/508956. Ivanova, M., & Paternotte, C. (2013). Theory Choice, Good Sense and Social Consensus. Erkenntnis, 78, 1109–1132 Retrieved from http://www.jstor.org/stable/42001498. Accessed 26 Sept 2018. Jacob, F. (1998). Of Flies, Mice, and Men. Cambridge: Harvard University Press. Kuhn, T. S. (1977). Objectivity, Value Judgment, and Theory Choice. In T. S. Kuhn (Ed.), The Essential Tension (pp. 320–339). University of Chicago Press. Leahey, E. (2008). Overseeing Research Practice: The Case of Data Editing. Science, Technology & Human Values, 33, 605–630. https://doi.org/10.1177/0162243907306702. MacLeod, M., & Nersessian, N. J. (2016). Interdisciplinary Problem-Solving: Emerging modes in integrative systems biology. European Journal for Philosophy of Science, 6, 401–418. McAllister, J. W. (2014). Methodological Dilemmas and Emotion in Science. Synthese, 191, 3143–3158. https://doi.org/10.1007/s11229-014-0477-3. Mulkay, M., & Gilbert, G. N. (1986). Replication and Mere Replication. Philosophy of the Social Sciences, 16, 21–37. https://doi.org/10.1177/004839318601600102. Nosek, B. A., Alter, G., Banks, G. C., Borsboom, D., Bowman, S. D., Breckler, S. J., et al. (2015). Promoting an Open Research Culture: Author guidelines for journals could help to promote transparency, openness, and reproducibility. Science, 348, 1422–1425. https://doi.org/10.1126/science.aab2374. Radder, H. (1992). Experimental Reproducibility and the Experimenters’ Regress. PSA: Proceedings of the Biennial Meeting of the Philosophy of Science Association, 1992, 63–73 Retrieved from http://www.jstor.org/stable/192744. Accessed 26 Sept 2018. Rheinberger, H.-J. (2009). Experimental Reorientations. In G. Hon, J. Schickore, & F. Steinle (Eds.), Going Amiss in Experimental Research (pp. 75–90). Dordrecht: Springer. Rosenthal, R. (1979). The File Drawer Problem and Tolerance for Null Results. Psychological Bulletin, 86, 638–641. https://doi.org/10.1037/0033-2909.86.3.638. Schickore, J. (2016). “Exploratory experimentation” as a probe into the relation between historiography and philosophy of science. Studies in History and Philosophy of Science Part A, 55, 20–26. Schmidt-Pfister, D., & Hangel, N. (2012). Wettbewerb und Zusammenarbeit im Universitären Forschungsalltag: Ambivalent und Untrennbar. In M. Winter & C. Würmann (Eds.), Die Hochschule: Jg. 21,2. Wettbewerb und Hochschulen: 6. Jahrestagung der Gesellschaft für Hochschulforschung in Wittenberg 2011 (pp. 183–198). Wittenberg: Institut für Hochschulforschung (HoF). Retrieved from http://www.hof.uni-halle.de/journal/texte/12_2/Schmidt-PfisterHangel.pdf. Soler, L., Wieber, F., Allamel-Raffin, C., Gangloff, J.-L., Dufour, C., & Trizio, E. (2013). Calibration: A Conceptual Framework Applied to Scientific Practices Which Investigate Natural Phenomena by Means of Standardized Instruments. Journal for General Philosophy of Science, 44, 263–317. https://doi.org/10.1007/s10838-013-9231-7. Starks, H., & Brown Trinidad, S. (2007). Choose Your Method: A Comparison of Phenomenology, Discourse Analysis, and Grounded Theory. Qualitative Health Research, 17, 1372–1380. https://doi.org/10.1177/1049732307307031. Steinle, F. (1997). Entering New Fields: Exploratory Uses of Experimentation. Philosophy of Science, 64, 65–74. Trizio, E. (2012). Achieving Robustness to Confirm Controversial Hypotheses: A Case Study in Cell Biology. In L. Soler, E. Trizio, T. Nickles, & W. C. Wimsatt (Eds.), Characterizing the Robustness of Science: After the Practice Turn in Philosophy of Science (pp. 105–120). Dordrecht: Springer Netherlands. https://doi.org/10.1007/978-94-007-2759-5_4. Wagenknecht, S. (2014). Opaque and Translucent Epistemic Dependence in Collaborative Scientific Practice. Episteme, 11, 475–492. https://doi.org/10.1017/epi.2014.25. Wagenknecht, S. (2016). A Social Epistemology of Research Groups. London: Palgrave Macmillan UK. Wilholt, T. (2009). Bias and Values in Scientific Research. Studies in History and Philosophy of Science Part A, 40, 92–101. https://doi.org/10.1016/j.shpsa.2008.12.005. Wimsatt, W. C. (2012). Robustness, Reliability, and Overdetermination (1981). In L. Soler, E. Trizio, T. Nickles, & W. C. Wimsatt (Eds.), Characterizing the Robustness of Science: After the Practice Turn in Philosophy of Science (pp. 61–87). Dordrecht: Springer. https://doi.org/10.1007/978-94-007-2759-5_2. Woodward, J. (2006). Some Varieties of Robustness. Journal of Economic Methodology, 13, 219–240. https://doi.org/10.1080/13501780600733376.