Seeing without knowing: Limitations of the transparency ideal and its application to algorithmic accountability

New Media and Society - Tập 20 Số 3 - Trang 973-989 - 2018
Mike Ananny1, Kate Crawford2,3,4
1University of Southern California USA
2MIT, USA
3Microsoft Research New York City, USA; New York University, USA; MIT, USA
4New York University, USA

Tóm tắt

Models for understanding and holding systems accountable have long rested upon ideals and logics of transparency. Being able to see a system is sometimes equated with being able to know how it works and govern it—a pattern that recurs in recent work about transparency and computational systems. But can “black boxes’ ever be opened, and if so, would that ever be sufficient? In this article, we critically interrogate the ideal of transparency, trace some of its roots in scientific and sociotechnical epistemological cultures, and present 10 limitations to its application. We specifically focus on the inadequacy of transparency for understanding and governing algorithmic systems and sketch an alternative typology of algorithmic accountability grounded in constructive engagements with the limitations of transparency ideals.

Từ khóa


Tài liệu tham khảo

10.7208/chicago/9780226189666.001.0001

10.1177/0007650316659851

10.1177/0162243915606523

10.1177/1461444812465137

10.1177/0263276414566642

Bacon M, 2012, Pragmatism

Baecker R, 1998, Software Visualization Programming as a Multimedia Experience, 369

10.2753/PIN1099-9922110400

10.7551/mitpress/9780262525381.003.0006

10.7208/chicago/9780226924601.001.0001

10.1177/0263276411427744

10.1177/1532708613517442

Bratich J, 2016, International Journal of Communication Systems, 10, 178

Brill J (2015) Scalable approaches to transparency and accountability in decisionmaking algorithms: remarks at the NYU conference on algorithms and accountability. Federal Trade Commission, 28 February. Available at: https://www.ftc.gov/system/files/documents/public_statements/629681/150228nyualgorithms.pdf

Brown MH, 1998, Software Visualization Programming as a Multimedia Experience, 35

10.1215/9780822375302

10.1177/2053951715622512

10.7551/mitpress/9780262525381.003.0010

10.1111/comt.12052

10.1177/1077699012447918

10.1515/9781400845293

10.7551/mitpress/9780262525381.003.0001

10.1177/1461444816657096

Crain W, 2000, Theories and Development: Concepts and Applications

Crary J, 1990, Techniques of the Observer

10.1177/0162243915589635

da Vinci L, 2015, Notebooks of Leonardo Da Vinci

10.1177/030631292022004002

Daston L, 2007, Objectivity

10.1515/popets-2015-0007

David M (2015) The correspondence theory of truth. Stanford Encyclopedia of Philosophy, 28 May Available at: http://plato.stanford.edu/entries/truth-correspondence/

10.7551/mitpress/9780262525381.003.0007

Dewey J, 1997, Experience and Education

10.1145/2844110

10.7551/mitpress/8732.001.0001

Dougherty C (2015) Google photos mistakenly labels black people “gorillas.” The New York Times, 1 July. Available at: http://bits.blogs.nytimes.com/2015/07/01/google-photos-mistakenly-labels-black-people-gorillas/

Driscoll K, 2014, International Journal of Communication Systems, 8, 1745

10.1145/2702123.2702556

Flyverbom M, 2016, International Journal of Communication Systems, 10, 110

10.1080/09614520701469955

Fung A, 2008, Full Disclosure: The Perils and Promise of Transparency

10.7551/mitpress/9780262525374.003.0009

Glasser TL, 1989, Media Freedom and Accountability

10.1525/aa.1994.96.3.02a00100

Greenwald G, 2014, No Place to Hide

10.1177/1461444814538646

10.1177/1350508414522315

Heald D, 2006, Proceedings of the British Academy, 135, 25

Heemsbergen L, 2016, International Journal of Communication Systems, 10, 138

10.1515/9781503618220

10.5871/bacad/9780197263839.003.0001

James W, 1997, Pragmatism: A Reader, 93

10.1215/9780822389002

10.1177/1461444812469597

Leigh D, 2011, WikiLeaks: Inside Julian Assange’s War on Secrecy

10.1080/01972243.2015.998105

Levy KEC, Johns DM (2016) When open data is a Trojan Horse: the weaponization of transparency in science and governance. Big Data & Society 3(1). http://bds.sagepub.com/content/3/1/2053951715621568

Lynch M, 1988, Human Studies, 11, 99

10.1177/1367549415577384

Mackenzie D, 2008, An Engine, Not a Camera: How Financial Models Shape Markets

Michaels D, 2008, Doubt Is Their Product: How Industry’s Assault on Science Threatens Your Health

Minsky M, 2007, The Emotion Machine: Commonsense Thinking, Artificial Intelligence, and the Future of the Human Mind

Misak C, 2007, New Pragmatists, 10.1093/oso/9780199279975.001.0001

10.1111/comt.12039

10.1080/1369118X.2016.1156141

Papert S, 1980, Mindstorms: Children, Computers, and Powerful Ideas

10.4159/harvard.9780674736061

10.1177/0263276411428339

10.1207/s15327809jls0901_3

10.1177/0149206314525202

10.1177/1461444816629469

10.4159/9780674915787

10.1111/j.1468-2885.2002.tb00278.x

10.1068/b070409

Stohl C, 2016, International Journal of Communication Systems, 10, 123

10.1177/1367549415577392

Tkacz N, 2015, Wikipedia and the Politics of Openness

10.7208/chicago/9780226817439.001.0001

10.7208/chicago/9780226064147.001.0001

10.7551/mitpress/9780262525381.003.0002

Wade L (2010) HP software doesn’t see black people. Sociological Images, 5 January. Available at: https://thesocietypages.org/socimages/2010/01/05/hp-software-doesnt-see-black-people/

Ward SC, 1996, Reconfiguring Truth

Zara C (2015) FTC chief technologist Ashkan Soltani on algorithmic transparency and the fight against biased bots. International Business Times, 9 April. Available at: http://www.ibtimes.com/ftc-chief-technologist-ashkan-soltani-algorithmic-transparency-fight-against-biased-1876177