AI & SOCIETY
SCOPUS (1987-1996,1998-2023)ESCI-ISI
1435-5655
0951-5666
Cơ quản chủ quản: SPRINGER , Springer London
Các bài báo tiêu biểu
Biometric facial recognition is an artificial intelligence technology involving the automated comparison of facial features, used by law enforcement to identify unknown suspects from photographs and closed circuit television. Its capability is expanding rapidly in association with artificial intelligence and has great potential to solve crime. However, it also carries significant privacy and other ethical implications that require law and regulation. This article examines the rise of biometric facial recognition, current applications and legal developments, and conducts an ethical analysis of the issues that arise. Ethical principles are applied to mediate the potential conflicts in relation to this information technology that arise between security, on the one hand, and individual privacy and autonomy, and democratic accountability, on the other. These can be used to support appropriate law and regulation for the technology as it continues to develop.
In philosophy of mind, zombies are imaginary creatures that are exact physical duplicates of conscious subjects for whom there is no first-personal experience. Zombies are meant to show that physicalism—the theory that the universe is made up entirely out of physical components—is false. In this paper, I apply the zombie thought experiment to the realm of morality to assess whether moral agency is something independent from sentience. Algorithms, I argue, are a kind of functional moral zombie, such that thinking about the latter can help us better understand and regulate the former. I contend that the main reason why algorithms can be neither autonomous nor accountable is that they lack sentience. Moral zombies and algorithms are incoherent as moral agents because they lack the necessary moral understanding to be morally responsible. To understand what it means to inflict pain on someone, it is necessary to have experiential knowledge of pain. At most, for an algorithm that feels nothing, ‘values’ will be items on a list, possibly prioritised in a certain way according to a number that represents weightiness. But entities that do not feel cannot value, and beings that do not value cannot act for moral reasons.