Papers by Karen Lancaster

Ethics and information Technology, 2021
Humanoid robots used for sexual purposes (sexbots) are beginning to look increasingly lifelike. I... more Humanoid robots used for sexual purposes (sexbots) are beginning to look increasingly lifelike. It is possible for a user to have a bespoke sexbot created which matches their exact requirements in skin pigmentation, hair and eye colour, body shape, and genital design. This means that it is possibleand increasingly easyfor a sexbot to be created which bears a very high degree of resemblance to a particular person. There is a small but steadily increasing literature exploring some of the ethical issues surrounding sexbots, however sexbots made to look like particular people is something which, as yet, has not been philosophically addressed in the literature. In this essay I argue that creating a lifelike sexbot to represent and resemble someone is an act of sexual objectification which morally requires consent, and that doing so without the person's consent is intrinsically wrong. I consider two sexbot creators: Roy and Fred. Roy creates a sexbot of Katie with her consent, and Fred creates a sexbot of Jane without her consent. I draw on the work of Alan Goldman, Rae Langton, and Martha Nussbaum in particular to demonstrate that creating a sexbot of a particular person requires consent if it is to be intrinsically permissible.

Philosophy in the Contemporary World, 2019
An elderly patient in a care home only wants human nurses to provide her care-not robots. If she ... more An elderly patient in a care home only wants human nurses to provide her care-not robots. If she selected her carers based on skin colour, it would be seen as racist and morally objectionable, but is choosing a human nurse instead of a robot also morally objectionable and speciesist? A plausible response is that it is not, because humans provide a better standard of care than robots do, making such a choice justifiable. In this paper, I show why this response is incorrect, because robots can theoretically care as well as human nurses can. I differentiate between practical caring and emotional caring, and I argue that robots can match the standard of practical care given by human nurses, and they can simulate emotional care. There is growing evidence that people respond positively to robotic creatures and carebots, and AI software is apt to emotionally support patients in spite of the machine's own lack of emotions. I make the case that the appearance of emotional care is sufficient, and need not be linked to emotional states within the robot. After all, human nurses undoubtedly 'fake' emotional care and compassion sometimes, yet their patients still feel adequately cared for. I show that it is a mistake to claim that 'the human touch' is in itself a contributor to a higher standard of care; 'the robotic touch' will suffice. Nevertheless, it is not speciesist to favour human nurses over carebots, because carebots do not (currently) suffer as the result of such a choice. The Robotic Touch 89 In the residential care home where Maude lives, there are some human nurses and some carebots. Maude only wants human nurses to provide her care. IF MAUDE ONLY WANTED white nurses to provide her care purely on the basis of their skin colour, we would claim she was being racist-so when she chooses a human nurse over a carebot purely on the basis of species membership 1 , why do we not claim she is being speciesist? Two plausible responses are: 1. Humans provide a better standard of care than robots do (making the choice of a human nurse over a carebot justifiable). 2. Robots do not have moral standing because they cannot suffer, and therefore there is nothing morally problematic about choosing a human over a robot. 2 I address the first of these responses and demonstrate that it is unfounded, because some robots are capable of providing a standard of care which is equal to or higher than that provided by human nurses. I further show that it is a mistake to claim that 'the human touch' is valuable simply because it comes from a human being.
Uploads
Papers by Karen Lancaster