Researchers argue AI ethics should include animal welfare

In 2015, a Great White Shark – vulnerable on the threatened species list – was killed under Western Australia’s catch-to-kill policy, after the animal was detected swimming near a bathing beach.

Authorities were only aware of the shark’s presence due to tracking data collected for the purposes of scientific research and conservation.

The shark’s tale is a cautionary one. 

It’s one of the case studies University of Melbourne researchers have included in new research highlighting the potential harms to animals which might result from big data and generative Artificial Intelligence (AI) technologies.

Their paper, a framework for assessing AI harms to non-human animals, is published in Philosophy and Technology.

Humans are usually front-of-mind in conversations about the ethics of big data and AI. 

Lead author Dr Simon Coghlan is a philosopher at the University of Melbourne working in digital ethics. He has a background in veterinary science. 

Coghlan says: “There are different ways AI can affect animals for the worse, but also for the better”.

The paper (co-authored with Melbourne Law School Professor Christine Parker) focuses on potential AI harms to animals, given these tend to be neglected in discussions about AI and ethics, he says.

The framework outlines three ways AI can harm animals – intentional, unintentional and foregone benefits. Harms can occur directly or indirectly and can result from both legal or illegal uses of AI.

The use of AI-enabled drones for illegal wildlife trafficking is an example of using the technology intentionally, causing direct harm to animals. 

“It’s much easier to capture those animals in the wild, if you’ve got cameras that that can automatically identify valuable species. 

“And this has already been done, so poachers have used this kind of tracking – drones might be involved as well – to track down animals and to kill them, and to sell their body parts,” he says.

Legal harms from AI might include uses which intensify factory farming; self-driving cars programmed to ignore small animals; or systems used to capture feral animals which incidentally harm non-target species.

Unintentional and indirect harms can be hidden, Coghlan says. For instance, the emissions from energy-intensive generative AI in computing contributing to climate change, or digital recommending systems reinforcing a human-centred view.

“We’re familiar with this (with humans) that AI can absorb biases from the training data, and therefore sometimes produce offensive or discriminatory kind of outputs […]

“Something comparable could occur with animals where the training of the AI could result in discriminatory or negative depictions of animals […] describe animals as filthy and disgusting. Or misrepresent them as treated well in industrial farming.”

The probabilistic nature of AI chatbots or digital platforms may amplify anthropocentric views with indirect costs to animals, the paper explains. Examples include fast food and fast fashion, linked to greater use of animals in factory farms. Or gambling advertising which encourages horse and dog racing.

Dr Nick Schuster an AI ethicist based at ANU’s Humanising Machine Intelligence used the example of asking Chat-GPT about the ‘morality of eating meat’ and ethical veganism.

Because the outputs of AI chatbots are determined by training data and probability, these systems tend to ignore legitimate minority viewpoints.

“When you ask [Chat-GPT] about the morality of eating meat, it kind of hedges. It says: ‘well, it’s a controversial issue, and ultimately, it’s a personal choice’.”

“I still worry a lot that there are very legitimate minority ethical views out there and value systems that just aren’t going to get represented,” Schuster says.

Coghlan hopes the framework will be taken up in discussions about AI ethics and regulation. 

“We’d like to put that on people’s radar a bit more. That AI is not just related to human wellbeing, but it’s also related to animal wellbeing as well as the environment.”

Register for scinema 2023

SCINEMA runs from August 1 to August 31 every year. Register now to be part of the festival and watch the films for free. REGISTER NOW.

Please login to favourite this article.