Decoding Animal Communication w/ Jane Lawton
BONUS | Dubai Future Forum #05
Listen on Apple Podcasts | Spotify | YouTube | SoundCloud | Goodpods | CastBox | RSS Feed
Bonus episode recorded live from the Dubai Future Forum at the Museum of the Future in partnership with the Dubai Future Foundation on 20 November 2024.
Summary
Earth Species Project’s Jane Lawton shares her insights on how artificial intelligence is used to decode animal communication, how new technology challenges human-centric views of intelligence, and how the ‘voices’ of other species can inform conservation efforts and influence rights-for-nature debates.
Guest Bio
Jane Lawton has over 30 years of international experience working with leading organisations focused on sustainable development and nature conservation. Throughout her career, she has explored various pathways to solve the complex puzzle of living sustainably on Earth. She has held senior roles in Asia, Europe, and North America, working with organisations such as Forum for the Future, The B Team, IUCN (International Union for Conservation of Nature), the Nature Conservancy of Canada, and the Jane Goodall Institute of Canada. She is currently Director of Impact at the Earth Species Project, a non-profit focused on using AI to decode animal communication with the ultimate goal of transforming how humans relate to the rest of nature.
Show Notes
00:00 Animal Communication
01:29 Earth Species Project
04:37 Science Behind Animal Communication
07:24 Technological Approaches and Challenges to Decoding Non-Humans
11:56 Implications for Conservation
17:18 Future Possibilities
Links
Transcript (AI-Generated)
NOTE: This transcript is AI-generated and unedited. It may contain errors. A human-transcription is coming soon.
Jane Lawton: What if you were actually able to use AI to discern the preferences and the needs and the desires of an animal? What if their voice was admissible in a court of law?
Luke Robert Mason: You're listening to The Future's Podcast, live from the Dubai Future Forum at the Museum of the Future, a location where imagining new possibilities can occur through conversation. On this show we meet the scientists, technologists, artists, and philosophers working to imagine the sorts of developments that might dramatically alter what it means to be human.
Some of their predictions will be preferable, others might seem impossible, but none of them are inevitable. My name is Luke Robert Mason and I'm your host for this session. For all of the potential uses for artificial intelligence, the possibility that we could use it to speak with our animal companions is one of the most awe inspiring.
In his 1974 essay, What is it like for a bat to be a bat? Philosopher Thomas Nagel argued that humans could only imagine what it would be like to embody a bat. He claimed that it would be impossible to fully understand them on a conscious level. But artificial intelligence may soon change that. Today, we're joined by Jane Lawton, one of the leading experts who's working with a team at the Earth Species Project towards cutting edge technologies to communicate with our non human animal counterparts and companions.
So Jane, welcome to the podcast lounge. And I guess my first question is, What and why? I mean, what inspired the Earth Species Project to decide that the project they want to do is use artificial intelligence to decode non human and animal communication and behavior?
Jane Lawton: Yeah, I mean, good question. We are, we are founded by, um, a couple of people who come from backgrounds in the technology sector, uh, early at Twitter, um, uh, working with Mozilla Foundation.
And I think there were, they were people who were, first of all, um, beginning to be a little disillusioned with what technology was bringing to us, um, and how it was helping to advance things for humanity, deeply interested in nature. And I think they were very inspired by, I mean, artificial intelligence has been around for a long time, what we were beginning to see with artificial intelligence specifically and human language.
And back in 2017, there was a really major shift that most people don't know about, which was when, um, new techniques with machine learning allowed computers to essentially translate between human languages without the use of a Rosetta Stone or a dictionary. So that's, that's pretty fundamental and hearing about that and being super interested in nature and some of the challenges we have with nature, um, they hypothesized that we might be able to transfer some of that technology into understanding better better.
what animals are saying. So this is kind of the foundation, uh, the inspiration behind Earth Species Project. Uh, and it, it, it's also really been moved on by the fact that, you know, we, uh, now have ChatGPT, right? Everyone is reckoning with AI. Everyone is beginning to understand the possibilities of it. We have new, uh, Uh, models that are allowing us to translate across modalities.
So if you think about Mid Journey and Dolly, you know, being able to prompt the model with a line of text and get a beautiful image out. Those are things that are super useful when thinking about animal communication and the fact that animals communicate in a bunch of different modalities and not just vocalizations.
So all of that. paired with new techniques of, uh, deploying sensors on animals. So these new sophisticated sensors capture many of them, not just vocalizations, but also movement, uh, often environmental context. And so what we're starting to get is a really rich picture of what's going on with animals. A lot of data that needs to be analyzed, new machine learning techniques that allow us to analyze that data.
It's kind of like the conditions. Well,
Luke Robert Mason: the question I have then is, in what way do animals have language? You mentioned there this word multimodal, so there's multimodal ways in which animals communicate. So, does nature only communicate through sound or are there other things as well? Do animals have different accents?
Do animals have different, different languages depending on where they are based in the world?
Jane Lawton: Uh, they, they definitely do this. I mean, they documented that just one species of orca, you know, they have many, many different dialogues across, dialects across different pods of orca. But you know, just kind of coming back to that core question of language.
And this all depends on how you define language and, you know, human beings are very, very good categorizing things and putting things in boxes and creating definitions and boundaries in order to separate things from each other. So, you know, the, the very scientific definition of language, I think most scientists would say that, that animals do not possess that, that is uniquely human.
Yet, if you start to break down the different components of what we consider to be language, there are many elements of that that are starting to be documented. In other species, um, so complex ways of, uh, referring to other things, uh, vervet monkeys as an example. Since 2014 we've known that they have, you know, not just an alarm call to say to each other, uh, Hey, there's a predator around, but they have different alarm calls for an eagle.
Um, uh, predator above. They have a different alarm call for a snake, predator below. Which makes perfect sense because they're going to have to respond differently in those situations. We know that many, many different animal species actually have names for each other. Dolphins have signature whistles, um, that encode the identity of the person that they're speaking to.
Recent research on elephants shows that they have distinct names for each other. Parrots have been demonstrated to have distinct names that they actually learn from their parents. And then really recent research done by, um, Adriano LaMera at University of Warwick on orangutans demonstrates that they can speak about events in the past, about something not present.
So, you know, I mean, technically, given our human definition of language, animals don't possess all of those traits, yet we are starting to see many of the hallmarks emerge. And we're learning so much every day. I I would imagine that many of our beliefs about language will be upended, uh, very quickly.
Luke Robert Mason: Wow.
So what, what are some of those methods? So you, you mentioned there are alarm calls. You take the sound of an alarm call and then how do you then go about understanding it from a technological perspective or using AI or some of these tools? Is it learning those alarm calls or are you training a model?
How, how does it work?
Jane Lawton: Yeah. So our niche at Earth Species Project, we're, we're the machine learning specialist. Right? We're not out in the field gathering data, we're not out, you know, researching whales. We work really closely with partners who are experts in the communication of those species and who've been studying them for a really long time.
And what we are really focusing on right now is essentially building, um, a large language model, a GPT for nature, uh, if you will, that essentially is trained on large amounts of data. Um, can solve for many different tasks that biologists typically need to solve in term, in order to understand communication and are generalizable across species.
So there are some organizations that are studying one specific species, um, and they are building machine learning models in order to understand that species. We're actually interested in building a generalizable model that will, you know, in essence, build the entire field, um, of AI and animal communication.
Uh, and we have, we just released a new model actually, um, which, uh, is doing this, uh, it's early, but it's doing this very well and what's fascinating, it's trained on large amounts of data, including animal data, but also human speech and music. What is, it's showing in these early stages is that it can do the tasks, basic tasks of detecting vocalizations in recordings, identifying what species that, uh, vocalization comes from and then going further to classify those vocalizations into, is it an alarm call for example?
Is it an affiliation call? Like a, I'm greeting you call? All these different types of calls that have been. It's already identified by, uh, biologists and ethologists. And it is also, astonishingly, able to learn enough to be able to identify species in a recording that it has never been exposed to before.
It hasn't been trained. on data from that species. So this is where you see the learning coming in, uh, and similar to what we're seeing in human language, we don't always know why, uh, the model is performing, uh, as it is. Which is both scary and exciting, I guess.
Luke Robert Mason: In that case, how do you validate the accuracy of the AI?
How do you know that it's interpreting those signals correctly? I mean, a science fiction author would take this idea and maybe suggest that the AI could be tricking the humans and using the animal and expressing its agency, the AI's agency, rather than the animal's agency. Is there a way to validate animal communication, or is this one of the grand challenges you're still yet to overcome?
Jane Lawton: It is, I mean, it is a challenge. It's an ongoing challenge. Um, because of course, as human beings, we don't speak whale, uh, or crow. And so Yes,
Luke Robert Mason: we do, if you've seen Finding Nemo.
Jane Lawton: So we don't know. We don't always know how well they're doing. But one of the things that we've done as an organization, uh, the very first things we did was establish benchmark data sets.
So, I mean, that's been needed for the progress of AI in the human domain as well, but really important in animal communication. So data sets that are annotated, and then you can basically, once you develop a model, you can test it against the benchmark to understand how well it is doing or performing. Um, so that's super important.
Uh, I think then. One of the other ways that we'll be going at this challenge is first using the new models that we develop to replicate what human scientists have already been able to discover from another species, right? So if we can get a model to the place where it's finding the same things that have already been discovered, we'll know we're on the right track.
Um,
Luke Robert Mason: So what happens when we start decoding language? Could we start understanding animal cultures? Might we realize they have new cultural ways of communicating? Is, is language the gateway to understanding so much about nature?
Jane Lawton: Beautiful question. I mean, we kind of see, uh, communication as almost a window into the, uh, intelligences.
And the worlds of other species. So along with that comes culture for sure. And again, rich, complex cultures have been documented in many, many species. Um, one of the projects that we're doing, which I think is a great illustration of how this can be used to help, uh, with conservation efforts, the Hawaiian crow, uh, which has been extinct in the wild for quite some time.
There are two different populations being kept in captivity, uh, in Hawaii, and they're interested in reintroducing the crows at some point in the future, but one of the things that they're concerned about, because they're such a vocal species, uh, they use vocalizations and other forms of communication, uh, to do a lot of things together.
They often work together. If they reintroduce them. They've been in captivity for quite a long time. Will they have lost vocabulary over that period of time? And is that vocabulary important for them to survive in the wild? So, you know, the project currently is documenting the different forms of communication that they have, their vocal repertoire, um, and comparing it to the vocal repertoire.
We have some recordings from crows in the wild and seeing if there are differences.
Luke Robert Mason: So you mentioned there how it's going to help conservation efforts, but would it contribute to philosophical debates, perhaps about animal consciousness, or even animal rights?
Jane Lawton: I really hope it will, actually. I think the thing that I, um, I feel most passionate about with this work is it does a great job of demonstrating for human beings how limited we are in our own perception and understanding.
Um, so, you know, just even the fact that we now know that plants both emit sounds. And can hear to a certain degree. Amazing research that's been done by scientists in, uh, in Tel Aviv. Um, they, they did research on, uh, uh, tomato plants and tobacco plants. And when they are distressed, when they're in need of water and dehydrated, they actually emit sounds at, uh, uh, a level that humans can't hear.
But it can be captured, it can be sped up, it can be put into a frequency we can engage with. And it just goes to show how much is happening in the world around us that we are completely unaware of. So if we're able to demonstrate these limitations in human beings, I think that this will give rise to a greater sense of humility, um, an acknowledgement that, uh, we are not necessarily the superior, superior beings on the planet, that we, that this whole myth that of human exceptionalism is something that we have created, uh, and that in and of itself I think opens doors in people's minds, um, to new ways of engaging with nature and connecting with it.
But I also think it can be used, uh, in almost more directly. So if you think about the burgeoning rights for nature movement as an example, you know, right now we're trying to figure out like, you know, how to accord, uh, rights to ecosystems, to rivers in order to protect them better. But what if you were actually able to use AI to discern the preferences and the needs and the desires of an animal, and you were in a situation where you were debating human wildlife conflict and whether or not bears should be exterminated or culled from a particular area because they were coming into conflict with humans?
What if their voice was admissible in a court of law? And maybe AI in the future will be able to help us. To do that, maybe we'll be able to actually bring the voice of other species into some of our decision making processes, some, you know, imagine like 30 years in the future, is there a Council of the Whales, um, that happens when the UN is debating the future of the oceans.
Um, I don't know, but I think it opens up a lot of possibilities.
Luke Robert Mason: Wow, I didn't know that Plants could hear.
Jane Lawton: Yeah, so I mean, the hearing part, the experiments that have been done, one of them is on evening primrose flowers. And they basically played a whole series of sounds to the plants and registered their responses.
And there was very little response until they played them the sound of an approaching pollinator. And once they heard the sound of the pollinator, they almost immediately began producing more and sweeter nectar in response to that sound. Wow. So, that, that definitely indicates that there is some response happening, um, in the plants.
They were leaves dropping. They were leaves
Luke Robert Mason: dropping. Exactly, exactly. So, dolphins, monkeys, domestic pets, I mean, which animal do you think we're going to understand first? Do you think there's one animal species that's got a language that we might be able to understand? finally be able to talk?
Jane Lawton: Yeah, that's a, that's a really interesting question.
I'm not sure we know yet. Uh, the, the approach we're taking, as I said, is kind of species agnostic. We're trying to build the tools that will help advance this, um, across the animal domain, across the tree of life, in essence. Um, and I think, One of the things that's important is that this is not about novelty, right?
So, uh, pretty much everybody, when I mention the work that we do, says, Oh my God, I'm going to be able to talk to my dog or my cat, which is great. I'm sure everybody wants to do that. But if we're really thinking about this work as something transformative, Shifting human mindsets, then talking to your dog or your cat is probably not going to do that for you because we already put dogs and cats in a totally separate category from all other animals, right?
We, we separate them out, they're special, we treat them differently. So if we want to be transformative, I think we are going to be focusing as we move forward on a suite of species that, one, is it going to be feasible to do decoding, which means is there enough data available? Are they a social species?
Because species that are not social, there's going to be less in, uh, species, communication within the species for us to decode. And do they occupy in some way, shape, or form a world that we might be able to connect to or understand? But simultaneously, I feel it's going to be really important for us to, um, work with species where there might be a surprise element for humans.
So if you think about the documentary film, My Octopus Teacher, I don't know if you've seen that. Yes.
Luke Robert Mason: Yeah.
Jane Lawton: But. I I know, I watched it and thought Oh my God, I had no idea that there was this depth of complexity and almost emotion present in these creatures that are so alien to us. And I think it's that surprise factor that often makes people change their minds.
Um, so we'll be looking for those. Um, particular areas and right now we're doing a lot of work on crows, um, the Hawaiian crow, but also, um, there's a really interesting group of carrion crows in the north of Spain, which are cooperative breeders. And so they actually bring their young up together in kind of creche like environments.
And so we're doing a lot of work there on, uh, how do you pair their, um, movements to their vocalization? So you, you raised the whole multimodal thing before. That's uh, interesting work we're doing there. And birds, I think will definitely be on the list.
Luke Robert Mason: So if we could. do what you're hoping the Earth Species Projects and we can uplift animals using AI so we can understand them and they can talk back to us and we can talk to them.
Do you think they'll listen to us?
Jane Lawton: Well, that's a really good question. I think I'd like to flip it because I don't think that we're actually trying to get to a place where we are in two way communication with another species. I mean, We might need to use two way communication to validate what we think is the meaning of what's going on in their communication.
But whether we actually want to get to a point where we're actually having this back and forth Dr. Dolittle type moment with another species is questionable. Um, whether they even want to do that with us is a huge question. Um, so I think this is really more about listening. It's more about learning and understanding, and at some point in this process we may get into some more profound communication, but that's, I think that's up for grabs.
Luke Robert Mason: Alright, don't spoil my dream that one day in the future I can interview animals on my, on my podcast. So on that note, I just want to thank you for joining us for the FUTURES Podcast live from the Dubai Future Forum. If you like what you've heard, you can find out more by visiting FUTURES Podcast, www.
FUTURESPodcast. com. Dot net. Thank you, Jane. It was a fascinating conversation and I really appreciate you joining us at the Dubai Future Forum on the podcast stage.
Jane Lawton: Thanks, Luke.
Credits
If you enjoyed listening to this episode of the FUTURES Podcast you can help support the show by doing the following:
Subscribe on Apple Podcasts | Spotify | Google Podcasts | YouTube | SoundCloud | Goodpods | CastBox | RSS Feed
Write us a review on Apple Podcasts or Spotify
Subscribe to our mailing list through Substack
Producer & Host: Luke Robert Mason
Assistant Audio Editor: Ramzan Bashir
Transcription: Beth Colquhoun
Join the conversation on Facebook, Instagram, and Twitter at @FUTURESPodcast
Follow Luke Robert Mason on Twitter at @LukeRobertMason
Subscribe & Support the Podcast at http://futurespodcast.net