#S02E04 - Using Animals to Design Better Robots

For years, scientists have been studying animals and their behaviours. We’ve tracked animals using cameras, GPS and radio sensors. But as the age of technology progresses, scientists have found a much better way to interact with and learn from the animal kingdom... using robots.

CREDITS

This episode of Moonshot was hosted by Kristofor Lawson (@kristoforlawson) and and Andrew Moon (@moonytweets).

Research for this episode by our intern Caralene Ho.

Our theme music is by Breakmaster Cylinder.

And our cover artwork is by Andrew Millist.

TRANSCRIPT

David Attenborough: “Most reptile head for the shade when it gets too hot, to see where this gopher tortoise is heading here in Florida, I’m going to use this. A remotely controlled mini camera on wheels, with its own lights.”

KRIS: For years, scientists have been studying animals and their behaviours. We’ve tracked them using using cameras, GPS and radio sensors.

David Attenborough: “Using the latest scanning techniques we can create a picture of the mounds interior.”

KRIS: And in our quest to learn more - we created animatronics.

GRAB: Spypup is the result of a huge amount of work in a small London studio.

KRIS: Which are basically models of animals that can move mechanically... similar to what you might see at a museum.

GRAB: Beneath the skin is a miracle of animatronic engineering, a skeleton of articulated metal limbs controlled by sophisticated electronics and servos.

KRIS: We installed cameras in thsse animatronics and then placed them out in the wild, to get up close and personal with the wildlife in their natural habitat.

GRAB: The moves of the different creatures real life counterparts are programmed and tested. Each one takes months to design and build.

ANDREW: But as the Age of Technology progresses, scientists have found a much better way to interact with the animal kingdom... using robots.

GRAB: We are constructing robots the size of insects. Robots made out of entirely soft materials. We use nature to inspire the robots that we build.

KRIS: Welcome to Moonshot - the show exploring the worlds biggest ideas and the people making them happen. I’m Kristofor Lawson.

ANDREW: And I’m Andrew Moon.

KRIS: And in this episode we’re looking at biorobotics and the people who are out there designing robots that are based on the animal world.

ANDREW: But before we dive too deep into the cybernetic kingdom - here’s a word from our sponsors.

Barbarra Webb: Biorobotics, for me, at least, is slightly different from biologically inspired robotics. So with biological inspiration, you're looking at biology and trying to use that to build a better robot, whereas, in biorobotics, part of the aim is actually to understand the biology by building the robot. So the robot is intended as a model of the biological system that you can use to test your understanding of how it works.

KRIS: This is Barbara Webb, Professor of Biorobotics at the University of Edinburgh. Her fascination with insects has led her to build robots that are modelled after bugs.

Barbara Webb: Well, insects are very competent at a lot of things that robots are very bad at doing, just in general. Getting around the world, not getting damaged, being able to do everything from fly, to swim, to move on water, walk over rough surfaces, and do all that in a directed way, so that they're avoiding predators and they're catching prey, and often navigating, collecting food, and navigating over quite long distances very accurately. So there's a whole set of things that they can do in all sorts of different environments, and they have very tiny brains, so they must be doing something pretty good with those brains, because they don't have supercomputers at their disposal.

KRIS: And it’s this drive to find out what insects do with those little brains which has led Barbara into the field of robotics.

Barbara Webb: One reason why we use robots as our model is that we want to try and capture all of that. So we want to understand everything from how they sense the world, what information they're getting from the world, how they're processing that, and then how they're using that to control their limbs, or their wings, or whatever system they're using to move.

Ant-Man Grab: “Alright guys I’m in position, I’m going to signal the ants.”

ANDREW: Just like the creators of ‘Ant-Man’, Barbara has also developed an interest in ants. And more specifically, desert ants.

Barbara Webb: We do studies of ants in the field, so we actually go to a field site and follow the ants. You follow them with a camera, or we've even followed them with [a] differential GPS system, so that we can keep a track of where they've been, and we do different kinds of manipulations in that situation. We'll pick up an ant from someplace and put it down somewhere else that may be less familiar, for example, to see what it's able to do from that position.

ANDREW: Most ants leave a trail of pheromones the moment they leave the nest. Not just to find their way back home, but to guide other ants to likely food sources, which is why we see long trails of ants moving in the same direction.

ANDREW: But desert ants are different in that they don’t leave these breadcrumb trails. And part of the reason is because they hunt for food alone. Because the desert can be so hot and food so hard to come by there’s a good chance they won’t even find anything at all. And it’s this ability to hunt for food on their own that Barbara finds so interesting:

Barbara Webb: They actually use visual memory, and they use integration of the distance and direction that they've travelled to keep track of where they are relative to their nest, which might be hundreds of metres away, and a small hole in the ground.

KRIS: And it turns out that these desert ants actually have very little trouble finding their way home because they pack a whole lot of cool tech into those tiny little bodies.

Barbara Webb: They have almost 360 degree vision, they're able to see ultraviolet light, which we think might be quite important for how they recognise places in the world, and they can also see polarised light in the sky, which they can use a compass system. So we'll try and replicate those sorts of sensory systems on the robot, and then programme the robot with what we think is the control system, and then we see if it does the same thing as the ant.

KRIS: There are so many traits in animals and insects that we could study and adapt from. But what can we really do with these characteristics once we’ve wired them into our robots?

Barbara Webb: I'd like to think the most helpful thing will be for using robots in more natural or unstructured situations, so if you imagine trying to use robots in agriculture, where they're having to deal with uneven ground and vegetation that's not in any kind of orderly way that they can build a map and use a map to get around, they have to actually navigate with the kind of immediate surroundings. I think it's in areas like that that they might be the most useful.

Navinda Kottege: So we have a wide range of sizes. So the smallest ones that we have are mainly designed to go into what we call confined spaces, so spaces that are difficult or too dangerous for humans to go into.

KRIS: This is Navinda Kottege, a senior research scientist at the Robotics and Autonomous System Group, which is a research group within CSIRO's Data61. Navinda works on developing a range of robots that are modeled from insects… and it all started because of frogs.

KRIS: The CSIRO in Australia is a government funded organisation that does a lot of research on agriculture. And as part of one of these projects Navinda’s job was to monitor frogs along a riverbed using some acoustic sensors.

KRIS: It didn’t take them long to realise that when they deployed these acoustic sensors they were always in fixed locations - meaning Navinda couldn’t track the location of the frogs as they moved around…

Navinda Kottege: So, these were very steep ravines, and it was really difficult to access….That's when I started looking at legged robots. We bought some off the shelf platforms, at that time. Put some of our sensor payload, started doing some testing. We soon realised the limitations of the off the shelf bought robot kits. And, then that's when we started looking at designing our own robots.

ANDREW: The robots that the Data61 team have built are based on a hexapod design. That is they have six legs which allow them to move freely around a terrain, and they can withstand angles of up to 45 degrees. Now they also come in a range of sizes from not much bigger than a dinner plate, right through to robots which are two meters tall, making them perfect to send out on a mission to explore a forest landscape or into an area where humans just don’t want to go.

Navinda Kottege: An example is, let's say in the manufacturing domain when they build an aircraft, they still have a human crawling into the wing cavity of an aircraft to do the necessary checks and coatings and all of that. And as you can imagine, it's not a very pleasant environment to be in, so that's one of the motivating applications that we had initially, to build a robot, a legged robot that can go into this sort of a space. And in the surveying and inspection domain, we still have to send people into underfloor spaces and to ceiling cavities to do inspections and to do scanning to figure out where the pipes and ducts are...

ANDREW: These animal-like robots would not only be beneficial for squeezing into tight spaces or monitoring frogs - but they could also help make rescue missions safer for both rescuers and those that need help.

Navinda Kottege: If you think of the aftermath of a natural disaster such as an earthquake or a mine collapse where you have extremely unstable terrain, you have unstructured terrain. Those are the environments that are really suitable for legged robots, because legged robots don't need pathways or roadways to travel on. All they need is fairly small foot holes, and that means you can actually send one of these robots to carry a sensor payload that can help in locating survivors or communicating with survivors. And be able to even carry essential medication or that sort of essential things for people who are trapped in this sort of environment.

KRIS: Right, so you'd be taking the risk away from the humans and then placing that onto a robot in these sorts of situations?

Navinda Kottege: Yes exactly, so even in the application of sending a robot into a confined space to do inspection, what we're suggesting is where we can put cameras, thermal cameras, lidars, all sorts of sensors on the robot and then have a means of transporting this back to a human operator or a human expert who can make the judgements and make the assessments from a safe and comfortable location, rather than putting themselves at risk at the hazardous location.

KRIS: And while eventually these robots would be able to move through a location using artificial intelligence… at the moment you need to have actual people behind the controls, driving the robot to its destination.

KRIS: So this particular robot that we’re looking at is quite small and it can sort of rotate on the spot. It’s legs are just able to move around its body.

KRIS: You’re controlling the speed that it moves? Or are you controlling…

Navinda Kottege: So we are giving velocity commands. So, I’m providing the linear and angular velocity, that which it should move. And then, it decides what gait to perform and how to move each of the joints to make it move forward, backwards, or to rotate.

KRIS: Is it difficult to drive the robot like this?

Navinda Kottege: No. This one is fairly easy. So, it's using a remote controller, so even you probably can have a go, if you want.

KRIS: Yeah. It looks like a PlayStation controller.

Navinda Kottege: Yes. Yes, it does.

KRIS: Okay, so you're going to have to explain to me what to-

Navinda Kottege: Sure. That makes it go forward, backwards, left, right. And, this makes it rotate on the spot.

KRIS: Okay. So, forward, backwards, and rotate. Okay. Right. And then ... That's so cool. It's, actually, it's really easy to move it around.

KRIS: How do I tell which way is forward?

Navinda Kottege: That's an interesting one. You're actually looking at the back of the robot, there. In the front, there's a camera.

KRIS: Right. So, how quickly can it actually move?

Navinda Kottege: Right now, we haven't optimised it for speed, because we wanted it to go over a relatively rough terrain. But, we can make it go a bit faster. At the moment, I think the top speed would be about, maybe, about 30 centimetres per second.

KRIS: Excellent, that's great. Thank you very much. I imagined it being a much harder thing to control. Does the complexity change with the size of the robot, in terms of being able to control it?

Navinda Kottege: Not really, because the different ... As I said, we have a family of robots, going from the smallest one to the two metre tall version. We actually run the same software base on all the robots, which means it has, pretty much, exactly the same controls. But, different robots do have additional functionality. Some of them have the functionality to use some of the legs to manipulate objects or obstacles. So then, the controller would have some additional functionality for you to invoke that behaviour. But, in terms of just making it move around, it's fairly similar.

ANDREW: And we’ll continue our look at these animal-inspired robots… right after this break.

ANDREW: Welcome back to Moonshot, I’m Andrew Moon. In this episode we have been delving into the world of robotics based upon how animals roam our world, and insects aren’t the only creatures scientists and creators are drawing their inspiration from.

Pleurobot TED Talk: “This is Pleurobot. Pleurobot is a robot that we designed to closely mimic a salamander species called Pleurodeles waltl.”

ANDREW: This is Auke Ijspeert who is the head of the Biorobotics Laboratory at the Swiss Federal Institute of Technology at Lausanne. He introduced Pleurobot at a TED conference in 2015. And his source of inspiration for the salamander-like robot has a closer relationship to humans than you might think.

Pleurobot TED Talk: “It makes a wonderful link between swimming as you find it in eels or fish and quadruped locomotion as you see in mammals, in cats, and in humans.”

ANDREW: Now Auke is hoping to use his robot as a scientific device for improving neuroscience. And rather than only investing in engineers and computer scientists - the Pleurobot team includes neurobiologists - because the team is actually hoping to better understand how animals move - specifically focusing on how the spinal cord controls motion.

KRIS: A major component behind movement is the spinal cord which has reflexes that control our sensorimotor coordination. Even with the most advanced technology, it is still especially difficult to record activity in the spinal cord because it is protected by the spine. And understanding what happens in the spine becomes ever more important as we look to help people who have issues due to spinal cord injuries. But with a biorobot like Pleurobot, studying the role of this activity and movement is made possible.

Pleurobot TED Talk: “It's very important to understand how the spinal cord works, how it interacts with the body, and how the brain communicates with the spinal cord. This is where the robots and models that I've presented today will hopefully play a key role towards these very important goals.”

KRIS: Now as we’ve mentioned on Moonshot before - Boston Dynamics is another big player in this animal-like robot space. The company created a laboratory prototype of a cheetah. And while it doesn’t exactly look like the real thing - it certainly could give most humans a run for their money...reaching speeds of around 48 kilometres per hour, which is actually faster than Usain Bolt.

KRIS: Big Dog is another robot they worked on. It’s name is quite fitting since at 1 metre high and 109 kg in weight, it’s much bigger than the size of your average great dane. Big Dog was made to go out into the real world. It can climb slopes, walk in snow and water, and can even carry up to 45 kg.

ANDREW: Boston Dynamics was sold in 2017 to robotics giant Softbank - and that acquisition is already starting to see the company change focus to one of commercialisation.

Marc Raibert: “So SpotMini is in pre-production now. We’ve built 10 units that’s a design that’s close to a manufacturable design. We’ve built them in house but with help from contract manufacturer type people. We have a plan later this year to build a hundred with contract manufacturers, and that’s the prelude to getting them into a higher rate production.”

KRIS: That’s the CEO of Boston Dynamics Marc Raibert speaking at Techrcunch Disrupt in May, announcing that the company is going to start selling their SpotMini robot in 2019. And that’s a big deal for a company that has mainly been sustaining itself through military contracts.

Marc Raibert: “We’re not saying what the price point is yet but I’ll tell you that this prototype, which if you just looked at it you couldn’t tell the difference from the previous one… but it’s about a ten times reduction in cost.”

KRIS: But as good as the Boston Dynamics robots are, if we look more broadly at the field of robotics and these animal inspired robots there’s still plenty of room to improve.

Barbara Webb: So we have several implementations of the robot ant, or we sometimes call it the ant bot.

KRIS: This is Barbara Webb again:

Barbara Webb: So far, so we've been able to show that the basic mechanisms work, but what we tend to find is that it's never as robust on the robot as it is in the ant. For example, it will work for the robot as long as it's on a nice, flat floor, but then if we put it over bumpy ground, that the ant is fine with bumpy ground, and the robot stops working, or the ant can deal with the fact that the sun has moved in changing times of day, but our robot at the moment is not very robust to changing the lighting situation and so on. What we usually find is, if we keep things simple enough, we can usually show, in principle, that the mechanism should work, but there's nearly always something that the ant can do that the robot can't.

ANDREW: Technology still has many limitations, but we’re slowly edging towards a future where these robots are most life-like then ever, and Navinda and his team at the CSIRO are constantly looking for ways that they can improve their hexapod robots. They’re brainstorming ways to come up with the best robotic legs, integrate the best sensors, and apply artificial intelligence techniques that will improve the overall functionality.

Navinda Kottege: And the other one is trying to use machine learning techniques to allow the robots to be able to more intelligently navigate in an environment. Be able to learn from its mistakes like we humans would do. Let's say if we tried to walk on a area that we've never walked on before, let's say loose gravel or loose soil or something like that, initially we'd be very cautious, we will try to very cautiously tred that environment but once we've done that we actually have that skill in our skill repertoire and we can use that if we see a similar terrain again.

Navinda Kottege: So we are trying to give our robots that sort of ability where they can learn from their experience and be able to apply that knowledge when they see a similar terrain the next time. So these are a few of the different research topics that we are trying to address.

Barbara Webb: We really have solved the problem of how to build the nervous system, and a very compact machine that is very similar to what the nervous system does as a machine, if you like, but what we can’t build is a equivalent of a muscle that drives the leg of an insect, that actually operates in anything like the same power, with the same energy consumption, with the same ratio of size, and so forth. In the mechanics, we just can’t copy the mechanical capabilities of these animals at the moment, I think.

KRIS: Do you think that’s something that we’ll be able to do in the future, or do you think that’s something that insects and animals will be able to retain for themselves?

Barbara Webb: No, I think it’s possible, and it will probably come through new, smart materials. I think that’s probably the area where these kind of breakthroughs will come, but I think there’s a long way to go to get something like that, so I think, relatively speaking, that’s a very unexplored and new area.

KRIS: In a way then, the evolution of biorobotics one filled with constant iterations and improvements. And for Barbara and others like her, there’s almost this art-like appeal to her two, four, and six legged subjects - studying every element to understand not only how things work, but what might come next.

Barbara Webb: I think it's because you can't help but be impressed when you look at animals and what they can do, and if you've worked in robotics for a while, you begin to appreciate that more and more and more, that every time you see a bird fly or a cat jump onto a high table from the ground, or, for me, now even seeing a simple insect crawling across the floor, you can't help but be amazed at how robust and capable they are compared to anything we're able to build. I mean, it may turn out to be something like ... I think an interesting contrasting example is flight, where people spent a long time trying to build things that flew like birds, and then only really made progress when they stopped trying to make them like birds and came up with completely new principles, but I think if they hadn't been inspired by birds in the first place, it wouldn't have happened.